US20210402288A1 - Information processing system, information processing device, controller device and accessory - Google Patents

Information processing system, information processing device, controller device and accessory Download PDF

Info

Publication number
US20210402288A1
US20210402288A1 US17/369,672 US202117369672A US2021402288A1 US 20210402288 A1 US20210402288 A1 US 20210402288A1 US 202117369672 A US202117369672 A US 202117369672A US 2021402288 A1 US2021402288 A1 US 2021402288A1
Authority
US
United States
Prior art keywords
controller
main unit
attached
information processing
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/369,672
Inventor
Yoshiaki Koizumi
Yui Ehara
Munetaka Nishikawa
Kochi Kawai
Kenichi Mae
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Priority to US17/369,672 priority Critical patent/US20210402288A1/en
Publication of US20210402288A1 publication Critical patent/US20210402288A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/843Special adaptations for executing a specific game genre or game mode involving concurrently two or more players on the same game device, e.g. requiring the use of a plurality of controllers or of a specific view of game data for each player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/98Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1601Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
    • G06F1/1607Arrangements to support accessories mechanically attached to the display housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1671Special purpose buttons or auxiliary keyboards, e.g. retractable mini keypads, keypads or buttons that remain accessible at closed laptop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C19/00Electric signal transmission systems
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1669Detachable keyboards
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the present technique relates to an information processing system, an information processing device, a controller device and an accessory.
  • an information processing device can be used in different modes.
  • the present application discloses an information processing device that can be used in different modes. Also, the present application discloses a novel information processing device.
  • An example information processing system described herein includes a main unit, a first controller device and a second controller device.
  • the main unit includes a display.
  • the first controller device is configured to be removably attached to the main unit, and the first controller device is configured to transmit first operation data representing an operation performed on the first controller device to the main unit, irrespective of whether the first controller device is attached to the main unit.
  • the second controller device is configured to be removably attached to the main unit, and the second controller device is configured to transmit second operation data representing an operation performed on the second controller device to the main unit, irrespective of whether the second controller device is attached to the main unit.
  • the main unit is configured to display, on the display, an execution result of an information process based on the first operation data transmitted from the first controller device and the second operation data transmitted from the second controller device.
  • the first controller device may include a first input section and a second input section.
  • the second controller device may include a third input section of the same type as the first input section and a fourth input section of the same type as the second input section.
  • the first input section may have substantially the same input mechanism as the third input section.
  • the second input section may have substantially the same input mechanism as the fourth input section.
  • the first input section may have substantially the same shape as the third input section.
  • the second input section may have substantially the same shape as the fourth input section.
  • a positional relationship between the first input section and the second input section of the first controller device placed in a predetermined orientation may be the same as a positional relationship between the third input section and the fourth input section of the second controller device placed in a predetermined orientation.
  • a positional relationship between the first input section and the second input section may be opposite from a positional relationship between the third input section and the fourth input section.
  • the first input section and the third input section may each be a directional input section configured to accept a directional input.
  • the directional input section may include an operation member configured to be tilted or slid in a predetermined direction.
  • the second input section and the fourth input section may each be a button which can be pressed.
  • the first controller device may be configured to be integrally attached to the main unit with a predetermined surface of a housing of the first controller device facing a predetermined surface of the main unit.
  • the first controller device may be configured to be integrally attached to the main unit so as to be facing one of a left side surface and a right side surface of the main unit.
  • the second controller device may be configured to be attached to the main unit so as to be facing the other one of the left side surface and the right side surface of the main unit.
  • the first controller device may include a light-emitting portion on the predetermined surface for notifying a user of predetermined information.
  • the first controller device may include an operation section on the predetermined surface.
  • a connecting portion between a first side surface of four side surfaces of the first controller device and a side surface adjacent to the first side surface may have a more rounded shape than a connecting portion between a second side surface of the four side surfaces, which is opposite from the first side surface, and a side surface adjacent to the second side surface.
  • a connecting portion between a third side surface of four side surfaces of the second controller device and a side surface adjacent to the third side surface may have a more rounded shape than a connecting portion between a fourth side surface of the four side surfaces, which is opposite from the third side surface, and a side surface adjacent to the fourth side surface.
  • the first controller device may be attached to the main unit with the second side surface of the first controller device facing a fifth side surface of four side surfaces of the main unit.
  • the second controller device may be attached to the main unit with the fourth side surface of the second controller device facing a sixth side surface, which is opposite from the fifth side surface, of the main unit.
  • the second controller device may include an input section having a first function, the first controller device not having the first function.
  • the second controller device may include an image-capturing device as an input section having the first function.
  • the second controller device may include a button as an input section having the first function.
  • the first controller device may include an input section having a second function different from the first function.
  • the second controller device may include an input section having the second function.
  • the first controller device may include one or more input section having a predetermined number of types of functions.
  • the second controller device may include one or more input section having a number of types of functions, the number being different from the predetermined number.
  • Communication between the main unit and the first controller device when the first controller device is attached to the main unit may use a first communication scheme, and communication between the main unit and the first controller device when the first controller device is detached from the main unit may use a second communication scheme different from the first communication scheme.
  • Communication between the main unit and the first controller device when the first controller device is detached from the main unit may be wireless communication.
  • Communication between the main unit and the first controller device when the first controller device is attached to the main unit may be wired communication.
  • the wired communication between the main unit and the first controller device is communication through a wired communication channel formed by an electrical connection a first terminal of the main unit and a second terminal of the first controller device.
  • a first terminal of the main unit and a second terminal of the first controller device may be electrically connected to each other, and communication between the main unit and the first controller device and power supply from the main unit to the first controller device may be implemented through the first terminal and the second terminal.
  • the information processing system may include: a first sensor configured to sense attachment of the first controller device to the main unit; and a second sensor configured to sense attachment of the second controller device to the main unit.
  • the main unit may register, as a set, the first controller device and a second controller device based on a sensing result from the first sensor and the sensing result from the second sensor.
  • the main unit may register, as a set, the first controller device and the second controller device attached thereto.
  • the main unit may execute a predetermined information process using the two received operation data as a set.
  • the main unit may include a housing including a first engagement portion configured for engagement with a housing of the first controller device when the first controller device is attached to the main unit, and a second engagement portion configured for engagement with a housing of the second controller device when the second controller device is attached to the main unit.
  • the first controller device may include a housing including a third engagement portion configured for engagement with the first engagement portion of the main unit.
  • the second controller device may include a housing including a fourth engagement portion configured for engagement with the second engagement portion of the main unit.
  • the main unit may selectively output an execution result of the information process either to the display or to a display device separate from the main unit.
  • Another example information processing device described herein includes a first attachment and detachment mechanism, a second attachment and detachment mechanism and a display.
  • the first attachment and detachment mechanism is configured to be removably attached to a first controller device thereto.
  • the second attachment and detachment mechanism is configured to be removably attached of a second controller device thereto.
  • the information processing device displays a result of an information process based on operation data transmitted from the first controller device and the second controller device on the display, irrespective of whether the first controller device and the second controller device are attached to the information processing device.
  • Another example information processing system described herein includes a main unit and a first controller device.
  • the main unit includes: a display; and a housing including a main unit-side engagement portion configured for engagement with a housing of the first controller device.
  • the first controller device is configured to be removably attached to the main unit by means of the main unit-side engagement portion.
  • the first controller device may include a housing including a controller device-side engagement portion configured for engagement with the main unit-side engagement portion.
  • the main unit-side engagement portion may include a first slide member including a surface of the housing of the main unit.
  • the first controller device may include a second slide member configured for slidable and detachable engagement with the first slide member.
  • the first slide member and the second slide member may together comprise a slide mechanism.
  • the first slide member may be configured for slidable engagement with the second slide member in a predetermined direction, and the first slide member may be configured to allow the second slide member to be inserted into and detached from the first slide member via one end of the first slide member in the predetermined direction.
  • the first slide member may be including an up-down direction of the main unit and may be configured to allow the second slide member to be inserted into and detached from the first slide member via an upper end of the first slide member.
  • the first slide member may be configured to extend generally over an entirety of a surface of the housing of the main unit in a predetermined direction.
  • the first slide member may have a C-shaped cross section.
  • the second slide member may have a T-shaped cross section.
  • the first controller device may include a terminal for communicating with the main unit.
  • the main unit may include a terminal on the housing thereof at such a location to allow the terminal to be connected to the terminal of the first controller device when the first controller device is attached to the main unit.
  • the first controller device may include an input section on the predetermined surface.
  • Another example information processing system described herein includes a main unit and a controller device configured to be removably attached to the main unit.
  • the main unit includes a display. Communication between the main unit and the controller device when the controller device is attached to the main unit may use a first communication scheme. Communication between the main unit and the controller device when the controller device is detached from the main unit may use a second communication scheme different from the first communication scheme.
  • the first communication scheme may be wired communication.
  • the first terminal of the main unit and the second terminal of the controller device may be electrically connected to each other via contact.
  • the second communication scheme may be wireless communication.
  • Another example information processing system described herein includes a main unit, a first controller device, a second controller device and an accessory.
  • the first controller device is configured to be removably attached to the main unit and to be removably attached to the accessory.
  • the second controller device is configured to be removably attached to the main unit and to be removably attached to the accessory.
  • the main unit includes a display and displays, on the display, an execution result of an information process based on an operation performed on the first controller device and the second controller device.
  • the accessory is configured to be attached of the first controller device and the second controller device thereto at the same time.
  • the accessory may include a charging control section configured to use power supplied to the accessory to charge one or both of the first controller device and the second controller device attached to the accessory.
  • the first controller device may be configured to be attached to the accessory on a left side of a center of the accessory.
  • the second controller device may be configured to be attached to the accessory on a right side of the center of the accessory.
  • the accessory may include a first grip portion on a left side of a center of the accessory and a second grip portion on a right side of the center of the accessory.
  • the first grip portion may be on a left side of an area where the first controller device is attached.
  • the second grip portion may be on a right side of an area where the second controller device is attached.
  • first controller device When the first controller device is attached to the accessory, communication between the main unit and the first controller device may be wireless communication.
  • communication between the main unit and the second controller device When the second controller device is attached to the accessory, communication between the main unit and the second controller device may be wireless communication.
  • the first controller device may be configured to be attached to the accessory with a tilt in a first direction from a reference attitude.
  • the second controller device may be configured to be attached to the accessory with a tilt in a direction opposite to the first direction from a reference attitude.
  • the main unit may execute the information process while modifying operation data obtained from the at least one controller device attached to the accessory and/or modifying information obtained based on the operation data.
  • the main unit may be configured to modify data from a directional input section, of all the operation data.
  • An example accessory described herein is an accessory to which a first controller device and a second controller device are removably attachable.
  • the first controller device is configured to be removably attached to a main unit separate from the accessory.
  • the second controller device is configured to be removably attached to the main unit.
  • the main unit includes a display and is configured to display, on the display, an execution result of an information process based on an operation performed on the first controller device and the second controller device.
  • the accessory includes: a first engagement portion configured for detachable engagement with the first controller device; and a second engagement portion configured for detachable engagement with the second controller device.
  • An example game system described herein includes a main unit, a first controller device and a second controller device.
  • the main unit includes a display.
  • the first controller device is configured to be removably attached to the main unit.
  • the second controller device is configured to be removably attached to the main unit. At least when the first controller device and the second controller device are detached from the main unit, each of the first controller device and the second controller device transmits, to the main unit, via wireless communication, operation data representing an operation performed thereon.
  • An example information processing device described herein is a hand-held information processing device.
  • the information processing device includes a main section including a display, a first controller section and a second controller section, and the information processing device is configured to perform a information process in response to an operation performed on either the first controller section or the second controller section.
  • the first controller section is configured to be removably attached to the main section.
  • the second controller section is configured to be removably attached to the main section.
  • the main unit is configured to perform the information process based on operation data representing an operation performed on at least one of the first controller section and the second controller section, and displays a result of the information process on the display.
  • the first controller section and the second controller section When the first controller section and the second controller section are attached to the main section, the first controller section can be operated by one hand of a user and the second controller section can be operated by the other hand of the user.
  • the present specification also discloses an example of an information processing device, a controller device or an accessory of the information processing system.
  • the present specification also discloses an example of a method to be carried out in the information processing system.
  • the present specification also discloses an example of a computer-readable storage medium storing therein an information processing program which causes a computer to execute some of the processes to be executed on the information processing system or the information processing device (in other words, causes a computer to function as some of the various units of the information processing system or the information processing device).
  • an information processing device and/or an information processing system can be used in different modes.
  • FIG. 1 is a diagram showing an example information processing device according to the present embodiment
  • FIG. 2 is a diagram showing an example where controllers are detached from a main unit
  • FIG. 3 is a six-sided view showing an example main unit
  • FIG. 4 is a diagram showing an example where the main unit is placed upright
  • FIG. 5 is a six-sided view showing an example left controller
  • FIG. 6 is a six-sided view showing an example right controller
  • FIG. 7 is a diagram showing an example of how a left rail member and a slider engage with each other;
  • FIGS. 8A and 8B are diagrams showing an example of how the slider is locked to the left rail member
  • FIG. 9 is a diagram showing an overall configuration of an example information processing system according to the present embodiment.
  • FIG. 10 is a diagram showing an external configuration of an example cradle
  • FIG. 11 is a block diagram showing an example internal configuration of the main unit
  • FIG. 12 is a block diagram showing an example internal configuration of the information processing device
  • FIG. 13 is a block diagram showing an example internal configuration of the cradle
  • FIG. 14 is a diagram showing an example of how the information processing device is used with the controllers attached to the main unit;
  • FIG. 15 is a diagram showing an example of a single user holding two controllers to use the information processing device in a detached state
  • FIG. 16 is a diagram showing an example of two users each holding one controller to use the information processing device in a detached state
  • FIGS. 17A and 17B are diagrams showing example modes of use where three or more controllers are used.
  • FIG. 18 is a diagram showing an example mode of use where the image is displayed on a TV
  • FIGS. 19A, 19B, 19C, and 19D are diagrams showing an example operation flow where the image is displayed on a TV;
  • FIG. 20 is a flow chart showing an example flow of a registration process executed on the main unit
  • FIG. 21 is a diagram showing an example of registration information
  • FIG. 22 is a diagram showing an example of pairing information
  • FIG. 23 is a flow chart showing an example flow of a wireless setting process executed on the main unit
  • FIG. 24 is a flow chart showing an example flow of a mode setting process executed on the main unit
  • FIG. 25 is a flow chart showing an example flow of an information process executed on the main unit
  • FIG. 26 is a flow chart showing an example flow of an information process executed on the main unit
  • FIG. 27 is a flow chart showing an example flow of an information process executed on the main unit
  • FIG. 28 is a diagram showing another example left controller
  • FIG. 29 is a diagram showing another example left controller
  • FIG. 30 is a diagram showing an example information processing device with a right controller different from that of FIG. 1 attached thereto;
  • FIG. 31 is a diagram showing an example accessory to which controllers can be attached.
  • FIG. 32 is a diagram showing another example accessory.
  • FIG. 33 is a diagram showing an example accessory to which the main unit can be attached.
  • the information processing system includes an information processing device 1 and a cradle 5 (see FIG. 9 ).
  • the information processing device 1 of the present embodiment includes a main unit 2 and controllers 3 and 4 , which can be attached to and detached from each other, and the controllers 3 and 4 can be used separately from the main unit 2 (see FIG. 2 ).
  • the information processing device 1 can be used both in a mode of use in which the image is displayed on the main unit 2 and in another mode of use in which the image is displayed on a separate display device such as a TV.
  • the information processing device 1 is used as a portable device (e.g., a portable game device) in the former mode, and the information processing device 1 is used as a console-type device (e.g., a console-type game device) in the latter mode.
  • FIG. 1 is a diagram showing an example information processing device 1 according to the present embodiment.
  • the information processing device 1 includes a main unit 2 , a left controller 3 and a right controller 4 .
  • the main unit 2 including a display 12 , executes various processes of the information processing device 1 .
  • the controllers 3 and 4 each include an operation section allowing a user to provide an input.
  • FIG. 2 is a diagram showing an example where the controllers 3 and 4 are detached from the main unit 2 .
  • the controllers 3 and 4 can be attached to and detached from the main unit 2 .
  • the left controller 3 can be attached to the left side of the main unit 2 (the x-axis positive direction side shown in FIG. 1 ).
  • the right controller 4 can be attached to the right side of the main unit 2 (the x-axis negative direction side shown in FIG. 1 ).
  • the left controller and the right controller may be referred to generally as “controllers”.
  • a specific example configuration of the main unit 2 and the controllers 3 and 4 will now be described.
  • FIG. 3 is a six-sided view showing an example main unit.
  • the main unit 2 includes a generally plate-shaped housing 11 .
  • the primary surface (in other words, the front-side surface, i.e., the surface on which the display 12 is provided) of the housing 11 has a generally rectangular shape.
  • the housing 11 has a horizontally-elongated shape. That is, in the present embodiment, the longitudinal direction of the primary surface of the housing 11 (i.e., the x-axis direction shown in FIG. 1 ) is denoted as the horizontal direction (also referred to as the left-right direction), the width direction of the primary surface (i.e., the y-axis direction shown in FIG.
  • the main unit 2 may be used in a landscape position or may be used in a portrait position.
  • the housing 11 may include a projection or a grip portion for making it easier for a user to hold the device.
  • the main unit 2 includes the display 12 provided on the primary surface of the housing 11 .
  • the display 12 displays an image (which may be a still image or a video image) obtained or produced by the main unit 2 . While the display 12 is assumed to be a liquid crystal display device (LCD) in the present embodiment, it may be any type of a display device.
  • LCD liquid crystal display device
  • the main unit 2 includes a touch panel 13 on the screen of the display 12 .
  • the touch panel 13 is of a type (e.g., the capacitive type) that enables a multi-touch input.
  • the type of the touch panel 13 may be of a type (e.g., the resistive type) that enables a single-touch input, for example.
  • the main unit 2 includes a speaker (i.e., a speaker 88 shown in FIG. 11 ) inside the housing 11 .
  • speaker holes 11 a and 11 b are formed in the primary surface of the housing 11 . Output sounds from the speaker 88 are output through these speaker holes 11 a and 11 b .
  • the main unit 2 includes two speakers, and speaker holes are located respectively for the left speaker and the right speaker.
  • the speaker hole 11 a for the left speaker is formed in a left portion of the display 12 .
  • the speaker hole 11 b for the right speaker is formed in a right portion of the display 12 .
  • the main unit 2 also includes an ambient light sensor (i.e., an ambient light sensor 94 shown in FIG. 11 ) inside the housing 11 .
  • an ambient light sensor i.e., an ambient light sensor 94 shown in FIG. 11
  • a window portion 14 is provided in the primary surface of the housing 11 so as to allow light from outside the housing 11 to be received by the ambient light sensor 94 .
  • the window portion 14 is provided for example as a transparent member that allows light to pass therethrough, or a filter member that allows light of a predetermined wavelength that can be sensed by the ambient light sensor 94 to pass therethrough.
  • the speaker holes 11 a and 11 b and the window portion 14 there is no particular limitation on the position, the shape and the number of the speaker holes 11 a and 11 b and the window portion 14 .
  • the speaker holes 11 a and 11 b may be provided on the side surface or the back surface of the housing 11 .
  • the window portion 14 is provided on the lower left side of the display 12 in the present embodiment, it may be provided in any other position on the primary surface of the housing 11 or may be provided on the side surface of the housing 11 .
  • the main unit 2 includes a left rail member 15 on the left side surface of the housing 11 .
  • the left rail member 15 is a member that allows the left controller 3 to be detachably attached to the main unit 2 .
  • the left rail member 15 is provided so as to extend in the up-down direction on the left side surface of the housing 11 .
  • the left rail member 15 has such a shape that can engage with a slider of the left controller 3 (i.e., a slider 40 shown in FIG. 5 ).
  • the left rail member 15 and the slider 40 together form a slide mechanism, the details of which will be described later. This slide mechanism allows the left controller 3 to be slidably and detachably attached to the main unit 2 .
  • the left rail member 15 has a shape with a groove.
  • the cross section (specifically, the cross section perpendicular to the up-down direction) of the left rail member 15 is C-shaped. More specifically, the cross section of the left rail member 15 is such that the end portions of the cross section extend in the outside-to-center direction. Therefore, the slider 40 in engagement with the left rail member 15 is securely locked so as not to come off in the direction perpendicular to the sliding direction (in other words, the direction in which the left rail member 15 extends) (see FIG. 7 to be discussed below).
  • the left rail member 15 is provided with an engagement hole 16 .
  • the engagement hole 16 is located so as to face a projection 41 provided on the slider 40 when the left controller 3 is attached to the main unit 2 .
  • the engagement hole 16 is provided on the bottom surface of the left rail member 15 (in other words, the bottom surface of the groove of the left rail member 15 ).
  • the engagement hole 16 is shaped so that the projection (i.e., the projection 41 shown in FIG. 5 ) can engage with the engagement hole 16 .
  • the projection 41 is inserted into and engages with the engagement hole 16 , thereby locking the left controller 3 to the main unit 2 , the details of which will be described later.
  • the left rail member 15 may be provided with a projection and the slider 40 may be provided with an engagement hole.
  • the main unit 2 includes a left-side terminal 17 .
  • the left-side terminal 17 allows the main unit 2 to communicate with the left controller 3 in wired communication.
  • the left-side terminal 17 is located so as to be in contact with the terminal of the left controller 3 (a terminal 42 shown in FIG. 5 ) when the left controller 3 is attached to the main unit 2 .
  • the left-side terminal 17 is provided on the bottom surface of the left rail member 15 .
  • the left-side terminal 17 is provided near the lower end on the bottom surface of the left rail member 15 .
  • the left-side terminal 17 is provided below the engagement hole 16 (in other words, on the far side with respect to the direction in which the slider 40 is inserted into the left rail member 15 ).
  • a stopper 18 is provided on the left side surface of the housing 11 . As shown in FIG. 3 , the stopper 18 is provided near the end (in the present embodiment, near the lower end) of the left rail member 15 . The stopper 18 is provided inside the groove of the left rail member 15 . The stopper 18 is provided in order to limit the slide of the slider 40 in engagement with the left rail member 15 , the details of which will be described later.
  • the main unit 2 includes a right rail member 19 on the right side surface of the housing 11 .
  • the right rail member 19 is provided so as to extend in the up-down direction on the right side surface of the housing 11 .
  • the right rail member 19 has such a shape that it can engage with a slider of the right controller 4 (i.e., a slider 62 shown in FIG. 6 ).
  • the right rail member 19 and the slider 62 together form a slide mechanism, the details of which will be described later. This slide mechanism allows the right controller 4 to be slidably and detachably attached to the main unit 2 .
  • the right rail member 19 has a similar shape to that of the left rail member 15 . That is, the right rail member 19 has a shape with a groove whose cross-sectional shape is similar to that of the left rail member 15 . Note however that the right rail member 19 does not need to have exactly the same shape as that of the left rail member 15 .
  • the groove of the left rail member 15 and the groove of the right rail member 19 may differ from each other in terms of the size and/or the shape so that the slider 62 of the right controller 4 cannot engage with the left rail member 15 (and/or so that the slider 40 of the left controller 3 cannot engage with the right rail member 19 ).
  • the right rail member 19 is provided with an engagement hole 20 .
  • the engagement hole 20 is located so as to face a projection 63 provided on the slider 62 when the right controller 4 is attached to the main unit 2 .
  • the engagement hole 20 is provided on the bottom surface of the right rail member 19 (in other words, the bottom surface of the groove of the right rail member 19 ).
  • the engagement hole 20 is shaped so that the projection (i.e., the projection 63 shown in FIG. 6 ) can engage with the engagement hole 20 .
  • the projection 63 is inserted into and engages with the engagement hole 20 , thereby locking the right controller 4 to the main unit 2 , the details of which will be described later.
  • the right rail member 19 may be provided with a projection and the slider 62 with an engagement hole.
  • the main unit 2 includes a right-side terminal 21 .
  • the right-side terminal 21 allows the main unit 2 to communicate with the right controller 4 in wired communication.
  • the right-side terminal 21 is located so as to be in contact with the terminal of the right controller 4 (a terminal 64 shown in FIG. 6 ) when the right controller 4 is attached to the main unit 2 .
  • the right-side terminal 21 is provided on the bottom surface of the right rail member 19 .
  • the right-side terminal 21 is provided near the lower end on the bottom surface of the right rail member 19 .
  • the right-side terminal 21 is provided below the engagement hole 20 (in other words, on the far side with respect to the direction in which the slider 62 is inserted into the right rail member 19 ).
  • a stopper 22 is provided on the right side surface of the housing 11 . As shown in FIG. 3 , the stopper 22 is provided near the end (in the present embodiment, near the lower end) of the right rail member 19 . The stopper 22 is provided inside the groove of the right rail member 19 . The stopper 22 is provided in order to limit the slide of the slider 62 in engagement with the right rail member 19 , the details of which will be described later.
  • the housing 11 of the main unit 2 is provided with the left rail member 15 and the right rail member 19 .
  • the housing 11 is configured on the assumption that controllers are attached thereto.
  • the rail members 15 and 19 may be provided on the left and right end portions, respectively, on the primary surface and/or the reverse surface of the housing 11 .
  • the mechanism for allowing the controllers 3 and 4 to be detachably attached to the main unit 2 and a slider mechanism different from that of the present embodiment may be used, or a mechanism different from a slider mechanism may be used.
  • the main unit 2 includes a first slot 23 .
  • the first slot 23 is provided on the upper side surface of the housing 11 .
  • the first slot 23 is shaped so as to accommodate a storage medium of a first type.
  • a cover that can be opened/closed is provided for the opening of the first slot 23 , and a storage medium of the first type can be inserted into the first slot 23 with the cover being open.
  • a storage medium of the first type is, for example, a dedicated storage medium (e.g., a dedicated memory card) for the information processing device 1 or other information processing devices of the same type.
  • the storage medium of the first type is used, for example, for storing data used in the main unit 2 (e.g., application save data, etc.) and/or for storing programs to be executed on the main unit 2 (e.g., application programs, etc.).
  • the main unit 2 also includes a power button 28 .
  • the power button 28 is provided on the upper side surface of the housing 11 .
  • the power button 28 is a button for turning ON/OFF the power of the main unit 2 .
  • the power button 28 can be used to switch between the ON mode and the sleep mode.
  • the ON mode is a mode in which the screen display of the display 12 is turned on, for example
  • the sleep mode is a mode in which the screen display of the display 12 is turned off, for example.
  • a predetermined process of the application e.g., a game process of a game application
  • the main unit 2 executes a process of turning ON/OFF the power of the main unit 2 .
  • the main unit 2 executes a process of switching between the ON mode and the sleep mode.
  • the power button 28 can be used to turn the power ON/OFF and to switch between the ON mode and the sleep mode.
  • the main unit 2 may be provided with a button only for the function of turning the power ON/OFF or only for the function of switching between the ON mode and the sleep mode.
  • the main unit 2 includes a sound input/output terminal (specifically, an earphone jack) 25 . That is, the main unit 2 allows a microphone or an earphone to be attached to the sound input/output terminal 25 . As shown in FIG. 3 , the sound input/output terminal 25 is provided on the upper side surface of the housing 11 .
  • the main unit 2 includes sound volume buttons 26 a and 26 b . As shown in FIG. 3 , the sound volume buttons 26 a and 26 b are provided on the upper side surface of the housing 11 .
  • the sound volume buttons 26 a and 26 b are buttons for giving instructions to adjust the volume of the sound output from the main unit 2 . That is, the sound volume button 26 a is a button for giving an instruction to lower the sound volume, and the sound volume button 26 b is a button for giving an instruction to raise the sound volume.
  • the housing 11 is provided with an air outlet hole 11 c .
  • the air outlet hole 11 c is provided on the upper side surface of the housing 11 .
  • the air outlet hole 11 c is provided so as to radiate (in other words, discharge) the heat generated inside the housing 11 to the outside of the housing 11 .
  • the main unit 2 includes a lower terminal 27 .
  • the lower terminal 27 is a terminal for allowing the main unit 2 to communicate with the cradle 5 to be described later. As shown in FIG. 3 , the lower terminal 27 is provided on the lower side surface of the housing 11 .
  • the lower terminal 27 is connected to a terminal of the cradle 5 (a main body terminal 73 shown in FIG. 10 ) when the main unit 2 is attached to the cradle 5 , the details of which will be described later.
  • the lower terminal 27 is a USB connector (more specifically, a female-side connector).
  • the main unit 2 also includes a second slot 24 .
  • the second slot 24 is provided on the lower side surface of the housing 11 .
  • the second slot 24 may be provided on the same surface as the first slot 23 .
  • the second slot 24 is shaped so as to accommodate a storage medium of a second type, which is different from the first type.
  • a cover that can be opened/closed is provided for the opening of the second slot 24 , and a storage medium of the second type can be inserted into the second slot 24 with the cover being open.
  • a storage medium of the second type may be, for example, a general-purpose storage medium, e.g., an SD card.
  • the storage medium of the second type is used for storing data used in the main unit 2 (e.g., application save data, etc.) and/or for storing programs to be executed on the main unit 2 (e.g., application programs, etc.).
  • data used in the main unit 2 e.g., application save data, etc.
  • programs to be executed on the main unit 2 e.g., application programs, etc.
  • the housing 11 is provided with an air inlet hole 11 d .
  • the air inlet hole 11 d is provided on the lower side surface of the housing 11 .
  • the air inlet hole 11 d is provided so as to take in (in other words, introduce) the air from the outside of the housing 11 to the inside of the housing 11 .
  • the air inlet hole 11 d is provided on the surface opposite from the surface where the air outlet hole 11 c is provided, thereby allowing for efficient discharge of the heat from inside the housing 11 .
  • the main unit 2 also includes a stand member 29 used when placing the housing upright. As shown in FIG. 3 , the stand member 29 is provided on the lower side surface of the housing 11 . The stand member 29 is rotatably connected to the housing 11 via a pivot 29 a . In FIG. 3 , the stand member 29 is accommodated in the housing 11 .
  • FIG. 4 is a diagram showing an example in which the main unit 2 is placed upright. Note that in order to facilitate understanding of elements of interest to be discussed in conjunction with the figure, some of the other elements of the main unit 2 are not shown in FIG. 4 .
  • the rod-shaped portion of the stand member 29 protrudes from the housing 11 after being rotated about the pivot 29 a . Thus, the stand member 29 is brought into a position protruding from the housing 11 , allowing the main unit 2 to be placed upright as shown in FIG. 4 .
  • the mechanism for placing the main unit 2 upright is not limited to the stand member 29 shown in FIG. 3 , but may be any other mechanism.
  • the shape, the number and the arrangement of the various elements provided on the housing 11 described above.
  • some of the power button 28 and the slots 23 and 24 may be provided on another side surface or the back surface of the housing 11 .
  • some of the elements described above may be absent on the main unit 2 .
  • FIG. 5 is a six-sided view showing an example of the left controller 3 .
  • the left controller 3 includes a generally plate-shaped housing 31 .
  • the primary surface (in other words, the front-side surface, i.e., the surface on the z-axis negative direction side shown in FIG. 1 ) of the housing 31 has a generally rectangular shape.
  • the housing 31 has a vertically-elongated shape, i.e., a shape that is elongated in the up-down direction (i.e., the y-axis direction shown in FIG. 1 ). Note that when detached from the main unit 2 , the left controller 3 may be held in a portrait position (see FIG.
  • the housing 31 may be held in a landscape position (see FIG. 16 ).
  • the housing 31 does not need to be generally plate-shaped in other embodiments.
  • the housing 31 does not need to have a rectangular shape, but may have a semi-circular shape, or the like, for example.
  • the housing 31 does not need to have a vertically-elongated shape.
  • the length of the housing 31 in the up-down direction is generally equal to the length of the housing 11 of the main unit 2 in the up-down direction.
  • the thickness of the housing 31 i.e., the length thereof in the front-rear direction; in other words, the length thereof in the z-axis direction shown in FIG. 1 ) is generally equal to the thickness of the housing 11 of the main unit 2 . Therefore, when the left controller 3 is attached to the main unit 2 (see FIG. 1 ), a user can hold the main unit 2 and the left controller 3 as if they were an integral unit.
  • the left controller 3 includes an analog stick 32 .
  • the analog stick 32 is provided on the primary surface of the housing 31 .
  • the analog stick 32 is an example of a directional input section allowing a user to input a direction.
  • the analog stick 32 includes a stick member that can be tilted in any direction (i.e., 360° directions including the upper, lower, left, right and diagonal directions) parallel to the primary surface of the housing 31 .
  • a user can tilt the stick member to make a direction input based on the tilt direction (and a magnitude input based on the tilt angle).
  • the directional input section may also be a cross-shaped key, a slide stick, or the like.
  • a slide stick is an input section including a stick member that can be slid in any direction parallel to the primary surface of the housing 31 , and a user can slide the stick member to make an input based on the slide direction (and a magnitude input based on the slide amount).
  • a user can also make an input by pressing down the stick member (in a direction vertical to the housing 31 ).
  • the analog stick 32 is an input section that allows a user to make a direction input and a magnitude input based on the tilt direction and the tilt amount, respectively, of the stick member, and also to make a push input by pressing down the stick member.
  • the left controller 3 includes four operation buttons 33 to 36 (specifically, a right direction button 33 , a lower direction button 34 , an upper direction button 35 and a left direction button 36 ). As shown in FIG. 5 , these four operation buttons 33 to 36 are provided below the analog stick 32 on the primary surface of the housing 31 . Note that while four operation buttons are provided on the primary surface of the left controller 3 in the present embodiment, there is no particular limitation on the number of operation buttons. These operation buttons 33 to 36 are used to give instructions in accordance with various programs executed on the main unit 2 (e.g., the OS program and application programs).
  • the operation buttons 33 to 36 can be used to make directional inputs, and the operation buttons 33 to 36 are therefore referred to as the right direction button 33 , the lower direction button 34 , the upper direction button 35 and the left direction button 36 . Note however that the operation buttons 33 to 36 may be used to give instructions other than directional inputs.
  • the left controller 3 also includes a record button 37 .
  • the record button 37 is provided on the primary surface of the housing 31 , more specifically, in a lower right area of the primary surface.
  • the record button 37 is a button for giving an instruction to save the image displayed on the display 12 of the main unit 2 .
  • a user can press the record button 37 to save the game image that is displayed at the point in time when the button is pressed in a storage section of the main unit 2 , for example.
  • the left controller 3 also includes a minus ( ⁇ ) button 47 .
  • the minus button 47 is provided on the primary surface of the housing 31 , more specifically, in an upper right area of the primary surface.
  • the minus button 47 is used to give instructions in accordance with various programs executed on the main unit 2 (e.g., the OS program and application programs).
  • the minus button 47 is used, for example, as a select button (e.g., a button used to move the selection through different selection items) in game applications.
  • the operation sections provided on the primary surface of the left controller 3 are operated with the thumb of the left hand, for example, of a user holding the information processing device 1 (see FIG. 14 ).
  • the operation sections are operated with the left and right thumbs, for example, of a user holding the left controller 3 (see FIG. 15 ).
  • the analog stick 32 is operated by the thumb of the left hand of the user, and the operation buttons 33 to 36 are operated with the thumb of the right hand of the user.
  • the left controller 3 includes a first L button 38 .
  • the left controller 3 also includes a ZL button 39 .
  • the first L button 38 is provided over a corner portion between the left side surface and the upper side surface of the housing 31 .
  • the ZL button 39 is provided to extend over a corner portion between the left side surface and the upper side surface of the housing 31 (strictly speaking, between the left side surface and the upper side surface as seen from the front side of the housing 31 ) while extending into the reverse surface of the housing 31 .
  • the ZL button 39 is provided on the rear side (the z-axis positive direction side shown in FIG. 1 ) of the first L button 38 .
  • the first L button 38 and the ZL button 39 each have a rounded shape in conformity with the rounded shape of the upper left corner portion of the housing 31 .
  • the left controller 3 When the left controller 3 is attached to the main unit 2 , the first L button 38 and the ZL button 39 will be placed over the upper left portion of the information processing device 1 (see FIG. 1 ). Therefore, a user holding the information processing device 1 is allowed to operate the first L button 38 and the ZL button 39 with the index finger or the middle finger of the left hand (see FIG. 14 ).
  • a portion of the reverse surface of the housing 31 where the ZL button 39 is provided projects past other portions of the housing 31 .
  • the ZL button 39 is provided so as to project past the other portions of the housing 31 on the reverse surface. Therefore, when the main unit 2 with the left controller 3 attached thereto is placed on a flat surface in such an orientation that the reverse surface of the left controller 3 faces the horizontal flat surface, the projecting portions of the housing 31 are in contact with the flat surface. As a result, the information processing device 1 is placed so that the upper side of the main unit 2 is slightly raised from the lower side thereof. When the information processing device 1 is so placed, it is easy for a user to see the display 12 .
  • the ZL button 39 when the main unit 2 with the left controller 3 attached thereto is placed on a flat surface in such an orientation that the reverse surface of the left controller 3 faces the horizontal flat surface, the ZL button 39 may be in contact with the flat surface.
  • the ZL button 39 can be pressed down primarily in the up-down direction (the y-axis direction). That is, the ZL button 39 is supported on the housing 31 so as to move primarily in the up-down direction. Therefore, even if the information processing device 1 is placed so that the ZL button 39 is in contact with the flat surface as described above, the ZL button 39 is unlikely to be pressed down because the ZL button 39 primarily receives a force in the front-rear direction (the z-axis direction). That is, even when the information processing device 1 is placed as described above, the ZL button 39 is unlikely to be pressed down inadvertently.
  • the ZL button 39 may be provided so as not to project from the reverse surface of the housing 31 .
  • the ZL button 39 may be provided on the side surface of the housing 31 .
  • an area of the reverse surface of the housing 31 where the ZL button 39 is provided may be sunken from the remaining portion (i.e., the housing is formed to be thinner in this area) so that the ZL button 39 does not project past the remaining portion of the reverse surface.
  • the left controller 3 includes the slider 40 described above. As shown in FIG. 5 , the slider 40 is provided so as to extend in the up-down direction on the right side surface of the housing 31 .
  • the slider 40 has such a shape that it can engage with the left rail member 15 (more specifically, the groove of the left rail member 15 ) of the main unit 2 .
  • the cross section (specifically, the cross section perpendicular to the up-down direction) of the slider 40 is T-shaped. More specifically, the cross section of the slider 40 is T-shaped in conformity with the cross-sectional shape of the left rail member 15 (see FIG. 7 ). Therefore, the slider 40 in engagement with the left rail member 15 is locked so as not to come off in the direction perpendicular to the sliding direction (in other words, the direction in which the left rail member 15 extends) (see FIG. 7 to be discussed below).
  • the slider 40 is provided with the projection 41 .
  • the projection 41 is arranged at such a position that allows the projection 41 to be inserted into the engagement hole 16 when the left controller 3 is attached to the main unit 2 .
  • the projection 41 is provided on the engaging surface of the slider 40 .
  • the engaging surface of the slider 40 refers to a surface that faces the bottom surface of the left rail member 15 when the left controller 3 is attached to the main unit 2 .
  • the projection 41 is shaped so that the projection 41 can engage with the engagement hole 16 of the left rail member 15 .
  • the projection 41 is biased from the inside of the slider 40 toward the outside of the slider 40 .
  • the projection 41 moves toward the inside of the slider 40 (i.e., retracts into the slider 40 ).
  • the projection 41 is connected to an elastic member inside the slider 40 , and the projection 41 is placed inside a hole in the slider 40 with a portion thereof protruding past the engaging surface of the slider 40 .
  • the projection 41 may be fixed to the slider 40 .
  • the left controller 3 includes the terminal 42 for allowing the left controller 3 to communicate with the main unit 2 in wired communication.
  • the terminal 42 is located so as to be in contact with the left-side terminal 17 of the main unit 2 ( FIG. 3 ) when the left controller 3 is attached to the main unit 2 .
  • the terminal 42 is provided on the engaging surface of the slider 40 .
  • the terminal 42 is provided near the lower end on the engaging surface of the slider 40 .
  • the terminal 42 is provided below the projection 41 (in other words, on the front side with respect to the insertion of the slider 40 into the left rail member 15 ).
  • the left controller 3 also includes a second L button 43 and a second R button 44 .
  • these buttons 43 and 44 are used to give instructions in accordance with various programs executed on the main unit 2 .
  • the second L button 43 and the second R button 44 are provided on the engaging surface of the slider 40 .
  • the second L button 43 is provided on the engaging surface of the slider 40 above the center of the engaging surface with respect to the up-down direction (the y-axis direction shown in FIG. 1 ).
  • the second R button 44 is provided on the engaging surface of the slider 40 below the center of the engaging surface with respect to the up-down direction.
  • the second L button 43 and the second R button 44 are arranged at such positions that they cannot be pressed down with the left controller 3 attached to the main unit 2 . That is, the second L button 43 and the second R button 44 are buttons that are used when the left controller 3 is detached from the main unit 2 .
  • the second L button 43 and the second R button 44 are operated with the index finger or the middle finger of the left hand and the right hand of a user holding the left controller 3 detached from the main unit 2 (see FIG. 16 ).
  • the left controller 3 includes an indicator LED 45 .
  • the indicator LED 45 is an indicator section for indicating predetermined information to the user. There is no particular limitation on the information to be indicated by the indicator LED 45 .
  • the indicator LED 45 shows the user identification information of the controller when the main unit 2 communicates with a plurality of controllers.
  • the left controller 3 includes a number (herein, four) of LEDs equal to the number of controllers that the main unit 2 can be simultaneously in communication with. Then, one of the four LEDs is lit, which is associated with the number assigned to the controller. Thus, it is possible with the indicator LED 45 to indicate the number to the user.
  • the indicator LED 45 may indicate, to the user, the status of communication between the left controller 3 and the main unit 2 .
  • the indicator LED 45 may be lit while a connection with the main unit 2 is established. While the number of LEDs (in other words, the light-emitting portions) serving as the indicator LED 45 is four in the present embodiment, there is no particular limitation on the number of LEDs.
  • the indicator LED 45 is provided on the engaging surface of the slider 40 as shown in FIG. 5 .
  • the indicator LED 45 is arranged at such a position that the indicator LED 45 cannot be seen when the left controller 3 is attached to the main unit 2 . That is, the indicator LED 45 is used when the left controller 3 is detached from the main unit 2 .
  • the left controller 3 includes a pairing button 46 .
  • the pairing button 46 is used to give an instruction for a setting (referred to also as pairing) process regarding wireless communication between the left controller 3 and the main unit 2 , and to give an instruction for a resetting process of resetting the left controller 3 .
  • the pairing button 46 may only serve to instruct one of the setting process and the resetting process.
  • the left controller 3 executes the setting process. Note that the details of the setting process will be described later.
  • the resetting process is a process of resetting the status of the left controller 3 , and is a process that should be executed, for example, when the left controller 3 freezes (e.g., when the main unit 2 is no longer able to obtain data from the left controller 3 ).
  • the resetting process may include, for example, a process of turning OFF the power of the left controller 3 and then turning it back ON, a process of disconnecting with the main unit 2 and then re-connecting with the main unit 2 , a process of re-executing a process that is executed when starting communication, and/or the setting process.
  • the left controller 3 can be reset, using the pairing button 46 , to an operative state.
  • the pairing button 46 is provided on the engaging surface of the slider 40 as shown in FIG. 5 .
  • the pairing button 46 is arranged at such a position that the pairing button 46 cannot be seen when the left controller 3 is attached to the main unit 2 . That is, the pairing button 46 is used when the left controller 3 is detached from the main unit 2 .
  • the pairing button 46 is arranged at such a position, thereby preventing the pairing button 46 from being operated in error when the left controller 3 is attached to the main unit 2 .
  • buttons provided on the engaging surface of the slider 40 are provided so as not to protrude past the engaging surface. That is, the upper surface (in other words, the surface to be pressed) of these buttons is arranged flush with the engaging surface of the slider 40 or arranged at a position sunken from the engaging surface. This allows the slider 40 to slide smoothly against the left rail member 15 when the slider 40 is engaged with the left rail member 15 of the main unit 2 .
  • FIG. 6 is a six-sided view showing an example of the right controller 4 .
  • the right controller 4 includes a generally plate-shaped housing 51 .
  • the primary surface (in other words, the front-side surface, i.e., the surface on the z-axis negative direction side shown in FIG. 1 ) of the housing 51 has a generally rectangular shape.
  • the housing 51 has a vertically-elongated shape, i.e., a shape that is elongated in the up-down direction. Note that when detached from the main unit 2 , the right controller 4 may be held in a portrait position (see FIG. 18 ) or may be held in a landscape position (see FIG. 16 ).
  • the length of the housing 51 of the right controller 4 in the up-down direction is generally equal to the length of the housing 11 of the main unit 2 in the up-down direction, and the thickness thereof is generally equal to the thickness of the housing 11 of the main unit 2 . Therefore, when the right controller 4 is attached to the main unit 2 (see FIG. 1 ), a user can hold the main unit 2 and the right controller 4 as if they were an integral unit.
  • the right-side corner portion of the primary surface of the housing 51 has a more rounded shape than the left-side corner portion thereof. That is, the connecting portion between the upper side surface and the right side surface of the housing 51 and the connecting portion between the lower side surface and the right side surface of the housing 51 are more rounded (in other words, round-cornered with a greater radius) than the connecting portion between the upper side surface and the left side surface and the connecting portion between the lower side surface and the left side surface. Therefore, when the right controller 4 is attached to the main unit 2 (see FIG. 1 ), the right side of the information processing device 1 will have a rounded shape, making it easier for a user to hold the device.
  • the right controller 4 includes an analog stick 52 as the directional input section.
  • the analog stick 52 has the same configuration as the analog stick 32 of the left controller 3 .
  • the right controller 4 includes four operation buttons 53 to 56 (specifically, the A button 53 , the B button 54 , the X button 55 and the Y button 56 ). In the present embodiment, these four operation buttons 53 to 56 are of the same mechanism as the four operation buttons 33 to 36 of the left controller 3 .
  • the analog stick 52 and the operation buttons 53 to 56 are provided on the primary surface of the housing 51 . Note that while the number of operation buttons provided on the primary surface of the right controller 4 is four in the present embodiment, there is no particular limitation on the number of operation buttons.
  • the positional relationship between the two types of operation sections (the analog stick and the operation buttons) of the right controller 4 is opposite from the positional relationship between these two types of operation sections of the left controller 3 . That is, the analog stick 52 is arranged above the operation buttons 53 to 56 on the right controller 4 , whereas the analog stick 32 is arranged below the operation buttons 33 to 36 on the left controller 3 . With such an arrangement, the left and right controllers 3 and 4 , when detached from the main unit 2 , can be operated in a similar fashion, the details of which will be described later.
  • the right controller 4 also includes a plus (+) button 57 .
  • the plus button 57 is provided on the primary surface of the housing 51 , more specifically, in an upper left area of the primary surface.
  • the plus button 57 is used to give instructions in accordance with various programs executed on the main unit 2 (e.g., the OS program and application programs).
  • the plus button 57 is used, for example, as a start button in a game application (e.g., a button used to give an instruction to start the game).
  • the right controller 4 includes a home button 58 .
  • the home button 58 is provided on the primary surface of the housing 51 , more specifically, in a lower left area of the primary surface.
  • the home button 58 is a button for displaying a predetermined menu screen on the display 12 of the main unit 2 .
  • the menu screen is, for example, a screen where a user can launch an application specified by the user, from among a plurality of applications that can be executed on the main unit 2 .
  • the menu screen may be displayed at the start-up of the main unit 2 , for example.
  • a predetermined control screen may be displayed on the display 12 (the menu screen may be displayed instead of the control screen) when the home button 58 is pressed while an application is executed on the main unit 2 (i.e., while the image of the application is displayed on the display 12 ).
  • the control screen is, for example, a screen where a user can give an instruction to end an application and display the menu screen on the display 12 , and an instruction to resume an application, etc.
  • the operation sections provided on the primary surface of the right controller 4 are operated with the thumb of the right hand, for example, of a user holding the information processing device 1 when the right controller 4 is attached to the main unit 2 (see FIG. 14 ).
  • the operation sections are operated with the left and right thumbs, for example, of a user holding the right controller 4 (see FIG. 15 ).
  • the analog stick 52 is operated with the thumb of the left hand of the user
  • the operation buttons 53 to 56 are operated with the thumb of the right hand of the user.
  • the right controller 4 includes a first R button 60 .
  • the right controller 4 includes a ZR button 61 .
  • the first R button 60 is provided over a corner portion between the right side surface and the upper side surface of the housing 51 .
  • the ZR button 61 is provided to extend over a corner portion between the right side surface and the upper side surface of the housing 51 (strictly speaking, between the right side surface and the upper side surface as seen from the front side of the housing 51 ) while extending into the reverse surface of the housing 51 . That is, the ZR button 61 is provided on the rear side (the z-axis positive direction side shown in FIG. 1 ) of the first R button 60 .
  • the first R button 60 and the ZR button 61 each have a rounded shape in conformity with the rounded shape of the upper right portion of the housing 51 .
  • the first R button 60 and the ZR button 61 When the right controller 4 is attached to the main unit 2 , the first R button 60 and the ZR button 61 will be placed over the upper right portion of the information processing device 1 (see FIG. 1 ). Therefore, a user holding the information processing device 1 is allowed to operate the first R button 60 and the ZR button 61 with the index finger or the middle finger of the right hand (see FIG. 15 ).
  • the ZR button 61 of the right controller 4 is provided so as to project from the housing 51 , as is the ZL button 39 of the left controller 3 . That is, a portion of the reverse surface of the housing 51 where the ZR button 61 is provided (more specifically, at least a portion of the perimeter of the ZR button 61 ) projects past the remaining portion of the housing 51 .
  • the ZR button 61 is provided so as to project past the remaining portion of the housing 51 on the reverse surface.
  • the right controller 4 when the right controller 4 is attached to the main unit 2 , as is the case when the left controller 3 is attached to the main unit 2 , when the main unit 2 is placed on a flat surface in such an orientation that the reverse surface of the right controller 4 faces the horizontal flat surface, the projecting portions of the housing 51 are in contact with the flat surface. As a result, the main unit 2 is placed so that the upper side thereof is slightly raised from than the lower side thereof, and it is therefore easy for a user to see the display 12 .
  • the ZR button 61 of the right controller 4 can be pressed down primarily in the up-down direction (the y-axis direction). Therefore, as with the ZL button 39 , the ZR button 61 is unlikely to be pressed down when the information processing device 1 is placed so that the ZR button 61 is in contact with a flat surface. Thus, the ZR button 61 is unlikely to be pressed in error.
  • the housing 51 may be formed so that the reverse surface of the housing 51 projects past the ZR button 61 . In other embodiments, as is the ZL button 39 of the left controller 3 , the ZR button 61 may be formed so as not to project from the reverse surface of the housing 51 .
  • the shape of the first L button 38 and the shape of the first R button 60 are not in left-right symmetry, and the shape of the ZL button 39 and the shape of the ZR button 61 are not in left-right symmetry. Note however that in other embodiments, the shape of the first L button 38 and the shape of the first R button 60 may be in left-right symmetry, and the shape of the ZL button 39 and the shape of the ZR button 61 may be in left-right symmetry.
  • the right controller 4 includes a similar slider mechanism to that of the left controller 3 . That is, the right controller 4 includes the slider 62 described above. As shown in FIG. 6 , the slider 62 is provided so as to extend in the up-down direction on the left side surface of the housing 51 .
  • the slider 62 has such a shape that it can engage with the right rail member 19 (more specifically, the groove of the right rail member 19 ) of the main unit 2 .
  • the cross section (specifically, the cross section perpendicular to the up-down direction) of the slider 62 is T-shaped. More specifically, the cross section of the slider 62 is T-shaped in conformity with the cross-sectional shape of the right rail member 19 (see FIG. 7 ). Therefore, the slider 62 in engagement with the right rail member 19 is locked so as not to come off in the direction perpendicular to the sliding direction (in other words, the direction in which the right rail member 19 extends) (see FIG. 7 ).
  • the slider 62 is provided with the projection 63 .
  • the projection 63 is arranged at such a position that allows the projection 63 to be inserted into the engagement hole 20 when the right controller 4 is attached to the main unit 2 .
  • the projection 63 is provided on the engaging surface of the slider 62 .
  • the engaging surface of the slider 62 refers to a surface that faces the bottom surface of the right rail member 19 when the right controller 4 is attached to the main unit 2 .
  • the projection 63 is shaped so that the projection 63 can engage with the engagement hole 20 of the right rail member 19 .
  • the projection 63 of the right controller 4 is biased from the inside of the slider 62 toward the outside of the slider 62 .
  • the projection 63 moves toward the inside of the slider 62 (i.e., retracts into the slider 62 ).
  • the configuration for biasing the projection 63 as described above, and it may be a similar configuration to the projection 41 of the left controller 3 .
  • the right controller 4 includes the terminal 64 for allowing the right controller 4 to communicate with the main unit 2 in wired communication.
  • the terminal 64 is located so as to be in contact with the right-side terminal 21 of the main unit 2 ( FIG. 3 ) when the right controller 4 is attached to the main unit 2 .
  • the terminal 64 is provided on the engaging surface of the slider 62 .
  • the terminal 64 is provided near the lower end on the engaging surface of the slider 62 .
  • the terminal 64 is provided below the projection 63 (in other words, on the front side with respect to the insertion of the slider 62 into the right rail member 19 ).
  • the right controller 4 also includes a second L button 65 and a second R button 66 .
  • these buttons 65 and 66 are used to give instructions in accordance with various programs executed on the main unit 2 .
  • the second L button 65 and the second R button 66 are provided on the engaging surface of the slider 62 .
  • the second L button 65 is provided on the engaging surface of the slider 62 below the center of the engaging surface with respect to the up-down direction (the y-axis direction shown in FIG. 1 ).
  • the second R button 66 is provided on the engaging surface of the slider 62 above the center of the engaging surface with respect to the up-down direction.
  • the second L button 65 and the second R button 66 are arranged at such positions that they cannot be pressed down with the right controller 4 attached to the main unit 2 , and they are buttons that are used when the right controller 4 is detached from the main unit 2 .
  • the second L button 65 and the second R button 66 are operated with the index finger or the middle finger of the left hand and the right hand of a user holding the right controller 4 detached from the main unit 2 (see FIG. 16 ).
  • the right controller 4 includes an indicator LED 67 .
  • the indicator LED 67 is an indicator section for indicating predetermined information to the user.
  • the right controller 4 includes four LEDs as the indicator LED 67 , and one of the four LEDs is lit, which is associated with the number assigned to the right controller 4 .
  • the indicator LED 67 is provided on the engaging surface of the slider 62 ( FIG. 6 ).
  • the indicator LED 67 is arranged at such a position that the indicator LED 67 cannot be seen with the right controller 4 attached to the main unit 2 , and the indicator LED 67 is used when the right controller 4 is detached from the main unit 2 .
  • the right controller 4 includes a pairing button 69 .
  • the pairing button 69 is used to give an instruction for a setting (referred to also as “pairing”) process regarding wireless communication between the right controller 4 and the main unit 2 , and to give an instruction for a resetting process of resetting the right controller 4 .
  • the setting process and the resetting process are the same as those for the left controller 3 , and will not therefore be described in detail below.
  • the pairing button 69 is provided on the engaging surface of the slider 62 as shown in FIG. 6 . That is, for the same reason for the pairing button 46 of the left controller 3 , the pairing button 69 is arranged at such a position that the pairing button 69 cannot be seen with the right controller 4 attached to the main unit 2 .
  • buttons provided on the engaging surface of the slider 62 are provided so as not to protrude past the engaging surface. This allows the slider 62 to slide smoothly against the right rail member 19 when the slider 62 is engaged with the right rail member 19 of the main unit 2 .
  • a window portion 68 is provided in the lower side surface of the housing 51 .
  • the right controller 4 includes an infrared image-capturing section (an infrared image-capturing section 123 shown in FIG. 12 ), the details of which will be described later, to detect a hand movement and/or the gesture, etc., of the user by means of the infrared image-capturing section 123 .
  • the window portion 68 is provided so as to allow the camera of the infrared image-capturing section 123 arranged inside the housing 51 to capture an image around the right controller 4 .
  • the window portion 68 is provided for protecting the lens of the camera of the infrared image-capturing section 123 , and is made of a material (e.g., a transparent material) that allows light of a wavelength to be detected by the camera to pass therethrough.
  • the window portion 68 may be a hole formed in the housing 51 .
  • the infrared image-capturing section 123 itself has a filter member for suppressing the transmission therethrough of light of wavelengths other than the light (in the present embodiment, infrared light) to be detected by the camera.
  • the window portion may have a filter function.
  • the controllers 3 and 4 there is no particular limitation on the shape, the number and the arrangement of the various elements (specifically, the slider, the stick, the buttons, the LEDs, etc.) provided on the housing 31 or 51 .
  • the controllers 3 and 4 may include a directional input section of a different type from an analog stick.
  • the slider 40 or 62 may be arranged at a position that corresponds to the position of the rail member 15 or 19 provided on the main unit 2 , and may be, for example, arranged on the primary surface or the reverse surface of the housing 31 or 51 .
  • one or more of the various elements described above may be absent on the controllers 3 and 4 .
  • FIG. 7 and FIGS. 8A and 8B the action of attaching and detaching a controller to and from the main unit 2 will be described. Note that although the description below is directed to the action of attaching and detaching the left controller 3 to and from the main unit 2 , the action of attaching and detaching the right controller 4 to and from the main unit 2 can be done in the same manner as the left controller 3 .
  • FIG. 7 is a diagram showing an example of how the left rail member 15 and the slider 40 engage with each other. Note that in order to facilitate understanding of the figure, elements of the main unit 2 are shown in phantom line in FIG. 7 . As shown in FIG. 7 , the slider 40 is inserted into the left rail member 15 so that the T-shaped cross section of the slider 40 engages with (or is fitted to) the C-shaped cross section of the left rail member 15 (in other words, the groove of the left rail member 15 ).
  • the user After inserting the lower end of the slider 40 into the groove of the left rail member 15 , the user further inserts the slider 40 down the groove of the left rail member 15 . That is, the user slides the left controller 3 downward against the main unit 2 . Then, when the left controller 3 has been slid until the lower end of the slider 40 reaches the position of the stopper 18 of the main unit 2 , the left controller 3 is locked to the main unit 2 .
  • FIGS. 8A and 8B are diagrams showing an example of how the slider 40 is locked to the left rail member 15 .
  • FIGS. 8A and 8B show a cross section of the left rail member 15 perpendicular to the front-rear direction (the z-axis direction).
  • the elements are shown in FIGS. 8A and 8B with different positional relationships and different sizes from FIG. 3 , etc.
  • the slider 40 After the state shown in FIG. 8A , the slider 40 is slid further down the left rail member 15 , the lower end of the slider 40 reaches the position of the stopper 18 of the main unit 2 (see FIG. 8B ). Then, as shown in FIG. 8B , the projection 41 of the slider 40 faces the engagement hole 16 of the left rail member 15 . Therefore, the projection 41 protrudes from the engaging surface of the slider 40 to be inserted into the engagement hole 16 . Thus, the projection 41 engages with the engagement hole 16 , thereby locking the left controller 3 to the main unit 2 (in other words, locking the slider 40 to the left rail member 15 ) to such a degree that a small force will not cause the left controller 3 to come off.
  • the terminal 42 of the slider 40 is located so as to face the left-side terminal 17 of the left rail member 15 as shown in FIG. 8B .
  • the terminal 42 and the left-side terminal 17 are connected together. This enables wired communication (in other words, communication via physical connection between terminals) between the left controller 3 and the main unit 2 . This also allows power to be supplied from the main unit 2 to the left controller 3 .
  • the controller-side terminal i.e., the terminal 42
  • the main unit-side terminal i.e., the left-side terminal 17
  • protrudes only slightly from the base surface.
  • the left-side terminal 17 on the main unit side is provided to slightly protrude from the base surface (i.e., the bottom surface of the left rail member 15 ).
  • a metal portion 17 a to be the contact point of the terminal provided to protrude is able to deform toward its base surface. Therefore, when the terminals come into contact with each other, each terminal receives a pressing force from the other terminal so as to be biased in the direction in which it protrudes, as shown in FIG. 8B . This as a result ensures a reliable contact between the terminals.
  • the left-side terminal 17 of the main unit 2 is provided below the engagement hole 16 .
  • the terminal 42 of the left controller 3 is provided below the projection 41 . Therefore, when the slider 40 is inserted into the left rail member 15 , the projection 41 will not come into contact with the left-side terminal 17 , thereby lowering the possibility of the projection 41 damaging the left-side terminal 17 .
  • a user slides the left controller 3 upward against the main unit 2 .
  • the left controller 3 is locked to the main unit 2 by means of the projection 41 and the engagement hole 16 .
  • a certain force or more for sliding the left controller 3 upward will dislocate the projection 41 off the position of the engagement hole 16 , thereby releasing the lock.
  • the left controller 3 can be further slid up to remove the left controller 3 from the main unit 2 .
  • the left controller 3 may include a mechanism capable of retracting the projection 41 into the slider 40 .
  • the left controller 3 may include a mechanism for retracting the projection 41 into the slider 40 in response to a user pressing a predetermined button provided on the left controller 3 , for example.
  • the user can easily release the lock of the left controller 3 to the main unit 2 by means of the projection 41 .
  • the controllers 3 and 4 can be detachably attached to the main unit 2 by means of a slide mechanism including a rail member and a slider.
  • a slide mechanism the controllers 3 and 4 can be securely locked to the main unit 2 for directions other than the slide direction. Therefore, a user can easily hold the information processing device 1 with the controllers 3 and 4 attached to the main unit 2 , with little shaking of the controllers 3 and 4 .
  • the projection and the engagement hole allow the controllers 3 and 4 to be locked to the main unit 2 . This also reduces the shaking of the controllers 3 and 4 attached to the main unit 2 , thereby allowing a user to easily hold the information processing device 1 .
  • FIG. 9 is a diagram showing an overall configuration of an example information processing system according to the present embodiment.
  • the information processing system includes the information processing device 1 and the cradle 5 .
  • the information processing device 1 can be placed in the cradle 5 .
  • the cradle 5 can communicate (either via wired communication or wireless communication) with a TV 6 , which is an example of an external display device separate from the display 12 .
  • the information processing system can display the image obtained or produced by the information processing device 1 on the TV 6 , the details of which will be described later.
  • the cradle 5 has a function of charging the information processing device 1 placed therein and a function as a hub device (specifically, a USB hub), the details of which will be described later.
  • FIG. 10 is a diagram showing an external configuration of an example of the cradle 5 .
  • the cradle 5 has a housing such that the information processing device 1 can be placed therein (or attached thereto, in other words).
  • the housing includes a first support 71 provided with a groove 71 a , and a generally-planar second support 72 , as shown in FIG. 10 .
  • the groove 71 a provided in the first support 71 is shaped in conformity with the shape of the lower portion of the information processing device 1 .
  • the groove 71 a is shaped so as to allow the lower portion of the information processing device 1 to be inserted into the groove 71 a , and more specifically is shaped to generally match the shape of the lower portion of the information processing device 1 .
  • the information processing device 1 can be placed in the cradle 5 by inserting the lower portion of the information processing device 1 into the groove 71 a .
  • the second support 72 supports the surface (i.e., the surface on which the display 12 is provided) of the information processing device 1 , of which the lower portion has been inserted into the groove 71 a .
  • the cradle 5 can more stably support the information processing device 1 .
  • the shape of the housing shown in FIG. 10 is merely an example, and the housing of the cradle 5 may have any other shape in other embodiments that allows the information processing device 1 to be placed in the cradle 5 .
  • the information processing device 1 is placed in the cradle 5 so that the primary surface of the main unit 2 (i.e., the surface of the display 12 ) faces the second support 72 .
  • the information processing device 1 may be placed in the cradle 5 so that the reverse surface of the main unit 2 faces the second support 72 . That is, in the present embodiment, a user can place the information processing device 1 in the cradle 5 in the front-facing orientation (i.e., an orientation such that the display 12 can be seen), or place the information processing device 1 in the cradle 5 in the back-facing orientation (i.e., an orientation such that the display 12 is hidden).
  • the cradle 5 also includes the main body terminal 73 for allowing the cradle 5 to communicate with the information processing device 1 .
  • the main body terminal 73 is provided on the bottom surface of the groove 71 a formed in the first support 71 . More specifically, the main body terminal 73 is located so as to be in contact with the lower terminal 27 of the information processing device 1 when the information processing device 1 is attached to the cradle 5 .
  • the main body terminal 73 is a USB connector (more specifically, a male-side connector).
  • the information processing device 1 can be attached to the cradle 5 either in the front-facing orientation or in the back-facing orientation, as described above.
  • the lower terminal 27 of the information processing device 1 and the main body terminal 73 of the cradle 5 are each symmetric with respect to the depth direction (i.e., the z-axis direction shown in FIG. 1 ), allowing them to communicate with each other whether they are connected in one orientation or the other with respect to the depth direction.
  • the cradle 5 includes a sleep button 74 .
  • the sleep button 74 is button for switching between the ON mode and the sleep mode of the main unit 2 attached to the cradle 5 .
  • the sleep button 74 may have the function of turning ON/OFF the power of the main unit 2 , in addition to (or instead of) the function of switching between the ON mode and the sleep mode of the main unit 2 .
  • the sleep button 74 is provided on the front surface of the first support 71 .
  • the sleep button 74 may be provided at any position such that the sleep button 74 can be pressed down with the information processing device 1 attached to the cradle 5 .
  • the sleep button 74 may be provided on the side surface of the housing of the cradle 5 or on the back surface of the second support 72 .
  • the sleep button 74 includes a light-emitting portion (specifically, LED).
  • the light-emitting portion of the sleep button 74 is used to indicate, to the user, the status of the main unit 2 attached to the cradle 5 . That is, the light-emitting portion varies how light is emitted (in other words, how it is lit) depending on the status of the main unit 2 attached to the cradle 5 .
  • the light-emitting portion is lit when the main unit 2 attached to the cradle 5 is in the ON mode, and the light-emitting portion is turned OFF when the main unit 2 is in the sleep mode or in the power-off mode.
  • the light-emitting portion when the main unit 2 has information that should be indicated to the user (e.g., program update information, announcement, advertisement, etc.), the light-emitting portion blinks. Note that the light-emitting portion does not need to be provided on the sleep button 74 , but it may be provided at any position on the housing of the cradle 5 , for example.
  • the cradle 5 includes a terminal (in the present embodiment, a plurality of terminals, specifically, a monitor terminal 132 , a power terminal 134 and extension terminals 137 shown in FIG. 13 ) provided on the back surface of the housing. The details of these terminals will be described later.
  • the housing may have any other shape such that the information processing device 1 can be supported by the housing.
  • Some of the terminals provided on the housing may be provided on the front surface of the housing.
  • some of the various elements described above may be absent from the cradle 5 .
  • FIG. 11 is a block diagram showing an example internal configuration of the main unit 2 .
  • the main unit 2 also includes elements 81 to 98 shown in FIG. 11 . Some of these elements 81 to 98 may be mounted, as electronic components, on an electronic circuit board and accommodated in the housing 11 .
  • the main unit 2 includes a CPU (Central Processing Unit) 81 .
  • the CPU 81 is an information processing section for executing various information processes to be executed on the main unit 2 .
  • the CPU 81 executes various information processes by executing an information processing program stored in a storage section that can be accessed by the CPU 81 (specifically, an internal storage medium such as a flash memory 84 or external storage media attached to the slots 23 and 24 , etc.).
  • the main unit 2 includes, as an example internal storage medium provided in the main unit 2 , the flash memory 84 and a DRAM (Dynamic Random Access Memory) 85 .
  • the flash memory 84 and the DRAM 85 are connected to the CPU 81 .
  • the flash memory 84 is a memory used primarily for storing various data (which may be programs) saved on the main unit 2 .
  • the DRAM 85 is a memory used for temporarily storing various data used in information processes.
  • the main unit 2 includes a first slot interface (hereinafter abbreviated as “I/F”) 91 .
  • the main unit 2 also includes a second slot I/F 92 .
  • the slot I/Fs 91 and 92 are connected to the CPU 81 .
  • the first slot I/F 91 is connected to the first slot 23 , and reads and writes data from and to a storage medium of the first type (e.g., an SD card) inserted in the first slot 23 , in response to an instruction from the CPU 81 .
  • a storage medium of the first type e.g., an SD card
  • the second slot I/F 92 is connected to the second slot 24 , and reads and writes data from and to a storage medium of the second type (e.g., a dedicated memory card) inserted in the second slot 24 , in response to an instruction from the CPU 81 .
  • a storage medium of the second type e.g., a dedicated memory card
  • the CPU 81 executes the information processes described above by reading and writing data from and to the memories 84 and 85 and the storage media.
  • the main unit 2 includes a network communication section 82 .
  • the network communication section 82 is connected to the CPU 81 .
  • the network communication section 82 communicates (specifically, in wireless communication) with an external device via a network.
  • the network communication section 82 is a Wi-Fi certified communication module, for example, and communicates with an external device via a wireless LAN.
  • the main unit 2 may have the function of implementing communication by connecting to a mobile communication network (in other words, a mobile telephone communication network), in addition to (or instead of) the function of implementing communication by connecting to a wireless LAN.
  • the main unit 2 includes a controller communication section 83 .
  • the controller communication section 83 is connected to the CPU 81 .
  • the controller communication section 83 communicates with the controllers 3 and/or 4 in wireless communication. While there is no particular limitation on the communication scheme between the main unit 2 and the controllers, the controller communication section 83 communicates with the controllers in accordance with the Bluetooth (registered trademark) standard in the present embodiment.
  • the CPU 81 is connected to the left-side terminal 17 , the right-side terminal 21 and the lower terminal 27 described above.
  • the CPU 81 transmits data to the left controller 3 via the left-side terminal 17 .
  • the CPU 81 transmits data to the right controller 4 via the right-side terminal 21 .
  • the CPU 81 transmits data to the cradle 5 via the lower terminal 27 .
  • the main unit 2 is capable of communicating with the left and right controllers 3 and 4 either in wired communication or in wireless communication. Note that the process of switching between wired communication and wireless communication will be described later.
  • the main unit 2 can communicate with a plurality of left controllers simultaneously (in other words, in parallel).
  • the main unit 2 can communicate with a plurality of right controllers simultaneously (in other words, in parallel). Therefore, users can make inputs to the information processing device 1 by using a plurality of left controllers and a plurality of right controllers.
  • the main unit 2 includes a touch panel controller 86 , which is a circuit for controlling the touch panel 13 .
  • the touch panel controller 86 is connected to the touch panel 13 , and is connected to the CPU 81 . Based on a signal from the touch panel 13 , for example, the touch panel controller 86 generates data representing the position at which the touch input has been performed, and outputs the data to the CPU 81 .
  • the display 12 described above is connected to the CPU 81 .
  • the CPU 81 displays images generated and/or images obtained from outside on the display 12 (e.g., by executing the information processes described above).
  • the main unit 2 includes a codec circuit 87 and a speaker (specifically, a left speaker and a right speaker) 88 .
  • the codec circuit 87 is connected to the speaker 88 and the sound input/output terminal 25 , and is also connected to the CPU 81 .
  • the codec circuit 87 is a circuit for controlling the input/output of sound data to/from the speaker 88 and the sound input/output terminal 25 . That is, when sound data is received from the CPU 81 , the codec circuit 87 outputs, to the speaker 88 or the sound input/output terminal 25 , a sound signal obtained by performing a D/A conversion on the sound data.
  • a sound is output from the speaker 88 or a sound output section (e.g., an earphone) connected to the sound input/output terminal 25 .
  • the codec circuit 87 performs an A/D conversion on the sound signal and outputs sound data of a predetermined format to the CPU 81 .
  • the sound volume button 26 described above (specifically, the sound volume buttons 26 a and 26 b shown in FIG. 3 ) is connected to the CPU 81 .
  • the CPU 81 controls the sound volume of the output from the speaker 88 or the sound output section based on an input on the sound volume button 26 .
  • the main unit 2 includes an acceleration sensor 89 .
  • the acceleration sensor 89 detects the magnitude of the linear acceleration along directions of three predetermined axes (e.g., the xyz axes shown in FIG. 1 ). Note that the acceleration sensor 89 may detect acceleration in one axis direction or two axis directions.
  • the main unit 2 includes an angular velocity sensor 90 .
  • the angular velocity sensor 90 detects the angular velocity about three predetermined axes (e.g., the xyz axes shown in FIG. 1 ). Note that the angular velocity sensor 90 may detect the angular velocity about one axis or two axes.
  • the acceleration sensor 89 and the angular velocity sensor 90 are connected to the CPU 81 , and the detection results of the acceleration sensor 89 and the angular velocity sensor 90 are output to the CPU 81 . Based on the detection results of the acceleration sensor 89 and the angular velocity sensor 90 , the CPU 81 can calculate information regarding movement and/or attitude of the main unit 2 .
  • the main unit 2 includes a power control section 97 and a battery 98 .
  • the power control section 97 is connected to the battery 98 and the CPU 81 .
  • the power control section 97 is connected to various sections of the main unit 2 (specifically, various sections receiving power supply from the battery 98 , the left-side terminal 17 and the right-side terminal 21 ).
  • the power control section 97 controls the power supply from the battery 98 to the various sections based on an instruction from the CPU 81 .
  • the power control section 97 is connected to the power button 28 .
  • the power control section 97 controls the power supply to the various sections based on an input on the power button 28 .
  • the power control section 97 stops the power supply to some or all of the various sections, and when the power-ON operation is performed on the power button 28 , the power control section 97 starts the power supply to some or all of the various sections.
  • the power control section 97 stops the power supply to some elements, including the display 12 , and when an instruction to switch to the ON mode is given to the power button 28 , the power control section 97 starts the power supply to the elements.
  • the power control section 97 also outputs, to the CPU 81 , information representing the input on the power button 28 (specifically, information indicating whether or not the power button 28 is being pressed).
  • the battery 98 is connected to the lower terminal 27 .
  • an external charging device e.g., the cradle 5
  • the battery 98 is charged with the power supplied thereto.
  • the charging capacity of the battery 98 of the main unit 2 is larger than the charging capacity of the battery of the controllers 3 and 4 to be described later.
  • the main unit 2 includes a magnetic force sensor (referred to also as a magnetic sensor) 93 for detecting the intensity and/or the direction of a magnetic field.
  • the magnetic force sensor 93 is connected to the CPU 81 , and the detection results of the magnetic force sensor 93 are output to the CPU 81 .
  • the magnetic force sensor 93 is used to detect whether a protection cover (not shown) attached to the information processing device 1 is open or closed.
  • the protection cover is provided with a magnet, and the CPU 81 detects when the primary surface of the main unit 2 is covered by the protection cover based on the detection results of the magnetic force sensor 93 . Note that when such a state is detected, the CPU 81 turns OFF the display of the display 12 , for example.
  • the main unit 2 includes the ambient light sensor 94 for detecting the intensity of the ambient light around the main unit 2 .
  • the ambient light sensor 94 is connected to the CPU 81 , and the detection results of the ambient light sensor 94 are output to the CPU 81 .
  • the ambient light sensor 94 is used to adjust the brightness of the display 12 . That is, the CPU 81 controls the brightness of the display 12 based on the detection results of the ambient light sensor 94 .
  • the main unit 2 includes a cooling fan 96 for radiating heat from inside the main unit 2 .
  • the cooling fan 96 With the cooling fan 96 in operation, the air outside the housing 11 is introduced into the housing 11 through the air inlet hole 11 d and the air inside the housing 11 is discharged through the air outlet hole 11 c , thereby discharging heat from inside the housing 11 .
  • the cooling fan 96 is connected to the CPU 81 , and the operation of the cooling fan 96 is controlled by the CPU 81 .
  • the main unit 2 includes a temperature sensor 95 for detecting the temperature inside the main unit 2 .
  • the temperature sensor 95 is connected to the CPU 81 , and the detection results of the temperature sensor 95 are output to the CPU 81 .
  • the CPU 81 controls the operation of the cooling fan 96 based on the detection results of the temperature sensor 95 .
  • FIG. 12 a block diagram showing an example internal configuration of the information processing device 1 . Note that the details of the internal configuration of the information processing device 1 that are related to the main unit 2 are shown in FIG. 11 , and are therefore not shown in FIG. 12 .
  • the left controller 3 includes a communication control section 101 for communicating with the main unit 2 .
  • the communication control section 101 is connected to various elements including the terminal 42 described above.
  • the communication control section 101 is capable of communicating with the main unit 2 both in wired communication via the terminal 42 and in wireless communication without using the terminal 42 .
  • the communication control section 101 controls the communication method used by the left controller 3 for communicating with the main unit 2 . That is, when the left controller 3 is attached to the main unit 2 , the communication control section 101 communicates with the main unit 2 via the terminal 42 .
  • the communication control section 101 communicates with the main unit 2 (specifically, with the controller communication section 83 ) in wireless communication.
  • the wireless communication between the controller communication section 83 and the communication control section 101 is in accordance with the Bluetooth (registered trademark) standard.
  • the left controller 3 includes a memory 102 such as a flash memory, for example.
  • the communication control section 101 is a microcomputer (referred to also as a microprocessor), for example, and executes various processes by executing firmware stored in the memory 102 .
  • the left controller 3 includes the buttons 103 (specifically, the buttons 33 to 39 , 43 and 44 ).
  • the left controller 3 includes the analog stick (labeled “stick” in FIG. 12 ) 32 described above.
  • the buttons 103 and the analog stick 32 repeatedly output, with appropriate timing, information regarding operations performed thereon to the communication control section 101 .
  • the left controller 3 includes an acceleration sensor 104 .
  • the acceleration sensor 104 detects the magnitude of the linear acceleration along directions of threes predetermined axes (e.g., the xyz axes shown in FIG. 1 ). Note that the acceleration sensor 104 may detect acceleration in one axis direction or two axis directions.
  • the left controller 3 includes an angular velocity sensor 105 .
  • the angular velocity sensor 105 detects the angular velocity about three predetermined axes (e.g., the xyz axes shown in FIG. 1 ). Note that the angular velocity sensor 105 may detect the angular velocity about one axis or two axes.
  • the acceleration sensor 104 and the angular velocity sensor 105 are connected to the communication control section 101 .
  • the detection results of the acceleration sensor 104 and the angular velocity sensor 105 are repeatedly output, with appropriate timing, to the communication control section 101 .
  • the communication control section 101 obtains input-related information (specifically, information regarding operations or detection results of sensors) from the input sections (specifically, the buttons 103 , the analog stick 32 and the sensors 104 and 105 ).
  • the communication control section 101 transmits operation data including the obtained information (or information obtained by performing a predetermined process on the obtained information) to the main unit 2 . Note that operation data is transmitted repeatedly once per a predetermined period of time. Note that the interval at which the input-related information is transmitted to the main unit 2 may or may not be the same among different input sections.
  • the main unit 2 can know the input that has been performed on the left controller 3 . That is, the main unit 2 can determine operations performed on the buttons 103 and the analog stick 32 based on the operation data. The main unit 2 can calculate information regarding movement and/or attitude of the left controller 3 based on the operation data (specifically, the detection results of the sensors 104 and 105 ).
  • the indicator LED 45 described above is connected to the communication control section 101 .
  • the indicator LED 45 is controlled by an instruction from the main unit 2 . That is, when the instruction is received from the main unit 2 , the communication control section 101 outputs, to the indicator LED 45 , a control signal for controlling the lighting of the indicator LED 45 in accordance with the instruction.
  • the left controller 3 includes a vibrator 107 for indications to the user through vibration.
  • the vibrator 107 is controlled by an instruction from the main unit 2 . That is, when the instruction is received from the main unit 2 , the communication control section 101 drives the vibrator 107 in accordance with the instruction.
  • the left controller 3 includes an amplifier 106 .
  • the communication control section 101 outputs a control signal in accordance with the instruction to the amplifier 106 .
  • the amplifier 106 amplifies the control signal from the communication control section 101 and generates a driving signal for driving the vibrator 107 to give the driving signal to the vibrator 107 . This actuates the vibrator 107 .
  • the left controller 3 includes a power supply section 108 .
  • the power supply section 108 includes a battery and a power control circuit.
  • the power control circuit is connected to the battery and is connected to the various sections of the left controller 3 (specifically, various sections receiving power supply from the battery). The power control circuit controls the power supply from the battery to the various sections.
  • the battery is also connected to the terminal 42 .
  • the battery is charged with the power supply from the main unit 2 via the terminal 42 under a predetermined condition, the details of which will be described later.
  • the right controller 4 includes a communication control section 111 for communicating with the main unit 2 .
  • the right controller 4 includes a memory 112 connected to the communication control section 111 .
  • the communication control section 111 is connected to various element including the terminal 64 described above.
  • the communication control section 111 and the memory 112 have similar functions to the communication control section 101 and the memory 102 of the left controller 3 . Therefore, the communication control section 111 is capable of communicating with the main unit 2 both in wired communication via the terminal 64 and in wireless communication without using the terminal 64 (specifically, communication in accordance with the Bluetooth (registered trademark) standard), and controls the communication method by which the right controller 4 communicates with the main unit 2 .
  • the right controller 4 includes similar input sections (specifically, the buttons 113 , the analog stick 52 , an acceleration sensor 114 and an angular velocity sensor 115 ) to those of the left controller 3 . These input sections have similar functions and operate in a similar fashion to the input sections of the left controller 3 .
  • the indicator LED 67 of the right controller 4 operates in a similar fashion to the indicator LED 45 of the left controller 3 . That is, when the instruction from the main unit 2 is received, the communication control section 111 outputs, to the indicator LED 67 , a control signal for controlling the lighting of the indicator LED 67 in accordance with the instruction.
  • the right controller 4 includes a vibrator 117 and an amplifier 116 .
  • the vibrator 117 and the amplifier 116 operate in a similar fashion to the vibrator 107 and the amplifier 106 of the left controller 3 . That is, the communication control section 111 actuates the vibrator 117 using the amplifier 116 in accordance with an instruction from the main unit 2 .
  • the right controller 4 includes a power supply section 118 .
  • the power supply section 118 has a similar function and operates in a similar fashion to the power supply section 108 of the left controller 3 . That is, the power supply section 118 controls the power supply to the various sections receiving power supply from the battery.
  • the battery is charged with the power supply from the main unit 2 via the terminal 64 under a predetermined condition.
  • the right controller 4 includes an NFC communication section 122 .
  • the NFC communication section 122 implements near-field wireless communication in accordance with the NFC (Near Field Communication) standard.
  • the NFC communication section 122 has the function of a so-called NFC reader/writer.
  • the NFC communication section 122 includes an antenna used for the near-field wireless communication, and a circuit (e.g., an NFC chip) for generating a signal (radio wave) to be transmitted from the antenna.
  • the near-field wireless communication is not limited to those based on the NFC standard, but may also be any proximity communication (referred to also as non-contact communication).
  • the proximity communication for example includes communication schemes in which one device generates an electromotive force on the other device by radio waves therefrom (e.g., through electromagnetic induction).
  • the right controller 4 includes the infrared image-capturing section 123 .
  • the infrared image-capturing section 123 includes an infrared camera for capturing an image around the right controller 4 .
  • the infrared image-capturing section 123 is used to capture an image of a hand of a user.
  • the information processing device 1 identifies an input made with the hand (e.g., a gesture input, etc.) based on information (e.g., the position, the size, the shape, etc.) of the hand of which the image has been captured.
  • the infrared image-capturing section 123 includes an illuminating section for outputting infrared light.
  • the illuminating section outputs infrared light in synchronism with the timing at which the infrared camera captures an image, for example.
  • the infrared light output from the illuminating section is reflected by an object (e.g., a hand of a user), and the reflected infrared light is received by the infrared camera, thus obtaining an image of the infrared light.
  • an object e.g., a hand of a user
  • the reflected infrared light is received by the infrared camera, thus obtaining an image of the infrared light.
  • an object e.g., a hand of a user
  • the reflected infrared light is received by the infrared camera, thus obtaining an image of the infrared light.
  • a visible light camera a camera using a visible light image sensor
  • an infrared camera as an image-capturing device, in other embodiments.
  • the right controller 4 includes a processing section 121 .
  • the processing section 121 is connected to the communication control section 111 and also connected to the NFC communication section 122 .
  • the processing section 121 executes a management process for the NFC communication section 122 in response to an instruction from the main unit 2 .
  • the processing section 121 controls the action of the NFC communication section 122 in response to an instruction from the main unit 2 .
  • the processing section 121 also controls the activation of the NFC communication section 122 , and controls the action (specifically, reading and writing, etc.) of the NFC communication section 122 on a communication partner (e.g., an NFC tag).
  • a communication partner e.g., an NFC tag
  • the processing section 121 also receives information to be transmitted to the communication partner from the main unit 2 via the communication control section 111 to pass this information to the NFC communication section 122 , and obtains from the NFC communication section 122 information received from the communication partner to transmit this information to the main unit 2 via the communication control section 111 .
  • the processing section 121 executes a management process for the infrared image-capturing section 123 in response to an instruction from the main unit 2 .
  • the processing section 121 causes the infrared image-capturing section 123 to perform an image-capturing operation, and obtains information based on the image-capturing result (information of the captured image or information calculated from such information, etc.) to transmit this information to the main unit 2 via the communication control section 111 .
  • FIG. 13 is a block diagram showing an example internal configuration of the cradle 5 . Note that the details of the internal configuration regarding the main unit 2 are shown in FIG. 11 and are therefore not shown in FIG. 13 .
  • the cradle 5 includes a conversion section 131 and the monitor terminal 132 .
  • the conversion section 131 is connected to the main body terminal 73 and the monitor terminal 132 .
  • the conversion section 131 converts a signal format regarding an image (referred to also as a video) and a sound received from the main unit 2 into a format to be output to the TV 6 .
  • the main unit 2 outputs image and sound signals to the cradle 5 as a display port signal (i.e., a signal in accordance with the DisplayPort standard).
  • communication based on the HDMI (registered trademark) standard is used for communication between the cradle 5 and the TV 6 .
  • the monitor terminal 132 is an HDMI terminal, and the cradle 5 and the TV 6 are connected together by an HDMI cable.
  • the conversion section 131 converts the display port signal (specifically, a signal representing a video and a sound) received from the main unit 2 via the main body terminal 73 into an HDMI signal.
  • the converted HDMI signal is output to the TV 6 via the monitor terminal 132 .
  • the cradle 5 includes a processing section 135 for executing various information processes on the cradle 5 .
  • the processing section 135 is connected to the sleep button 74 described above, and is connected to the main body terminal 73 via a connection processing section 136 (the details of which will be described later).
  • the processing section 135 detects an operation performed on the sleep button 74 , and notifies the main unit 2 of the operation. When receiving such a notification, the main unit 2 switches between the ON mode and the sleep mode thereof.
  • the ON mode and the sleep mode of the information processing device 1 are switched from one to another in response to the sleep button 74 being pressed down.
  • the cradle 5 includes a power control section 133 and the power terminal 134 .
  • the power terminal 134 is a terminal for connecting a charging device (not shown) (e.g., an AC adaptor, etc.). In the present embodiment, it is assumed that an AC adaptor is connected to the power terminal 134 , and the cradle 5 is receiving commercial power supply.
  • the power control section 133 supplies power from the power terminal 134 to the main unit 2 via the main body terminal 73 .
  • the battery 98 of the main unit 2 is charged as described above.
  • the power terminal 134 is a connector (i.e., a female-side USB terminal) of the same shape as the lower terminal 27 of the information processing device 1 . Therefore, in the present embodiment, the charging device may be connected to the cradle 5 to charge the information processing device 1 via the cradle 5 , or the charging device may be connected directly to the main unit 2 to charge the information processing device 1 .
  • the cradle 5 includes the connection processing section 136 and the extension terminals 137 .
  • the extension terminals 137 are terminals for connecting other devices.
  • the cradle 5 includes a plurality (more specifically, three) of USB terminals as the extension terminals 137 .
  • the connection processing section 136 is connected to the main body terminal 73 and the extension terminals 137 .
  • the connection processing section 136 has a function of a USB hub, and manages the communication between devices connected to the extension terminals 137 and the main unit 2 connected to the main body terminal 73 , for example, (i.e., a signal from one device is appropriately distributed among other devices).
  • the information processing device 1 is capable of communicating with other devices via the cradle 5 .
  • the connection processing section 136 may be capable of converting the communication speed or supply power to devices connected to the extension terminals 137 .
  • the left and right controllers 3 and 4 can be attached to and detached from the information processing device 1 .
  • the information processing device 1 is capable of outputting images (and sound) to the TV 6 . Therefore, the information processing device 1 can be used in various modes of use as will be described below. The operation of the information processing system in primary modes of use will now be described.
  • FIG. 14 is a diagram showing an example of how the information processing device 1 is used with the controllers 3 and 4 attached to the main unit 2 (referred to as the “attached state”). As shown in FIG. 14 , when the controllers 3 and 4 are attached to the main unit 2 , the information processing device 1 can be used as a portable device (e.g., a portable game device).
  • a portable device e.g., a portable game device
  • the main unit 2 and the controllers 3 and 4 communicate with each other in wired communication (i.e., communication between devices connected together via terminals). That is, the main unit 2 receives operation data from the controllers 3 and 4 attached thereto, and executes an information process based on the received operation data (specifically, by using operation data as an input).
  • the main unit 2 and the controllers may communicate with each other in wireless communication in the attached state.
  • wireless communication may not work properly because the main unit 2 and the controllers are too close to each other.
  • the main unit 2 and the controllers communicate with each other in wired communication in the attached state, thereby improving the reliability of communication.
  • the four operation buttons 33 to 36 of the left controller 3 may be used to give a directional input (in other words, a directional instruction). Then, a user can make a directional input by using the analog stick 32 or can make a directional input by using the operation buttons 33 to 36 . A user is allowed to make a directional input by using a preferred operation mechanism, thereby improving the controllability. Note however that the instruction each operation button is assigned can be freely determined by the program executed on the main unit 2 .
  • the arrangement of the analog stick and the four operation buttons is reversed between the left controller 3 and the right controller 4 . That is, in the attached state, the analog stick 32 is arranged above the four operation buttons 33 to 36 on the left controller 3 , whereas the four operation buttons 53 to 56 are arranged above the analog stick 52 on the right controller 4 . Therefore, as shown in FIG. 14 , when the information processing device 1 is held with the left hand and the right hand at the same height (in other words, at the same position with respect to the up-down direction), it will be easier for one hand to operate the analog stick and for the other hand to operate the four operation buttons. That is, in the present embodiment, it is possible to provide an information processing device, with which it is easy to operate an analog stick and four operation buttons.
  • controllers of different configurations may be provided as the left controller and/or the right controller (see “[5-1. Other types of controllers]), the details of which will be described later.
  • an information processing device that can be operated in a different fashion from the present embodiment (i.e., the information processing device 1 shown in FIG. 14 ) by attaching the left controller and/or the right controller having a different configuration to the main unit 2 , instead of the left controller 3 and/or the right controller 4 of the present embodiment.
  • the information processing device 1 can be used with the left and right controllers 3 and 4 detached from the main unit 2 (referred to as the “detached state).
  • Possible modes in which the information processing device 1 is used in the detached state include at least two modes, one in which a single user uses two controllers 3 and 4 , and another in which two users each use one controller.
  • FIG. 15 is a diagram showing an example of a single user holding two controllers 3 and 4 to use the information processing device 1 in the detached state. As shown in FIG. 15 , in the detached state, a user can operate one set of two controllers 3 and 4 by holding them respectively in the left hand and the right hand.
  • the information processing device 1 can accept, as an input, the operation of moving the controller itself.
  • a user is allowed to not only perform an operation on the operation sections (the operation buttons and the analog stick) of the controller but also perform an operation of moving the controller itself.
  • the information processing device 1 being a portable device, can allow a user to perform an operation of moving the controller (without moving the display).
  • the information processing device 1 being a portable device, can allow a user to perform an operation away from the display 12 .
  • the information processing device 1 can calculate information regarding movement and/or attitude of the information processing device 1 based on the detection results of the acceleration sensor 89 and/or the angular velocity sensor 90 provided in the main unit 2 not only in the detached state but also in the attached state.
  • the main unit 2 and the controllers 3 and 4 communicate with each other using wireless communication. That is, the main unit 2 receives operation data from the controllers 3 and 4 with which a wireless connection has been established (i.e., which have been paired with the main unit 2 ), and executes an information process based on the received operation data (specifically, using the operation data as an input).
  • the main unit 2 distinguishes between two controllers which are communication partners. That is, the main unit 2 identifies whether the received operation data is from the left controller 3 or from the right controller 4 .
  • the method for distinguishing between controllers will be described later.
  • FIG. 16 is a diagram showing an example of two users each holding one controller to use the information processing device 1 in the detached state.
  • two users can each perform an operation.
  • one user (referred to as the “first user”) can perform an operation by using the left controller 3 while the other user (referred to as the “second user”) can perform an operation by using the right controller 4 .
  • the information processing device 1 executes an information process of controlling the action of a first object (e.g., the player character) in a virtual space based on an operation performed on the left controller 3 , and controlling the action of a second object in the virtual space based on an operation performed on the right controller 4 .
  • a user can operate the operation sections provided on the controller and/or perform an operation of moving the controller itself
  • the positional relationship between the analog stick 52 and the operation buttons 53 to 56 on the right controller 4 is opposite from the positional relationship between these two types of operation sections on the left controller 3 . Therefore, if two users hold the left controller 3 and the right controller 4 in the same orientation, as shown in FIG. 16 , for example, the positional relationship between the two types of operation sections on one of the two controllers will be the same as that on the other controller. That is, in the present embodiment, users can use the left controller 3 and the right controller 4 detached from the main unit 2 in the same fashion with respect to the two types of operation sections. Thus, it is possible to improve the controllability of the controller.
  • the four operation buttons 33 to 36 of the left controller 3 may be used for the same function as the four operation buttons 53 to 56 of the right controller 4 (in other words, they may be used for giving the same instructions).
  • the right direction button 33 may be used for the same function as the Y button 56
  • the lower direction button 34 for the same function as the X button 55
  • the upper direction button 35 for the same function as the B button 54
  • the left direction button 36 for the same function as the A button 53 .
  • the function of the operation buttons 33 to 36 may vary between the attached state and the detached state. Note however that the instruction each operation button is assigned can be freely determined by the program executed on the main unit 2 .
  • the information processing device 1 divides the display area of the display 12 into two display areas to display an image for the first user (e.g., an image including the first object) in one of the display areas and an image for the second user (e.g., an image including the second object) in the other display area. Note however that depending on the application running on the information processing device 1 , the information processing device 1 may display an image for two users (e.g., an image including both the first object and the second object) without dividing the display area of the display 12 .
  • the main unit 2 and the controllers 3 and 4 communicate with each other in wireless communication. Then, the main unit 2 distinguishes between controllers each being a communication partner.
  • the right controller 4 includes the infrared image-capturing section 123 . Therefore, with the right controller 4 detached from the main unit 2 , the information processing device 1 may execute an information process based on the image-capturing results (referred to also as detection results) from the infrared image-capturing section 123 . Examples of the information process include the following processes.
  • the main unit 2 can detect an operation performed on the operation section based on the image-capturing results from the infrared image-capturing section 123 . Therefore, the main unit 2 can execute an information process in accordance with an operation performed on the operation section by using the image-capturing results.
  • the extension controller can be attached and detached to and from the right controller 4 , and include buttons and a movable operation section such as a stick, for example.
  • the extension controller includes, inside the housing, a movable portion that moves (herein, the movement includes rotation) in response to an operation performed on an operation section.
  • the movable portion is a member configured to move in response to a button, which is an operation section, being pressed down.
  • the extension controller is attached to the right controller 4 so that the infrared image-capturing section 123 can capture an image of the movable portion inside the housing.
  • the housing of the extension controller is provided with a window portion, and the infrared image-capturing section 123 can capture an image of the movable portion via the window portion, with the extension controller attached to the right controller 4 . Then, the main unit 2 can identify an operation performed on an operation section of the extension controller based on the position and/or the attitude of the movable portion in the image captured by the infrared image-capturing section 123 . Thus, by using captured images, the main unit 2 may execute an information process in response to an operation performed on the operation section.
  • the main unit 2 may detect a gesture input by using a hand of a user based on the image-capturing results from the infrared image-capturing section 123 so as to execute an information process in accordance with the gesture input.
  • a user holds the right controller 4 in one hand, and an image is captured of the other hand of the user using the infrared camera of the infrared image-capturing section 123 .
  • the infrared camera is arranged so as to capture an image of an area under the right controller 4 . Therefore, a user makes a gesture input by placing the other hand under the right controller 4 .
  • the main unit 2 obtains information based on the image-capturing results from the right controller 4 so as to identify a gesture input made by a user. Then, the main unit 2 executes an information process based on the gesture input.
  • a gesture input may be any input made by using an operation object to be operated (moved) by a user.
  • An operation object may be a body of a user (which may be a part of the body, such as a hand or the face, or may be the whole body), may be an object held by a user, or may include both.
  • the information processing device 1 may recognize the shape of the operation object as a gesture input, may recognize the position, the orientation and/or the movement of the operation object as a gesture input, or may recognize a combination thereof as a gesture input.
  • a user can make a gesture input by using the shape of a hand, the movement of a hand, the position of a hand (with respect to the right controller 4 ), the orientation (attitude) of a hand, etc.
  • the infrared image-capturing section 123 may be used to calculate the position and/or the orientation of the right controller 4 with respect to a predetermined marker. For example, a user may place a marker at an intended position (e.g., around the display 12 or around the TV 6 ), and perform an operation of moving the right controller 4 within such a range that the infrared camera can capture an image of the marker.
  • the marker may be made of a material that reflects infrared light, for example.
  • the information processing device 1 can calculate the position and/or the orientation of the right controller 4 with respect to the marker based on the image-capturing results from the infrared image-capturing section 123 .
  • the information processing device 1 can execute an information process by using the calculated information as a user's operation input.
  • the main unit 2 can also capture an image of a hand of a user (specifically, the vein pattern of the hand) by means of the infrared image-capturing section 123 , to thereby authenticate (specifically, by vein authentication) a user based on the captured image. Moreover, the main unit 2 can also measure the pulse of a user by capturing an image of a hand of the user by means of the infrared image-capturing section 123 . That is, where infrared light output onto and reflected by a hand of a user is detected by the infrared image-capturing section 123 , the main unit 2 can calculate the pulse of the user based on changes in the reflected waves.
  • the information processing device 1 can also execute an information process based on the image-capturing results from the infrared image-capturing section 123 with the right controller 4 attached to the main unit 2 .
  • the main unit 2 is provided with input sections (specifically, the touch panel 13 , the acceleration sensor 89 , the angular velocity sensor 90 , etc.). Therefore, a user can use only the main unit 2 with the controllers 3 and 4 detached therefrom. Then, a user can use the information processing device 1 while the weight thereof is reduced.
  • input sections specifically, the touch panel 13 , the acceleration sensor 89 , the angular velocity sensor 90 , etc.
  • the information processing device 1 may be used with one of the left controller 3 and the right controller 4 detached from the main unit 2 and with the other controller attached to the main unit 2 . Then, the main unit 2 may communicate with the first controller in wireless communication and communicate with the second controller in wired communication.
  • the main unit 2 can communicate with a plurality of left controllers.
  • the main unit 2 can also communicate with a plurality of right controllers. Therefore, in the present embodiment, three or more controllers can be used simultaneously.
  • FIGS. 17A and 17B are diagrams showing example modes of use where three or more controllers are used.
  • FIGS. 17A and 17B show cases in which a total of four controllers, i.e., two left controllers 3 a and 3 b and two right controllers 4 a and 4 b , are used. Note that it is assumed here that the controllers are detached from the main unit 2 .
  • four controllers one can possibly conceive at least a mode in which four users each use one controller ( FIG. 17A ), and a mode in which two users each use two controllers (specifically, a pair of left and right controllers) ( FIG. 17B ).
  • each user uses one of the four controllers 3 a , 3 b , 4 a and 4 b . That is, in the present embodiment, where four controllers are provided, four users, User A to User D, are allowed to perform operations each using a controller. For example, based on an operation performed on a controller, the information processing device 1 executes an information process of controlling the action of an object associated with that controller. Note that also in the modes of use shown in FIGS. 17A and 17B , as in the modes of use shown in FIG. 15 and FIG. 16 , each user is allowed to perform an operation on an operation section provided on the controller and/or to perform an operation of moving the controller itself.
  • the main unit 2 communicates, in wireless communication, with each of the four controllers 3 a , 3 b , 4 a and 4 b .
  • the main unit 2 distinguishes the four controllers 3 a , 3 b , 4 a and 4 b from one another. That is, the main unit 2 identifies from which of the four controllers the operation data has been received.
  • the main unit 2 distinguishes between the left controller 3 a and the left controller 3 b , and distinguishes between the right controller 4 a and the right controller 4 b . Note that the method for distinguishing each controller will be described later.
  • each user uses a pair of controllers from the four controllers 3 a , 3 b , 4 a and 4 b . That is, User A uses a pair of the left controller 3 a and the right controller 4 a , and User B uses a pair of the left controller 3 b and the right controller 4 b .
  • two users can each operate a pair of controllers (which can also be referred to as a set of controllers).
  • the information processing device 1 executes an information process using, as a set of data, two pieces of operation data received from a pair of controllers. For example, based on operations performed on a pair of controllers, the information processing device 1 executes an information process of controlling the action of an object associated with the pair of controllers. Specifically, the action of the first object is controlled based on operations performed on the left controller 3 a and the right controller 4 a , and the action of the second object is controlled based on operations performed on the left controller 3 b and the right controller 4 b.
  • the main unit 2 registers each pair of a left controller and a right controller used by a single user.
  • the main unit 2 executes an information process by using operation data from the two controllers included in the registered pair as a set of data (e.g., as operation data for controlling one control object).
  • a left controller and a right controller can be attached to the main unit 2 so as to register the controllers as a pair, in the present embodiment. That is, the main unit 2 registers, as a pair of controllers, a left controller and a right controller that are attached thereto at the same time. For example, when registering pairs of controllers shown in FIG.
  • a user first attaches the left controller 3 a and the right controller 4 a to the main unit 2 and detaches these two controllers, i.e., the left controller 3 a and the right controller 4 a from the main unit 2 , after which the other controllers, i.e., the left controller 3 b and the right controller 4 b , are attached to the main unit 2 . Then, a pair of the left controller 3 a and the right controller 4 a and a pair of the left controller 3 b and the right controller 4 b can be registered with the main unit 2 . Note that the details of the pairing process will be described later.
  • the information processing system can be used in various other modes, other than the modes of use shown in FIGS. 17A and 17B .
  • the information processing system can be used in a mode in which a user uses a pair of left and right controllers while another user uses one controller.
  • a user can use a controller or controllers attached to the main unit 2 while another user uses a controller or controllers detached from the main unit 2 .
  • FIG. 18 is a diagram showing an example mode of use where images are displayed on a TV.
  • the information processing system in the present embodiment can use the TV 6 as a display device (and a sound output device).
  • FIGS. 19A, 19B, 19C, and 19D are diagrams showing an example operation flow in which images are displayed on a TV.
  • the operation of switching from a mode in which the information processing device 1 is used as a portable device to a mode in which it is used as a console device (i.e., using the TV 6 as a display device) will be described.
  • the cradle 5 is connected to the TV 6 in advance.
  • a charging device (not shown) (e.g., an AC adaptor) is connected to the power terminal 134 of the cradle 5 , and the cradle 5 is receiving commercial power supply.
  • a user uses the information processing device 1 as a portable device, i.e., in a state in which it is not attached to the cradle (( 1 ) shown in FIG. 19A ).
  • the user attaches the information processing device 1 to the cradle 5 (( 2 ) shown in FIG. 19B ).
  • the lower terminal 27 of the information processing device 1 and the main body terminal 73 of the cradle 5 are connected together.
  • the information processing device 1 with the controllers 3 and 4 attached thereto may be attached to the cradle 5 , or the information processing device 1 with the controllers 3 and 4 detached therefrom (i.e., the main unit 2 ) may be attached to the cradle 5 .
  • the information processing device 1 upon detecting the attachment of the information processing device 1 to the cradle 5 , the information processing device 1 turns off the display 12 , the details of which will be described later.
  • the display 12 of the main unit 2 is not used when attached to the cradle 5 .
  • the main unit 2 may still display images on the display 12 when attached to the cradle 5 .
  • the information processing device 1 upon detecting the detachment of the information processing device 1 from the cradle 5 , the information processing device 1 turns on the display 12 .
  • the information processing device 1 can be attached to the cradle 5 either in the front-facing orientation or in the back-facing orientation. This allows a user to place the information processing device 1 in any orientation, thereby facilitating the attachment to the cradle 5 .
  • the cradle 5 may be capable of supporting the information processing device 1 in the back-facing orientation (i.e., in an orientation in which the display 12 is hidden facing the second support 72 ) and not capable of supporting the information processing device 1 in the front-facing orientation (i.e., in an orientation in which the display 12 is not hidden).
  • the display 12 is turned off when the information processing device 1 is placed in the cradle 5 . Although turning off the display 12 is the intended operation by the information processing device 1 , a user may possibly mistake the display 12 being turned off for a breakdown or a malfunction. This can be avoided by not allowing the information processing device 1 to be placed in the cradle 5 in the front-facing orientation.
  • the cradle 5 upon detecting the attachment of the information processing device 1 to the cradle 5 , the cradle 5 starts charging the information processing device 1 . That is, upon detecting the attachment of the information processing device 1 to the cradle 5 , the processing section 135 instructs the power control section 133 to perform an operation of supplying power from the power terminal 134 to the main unit 2 . As the power control section 133 starts this operation, the cradle 5 starts charging the main unit 2 . That is, the battery 98 in the main unit 2 is charged by the power supplied from the cradle 5 via the lower terminal 27 .
  • a controller (specifically, the left controller 3 and/or the right controller 4 ) is attached to the main unit 2 while the main unit 2 is charged by the cradle 5 , the main unit 2 charges the controller attached thereto. That is, in such a case, the power control section 97 of the main unit 2 supplies the power from the cradle 5 via the lower terminal 27 to the controller via a terminal corresponding to the controller attached to the main unit 2 (specifically, the left-side terminal 17 and/or the right-side terminal 21 ). Thus, the controller is charged. That is, when the left controller 3 is charged, the battery of the power supply section 108 is charged by the power supplied via the terminal 42 . When the right controller 4 is charged, the battery of the power supply section 118 is charged by the power supplied via the terminal 64 .
  • the cradle 5 may charge the main unit 2 under a predetermined condition.
  • the power control section 133 of the cradle 5 may perform charging on the condition that the remaining charge of the battery 98 of the main unit 2 attached to the cradle 5 is less than or equal to a predetermined level.
  • the main unit 2 may charge a controller under a predetermined condition.
  • the power control section 97 of the main unit 2 may perform charging on the condition that the remaining charge of the battery of a controller attached to the main unit 2 is less than or equal to a predetermined level.
  • the above operation regarding charging may be executed even when the power of the information processing device 1 is off. That is, even when the information processing device 1 is attached to the cradle 5 when the power of the information processing device 1 is off, the cradle 5 may charge the information processing device 1 (i.e., charge the main unit 2 and/or controllers).
  • the main unit 2 may charge controllers when the main unit 2 is not attached to the cradle 5 . This reduces the possibility that a user can no longer play because the battery of the controller has run out even though the battery 98 of the main unit 2 is still charged sufficiently. As described above, in the present embodiment, the battery 98 of the main unit 2 has a greater charging capacity than the battery of a controller. Thus, it is possible to further reduce such a possibility described above.
  • the information processing device 1 When the information processing device 1 is attached to the cradle 5 , under a predetermined condition, the information processing device 1 outputs images (and sound) to the TV 6 and causes the TV 6 to output the images (and the sound) (( 3 ) shown in FIG. 19C ). That is, the information processing device 1 transmits data of the images and the sound to be output to the cradle 5 under a predetermined condition.
  • the images and the sound to be output in the present embodiment, refer to the images and the sound that are produced or obtained by a program (e.g., an OS program or an application program) running at a point in time when the information processing device 1 is attached to the cradle 5 .
  • the information processing device 1 For example, if a game application is running at such a point in time, the information processing device 1 outputs, to the cradle 5 , data of the images and the sound that are produced by the game application. For example, if an application for obtaining and reproducing video images from the Internet is running at such a point in time, the information processing device 1 transmits, to the cradle 5 , data of the images and the sound that are obtained by the application.
  • the image output condition is the satisfaction of Conditions 1 to 3 below.
  • Condition 1 the cradle 5 is connected to the TV 6 .
  • Condition 2 the cradle 5 is receiving power supply.
  • the cradle 5 is an authorized product (or a genuine product) (in other words, the cradle 5 is a product that is authorized by the manufacturer of the information processing device 1 ).
  • the information processing device 1 determines that the image output condition is satisfied. In this case, the information processing device 1 transmits, to the cradle 5 , the image and the sound to be output. Note that in other embodiments, the information processing device 1 may use, as the image output condition, one or two conditions of Conditions 1 to 3 above, or may use other conditions, as the image output condition, in addition to (or instead of) Conditions 1 to 3 above.
  • the cradle 5 Upon receiving data of the images and the sound from the information processing device 1 , the cradle 5 transmits data of the images and the sound to the TV 6 .
  • “the images and the sound to be output” are output from the TV 6 (( 3 ) shown in FIG. 19C ).
  • controllers (( 4 ) shown in FIG. 19D ).
  • the information processing device 1 even if the image output condition is not satisfied, the information processing device 1 turns off the image display on the display 12 while attached to the cradle 5 . Note however that in other embodiments, if the image output condition is not satisfied, the information processing device 1 may resume the image display on the display 12 .
  • the image display output can be switched from the display 12 to the TV 6 by attaching the information processing device 1 to the cradle 5 while images are displayed on the display 12 of the information processing device 1 . That is, in the present embodiment, a user can easily (and seamlessly) switch the image display output only by placing the information processing device 1 on the cradle 5 .
  • the information processing device 1 can be used in a mode of use in which the images are displayed on the display 12 and a mode of use in which the images are displayed on the TV 6 .
  • the information processing device 1 changes its operation mode depending on these two modes of use. That is, the information processing device 1 can operate at least in two different modes of the portable mode and the console mode. In the portable mode, some of the functions of the information processing device 1 are limited, the details of which will be described later. Note that the switching of the operation mode will be described later (see “[3-5. Changing operation mode]” and “[4-3. Mode setting process]”).
  • the information processing device 1 may be allowed to communicate directly with the TV 6 (i.e., with no cradle 5 interposed therebetween). Then, the information processing device 1 can transmit the image and/or the sound directly to the TV 6 .
  • the communication method used between the information processing device 1 and the TV 6 may be, for example, wired communication using a cable (e.g., an HDMI cable) or wireless communication.
  • the cradle 5 may be used for the purpose of charging the information processing device 1 , for example.
  • the information processing device 1 may transmit the image and/or the sound to the TV 6 at least on the condition that the information processing device 1 is attached to the cradle 5 .
  • a user can perform operations by using a controller or controllers (( 4 ) shown in FIGS. 19A-19D ).
  • a controller or controllers detached from the main unit 2 .
  • a user detaches the left controller 3 and/or the right controller 4 from the main unit 2 as necessary, and performs operations.
  • the controllers 3 and 4 can be detached from the main unit 2 by sliding them upward against the main unit 2 . Therefore, a user can conveniently detach the controllers 3 and 4 from the main unit 2 while the main unit 2 remains attached to the cradle 5 .
  • controllers other than the controllers 3 and 4 attached to the main unit 2 can communicate with the main unit 2 in wireless communication, such other controllers may be used.
  • a controller detached from the main unit 2 is used in a mode of use in which images are displayed on the TV 6 , the controller and the main unit 2 communicate with each other in wireless communication.
  • the mode of use where the TV 6 is used as a display device is similar to the modes of use described above in “[3-2. Mode in which a pair of controllers are used detached]” and “[3-3. Mode in which three or more controllers are used]”, except that the TV 6 is used instead of the display 12 . That is, in the present embodiment, also in the case where the TV 6 is used as a display device, as in the case in which the display 12 is used, one user may perform operations using a pair of controllers (see FIG. 18 ), or two users may perform operations each using one controller. When controllers other than the left controller 3 and the right controller 4 are provided, three or more users may perform operations each using one controller, or two or more users may perform operations each using a pair of controllers.
  • the information processing device 1 can be used in a mode of use in which images are displayed on the display 12 and a mode of use in which images are displayed on the TV 6 .
  • the information processing device 1 changes its operation mode depending on these two modes of use. That is, the information processing device 1 can operate at least in two different modes of the portable mode and the console mode.
  • the portable mode is a mode in which the information processing device 1 is used as a portable device.
  • images obtained or produced by the information processing device 1 are displayed on the display 12 .
  • Sound obtained or produced by the information processing device 1 is output from the speaker 88 .
  • settings of the information processing device 1 are changed as shown in (a) and (b) below.
  • the clock frequency range over which the CPU 81 is allowed to operate is limited.
  • the clock frequency at which the CPU 81 is allowed to operate can be specified within a predetermined range by the program executed on the main unit 2 .
  • the range over which a clock frequency can be specified by the program is limited as compared with that in the console mode.
  • the range over which a clock frequency can be specified in the console mode is X 1 [Hz] or less, whereas it is limited to X 2 ( ⁇ X 1 ) [Hz] in the portable mode.
  • the main unit 2 includes a GPU (Graphics Processing Unit) in addition to a CPU, the range of processing power (i.e., clock frequency) may be limited for the CPU and/or for the GPU.
  • the image rendering capacity (which can also be referred to as the image generating capacity) is limited.
  • the resolution in other words, the number of pixels
  • the main unit 2 is lower than that in the console mode.
  • the amount of computation to be executed on the main unit 2 is limited in portable mode, and it is therefore possible to reduce the heat generation and the power consumption of the main unit 2 .
  • the operation of the cooling fan 96 is limited.
  • the rotations per minute (rpm) at which the cooling fan 96 is allowed to operate (in other words, the fan speed) is limited to be lower than the highest rpm at which the cooling fan 96 is allowed to operate in the console mode.
  • the cooling fan 96 is controlled to operate at a predetermined rpm or less, which is lower than the maximum rpm.
  • the console mode is a mode in which the information processing device 1 is used as a console-type device.
  • the console mode images obtained or produced by the information processing device 1 are displayed on the TV 6 .
  • the sound obtained or produced by the information processing device 1 is output from the speaker of the TV 6 .
  • the console mode the function suppressions in the portable mode are lifted. That is, in the console mode, the limitations (a) and (b) described above are lifted. Therefore, in the console mode, a program running on the information processing device 1 can make better use of the processing power of the CPU 81 .
  • the information processing device 1 can display, on the TV 6 , images of a higher resolution than in the portable mode.
  • the information processing device 1 can be cooled by the cooling fan 96 more effectively than in the portable mode.
  • the switching between the portable mode and the console mode is done as follows. That is, when the information processing device 1 is attached to the cradle 5 and the image output condition described above is satisfied, the information processing device 1 switches the operation mode to the console mode. On the other hand, when the information processing device 1 is detached from the cradle 5 , the information processing device 1 switches the operation mode to the portable mode. Note that the switching of the operation mode will be described later (see FIG. 24 ).
  • the information processing device 1 may impose only one of the limitations (a) and (b) described above in the portable mode.
  • the information processing system may use different communication methods for the communication with external devices via a network (e.g., the Internet) (in other words, may switch between different communication methods) in the portable mode and in the console mode.
  • a network e.g., the Internet
  • the information processing device 1 may communicate with the network using the network communication section 82 .
  • the information processing device 1 may communicate with the network via the cradle 5 and the network connection device.
  • settings in the application may be changed between the portable mode and the console mode.
  • particulars of the game e.g., the rules, the operation method, etc., of the game
  • the portable mode and the console mode see FIG. 25 .
  • the operation mode (the portable mode and the console mode) is of a different level from the ON mode and the sleep mode. That is, the main unit 2 may be either in the ON mode or in the sleep mode when the operation mode is the portable mode. Or, the main unit 2 may be either in the ON mode or in the sleep mode when the operation mode is the console mode.
  • the main unit 2 and the controllers 3 and 4 communicate with each other.
  • the main unit 2 executes a registration process for registering controllers so that the main unit 2 can identify controllers to communicate with the main unit 2 .
  • those controllers offered (specifically, sold) together with the main unit 2 may be pre-registered with the main unit 2 at the time of offering.
  • FIG. 20 is a flow chart showing an example flow of a registration process executed on the main unit 2 .
  • the registration process is executed in response to a controller being attached to the main unit 2 . That is, the main unit 2 detects the attachment of the left controller 3 or the right controller 4 , and starts executing the registration process shown in FIG. 20 in response to detecting the attachment.
  • the sensing method may be a method based on a signal state (e.g., a voltage state) at a predetermined pin included in the terminal of the main unit 2 and/or controller.
  • the sensing method may be a method in which the subject device mechanically senses the connection of a terminal of another device thereto.
  • the main unit 2 and a controller may both have the sensing function described above, or either one of them may have the sensing function. Where only one of the devices has the sensing function, that device may indicate, to the other device, that the connection has been sensed, as necessary.
  • Some of the processes executed by the main unit 2 may be executed by another information processing device capable of communicating with the main unit 2 (e.g., a server capable of communicating with the main unit 2 via a network). That is, the processes shown in FIG. 20 , FIG. 23 to FIG. 27 may be executed through a cooperation of a plurality of information processing devices including the main unit 2 .
  • step S 1 the CPU 81 determines whether or not the controller sensed to have been attached to the main unit 2 has been registered.
  • the main unit 2 stores, in a storage section thereof (e.g., the flash memory 84 ), registration information representing controllers that have been registered. The determination of step S 1 is based on this registration information.
  • FIG. 21 is a diagram showing an example registration information.
  • the registration information is information in which number information, identification information and wireless communication information are associated together, as shown in FIG. 21 .
  • the number information is a number assigned to a registered controller.
  • the indicator LED described above on the controller may be controlled to represent a value based on this number.
  • the identification information is information representing a value (e.g., an ID) that is uniquely assigned to a controller. With this identification information, it is possible to uniquely identify a controller.
  • identification information includes information representing whether a controller is a left controller or a right controller. That is, the main unit 2 can determine whether a controller is a left controller or a right controller based on the identification information assigned to the controller. Note that in other embodiments, the identification information does not need to include information representing whether a controller is a left controller or a right controller. In such a case, the registration information may include information representing whether a controller is a left controller or a right controller (separately from the identification information). Note that the CPU 81 can determine whether a controller is a left controller or a right controller based on whether the identification information of the controller has been obtained through the left-side terminal 17 or through the right-side terminal 21 .
  • the wireless communication information indicates whether or not the connection setting (i.e., pairing) of the device for wireless communication with the main unit 2 has been done. That is, if the pairing between the main unit 2 and the controller has been done, information representing “registered” is stored as the wireless communication information of the controller. On the other hand, if the pairing between the main unit 2 and the controller has not been done, information representing “unregistered” is stored as the wireless communication information of the controller.
  • the main unit 2 may store information regarding the connection setting for wireless communication (separately from the registration information) so as to eliminate the need to perform pairing again for a controller for which pairing has been done.
  • a part of the registration information may be deleted or changed in response to an instruction from a user.
  • the main unit 2 may delete information regarding a specified controller or change a number assigned to the controller, in response to an instruction from a user.
  • step S 1 the CPU 81 first obtains identification information from a controller sensed to have been attached.
  • a memory of a controller e.g., the memories 102 and 112 ) stores, in advance, identification information assigned to the controller.
  • a controller transmits the identification information thereof to the main unit 2 in response to the controller being connected to the main unit 2 or in response to a request from the CPU 81 .
  • the CPU 81 determines whether or not a controller sensed to have been attached has been registered based on whether or not the obtained identification information is included in the registration information. If the determination result of step S 1 is negative, the process of step S 2 is executed. On the other hand, if the determination result of step S 1 is affirmative, the process of step S 3 is executed, skipping the process of step S 2 .
  • step S 2 the CPU 81 registers a controller sensed to have been attached. That is, the CPU 81 updates the registration information stored in the storage section so that the number information, the identification information and the wireless communication information of the controller are associated together and added to the registration information.
  • the number information may be, for example, information representing a number that has not been used for any other registered controller.
  • the identification information may be identification information obtained from the controller in step S 1 described above.
  • the wireless communication information is set to “unregistered” since pairing has not been done at this point in time.
  • step S 3 the CPU 81 determines whether or not both a left controller and a right controller are attached to the main unit 2 . That is, it is determined whether or not the attachment of the left controller 3 and the right controller 4 to the main unit 2 can be sensed at the present point in time. If the determination result of step S 3 is affirmative, the process of step S 4 is executed. On the other hand, if the determination result of step S 3 is negative, the CPU 81 ends the registration process, skipping the process of step S 4 .
  • step S 4 the CPU 81 registers the left controller 3 and the right controller 4 attached to the main unit 2 as a pair.
  • the main unit 2 stores pairing information representing a pair of a left controller and a right controller in the storage section (e.g., the flash memory 84 ).
  • FIG. 22 is a diagram showing an example of pairing information.
  • the pairing information is information in which left identification information and right identification information are associated together.
  • the left identification information is identification information of a left controller registered as a pair, from among registered controllers (in other words, controllers represented by identification information included in the registration information).
  • the right identification information is identification information of a right controller registered as a pair, from among registered controllers (in other words, controllers represented by identification information included in the registration information).
  • Left identification information and right identification information that are associated together in the pairing information represent a pair of a left controller and a right controller.
  • step S 4 described above the CPU 81 first obtains identification information of the left controller and the right controller attached to the main unit 2 . Note that for a controller whose identification information has been obtained in step S 1 described above, the identification information does not need to be obtained again. Next, the CPU 81 adds, to the pairing information, the obtained identification information of the left controller and that of the right controller, while they are associated together. That is, the CPU 81 updates pairing information so as to add the pair of two identification information obtained to the pairing information (strictly speaking, unupdated pairing information). Then, the updated pairing information is stored in the storage section. Note that if information representing the pair of two identification information is already included in the pairing information, the CPU 81 does not need to update the pairing information.
  • the CPU 81 deletes such a pair from the pairing information. Thus, any pair including one of two controllers that are included in a pair to be registered is unregistered (i.e., deleted from the pairing information). After step S 4 , the CPU 81 ends the registration process.
  • the controller in response to a controller being attached to the main unit 2 , the controller is registered with the main unit 2 .
  • a user can easily register controllers.
  • the registration process since the registration process is executed when the main unit 2 and the controllers communicate with each other in wired communication, it is possible to reduce the possibility that the main unit 2 cannot obtain information (specifically, identification information) from controllers, and to reduce the possibility that registration fails.
  • an accessory (see FIG. 31 ) may be provided to which a left controller and a right controller can be attached at the same time, the details of which will be described later. Then, when a left controller and a right controller are both attached to the accessory, the main unit 2 registers these two controllers as a pair, as when a left controller and a right controller are both attached to the main unit 2 .
  • the details of such a pairing process will be described in “[5-2. Accessory for controllers]” below.
  • the main unit 2 may register, as a pair, two controllers attached to the main unit 2 at the same time as in the embodiment described above, or may register, as a pair, two controllers that are not attached to the main unit 2 at the same time, unlike in the embodiment described above.
  • the main unit 2 may register, as a pair, a left controller last attached thereto and a right controller last attached thereto.
  • any other pairing method may be used for registering pairs of controllers.
  • the main unit 2 may register, as a pair, two controllers on which a predetermined operation is performed within a predetermined period of time.
  • the main unit 2 prompts a user to perform a predetermined operation in response to satisfaction of a predetermined condition (e.g., a pairing instruction has been received from a user).
  • the main unit 2 registers, as a pair, two controllers on which the predetermined operation is performed within a predetermined period of time from the prompting, from among two or more controllers being in communication with the main unit 2 .
  • the “predetermined operation” may be, for example, an operation of pressing A button, or may be an operation of bumping two controllers against each other (this operation can be determined, for example, based on detection results of an acceleration sensor).
  • the main unit 2 and the controllers 3 and 4 may communicate with each other in wireless communication. Therefore, in the present embodiment, the information processing device 1 performs a registration (referred to also as pairing) for enabling wireless communication between the main unit 2 and the controllers. Note that those controllers offered (specifically, sold) together with the main unit 2 may be pre-registered with the main unit 2 for wireless communication at the time of offering.
  • FIG. 23 is a flow chart showing an example flow of a wireless setting process executed on the main unit 2 .
  • the wireless setting process is executed in response to a controller being removed from the main unit 2 . That is, the main unit 2 senses the removal of the left controller 3 or the right controller 4 , and starts executing the wireless setting process shown in FIG. 23 in response to the removal (referred to as “detachment”).
  • detachment a similar method to the method for sensing the attachment of a controller to the main unit 2 may be used, for example.
  • step S 11 the CPU 81 determines whether or not the controller sensed to have been detached from the main unit 2 has been paired for wireless communication. This determination can be done by using the registration information described above. That is, if wireless communication information included in the registration information indicates “registered” for the controller sensed to have been detached, the CPU 81 determines that the controller has been paired for wireless communication. On the other hand, if wireless communication information included in the registration information indicates “unregistered” for the controller sensed to have been detached, the CPU 81 determines that the controller has not been paired for wireless communication. If the determination result of step S 11 is negative, the process of step S 12 is executed. On the other hand, if the determination result of step S 11 is affirmative, the CPU 81 ends the wireless setting process, skipping the process of step S 12 .
  • step S 12 the CPU 81 executes a pairing process for enabling wireless communication with a controller sensed to have been detached.
  • the pairing process in the present embodiment may be similar to a pairing process of a conventional communication technique in accordance with the Bluetooth (registered trademark) standard.
  • a controller executes the pairing process in response to being removed from the main unit 2 , the details of which will be described later. Thus, the main unit 2 and the controller are paired together, establishing a wireless connection therebetween.
  • step S 13 is executed.
  • step S 13 the CPU 81 determines whether or not the pairing with the controller sensed to have been detached has succeeded. Note that the process of step S 13 is executed after lapse of a predetermined period of time from the execution of step S 12 . If the determination result of step S 13 is affirmative, the process of step S 14 is executed. On the other hand, if the determination result of step S 13 is negative, the CPU 81 ends the wireless setting process, skipping the process of step S 14 .
  • step S 14 the CPU 81 updates the registration information. That is, the CPU 81 changes the wireless communication information included in the registration information to “registered” for the controller sensed to have been detached. Following step S 14 , the CPU 81 ends the wireless setting process.
  • the controllers 3 and 4 execute a similar process (referred to as a “controller-side wireless setting process”) to that of the main unit 2 .
  • a controller stores registration information in a storage section (e.g., the memories 102 and/or 112 ), wherein identification information of the main unit to which the controller is attached and wireless communication information indicating that whether or not the controller has been paired with the main unit are associated together in the registration information.
  • a controller (specifically, the communication control section 101 or 111 ) determines whether or not it has been paired with the main unit 2 in response to the removal from the main unit 2 . If it has not been paired, the controller executes the pairing process.
  • the controller determines whether or not the pairing has succeeded, and updates the registration information if the pairing has succeeded.
  • the main unit 2 and the controller each perform the pairing process, and a wireless connection is established upon successful pairing.
  • a setting process for wireless communication between the main unit 2 and the controller is executed to establish a wireless connection.
  • a controller communicates in wired communication while it is attached to the main unit 2 , and in wireless communication while it is removed from the main unit 2 . Therefore, by executing the setting process when the controller is removed from the main unit 2 , it is possible to establish a wireless connection with appropriate timing. Note that in the present embodiment, since a controller is registered when the controller is attached to the main unit 2 for the first time, the setting process is executed when the controller is removed from the main unit 2 for the first time.
  • the wireless connection is not established when a registered controller is removed from the main unit 2 ”, thus providing the information processing device 1 that is easy to use for a user.
  • a user can easily cause the information processing device 1 to execute the setting process described above by removing a controller from the main unit 2 .
  • the condition on which the setting process for wireless communication is executed on the information processing device 1 may be any condition, and it not limited to the removal of a controller from the main unit 2 .
  • the setting process may be executed in response to a controller being attached to the main unit 2 .
  • the setting process may be executed in response to an instruction from a user.
  • the main unit 2 may execute the wireless setting process ( FIG. 23 ) in response to a predetermined instruction from a user.
  • the predetermined instruction may be given by a user selecting a predetermined item displayed on a menu screen, or may be given by a user pressing a predetermined button provided on the main unit 2 .
  • a controller may execute the “controller-side wireless setting process” described above. This predetermined instruction may be given by pressing down a predetermined button provided on the controller.
  • the main unit 2 and the controller each determine whether or not the controller and the main unit 2 have been paired together, so that the pairing process is executed if they have not been paired together.
  • the main unit 2 and the controller may execute the pairing process, upon removal of the controller from the main unit 2 , without performing the determination process.
  • the controller regards the main unit 2 as its communication partner. That is, the controller transmits operation data in wired communication to the main unit 2 , to which the controller is attached.
  • the controller regards the other main unit 2 as its communication partner. That is, the controller transmits operation data in wireless communication to the other main unit 2 , with which the controller has been paired.
  • the controller transmits operation data to the main unit 2 , irrespective of whether it is attached to the main unit 2 .
  • the controller regards one main unit that satisfies a predetermined condition as its communication partner.
  • a predetermined condition it may be, for example, “being the main unit with which the controller has last been paired”. Note that if a controller is not attached to the main unit 2 and there is no main unit 2 with which the controller has been paired, the controller does not transmit operation data.
  • a controller If a controller is able to communicate with a main unit 2 , the controller repeatedly transmits operation data to the main unit 2 , as its communication partner, at a rate of once per a predetermined period of time. If there is no main unit 2 with which a controller is able to communicate (i.e., where the main unit 2 as its communication partner has been turned OFF or set in the sleep mode), the controller stops transmitting operation data to a main unit 2 . If there is newly a main unit 2 with which a controller is able to communicate (i.e., where the main unit 2 as its communication partner has been turned ON or set in the ON mode), the controller resumes transmitting operation data to the main unit 2 .
  • the information processing device 1 operates in the portable mode and in the console mode. Therefore, the information processing device 1 executes a process (referred to as the “mode setting process”) of switching the operation mode between the portable mode and the console mode.
  • a process referred to as the “mode setting process”
  • FIG. 24 is a flow chart showing an example flow of a mode setting process executed on the main unit 2 .
  • a series of processes of steps S 21 to S 29 shown in FIG. 24 is executed repeatedly. Note that in the present embodiment, it is assumed that at the start of the process (i.e., when the power is turned ON), the operation mode of the main unit 2 is set to the portable mode.
  • step S 21 the CPU 81 determines whether or not the main unit 2 is attached to the cradle 5 .
  • the CPU 81 may perform the sensing based on a signal state (e.g., a voltage state) at a predetermined pin included in the terminal of the main unit 2 and/or the cradle 5 . If the determination result of step S 21 is affirmative, the process of step S 22 is executed. On the other hand, if the determination result of step S 21 is negative, the process of step S 21 is executed again.
  • a signal state e.g., a voltage state
  • step S 21 is executed at a rate of once per a predetermined period of time, for example.
  • step S 22 the CPU 81 turns off the display 12 . Specifically, the CPU 81 stops supplying power to the display 12 . The CPU 81 also stops outputting image data to the display 12 . Following step S 22 , the process of step S 23 is executed.
  • the main unit 2 turns OFF the display 12 at a point when the main unit 2 is attached to the cradle 5 .
  • the main unit 2 may turn OFF the display 12 at a point when the image is output from the main unit 2 to the TV 6 , rather than turning OFF the display 12 (i.e., the display 12 is left ON) at the point when the main unit 2 is attached to the cradle 5 .
  • the main unit 2 may not turn OFF the display 12 even when the image is output from the main unit 2 to the TV 6 so that the image is displayed both on the display 12 and on the TV 6 .
  • the cradle 5 may be configured so that the display 12 can be viewed while the main unit 2 is attached to the cradle 5 .
  • the main unit 2 may generate two different images so that different images are displayed on the TV 6 and on the display 12 .
  • step S 23 the CPU 81 determines whether or not the image output condition is satisfied. That is, the CPU 81 determines whether or not Conditions 1 to 3 described above are satisfied. Note that Conditions 1 to 3 can be determined based on information obtained from the cradle 5 . Specifically, for “Condition 1: the cradle 5 is connected to the TV 6 ”, the CPU 81 may obtain information from the cradle 5 that indicates whether or not the cradle 5 is connected to the TV 6 , and make the determination based on the obtained information.
  • the CPU 81 may obtain information from the cradle 5 that indicates whether or not power is supplied from the power terminal 134 to the cradle 5 , and make the determination based on the obtained information.
  • the CPU 81 may obtain predetermined information with which it is possible to identify an authorized product, and make the determination based on the obtained information.
  • the predetermined information may for example be identification information assigned to the cradle or information that shows predetermined content if the product is an authorized product (in other words, information that shows different content if the product is not an authorized product).
  • the CPU 81 obtains the information at a predetermined timing from the cradle 5 , and determines whether or not the image output condition is satisfied based on the obtained information.
  • the predetermined timing There is no particular limitation on the predetermined timing, but it may be, for example, when the main unit 2 is attached to the cradle 5 or when step S 23 is executed.
  • step S 23 If the determination result of step S 23 is affirmative, the process of step S 25 to be described later is executed. On the other hand, if the determination result of step S 23 is negative, the process of step S 24 is executed. Note that in this case, no image will be displayed on the TV 6 even though a user has attached the information processing device 1 to the cradle 5 . Therefore, the information processing device 1 may indicate to the user that no image will be displayed on the TV 6 . For example, the information processing device 1 may indicate by outputting a sound from the speaker 88 or by displaying an image on the display 12 .
  • the information processing device 1 may indicate by controlling the light-emitting portion. If an indicator light-emitting portion is provided on the cradle 5 , the information processing device 1 may control the light-emitting portion so as to indicate whether the operation mode is the portable mode or the console mode.
  • an indicator light-emitting portion e.g., an LED provided on the sleep button 74
  • the information processing device 1 may indicate by controlling the light-emitting portion. If an indicator light-emitting portion is provided on the cradle 5 , the information processing device 1 may control the light-emitting portion so as to indicate whether the operation mode is the portable mode or the console mode.
  • step S 24 the CPU 81 determines whether or not the main unit 2 has been removed from the cradle 5 . If the determination result of step S 24 is affirmative, the process of step S 29 to be described later is executed. On the other hand, if the determination result of step S 24 is negative, the process of step S 23 described above is executed again. That is, while the main unit 2 is attached to the cradle 5 and the image output condition is not satisfied, the CPU 81 repeatedly executes a series of processes of steps S 23 and S 24 . The series of processes is repeatedly executed at a rate of once per a predetermined period of time.
  • step S 25 the CPU 81 outputs images to the TV 6 . That is, the CPU 81 outputs data of “the images and the sound to be output” to the cradle 5 via the lower terminal 27 .
  • the cradle 5 transfers the data to the TV 6 . That is, the format of the data is converted through the conversion section 131 of the cradle 5 , and the data is output to the TV 6 via the monitor terminal 132 . Thus, the images and the sound are output from the TV 6 .
  • the CPU 81 After the process of step S 25 , the CPU 81 repeatedly executes the process of outputting the images to the TV 6 .
  • step S 25 the process of step S 26 is executed.
  • step S 26 the CPU 81 changes the operation mode to the console mode. That is, the CPU 81 changes the settings of the main unit 2 so as to lift the functional limitations of the main unit 2 in the portable mode as described in “[3-5. Changing operation mode]” above.
  • step S 27 the process of step S 27 is executed.
  • step S 27 the CPU 81 determines whether or not the main unit 2 has been removed from the cradle 5 . If the determination result of step S 27 is affirmative, the process of step S 28 is executed. On the other hand, if the determination result of step S 27 is negative, the process of step S 27 is executed again. That is, while the operation mode is the console mode, the CPU 81 repeatedly executes the process of step S 27 .
  • the process of step S 27 is executed at a rate of once per a predetermined period of time, for example.
  • step S 28 the CPU 81 changes the operation mode to the portable mode. That is, the CPU 81 changes the settings of the main unit 2 so as to impose functional limitations on the main unit 2 as described in “[3-5. Changing operation mode]” above. The CPU 81 also stops the image output to the TV 6 . Following step S 28 , the process of step S 29 is executed.
  • step S 29 the CPU 81 outputs images to the display 12 . That is, the CPU 81 displays the “images to be output” on the display 12 . The CPU 81 also outputs the “sound to be output” from the speaker 88 (or the sound input/output terminal 25 ). After the process of step S 29 , the CPU 81 repeatedly executes the process of outputting images to the display 12 . Following step S 29 , the process of step S 21 is executed again.
  • the information processing device 1 can switch between the portable mode, in which images are displayed on the display 12 , and the console mode, in which images are displayed on the TV 6 .
  • the condition for changing the operation mode to the console mode may be any condition.
  • the condition may be as follows.
  • the image output condition may include a condition that “there has been a user instruction to output images to the TV 6 ”.
  • the cradle 5 may include an output instruction button for giving an instruction to output the images obtained or produced by the information processing device 1 to the TV 6 . Then, the information processing device 1 may use, as one of the image output conditions, a condition that “the output instruction button has been pressed while the information processing device 1 is attached to the cradle 5 ”.
  • the image output condition may include a condition that “the TV 6 is capable of displaying the images from the cradle 5 ”. Specifically, this condition is that the power of the TV 6 is ON and the input select setting thereof is such that input images from the cradle 5 are displayed.
  • the information processing device 1 may further perform a control, over the TV 6 , to turn ON the power of the TV 6 , and/or to switch the input select so as to display an input from the cradle 5 . Then, a user does not need to operate the TV 6 so as to satisfy the condition described above (i.e., that the TV 6 is able to display the images from the cradle 5 ), and it is possible to easily display the image on the TV 6 .
  • the information processing device 1 can perform a control as described above by causing the cradle 5 to transmit a control signal (e.g., a CEC command of the HDMI standard) to the TV 6 .
  • a control signal e.g., a CEC command of the HDMI standard
  • the information processing device 1 may change the image output condition in response to an instruction from a user.
  • the information processing device 1 may present a plurality of different conditions, as the image output conditions, on a predetermined setting changing screen, and allow a user to select one of the conditions. This is convenient for a user because the user can choose a condition that suits the user's way of using the device.
  • information processing system may use a switching-type button as the output instruction button so as to allow a user to change the condition. That is, a first state and a second state can be switched from one to another by pressing the output instruction button of the cradle 5 . Then, if the output instruction button is in the first state, the information processing device 1 interprets this to mean that there is a user instruction to output the images on the TV 6 , whereas if the output instruction button is in the second state, the information processing device 1 interprets this to mean that there is no user instruction to output the images on the TV 6 .
  • a user can cause the images to be output on the TV 6 (under a certain condition) in response to the attachment of the main unit 2 to the cradle, by leaving the output instruction button in the first state.
  • the information processing device 1 determines whether a user has attached the information processing device 1 to the cradle 5 merely for charging, or has attached the information processing device 1 to the cradle 5 to display the images on the TV 6 .
  • the information processing device 1 may make the determination above by any other method.
  • the information processing device 1 may make the determination described above based on the state of the information processing device 1 regarding the execution of an application.
  • the state of the information processing device 1 regarding the execution of an application is, for example, a state regarding whether an application is running on the information processing device 1 or a state regarding whether the images produced or obtained by a running application are displayed on the display 12 .
  • the information processing device 1 may determine that a user has attached the information processing device 1 to the cradle 5 in order to display the images on the TV 6 .
  • the information processing device 1 may output the images to the TV 6 on the condition that the image output condition is satisfied. That is, the images output condition may include a condition that “an application is running or the image produced or obtained by a running application are displayed on the display 12 ”.
  • the information processing device 1 may store, in a storage section (e.g., the flash memory 84 ), the operation mode at the point in time when the power of the information processing device 1 is turned OFF. Then, when the power of the information processing device 1 is turned ON, if the operation mode stored in the storage section indicates the portable mode, the information processing device 1 may execute the mode setting process from step S 21 , whereas if the operation mode stored in the storage section indicates the console mode, the information processing device 1 may execute the mode setting process from step S 27 .
  • the operation mode of the information processing device 1 at the time of power-off can be carried on when the power is next turned ON.
  • the information processing device 1 operates one of operation modes including the portable mode and the console mode. Therefore, in the information processing device 1 , information processes executed by various programs executed on the main unit 2 (e.g., the OS program and application programs) may be of different content between the portable mode and the console mode.
  • various programs executed on the main unit 2 e.g., the OS program and application programs
  • An example information process, in which a different process is executed depending on the operation mode, will now be described.
  • FIG. 25 is a flow chart showing an example flow of an information process executed on the main unit 2 .
  • a series of information processes shown in FIG. 25 is performed as the CPU 81 executes a program (e.g., an application program or an OS program) that is executable by the information processing device 1 , for example.
  • a program to be executed is an application program
  • the series of information processes shown in FIG. 25 is started in response to a user instruction to launch the application program.
  • the series of information processes shown in FIG. 25 is started in response to the power of the main unit 2 being turned ON.
  • step S 31 the CPU 81 obtains operation data from each controller.
  • the CPU 81 may obtain operation data from any controller with which the CPU 81 can communicate whether in wired communication or wireless communication.
  • the main unit 2 may obtain operation data with substantially the same frequency from controllers it is communicating in wired communication and from controllers it is communicating in wireless communication. Then, since there is no difference between controllers depending on their communication method, it is possible to improve the controllability of controllers.
  • step S 32 is executed.
  • the CPU 81 may obtain operation data from controllers that satisfy a predetermined condition and not obtain operation data from controllers that do not satisfy the condition.
  • the predetermined condition may be a communication-related condition. That is, the CPU 81 may obtain operation data from controllers that communicate with the main unit 2 in wired communication and not obtain operation data from controllers that communicate with the main unit 2 in wireless communication.
  • the predetermined condition may be a condition regarding the function of a controller.
  • the CPU 81 may obtain operation data from controllers having a predetermined function (e.g., controllers having an analog stick, controllers having the NFC communication function, controllers having an acceleration sensor, etc.) and not obtain operation data from controllers not having these predetermined functions. If there is a controller that is capable of communication but that does not obtain operation data, the main unit 2 may indicate to the user that the controller is not available for use on the running program.
  • controllers having a predetermined function e.g., controllers having an analog stick, controllers having the NFC communication function, controllers having an acceleration sensor, etc.
  • the CPU 81 may not use the operation data obtained from the controllers that do not satisfy the predetermined condition in the process or step S 33 or S 34 to be described later.
  • step S 32 the CPU 81 determines whether or not the operation mode of the information processing device 1 is the portable mode (i.e., whether it is the portable mode or the console mode).
  • the information processing device 1 has information indicating the current operation mode stored in a predetermined storage section (e.g., the flash memory 84 ). The determination of step S 32 is made based on this information. If the determination result of step S 32 is affirmative (i.e., if the operation mode is the portable mode), the process of step S 33 is executed. On the other hand, if the determination result of step S 32 is negative (i.e., if the operation mode is the console mode), the process of step S 34 is executed again.
  • step S 33 the CPU 81 executes an information process based on operation data in accordance with settings in the portable mode.
  • the processing power of the main unit 2 is limited. That is, the CPU 81 executes the information process under such a limitation.
  • the CPU 81 executes a predetermined process (e.g., a game process) using, as an input, the operation data obtained in step S 31 , and produces images representing the process results, which are displayed on the display 12 .
  • a predetermined process e.g., a game process
  • step S 34 the CPU 81 executes an information process based on operation data in accordance with settings in the console mode.
  • the CPU 81 may execute a predetermined process (e.g., a game process) using, as an input, the operation data obtained in step S 31 , and produces images representing the process results, which are displayed on the TV 6 .
  • a predetermined process e.g., a game process
  • the CPU 81 may use different content of the predetermined process to be executed in steps S 33 and S 34 described above depending on the operation mode.
  • the program is a game program (i.e., if a game application is executed)
  • the information processing device 1 may use different game content depending on the operation mode.
  • levels that can be played may differ between the portable mode and the console mode, and there may be levels that can only be played in the portable mode or there may be levels that can only be played in the console mode.
  • the game mode may differ in the portable mode and in the console mode.
  • the information processing device 1 may use different screen display settings for each operation mode.
  • the “screen display settings” refer, for example, to those settings that are directed to the size of icons on the menu screen and various information to be displayed together with images representing the game space (e.g., images representing the status of the player character, the map image, etc.). Therefore, in the portable mode, for example, since the images are displayed on the display 12 whose screen size is assumed to be small, graphical symbols such as icons are displayed to be larger with respect to the displayed images (in other words, the display screen), whereas in the console mode, since the images are displayed on the TV 6 whose screen size is assumed to be large, those graphical symbols may be displayed to be smaller with respect to the displayed images.
  • step S 35 the CPU 81 determines whether or not to end the series of information processes shown in FIG. 25 . This determination is made based on whether or not there has been a user instruction to end, for example. If the determination result of step S 35 is negative, the process of step S 31 is executed again. Thereafter, the series of processes of steps S 31 to S 35 is repeatedly executed until it is determined in step S 35 that the series of information processes is to be ended. On the other hand, if the determination result of step S 35 is affirmative, the CPU 81 ends the series of information processes shown in FIG. 25 .
  • the information processing device 1 may be configured so that the OS or an application may execute a different process depending on the operation mode. Then, for example, it is possible to give the user a different kind of entertainment (e.g., a different game level or a different game mode) for each operation mode. For example, it is possible to save the trouble for the user to change settings by automatically changing settings (e.g., the screen display settings) depending on the operation mode.
  • the OS or an application may execute a different process depending on the operation mode. Then, for example, it is possible to give the user a different kind of entertainment (e.g., a different game level or a different game mode) for each operation mode. For example, it is possible to save the trouble for the user to change settings by automatically changing settings (e.g., the screen display settings) depending on the operation mode.
  • the information processing device 1 can be used both in a state in which a controller is removed from the main unit 2 and a state in which they are attached to the main unit 2 . Therefore, the information processing device 1 may be configured so that the content of information processes to be executed by various programs executed on the main unit 2 (e.g., the OS program and application programs) may be different from each other depending on the attached state of a controller to the main unit 2 (i.e., whether the controller is attached to or removed from the main unit 2 ). Examples of information processes depending on the attached state of a controller will now be described below.
  • FIG. 26 is a flow chart showing an example flow of an information process executed on the main unit 2 .
  • the series of information processes shown in FIG. 26 is executed by an application program or an OS program executable on the information processing device 1 , for example.
  • the series of information processes shown in FIG. 26 is also started in response to a user instruction to launch an application program or in response to the power of the main unit 2 being turned ON.
  • step S 41 the CPU 81 determines whether or not a controller is attached to the main unit 2 .
  • the CPU 81 determines whether or not a controller is attached to the main unit 2 for each of the left and right controllers 3 and 4 . If it is determined that at least one of the left and right controllers 3 and 4 is attached to the main unit 2 , the determination result of step S 41 is affirmative. If the left and right controllers 3 and 4 are both removed from the main unit 2 , the determination result of step S 41 is negative. If the determination result of step S 41 is affirmative, the process of step S 42 is executed. On the other hand, if the determination result of step S 41 is negative, the process of step S 43 to be described later is executed.
  • step S 42 the CPU 81 obtains operation data from a controller attached to the main unit 2 via a wired connection. That is, the CPU 81 obtains operation data received via the left-side terminal 17 and the right-side terminal 21 . Note that where only one of the left and right controllers 3 and 4 is attached, the CPU 81 obtains operation data only from the terminal corresponding to that controller. Note that the obtained operation data is stored in a predetermined storage section (e.g., the DRAM 85 ). Following step S 42 , the process of step S 43 is executed.
  • a predetermined storage section e.g., the DRAM 85
  • step S 43 the CPU 81 obtains operation data from a controller removed from the main unit 2 via a wireless connection. That is, the CPU 81 obtains operation data received via the controller communication section 83 . Note that if there are no controllers that are capable of wireless communication (in other words, controllers that are transmitting operation data to the main unit 2 ) at the point of step S 43 , the CPU 81 may skip the process of step S 43 since no operation data is received by the controller communication section 83 . Following step S 43 , the process of step S 44 is executed.
  • step S 44 the CPU 81 executes a predetermined information process based on the operation data obtained in steps S 42 and S 43 .
  • the CPU 81 may move a cursor across the menu screen based on the operation data, or identify an application selected by a user based on the operation data to launch the identified application.
  • the running program is an application program
  • the CPU 81 may execute a process according to the application by using the operation data as an input to the application.
  • step S 45 the process of step S 45 is executed.
  • step S 45 the CPU 81 outputs an image representing the result of the information process of step S 44 to a display device (the display 12 or the TV 6 ).
  • a display device the display 12 or the TV 6 .
  • the running program is an OS program
  • the menu screen described above is displayed on the display device.
  • an image of the application e.g., a game image in the game application
  • the process of step S 46 is executed.
  • step S 46 the CPU 81 determines whether or not to end the series of information processes shown in FIG. 26 . This determination is made based on whether or not there has been a user instruction to end, for example. If the determination result of step S 46 is negative, the process of step S 41 executed again. Thereafter, the series of processes of steps S 41 to S 46 is repeatedly executed until it is determined in step S 46 that the series of information processes is to be ended. On the other hand, if the determination result of step S 46 is affirmative, the CPU 81 ends the series of information processes shown in FIG. 26 .
  • the main unit 2 varies the content of the process to be executed depending on the attached state of the controller. That is, if no controller is attached to the main unit 2 , the main unit 2 skips the process of obtaining operation data via wired communication. Thus, it is possible to simplify the process to be executed on the main unit 2 .
  • step S 43 the main unit 2 executes the process of obtaining operation data via wireless communication (step S 43 ), irrespective of the attached state of the controller.
  • the process of step S 43 does not need to be executed. That is, depending on the running application, if a controller is attached, the main unit 2 may receive operation data only from the controller attached thereto while not accepting operation data from the other, un-attached controller (via wireless communication).
  • FIG. 27 is a flow chart showing an example flow of an information process executed on the main unit 2 .
  • the series of information processes shown in FIG. 27 is executed by an application program or an OS program executable on the information processing device 1 , for example.
  • the series of information processes shown in FIG. 27 is also started in response to a user instruction to launch an application program or in response to the power of the main unit 2 being turned ON.
  • step S 51 the CPU 81 determines whether or not whether or not a controller is attached to the main unit 2 .
  • the CPU 81 determines whether the left and right controllers 3 and 4 are both attached to the main unit 2 or at least one of the left and right controllers 3 and 4 is removed from the main unit 2 .
  • the CPU 81 may determine whether at least one of the left and right controllers 3 and 4 is attached to the main unit 2 or the left and right controllers 3 and 4 are both removed from the main unit 2 . If the determination result of step S 51 is affirmative, the process of step S 52 is executed. On the other hand, if the determination result of step S 51 is negative, the process of step S 54 is executed.
  • step S 52 the CPU 81 sets the control mode of the information process to the single-player control mode. That is, if step S 52 is executed, since a controller is attached to the main unit 2 , it is assumed that a single user performs operations by using the attached controller.
  • the single-player control mode is a mode in which there is one control object, for example. For example, in the game program, the CPU 81 starts a game in which there is one player character. In an application in which operations are performed by using a cursor displayed on the screen, the CPU 81 displays a cursor. Following step S 52 , the process of step S 53 is executed.
  • step S 53 the CPU 81 executes a predetermined process in the single-player control mode. That is, the CPU 81 obtains operation data from the left and right controllers 3 and 4 , and executes a predetermined process based on the operation data. Note that there is no particular limitation on the content of the predetermined process.
  • one control object e.g., one player character or one cursor
  • step S 53 the predetermined process described above is repeatedly executed. When a condition for ending the predetermined process is satisfied (e.g., when there is a user instruction to end the application), the CPU 81 ends the process of step S 53 , thereby ending the series of information processes shown in FIG. 27 .
  • step S 54 the CPU 81 sets the control mode of the information process to the two-player control mode. That is, if step S 54 is executed, it means that (at least one) controller is removed from the main unit 2 , and it is therefore assumed that a total of two users are performing operations, wherein one user uses the removed controller while the other user uses the other controller.
  • the two-player control mode is a mode in which there are two control objects, for example. For example, in the game program, the CPU 81 starts a game in which there are two player characters. In an application in which operations are performed by using a cursor displayed on the screen, the CPU 81 displays two cursors. Following step S 54 , the process of step S 55 is executed.
  • step S 55 the CPU 81 executes a predetermined process in the two-player control mode. That is, the CPU 81 obtains operation data from the left and right controllers 3 and 4 , and executes a predetermined process based on the operation data. Note that there is no particular limitation on the content of the predetermined process.
  • a first control object e.g., one player character or one cursor
  • a second control object is controlled based on operation data obtained from the right controller 4 .
  • step S 55 as in step S 53 , the predetermined process is repeatedly executed. When a condition for ending the predetermined process is satisfied (e.g., when there is a user instruction to end the application), the CPU 81 ends the process of step S 55 , thereby ending the series of information processes shown in FIG. 27 .
  • step S 52 , S 54 when a program is launched, the attached state of controllers is sensed, and a control mode is set depending on the attached state (step S 52 , S 54 ).
  • a control mode is set depending on the attached state (step S 52 , S 54 ).
  • the main unit 2 determines the attached state of controllers at the start of the execution of a program (step S 51 ).
  • the main unit 2 may determine the attached state of controllers at a predetermined timing during the execution of a program. Then, in response to the attached state of controllers being changed, the main unit 2 may change the control mode.
  • the predetermined timing there is no particular limitation on the predetermined timing. For example, where a game application is executed, the main unit 2 may determine the attached state of controllers at the point in time when one play of the game is complete, or may determine the attached state of controllers at the point in time when a predetermined instruction is given by a user.
  • the main unit 2 changes the number of users to perform operations depending on the attached state of controllers.
  • the content of the process to be changed depending on the attached state of controllers.
  • the interpretation (specifically, the interpretation by the main unit 2 ) of an operation on a controller may be changed depending on the attached state of controllers.
  • the CPU 81 may interpret the operation of tilting the analog stick 32 in the upper direction (i.e., in the y-axis direction shown in FIG. 1 ) as being an input of the upper direction.
  • the CPU 81 may interpret the operation of tilting the analog stick 32 in the upper direction as being an input of the left direction (assuming that the controller is held in a landscape position as shown in FIG. 16 ).
  • each of the three series of processes shown in FIG. 25 to FIG. 27 may be executed in any phase on the main unit 2 . That is, these three series of processes may each be executed in a different phase (i.e., in the execution of a different program). Two or three of these series of processes may be executed simultaneously (e.g., within a single program).
  • the main unit 2 may execute a process that combines together processes of FIG. 26 and FIG. 27 when a certain application is executed.
  • the CPU 81 may execute the predetermined process described above based on operation data obtained from the processes of steps S 41 to S 43 shown in FIG. 26 .
  • the controllers 3 and 4 can be attached and detached to and from the main unit 2 . Therefore, there may be provided a left controller different from the left controller 3 shown in FIG. 5 , and the other left controller may be attached to the main unit 2 . Moreover, there may be provided a right controller different from the right controller 4 shown in FIG. 6 , and the other right controller may be attached to the main unit 2 . That is, in the present embodiment, controllers attached to the main unit 2 may be detached therefrom and other controllers may be attached thereto.
  • FIG. 28 and FIG. 29 are diagrams each showing another example left controller.
  • a left controller 201 shown in FIG. 28 as compared with the left controller 3 shown in FIG. 5 , includes a cross-shaped key 202 , instead of the operation buttons 33 to 36 .
  • a left controller 203 shown in FIG. 29 as compared with the left controller 3 shown in FIG. 5 , includes a cross-shaped key 204 , instead of the analog stick 32 .
  • another controller may have a different function from the controllers 3 and 4 shown in FIG. 5 and FIG. 6 .
  • the information processing system may include other controllers having different functions from the controllers 3 and 4 , in addition to (or instead of) the controllers 3 and 4 .
  • FIG. 30 is a diagram showing an example information processing device with a right controller that is different from that of FIG. 1 .
  • another right controller 205 instead of the right controller 4 shown in FIG. 6 , is attached to the main unit 2 .
  • the arrangement of the analog stick 52 and the four operation buttons 53 to 56 is reversed.
  • the arrangement of the analog stick and the four operation buttons is symmetric between the left controller and the right controller. That is, in the present embodiment, the arrangement of the operation section on the information processing device 1 can be changed by changing the controller to be attached to the main unit 2 .
  • the information processing system may include other controllers having different arrangements from the controllers 3 and 4 , in addition to (or instead of) the controllers 3 and 4 .
  • controllers having different functions and/or arrangements. Then, a user can use any of the different kinds of controllers by attaching the controller to the main unit 2 .
  • a portable device with which it is possible to change the controller device (i.e., the controller) in accordance with, for example, the user preference or the content of the application to be executed on the information processing device 1 .
  • the information processing device 1 may include an accessory configured so that the controllers 3 and 4 , having been removed from the main unit 2 , can be attached to the accessory.
  • FIG. 31 is a diagram showing an example accessory to which controllers can be attached.
  • an extension grip 210 which is an example accessory, is an accessory to be used by a user for performing operations.
  • the extension grip 210 includes a housing 211 .
  • the left controller 3 can be attached to the housing 211 , and the right controller 4 can be attached thereto. Therefore, the extension grip 210 allows a user to perform operations while holding two controllers 3 and 4 , having been removed from the main unit 2 , as an integral unit.
  • the extension grip 210 includes mechanisms similar to those of the main unit 2 (specifically, the left rail member 15 , the engagement hole 16 , the stopper 18 , the right rail member 19 , the engagement hole 20 and the stopper 22 ).
  • the controllers 3 and 4 can be attached to the extension grip 210 , as they are attached to the main unit 2 .
  • the extension grip 210 includes grip portions to be held by a user. Specifically, the extension grip 210 accommodates the left controller 3 attached thereto on the left side of the center of the extension grip 210 in the left-right direction (the x-axis direction shown in FIG. 31 ).
  • the extension grip 210 includes a left grip portion 212 provided on the left side (in other words, on the outer side in the left-right direction) of the area where the left controller 3 is attached.
  • the extension grip 210 accommodates the right controller 4 attached thereto on the right side of the extension grip 210 in the left-right direction.
  • the extension grip 210 includes a right grip portion 213 provided on the right side (in other words, on the outer side in the left-right direction) of the area where the right controller 4 is attached. Therefore, by holding the grip portions 212 and 213 , a user can easily operate the controllers 3 and 4 attached to the extension grip 210 .
  • the extension grip 210 includes a left-side terminal similar to the left-side terminal 17 of the main unit 2 at such a location that allows the left-side terminal to be connected to the terminal 42 of the left controller 3 attached to the extension grip 210 .
  • the extension grip 210 also includes a right-side terminal similar to the right-side terminal 21 of the main unit 2 at such a location that allows the right-side terminal to be connected to the terminal 64 of the right controller 4 attached to the extension grip 210 . Therefore, when the controllers 3 and 4 are attached to the extension grip 210 , the extension grip 210 and the controllers 3 and 4 are electrically connected to each other. Therefore, when the controllers and the extension grip 210 are connected to each other, they can communicate with each other and can supply power to each other.
  • the extension grip 210 includes a power terminal similar to the power terminal 134 of the cradle 5 . Therefore, by connecting to a charging device (not shown) (e.g., an AC adaptor) to the power terminal, the extension grip 210 can receive power supply from the charging device via the power terminal. Then, the power supplied from the power terminal is supplied by the extension grip 210 to the left controller 3 and the right controller 4 via the left-side terminal and the right-side terminal. Therefore, by connecting the extension grip 210 to a charging device, it is possible to charge the left controller 3 and the right controller 4 attached to the extension grip 210 .
  • a charging device e.g., an AC adaptor
  • the extension grip 210 has the function of charging controllers attached thereto. That is, the extension grip 210 includes a charging control section (i.e., the power terminal, and the left-side terminal or the right-side terminal), which uses power supplied to the extension grip 210 to charge controllers attached to the extension grip 210 . Then, controllers can be charged while being attached to the extension grip. This improves the usability of the controllers.
  • a charging control section i.e., the power terminal, and the left-side terminal or the right-side terminal
  • the extension grip 210 does not have the charging function. Then, the extension grip 210 does not need to include electronic members such as terminals. By using no electronic members, it is possible to simplify the configuration of the extension grip 210 .
  • the controllers 3 and 4 When the controllers 3 and 4 are attached to the extension grip 210 , the controllers 3 and 4 and the main unit 2 communicate with each other in wireless communication (as when the controllers 3 and 4 are removed from the main unit 2 ). That is, even when attached to the extension grip 210 , the controllers 3 and 4 can communicate with the main unit 2 .
  • a controller transmits, to the main unit 2 , attachment information regarding its attachment to another device.
  • the attachment information indicates the device (and/or the type of the device) to which the controller is attached, and when a controller is not attached to another device, the attachment information indicates that the controller is not attached to another device.
  • the attachment information may be transmitted together with the operation data (or while being included in the operation data), for example.
  • the controller transmits, to the main unit 2 via wireless communication, a notification indicating that the controller has been attached to the extension grip 210 .
  • the controller transmits, to the main unit 2 , the attachment information indicating that the controller is attached to the extension grip 210 .
  • the controller in response to a controller being attached to another device, the controller obtains identification information identifying the other device from the other device (herein, the extension grip 210 ).
  • the identification information is, for example, identification information assigned to each device (more specifically, identification information unique to the device).
  • the identification information includes information indicating the type of the device so that it is possible to identify the type of the device from the identification information.
  • a controller determines (or senses) that the device to which the controller has been attached is the extension grip 210 , i.e., that the device has been attached to the extension grip 210 .
  • the controller transmits, to the main unit 2 via wireless communication, a notification indicating that the controller has been removed from the extension grip 210 . That is, the controller transmits, to the main unit 2 , the attachment information indicating that the controller is not attached to another device.
  • the main unit 2 may change the interpretation of operations performed on a controller depending on the device to which the controller is attached. That is, the main unit 2 may change the method of identifying the content of input (in other words, the content of a user instruction) based on the operation data from a controller depending on the device to which the controller is attached.
  • the left and right controllers 3 and 4 are attached to the extension grip 210 while being slightly inclined with respect to the up-down direction. Therefore, when the analog stick is tilted in the directly upward direction with respect to the extension grip 210 (the y-axis positive direction shown in FIG. 31 ), the tilt is not in the directly upward direction with respect to the controller.
  • the main unit 2 when a controller is attached to the extension grip 210 , the main unit 2 preferably changes the interpretation of directional inputs on the analog stick, as compared with a case where the controller is not attached to the extension grip 210 (e.g., when the controller is attached to the main unit 2 ).
  • the main unit 2 may change the direction of tilt of the analog stick that is interpreted as being an input in the directly upward direction.
  • the main unit 2 changes the interpretation of a directional input on the analog stick, as compared with a case in which the controller is attached to the main unit 2 .
  • the main unit 2 modifies the tilt direction of the analog stick represented by the operation data by rotating the tilt direction by a predetermined angle.
  • the predetermined angle is, for example, the angle of inclination of the controller attached to the extension grip 210 with respect to the attitude of the controller when attached to the main unit 2 .
  • the predetermined angle may be preset based on the structure of the extension grip 210 .
  • the main unit 2 uses the modified tilt direction as the direction input by a user.
  • the main unit 2 executes an information process using the modified tilt direction as an input.
  • the input direction is regarded as being the directly upward direction, thus allowing an information process to be executed appropriately.
  • the main unit 2 modifies the values of the detection results of the acceleration sensor and the angular velocity sensor of the controller. That is, the main unit 2 modifies the detection result of the acceleration sensor by rotating the direction of acceleration (i.e., the detection result of the acceleration sensor) by the above predetermined angle in a rotational direction about an axis extending in the front-rear direction of the controller (the z-axis direction shown in FIG. 1 ).
  • the main unit 2 modifies the detection result of the angular velocity sensor by rotating the axis of rotation of the angular velocity (i.e., the detection result of the angular velocity sensor) by the above predetermined angle in a rotational direction about an axis extending in the front-rear direction of the controller (the z-axis direction shown in FIG. 1 ). Then, the main unit 2 calculates the movement and/or the attitude of the controller (in other words, the extension grip 210 ) based on the modified values of the acceleration and/or the angular velocity.
  • the main unit 2 may calculate the movement and/or the attitude of the controller without modifying the values of the detection results of the acceleration sensor and the angular velocity sensor, and modify the calculated movement and/or the calculated attitude. That is, when the main unit 2 calculates the movement based on the detection results of the acceleration sensor and/or the angular velocity sensor, the main unit 2 modifies the calculated movement by rotating the calculated movement direction by the above predetermined angle in a rotational direction about an axis extending in the front-rear direction of the controller.
  • the main unit 2 modifies the calculated attitude by rotating the calculated attitude by a predetermined angle in a rotational direction about an axis extending in the front-rear direction of the controller.
  • the left controller 3 is attached to the extension grip 210 while being inclined in the first direction from a predetermined reference attitude (i.e., the attitude in which the up-down direction of the left controller 3 coincides with the up-down direction of the extension grip 210 ; or the attitude when the left controller 3 is attached to the main unit 2 ) (see FIG. 31 ).
  • the right controller 4 is attached to the extension grip 210 while being inclined in a direction opposite to the first direction from a predetermined reference attitude (i.e., the attitude where the up-down direction of the right controller 4 coincides with the up-down direction of the extension grip 210 ).
  • controllers can be attached to the accessory with an easy-to-operate inclination.
  • the main unit 2 executes a predetermined information process by modifying the operation data from the attached controller and/or information obtained from the operation data. For example, the main unit 2 performs a modifying process for modifying the inclination of the controller with respect to the predetermined reference attitude. More specifically, the main unit 2 performs a modifying process so that the result of the information process when a predetermined operation is performed on a controller attached to the extension grip 210 is generally equal to that when the predetermined operation is performed on the controller being in the predetermined reference attitude.
  • the main unit 2 modifies a part of the operation data representing an operation performed on a directional input section (i.e., an analog stick).
  • an analog stick can be operated in a similar fashion whether the controller is attached to the accessory while being inclined from the predetermined reference attitude or the controller is not attached to the accessory (or attached to the main unit 2 ).
  • the left and right controllers 3 and 4 are both attached to the extension grip 210 , as when the left and right controllers 3 and 4 are both attached to the main unit 2 , these two controllers are registered as a pair. Specifically, if one of the left and right controllers has been attached to the extension grip 210 , when an indication that the other controller is attached to the extension grip 210 is received, the main unit 2 registers these two controllers as a pair. Then, a user can register two controllers as a pair by attaching them to the extension grip 210 , as well as by attaching them to the main unit 2 .
  • the extension grip 210 may include a communication section capable of communicating with the main unit 2 in wireless communication. Then, the extension grip 210 may obtain operation data from the controllers 3 and 4 attached to the extension grip 210 , and transmit the obtained operation data to the main unit 2 via the communication section.
  • the main unit 2 may communicate with devices that cannot be attached to the main unit 2 (e.g., the extension grip 210 ) in wireless communication, as well as communicating with controllers that can be attached to the main unit 2 in wireless communication.
  • the main unit 2 may receive via wireless communication operation data from a controller device that cannot be attached to the main unit 2 , and execute an information process based on the operation data. Then, operations can be performed on the main unit 2 by using a wider variety of controller devices.
  • the extension grip 210 may include an operation section (e.g., buttons and an analog stick, etc.). Then, the extension grip 210 allows for a wider variety of operations using controllers. Note that information representing an operation performed on the operation section may be obtained by a controller attached to the extension grip 210 (if there are a plurality of controllers attached to the extension grip 210 , one of the controllers), and transmitted from the controller to the main unit 2 . Such information may be transmitted together with (or being included in) operation data of the controller. If the extension grip 210 includes a communication section capable of communicating with the main unit 2 in wireless communication, information representing an operation performed on the operation section may be transmitted directly from the extension grip 210 to the main unit 2 (i.e., without passing through a controller).
  • an operation section e.g., buttons and an analog stick, etc.
  • FIG. 32 is a diagram showing another example accessory controller device.
  • an attachment 220 includes a housing 221 to which the left controller 3 and the right controller 4 can be attached, as with the extension grip 210 shown in FIG. 31 .
  • the attachment 220 has similar mechanisms to those of the extension grip 210 for allowing controllers to be attached thereto.
  • the attachment 220 shown in FIG. 32 includes no grip portion. Therefore, a user holds the left controller 3 and the right controller 4 attached to the attachment 220 . This also enables the user to hold the two controllers 3 and 4 removed from the main unit 2 as an integral unit.
  • the information processing system may include an accessory to which the main unit 2 can be attached.
  • An HMD accessory to be described below as an example accessory can be used as a so-called HMD (head mounted display) with the main unit 2 attached thereto.
  • FIG. 33 is a diagram showing an example HMD accessory to which the main unit 2 can be attached.
  • An HMD accessory 230 shown in FIG. 33 includes a housing 231 and belts 232 a and 232 b .
  • One end of the belt 232 a is attached to one end of the housing 231
  • one end of the belt 232 b is attached to the other end of the housing 231 .
  • the other end of the belt 232 a can be removably connected to the other end of the belt 232 b .
  • the housing 231 can be mounted on the head of the user by connecting together the two belts 232 a and 232 b around the head of the user. Note that there is no particular limitation on the mechanism for allowing the HMD accessory 230 to be mounted on the head of the user.
  • the housing 231 includes two openings 231 a .
  • the openings 231 a are located so as to face the eyes of the user with the housing 231 mounted on the head of the user.
  • the HMD accessory 230 includes a lens provided in each of the openings 231 a.
  • the housing 231 includes an insertion slot 23 lb for receiving the main unit 2 attached thereto (or inserted thereinto). That is, the main unit 2 can be attached to the HMD accessory 230 by inserting the main unit 2 into the insertion slot 231 b . If the main unit 2 is attached so that the display 12 is facing the openings 231 a , the display 12 can be viewed through the lenses through the openings 231 a . That is, the housing 231 supports the display 12 so that the user can view the display 12 of the main unit 2 .
  • a user can view images on the display 12 when the HMD accessory 230 with the main unit 2 attached thereto is mounted on the head of the user. That is, the HMD accessory 230 with the main unit 2 attached thereto functions as a so-called HMD.
  • the HMD accessory 230 may present images of a wide viewing angle to a user by enlarging the viewing angle of the images on the display 12 through the lenses. This can enhance the sense of immersion for a user looking at the images.
  • the main unit 2 may perform a predetermined conversion process on the images displayed on the display 12 so that appropriate images are viewed through the lenses.
  • the main unit 2 includes the acceleration sensor 89 and the angular velocity sensor 90 , and can calculate the movement and/or the attitude of the main unit 2 based on the detection results of these sensors. Therefore, the main unit 2 can calculate the movement and/or the attitude of the HMD accessory 230 to which the main unit 2 is attached, and perform a predetermined process in accordance with the movement and/or the attitude.
  • the predetermined process for example, is a process of controlling a virtual camera for producing images to be displayed on the display 12 based on the movement and/or the attitude of the HMD accessory 230 , and more specifically is a process of changing the line-of-sight direction of the virtual camera depending on the attitude, for example.
  • the HMD accessory 230 may include the sensors. Then, the HMD accessory 230 may transmit the detection results of the sensors (or information obtained by performing a predetermined process on the detection results) to the main unit 2 . The main unit 2 may calculate the movement and/or the attitude of the HMD accessory 230 based on the information transmitted from the HMD accessory 230 .
  • the controllers 3 and 4 may be used while they are removed from the main unit 2 . Therefore, even with the main unit 2 attached to the HMD accessory 230 , the controllers 3 and 4 can be used as controller devices. That is, a user can operate the controllers using the hands while the HMD accessory 230 with the main unit 2 attached thereto is mounted on the head of the user.
  • the information processing device 1 of the present embodiment can be used in a mode where it is used as an HMD.
  • the controllers can be removed from the main unit 2 , it is possible to reduce the weight of the device or devices to be mounted on the head of the user. A user can perform operations using removed controllers.
  • the information processing device 1 includes the main unit 2 , the left controller (referred to also as the first controller device) 3 , and the right controller (referred to also as the second controller device) 4 . Since the information processing device 1 includes a plurality of devices, it can be referred to also as an information processing system.
  • the main unit includes a display (i.e., the display 12 ).
  • the left controller 3 is configured to be removably attached to the main unit 2
  • the right controller 4 is configured to be removably attached to the main unit 2 ( FIG. 2 ).
  • the left controller 3 transmits first operation data representing an operation performed on the left controller 3 to the main unit 2 , whether it is attached to the main unit 2 or not.
  • the right controller 4 transmits second operation data representing an operation performed on the right controller 4 to the main unit 2 , whether it is attached to the main unit 2 or not.
  • the main unit is configured to display, on the display, the execution result of an information process (step S 44 ) based on the first operation data transmitted from the left controller 3 and the second operation data transmitted from the right controller 4 (step S 45 ).
  • the main unit 2 is capable of displaying, on the display, images based on operations performed on the left controller 3 and the right controller 4 ( FIG. 14 ).
  • the main unit 2 is also capable of displaying, on the display, images based on operations performed on the left controller 3 and the right controller 4 when the left controller 3 and the right controller 4 are removed from the main unit 2 ( FIG. 15 ).
  • the information processing device 1 can be used both in the mode in which the controllers 3 and 4 are attached to the main unit 2 and in the mode in which the controllers 3 and 4 are removed from the main unit. Thus, since a plurality of modes of use are realized with a single information processing device 1 , the information processing device 1 can be used in a wider variety of manners.
  • the “image based on operations” may be images obtained by an information process that is performed based on an operation (e.g., images obtained by an operation performed on an application used for obtaining and viewing information from the Internet) or images produced by an information process that is performed based on an operation (e.g., game images produced in accordance with a game operation performed on a game application).
  • an operation e.g., images obtained by an operation performed on an application used for obtaining and viewing information from the Internet
  • images produced by an information process that is performed based on an operation e.g., game images produced in accordance with a game operation performed on a game application.
  • the main unit 2 may be used in the mode in which the left controller 3 and the right controller 4 are attached to the main unit 2 and in the mode in which the left controller 3 and the right controller 4 are removed from the main unit 2 , and it is not necessary that both of these modes be available under certain conditions.
  • only one of the two modes may be available in a predetermined application running on the main unit 2 . That is, the main unit 2 may run an application that is available only in the mode in which the left controller 3 and the right controller 4 are attached to the main unit 2 , and may run another application that is available only in the mode in which the left controller 3 and the right controller 4 are removed from the main unit 2 .
  • the main unit 2 may include an information processing unit (e.g., the CPU 81 ) instead of a display.
  • an information processing unit e.g., the CPU 81
  • the left controller 3 includes a first input section (e.g., the analog stick 32 ) and a second input section (e.g., the operation buttons 33 to 36 ).
  • the right controller 4 includes a third input section (e.g., the analog stick 52 ) of the same type as the first input section and a fourth input section (e.g., the operation buttons 53 to 56 ) of the same type as the second input section.
  • an “input section” is any means that outputs information representing a user input and/or information with which it is possible to calculate (or estimate) a user input.
  • an input section may be a button, a directional input section such as an analog stick, a touch panel, a microphone, a camera, a sensor capable of calculating the movement of the controller, for example, (e.g., an acceleration sensor and an angular velocity sensor), and the like.
  • the left controller 3 and the right controller 4 include two sets of input sections of the same type. Therefore, using the input sections described above, a user can use two controllers in a similar manner. For example, if a single user uses a controller, it is convenient because the user can perform operations in a similar manner using either one of two controllers. Also conveniently, two users can each use one controller, for example.
  • “input sections being of the same type” is not limited to cases where the two input sections are the same input sections, but may also include cases where two input sections have functions and/or applications of the same type.
  • the first input section is an analog stick that can be tilted up, down, left and right
  • the third input section may be a slide stick that can be slid up, down, left and right or a cross-shaped key capable of making an input of up, down, left and right.
  • the input mechanism of the first input section and the input mechanism of the third input section are substantially the same.
  • the input mechanism of the second input section and the input mechanism of the fourth input section are substantially the same.
  • the two controllers will have two types of input sections that can be operated in a similar fashion. Therefore, a user can use two controllers in a similar fashion, thereby improving the controllability of the controllers.
  • the first input section has substantially the same shape as the third input section.
  • the second input section has substantially the same shape as the fourth input section.
  • the two controllers will have two types of input sections that can be operated in a similar fashion. Therefore, a user can use two controllers in a similar fashion, thereby improving the controllability of the controllers.
  • the positional relationship between the first input section and the second input section of the left controller 3 placed in a certain orientation is the same as the positional relationship between the third input section and the fourth input section of the right controller 4 placed in a certain orientation.
  • the left side surface the left controller 3 is facing a user and the right side surface of the right controller 4 is facing another user, as shown in FIG. 16 and FIGS. 17A and 17B .
  • the positional relationship between the first input section (i.e., the analog stick 32 ) and the second input section (i.e., the operation buttons 33 to 36 ) is the same as the positional relationship between the third input section (i.e., the analog stick 52 ) and the fourth input section (i.e., the operation buttons 53 to 56 ).
  • the positional relationship between the first input section (i.e., the analog stick 32 ) and the second input section (i.e., the operation buttons 33 to 36 ) is opposite from the positional relationship between the third input section (i.e., the analog stick 52 ) and the fourth input section (i.e., the operation buttons 53 to 56 ) (see FIG. 1 ).
  • controllers 3 and 4 are attached to the main unit 2 , if a user holds the controllers 3 and 4 respectively with the left hand and the right hand, the user is allowed to easily operate different input sections with the left hand and with the right hand. Thus, it is possible to improve the controllability of the information processing device 1 with controllers attached thereto.
  • this configuration has a problem in that when users hold controllers removed from the main unit 2 , the direction of the rounded side surface of one controller will be opposite to that of the other controller, which makes it less easy for a user to understand the appropriate orientation in which to hold a controller.
  • sub-buttons the second L button and/or the second R button in the embodiment described above
  • the sub-buttons will be provided on the rounded side surface (as in the present embodiment) for one controller, whereas the sub-buttons will be provided on the opposite side surface from the rounded side surface for the other controller.
  • the sub-buttons of one controller will be exposed even when the controller is attached to the main unit 2 , and the sub-buttons may possibly be operated in error in the attached state.
  • the first input section and the third input section are each a directional input section for receiving a directional input.
  • the directional input section may include an operation member (e.g., a stick member) that can be tilted or slid in a predetermined direction. Then, a user can make directional inputs by using the left controller 3 or the right controller 4 .
  • the second input section and the fourth input section are each a button that can be pressed.
  • a user can make button inputs by using the left controller 3 or the right controller 4 .
  • a controller is attached integrally to the main unit 2 with a predetermined surface of the housing of the controller (e.g., the right side surface for the left controller 3 ) facing a predetermined surface of the main unit 2 (see FIG. 2 ).
  • a predetermined surface of the housing of the controller e.g., the right side surface for the left controller 3
  • a predetermined surface of the main unit 2 see FIG. 2 .
  • the left controller 3 is attached to the main unit 2 so as to be facing one of a left side surface and a right side surface of the main unit 2 (specifically, the left side surface).
  • the right controller 4 is attached to the main unit 2 so as to be facing the other one of the left side surface and the right side surface of the main unit 2 (i.e., the right side surface). Then, a user can operate the two controllers attached to the main unit 2 respectively with the left hand and the right hand, thereby providing the information processing device 1 with a good controllability.
  • the controller includes a light-emitting portion (e.g., the indicator LED 45 for the left controller 3 ) provided on the predetermined surface for notifying a user of predetermined information. Then, predetermined information (e.g., the status of the information processing device 1 ) can be indicated to the user using the controller.
  • a light-emitting portion e.g., the indicator LED 45 for the left controller 3
  • predetermined information e.g., the status of the information processing device 1
  • the light-emitting portion may indicate the communication status between the main unit 2 and the controller.
  • the indicator LED indicates a number that is assigned to the controller as a result of communication (more specifically, the number represented by the number information described above).
  • the indicator LED may indicate the status of wireless communication between the information processing device 1 and the controller. Specifically, the indicator LED may indicate whether or not wireless communication is available at that point in time or may indicate whether or not pairing has been done.
  • the controller includes operation sections provided on the predetermined surface (e.g., the second L button 43 and the second R button 44 for the left controller 3 ).
  • the “operation section” for example means any input section operated by a user, such as buttons, a stick, etc.
  • more operation sections are available when the controller is removed from the main unit 2 than when the controller is attached to the main unit 2 . Therefore, a user is allowed to perform a variety of operations even when the controller is removed from the main unit 2 , and it is possible to improve the controllability of the controller.
  • the main unit 2 may include operation sections (e.g., buttons) provided on the engaged surface (i.e., the left side surface or the right side surface of the main unit 2 ) to which the left controller 3 or the right controller 4 is attached.
  • the operation section may be a button having a particular function, which may specifically be a power button, a home button or a sleep button.
  • the power button 28 may be provided on the left side surface or the right side surface of the main unit 2 . Then, the power button 28 cannot be operated when the left controller 3 or the right controller 4 is attached, thereby preventing an erroneous operation by a user.
  • the operation section may be provided on the rail member or on another portion of the housing other than the rail member.
  • an operation section having the same function as the operation section described above may be provided on the left controller 3 and/or the right controller 4 .
  • a power button having the same function as the power button 28 may be provided on the left controller 3 and/or the right controller 4 . Then, the function of the input section is available to the user even when the left controller 3 and the right controller 4 are in the attached state.
  • the main unit 2 may include a terminal for connecting other devices provided on the engaged surface (i.e., the left side surface or the right side surface of the main unit 2 ) to which the left controller 3 or the right controller 4 is attached.
  • the first slot 23 and/or the second slot 24 may be provided on the left side surface or the right side surface of the main unit 2 . Then, with the controller attached to the main unit 2 , the terminals can be protected by the controller.
  • a slot is provided on the left side surface or the right side surface of the main unit 2 , it is possible to prevent a device inserted in the slot (e.g., a card storage medium) from coming off of the main unit 2 , when the controller is attached to the main unit 2 .
  • a device inserted in the slot e.g., a card storage medium
  • connecting portions i.e., the upper left portion and the lower left portion
  • first side surface i.e., the left side surface
  • side surfaces adjacent thereto i.e., the upper side surface and the lower side surface
  • connecting portions i.e., the upper right portion and the lower right portion
  • second side surface i.e., the right side surface
  • connecting portions between the third side surface (i.e., the right side surface) of the four side surfaces of the right controller 4 and side surfaces adjacent thereto (i.e., the upper side surface and the lower side surface) have a more rounded shape than connecting portions (i.e., the upper side surface and the lower side surface) between the fourth side surface (i.e., the left side surface) opposite from the third side surface and side surfaces adjacent thereto (see FIG. 6 ).
  • the controller has a rounded shape on one side thereof, and it is therefore easy for a user to understand the orientation in which to hold the controller removed from the main unit 2 . This also makes it easier for a user to hold the controller removed from the main unit 2 .
  • the left controller 3 is attached to the main unit 2 with the second side surface of the left controller 3 facing the fifth side surface (i.e., the left side surface) of the four side surfaces of the main unit 2 (see FIG. 2 ).
  • the right controller 4 is attached to the main unit 2 with the fourth side surface of the right controller 4 facing the sixth side surface (i.e., the right side surface) opposite from the fifth side surface of the main unit 2 (see FIG. 2 ).
  • the information processing device 1 will have an overall shape such that the left side and the right side thereof are rounded, making it easier for a user to hold.
  • the left side of the left controller 3 is more rounded than the right side thereof, whereas the right side of the right controller 4 is more rounded than the left side thereof ( FIG. 5 and FIG. 6 ).
  • the left controller 3 has a different overall shape than that of the right controller 4 , it is possible to reduce the possibility that a user mistakes the left and right controllers for each other when attaching them to the main unit 2 .
  • the right controller 4 includes an input section having a first function that the left controller 3 does not have (in the embodiment described above, the plus button 57 , the home button 58 and the infrared image-capturing section 123 ).
  • the left controller 3 includes one or more input sections having a predetermined number of types of functions ( 13 in the embodiment described above, including nine buttons, the analog stick 32 , the acceleration sensor 104 and the angular velocity sensor 105 ).
  • the right controller 4 includes one or more input sections having a number (different from the predetermined number) of types of functions (15 in the embodiment described above, including 11 buttons, the analog stick 52 , the acceleration sensor 114 , the angular velocity sensor 115 and the infrared image-capturing section 123 ).
  • a number different from the predetermined number
  • types of functions including 11 buttons, the analog stick 52 , the acceleration sensor 114 , the angular velocity sensor 115 and the infrared image-capturing section 123 .
  • the “input section having the first function” may be an image-capturing device (e.g., the infrared image-capturing section 123 ) or a button.
  • the button may be a button having a particular function (e.g., a power button or a home button), for example.
  • the left controller 3 includes input sections having the second function different from the first function (e.g., the analog stick 32 and the buttons 33 to 38 in the embodiment described above), and the right controller 4 includes input sections having the second function (e.g., the analog stick 52 and the buttons 53 to 56 , 60 and 61 in the embodiment described above). Then, a user can use the second function on either controller, thereby improving the controllability, whereas the first function is omitted for one of the controllers, thereby simplifying the configuration of the controller.
  • the second function different from the first function
  • the right controller 4 includes input sections having the second function (e.g., the analog stick 52 and the buttons 53 to 56 , 60 and 61 in the embodiment described above).
  • communication between the main unit 2 and a controller when the controller is attached to the main unit 2 uses a first communication scheme (specifically, wired communication), and communication between the main unit 2 and a controller when the controller is removed from the main unit 2 uses a second communication scheme (specifically, wireless communication) different from the first communication scheme. Then, by changing the communication scheme between when the controller is attached to the main unit 2 and when the controller is removed from the main unit 2 , the controller can communicate in either case.
  • the controller can easily communicate with the main unit 2 in wired communication when it is attached to the main unit 2 , and the controller can communicate with the main unit 2 in wireless communication when it is removed from the main unit 2 .
  • the wired communication between the main unit and the controller is a communication connection via a wired communication channel formed by an electrical connection between the first terminal of the main unit 2 (specifically, the left-side terminal 17 or the right-side terminal 21 ) and the second terminal of the controller (specifically, the terminal 42 or 64 ).
  • the wired communication as used herein means communication via a cable connection between devices, and also means communication via a connection between a terminal (e.g., a connector) of one device and a terminal (e.g., a connector) of the other device.
  • the controller when the controller is attached to the main unit 2 , the first terminal of the main unit 2 (i.e., the left-side terminal 17 or the right-side terminal 21 ) and the second terminal of the controller (i.e., the terminal 42 or 64 ) are electrically connected to each other by being in contact with each other. Then, when the controller is attached to the main unit 2 , wired communication and/or power supply are implemented via the terminals connected together.
  • the controller when the controller is attached to the main unit 2 , the first terminal of the main unit 2 and the second terminal of the controller are electrically connected together, and the communication between the main unit 2 and the controller and the power supply from the main unit 2 to the controller are implemented via the first terminal and the second terminal. Then, it is possible to increase the opportunity to charge the controller and thus to reduce the possibility of the controller running out of battery.
  • the controllers can be charged without the user knowing. Therefore, the user does not need to perform a separate operation for charging the controller, thus saving the trouble for the user.
  • the information processing device 1 includes a first sensing section (e.g., the CPU 81 executing step S 3 ) for sensing the attachment of the left controller 3 to the main unit 2 , and a second sensing section (e.g., the CPU 81 executing step S 3 ) for sensing the attachment of the right controller 4 to the main unit 2 .
  • the main unit 2 registers a pair of a left controller and a right controller based on the sensing results from the first sensing section and the second sensing section (see step S 4 ). Therefore, a user can register a pair through a straightforward, easy operation of attaching two controllers, to be used as a pair, to the main unit 2 .
  • the main unit 2 registers the left controller 3 and the right controller 4 attached thereto as a pair (see step S 3 , S 4 ).
  • a user can register a pair of controllers through a straightforward operation.
  • first sensing section and the second sensing section may or may not simultaneously sense the attachment of the controllers to the main unit 2 . That is, two controllers that are not at the same time attached to the main unit 2 may be registered as a pair.
  • the main unit 2 may execute a predetermined information process based on operation data received from two pairs of controllers. For example, the main unit 2 receives operation data from each of a left controller and a right controller that are registered as a first pair, and receives operation data from each of a left controller and a right controller that are registered as a second pair. The main unit 2 may execute a predetermined information process using, as a set of data, operation data received from controllers that are registered as a first pair, and using, as another set of data, operation data received from controllers that are registered as a second pair. Then, the information processing device can use operation data from a plurality of controllers while distinguishing between operation data from different registered pairs.
  • the main unit 2 executes a predetermined information process using the two pieces of operation data received as a set of data (see FIG. 17B ).
  • a user can perform operations using controllers registered as a pair. For example, a user can control one object by using a pair of controllers.
  • the main unit 2 includes the housing 11 provided with a first engagement portion (which can also be referred to as the attachment and detachment mechanism; specifically, the left rail member 15 ) to be engaged with the housing (specifically, the slider 40 provided on the housing 31 ) of the left controller 3 , and a second engagement portion (specifically, the right rail member 19 ) to be engaged with the housing (specifically, the slider 62 provided on the housing 51 ) of the right controller 4 .
  • a first engagement portion which can also be referred to as the attachment and detachment mechanism; specifically, the left rail member 15
  • the housing specifically, the slider 40 provided on the housing 31
  • a second engagement portion specifically, the right rail member 19
  • the first engagement portion and the second engagement portion are to be engaged with controller housings (including members provided on the housings), and are not to be engaged with connectors of the controllers. That is, in the present embodiment, the main unit 2 is configured so that controllers are configured to be removably attached to the main unit 2 by the method of engaging the engagement portions with the controllers, which is different from the method of connecting the connectors of the main unit 2 with those of the controllers (the engagement method and the method of connecting the connectors may be both used at the same time). Then, the main unit 2 and the controllers can be firmly connected together.
  • the main unit 2 may include only one engagement portion to be engaged with the housing of the controller or may include three or more engagement portions.
  • the left controller 3 includes the housing 31 provided with a third engagement portion (specifically, the slider 40 ) to be engaged with the first engagement portion of the main unit 2 .
  • the right controller 4 includes the housing 51 provided with a fourth engagement portion (specifically, the slider 62 ) to be engaged with the second engagement portion of the main unit 2 .
  • members are provided also on the controller side, which members are to be engaged with the engagement portions on the main unit 2 side are provided, and it is therefore possible to more firmly connect the main unit 2 and the controllers together.
  • the main unit 2 selectively outputs the execution result of the information process to either the display (the display 12 ) or a display device (the TV 6 ) separate from the main unit 2 (see FIG. 24 ).
  • This enables two different modes of use, including a mode in which images are displayed on the display of the main unit 2 , and another mode in which images are displayed on a display device separate from the main unit 2 .
  • the engagement portion of the main unit 2 is a rail member (referred to also as the first slide member) provided on a surface of the housing 11 of the main unit 2 .
  • the controller includes a slider (referred to also as the second slide member) which slidably and detachably engages with the rail member.
  • the rail member and the slider together form a slide mechanism (see FIG. 7 ).
  • the slide mechanism allows for firm locking between the main unit 2 and the controllers and allows for easy attachment and detachment of the controllers.
  • the rail member is formed so that the slider can slidably engage with the rail member in a predetermined direction (specifically, the y-axis direction shown in FIG. 1 ), and so that the slider can be inserted and detached into and from the rail member via one end thereof in the predetermined direction (see FIG. 2 ).
  • a predetermined direction specifically, the y-axis direction shown in FIG. 1
  • the slider can be inserted and detached into and from the rail member via one end thereof in the predetermined direction (see FIG. 2 ).
  • the rail member is provided so as to extend in the up-down direction of the main unit 2 so that the slider can be inserted and detached into and from the rail member via the upper end thereof (see FIG. 2 ).
  • controllers can be conveniently attached and detached to and from the main unit 2 while the main unit 2 is placed upright.
  • controllers can be attached and detached to and from the main unit 2 while the main unit 2 is attached to the cradle 5 .
  • the slide member of the main unit 2 is provided so as to extend generally over the entirety (e.g., so that the length of the slide member is at least one 80% or more of the length of the housing 11 of the main unit 2 ) of a surface of the housing 11 of the main unit 2 in a predetermined direction (specifically, the up-down direction) (see FIG. 3 ).
  • a controller is attached to the main unit 2
  • the controller is connected generally over the entirety of the aforementioned surface of the main unit 2 , thereby allowing the controller to be firmly connected to the main unit 2 .
  • the first slide member of the main unit 2 i.e., the rail member
  • the second slide member of the controller i.e., the slider
  • the first slide member of the main unit 2 may have a T-shaped cross section
  • the second slide member of the controller may have a C-shaped cross section.
  • the controller includes a terminal (e.g., the terminal 42 or 64 ) for communication with the main unit 2 .
  • the main unit 2 includes a terminal (i.e., the left-side terminal 17 or the right-side terminal 21 ) provided on the housing 11 at such a location that allows the terminal to be connected to a terminal of a controller when the controller is attached to the main unit 2 .
  • the terminals are connected together, enabling wired communication.
  • the information processing device 1 can be said to be a game system capable of executing game applications.
  • the left controller 3 and the right controller 4 are removed from the main unit 2 (in other words, when the two controllers are both removed from the main unit 2 )
  • the left controller 3 and the right controller 4 each transmit operation data representing operations performed on the left controller 3 and the right controller 4 to the main unit 2 via wireless communication.
  • the information processing device 1 can be said to be a hand-held information processing device including a main section (i.e., the main unit 2 ) having a display (i.e., the display 12 ), a first controller section (i.e., the left controller 3 ) and a second controller section (i.e., the right controller 4 ) for performing a predetermined information process in response to an operation performed on either the first controller section or the second controller section.
  • the main unit 2 performs a predetermined information process based on operation data representing an operation performed on either the first controller section or the second controller section, and displays the results of the information process on the display.
  • the first controller section and the second controller section are arranged so that when the first controller section is attached to the main section, a user (i.e., the user holding the information processing device) can operate the first controller section with one hand and operate the second controller section with the other hand (see FIG. 14 ).
  • a user i.e., the user holding the information processing device
  • the information processing system of the present embodiment includes the main unit 2 , and controllers (specifically, the left controller 3 and the right controller 4 ; referred to also as controller devices) that are configured to be removably attached to the main unit 2 .
  • the main unit 2 includes the display 12 (referred to also as the display).
  • the main unit 2 can display, on the display 12 , images that are obtained based on operations performed on the controllers.
  • the controllers are removed from the main unit 2 , the main unit 2 can display, on an external display device (specifically, the TV 6 ) separate from the main unit 2 , images that are obtained based on operations performed on the controllers.
  • the information processing device 1 can be used both in a mode in which controllers are attached to the main unit 2 and in a mode in which the controllers are removed from the main unit. Thus, since a plurality of modes of use are realized with a single information processing device 1 , the information processing device 1 can be used in a wider variety of manners.
  • an external display device can be used as the display device.
  • a user can provide, as the external display device, a display device having a larger screen size than the display 12 , so that it is possible to display the images on a larger screen.
  • the information processing system can communicate with the TV 6 , and further includes the cradle 5 (referred to also as an add-on device) to and from which the main unit 2 can be attached and detached.
  • the main unit 2 detects the attachment of the main unit 2 to the cradle 5 , and determines whether the images obtained based on operations performed on the controllers is displayed on the display 12 or displayed on the TV 6 based at least on the detection results (step S 21 , S 25 , S 29 ).
  • the main unit 2 can determine the image display output based on whether or not the main unit 2 is attached to the cradle 5 .
  • the main unit 2 may select the TV 6 as the image display output at least on the condition that the main unit 2 is attached to the cradle 5 .
  • the main unit 2 may select the TV 6 as the image display output when the main unit 2 is attached to the cradle 5 . That is, the main unit 2 may output the images to the TV 6 via the cradle 5 in response to the attachment of the main unit 2 to the cradle 5 .
  • the main unit 2 may be capable of communicating directly with the TV 6 .
  • the main unit 2 and the TV 6 may be capable of communicating with each other in wireless communication.
  • the main unit 2 may determine the image display output based on the attached state of the main unit 2 to the cradle 5 .
  • the add-on device (e.g., the cradle) may be any add-on device to and from which the main unit 2 can be attached and detached.
  • the add-on device may or may not have the function of charging the main unit 2 , as in the present embodiment.
  • the cradle 5 is enabled to communicate with the main unit 2 at least on the condition that the main unit 2 is attached thereto. If the main unit 2 has determined to display the images based on operations performed on the controllers on the TV 6 , the main unit 2 outputs the images to the TV 6 via the cradle 5 . Thus, the main unit 2 can display the images on the TV 6 by outputting the images to the TV 6 via the cradle 5 . Therefore, the main unit 2 does not need to communicate with the TV 6 , thereby simplifying the configuration of the main unit 2 .
  • the information processing system of the embodiment described above includes the main unit 2 , the left controller 3 (referred to also as the first controller device), the right controller 4 (referred to also as the second controller device), and an accessory (e.g., the extension grip 210 or the attachment 220 ).
  • the left controller 3 is configured to be removably attached to the main unit 2 or the accessory.
  • the right controller 4 is configured to be removably attached to the main unit 2 or the accessory.
  • the main unit 2 includes the display 12 (referred to also as the display), and displays, on the display 12 , the execution result of a predetermined information process based on operations performed on the left controller 3 and the right controller 4 .
  • the left controller 3 and the right controller 4 can be attached to the accessory at the same time (see FIG. 31 ).
  • a user can hold the two controllers 3 and 4 , removed from the main unit 2 , as an integral unit. That is, it is possible to improve the controllability of the controllers when removed from the main unit 2 .
  • the left controller 3 is attached to the accessory on the left side of the center of the accessory
  • the right controller 4 is attached to the accessory on the right side of the center of the accessory (see FIG. 31 , FIG. 32 ).
  • a user can operate the left controller 3 attached to the accessory with the left hand, and operate the right controller 4 attached to the accessory with the right hand. That is, a user can operate the controllers in a similar fashion to that when the controllers are not attached to the accessory, thereby providing an accessory having a good controllability.
  • the accessory includes a first grip portion (i.e., the left grip portion 212 ) provided on the left side and a second grip portion (i.e., the right grip portion 213 ) provided on the right side. Then, a user can operate the controllers while holding the grip portions respectively with the left hand and the right hand, thereby providing an accessory having a good controllability.
  • a first grip portion i.e., the left grip portion 212
  • a second grip portion i.e., the right grip portion 213
  • the first grip portion is provided on the left side of the area where the left controller 3 is attached.
  • the second grip portion is provided on the right side of the area where the right controller 4 is attached (see FIG. 31 ). Therefore, by holding the grip portions, a user can easily operate the controllers 3 and 4 attached to the accessory.
  • an accessory to which a first controller device and a second controller device are removably attachable including:
  • a first slide member configured for engaging with a slide member of the first controller device
  • a second slide member configured for engaging with a slide member of the second controller device
  • the first slide member may include a first engagement hole configured for engaging with a projection of the first controller device.
  • the second slide member may include a second engagement hole configured for engaging with a projection of the second controller device.
  • the first slide member may be configured to allow the first controller device to be attached to the accessory on the first side of a center of the accessory with respect to the predetermined direction.
  • the second slide member may be configured to allow the second controller device to be attached to the accessory on the second side of the center of the accessory with respect to the predetermined direction.
  • the first grip portion may be on the first side, with respect to the predetermined direction, of an area where the first controller device is attached to the accessory.
  • the second grip portion may be on the second side, with respect to the predetermined direction, of an area where the second controller device is attached to the accessory.
  • the Accessory May Further Include:
  • the Accessory May Further Include:
  • a charging unit configured to use power supplied to the accessory through the power supply terminal to charge the first controller device through the first connection terminal and charge the second controller device through the second connection terminal.
  • an accessory to which at least one controller device is removably attachable including:
  • a slide member configured for engaging with a slide member of the controller device
  • connection terminal at a position allowing the connection terminal to be connected to a terminal of the controller device attached to the accessory;
  • a charging unit configured to use power supplied to the accessory through the power supply terminal to charge the controller device through the connection terminal.
  • a game controller configured which is remobably attachable to a main unit of a game device, the game controller including:
  • a slide member configured for slidably and detachably engaging with a rail member of the main unit
  • the projection being configured to engage with the rail member of the main unit when the game controller is attached to the main unit, thereby locking the game controller to the main unit.
  • the game controller may further include a release mechanism configured to retract the projection into the slide member in response to an operation by a user, thereby releasing the locking of the game controller to the main unit.
  • the release mechanism may include a user-actuable button.
  • the projection may be configured to be biased from an inside toward an outside of the slide member.
  • the slide member may be configured for insertion into the rail member of the main unit downwardly from an upper end of the rail member.
  • the game controller may further include a terminal below the projection on the slide member, the terminal being configured for electrically connecting to a terminal of the main unit.
  • a game controller which is remobably attachable to a main unit of a game device, the game controller including:
  • a slide member configured for slidably and removably engaging with a rail member of the main unit
  • a first operation section and/or a light-emitting portion on a surface of the slide member the surface being faced to the main unit when the game controller is attached to the main unit.
  • the first operation section may be at a position recessed relative to the surface.
  • the first operation section is configured so as not to protrude past the surface.
  • a plurality of the first operation sections may be on the surface.
  • the first operation section may include a first button on a first side of a center of the slide member with respect to a slide direction and a second button on a second side of the center of the slide member with respect to the slide direction.
  • a plurality of light-emitting portions may be on the surface.
  • the game controller may further include:
  • a housing having a primary surface and a side surface, the slide member being on the side surface;
  • a game controller including:
  • a housing having a shape elongated in a predetermined direction, the housing having a primary surface, at least one first side surface extending in the predetermined direction, and at least one second side surface different from the at least one first side surface;
  • first operation section and a second operation section on one of the at least one first side surface, wherein the first operation sections is on one side of a center of the first side surface with respect to the predetermined direction and the second operation sections is on the other side of the center of the first side surface with respect to the predetermined direction;
  • the housing shape may be elongated in an up-down direction.
  • the first operation section and the second operation section may be on a right side surface of the housing, the first operation sections being on an upper side of a center of the right side surface with respect to the up-down direction and the second operation sections being on a lower side of the center of the right side surface with respect to the up-down direction.
  • the third operation section and the fourth operation section may be on a corner portion between a left side surface and an upper side surface of the housing.
  • the housing may have a shape elongated in an up-down direction.
  • the first operation section and the second operation section may be on a left side surface of the housing, the first operation sections being on an upper side of a center of the left side surface with respect to the up-down direction and the second operation sections being on a lower side of the center of the left side surface with respect to the up-down direction.
  • the third operation section and the fourth operation section may be on a corner portion between a right side surface and an upper side surface of the housing.
  • the housing may have a rounded shape in the corner portion on which the third operation section and the fourth operation section are.
  • the third operation section and the fourth operation section may each have a rounded shape in conformity with the rounded shape of the housing.
  • the fourth operation section may be on a rear side of the third operation section with respect to a front-rear direction defined perpendicular to the primary surface of the housing.
  • the fourth operation section may be configured so as to project past a reverse surface of the housing.
  • a game controller including:
  • a housing having a shape elongated in a predetermined direction
  • a directional input section on a primary surface of the housing
  • a first operation section on a first side surface of the housing at such a position allowing operation by an index finger of a user, when the housing is held in a portrait position in one hand of the user so as to allow operation of the directional input section by a thumb;
  • a second operation section and a third operation section on a second side surface of the housing at such positions allowing operation of the second operation section and the third operation section respectively by a left and a right index finger of the user, when the housing is held in a landscape position by both hands of the user so as to allow operation of the directional input section by a thumb.
  • a game controller capable of wireless communication, the game controller including:
  • a housing shape is elongated in a predetermined direction
  • the second operation section, the third operation section and the pairing button may be configured so as not to protrude past the side surface.
  • a game controller comprising:
  • the rail along a side surface of the housing, the rail having a T-shaped cross-section including a stem extending perpendicular to the side surface and a crossbar extending perpendicular to the stem;
  • the one or more game control buttons may be on a surface of the crossbar.
  • the game controller may further comprise one or more lights on the rail.
  • the one or more lights may be on a surface of the crossbar.
  • the one or more game control buttons may include a first button and a second button.
  • the one or more lights may be between the first button and the second button.
  • the game controller may further comprise an electrical connector on the rail.
  • an information processing apparatus comprising:
  • processing circuitry configured to generate images for the display
  • a first channel formed in and extending in a lengthwise direction along a first side surface of the housing
  • a second channel formed in and extending in a lengthwise direction along a second side surface of the housing opposite to the first side of the housing;
  • a game controller comprising:
  • the rail on a side surface of the housing, the rail having a T-shaped cross-section including a stem extending perpendicular to the side surface and a crossbar extending perpendicular to the stem;
  • the electrical connector may be on a surface of the crossbar.
  • a game controller comprising:
  • the rail located on a side surface of the housing, the rail having a T-shaped cross-section including a stem extending perpendicular to the side surface and a crossbar extending perpendicular to the stem;
  • the retractable projection may be on a surface of the crossbar.
  • the embodiment described above is applicable to, for example, an information processing device such as a game device and an information processing system such as a game system, with the aim of providing an information processing device that can be used in different modes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Casings For Electric Apparatus (AREA)
  • Position Input By Displaying (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Charge And Discharge Circuits For Batteries Or The Like (AREA)
  • Switch Cases, Indication, And Locking (AREA)
  • Telephone Set Structure (AREA)
  • Power Sources (AREA)
  • Calculators And Similar Devices (AREA)
  • Secondary Cells (AREA)
  • Pinball Game Machines (AREA)
  • Details Of Connecting Devices For Male And Female Coupling (AREA)
  • User Interface Of Digital Computer (AREA)
  • Mounting Components In General For Electric Apparatus (AREA)
  • Hardware Redundancy (AREA)
  • Switches With Compound Operations (AREA)

Abstract

An example information processing system includes a main unit, a first controller device and a second controller device. The main unit includes a display. The first controller device is configured to be removably attached to the main unit, and the first controller device transmits first operation data representing an operation performed on the first controller device to the main unit, irrespective of whether the first controller device is attached to the main unit. The second controller device is configured to be removably attached to the main unit, and the second controller device transmits second operation data representing an operation performed on the second controller device to the main unit, irrespective of whether the second controller device is attached to the main unit. The main unit displays, on the display, an execution result of an information process based on the first and the second operation data.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 16/529,532, filed Aug. 1, 2019, which is a continuation of U.S. application Ser. No. 15/179,022, filed Jun. 10, 2016, now U.S. Pat. No. 10,583,356, which claims priority of Japanese Patent Application No. 2015-119707, filed Jun. 12, 2015. The entire contents of each of these applications are hereby incorporated by reference in this application.
  • FIELD
  • The present technique relates to an information processing system, an information processing device, a controller device and an accessory.
  • BACKGROUND AND SUMMARY
  • There are conventional portable information processing devices including a display section and an operation section.
  • Desirably, an information processing device can be used in different modes.
  • Thus, the present application discloses an information processing device that can be used in different modes. Also, the present application discloses a novel information processing device.
  • An example information processing system described herein includes a main unit, a first controller device and a second controller device. The main unit includes a display. The first controller device is configured to be removably attached to the main unit, and the first controller device is configured to transmit first operation data representing an operation performed on the first controller device to the main unit, irrespective of whether the first controller device is attached to the main unit. The second controller device is configured to be removably attached to the main unit, and the second controller device is configured to transmit second operation data representing an operation performed on the second controller device to the main unit, irrespective of whether the second controller device is attached to the main unit. The main unit is configured to display, on the display, an execution result of an information process based on the first operation data transmitted from the first controller device and the second operation data transmitted from the second controller device.
  • The first controller device may include a first input section and a second input section. The second controller device may include a third input section of the same type as the first input section and a fourth input section of the same type as the second input section.
  • The first input section may have substantially the same input mechanism as the third input section. The second input section may have substantially the same input mechanism as the fourth input section.
  • The first input section may have substantially the same shape as the third input section. The second input section may have substantially the same shape as the fourth input section.
  • When the first controller device and the second controller device are detached from the main unit, a positional relationship between the first input section and the second input section of the first controller device placed in a predetermined orientation may be the same as a positional relationship between the third input section and the fourth input section of the second controller device placed in a predetermined orientation.
  • When the first controller device and the second controller device are attached to the main unit, a positional relationship between the first input section and the second input section may be opposite from a positional relationship between the third input section and the fourth input section.
  • The first input section and the third input section may each be a directional input section configured to accept a directional input.
  • The directional input section may include an operation member configured to be tilted or slid in a predetermined direction.
  • The second input section and the fourth input section may each be a button which can be pressed.
  • The first controller device may be configured to be integrally attached to the main unit with a predetermined surface of a housing of the first controller device facing a predetermined surface of the main unit.
  • The first controller device may be configured to be integrally attached to the main unit so as to be facing one of a left side surface and a right side surface of the main unit. The second controller device may be configured to be attached to the main unit so as to be facing the other one of the left side surface and the right side surface of the main unit.
  • The first controller device may include a light-emitting portion on the predetermined surface for notifying a user of predetermined information.
  • The first controller device may include an operation section on the predetermined surface.
  • A connecting portion between a first side surface of four side surfaces of the first controller device and a side surface adjacent to the first side surface may have a more rounded shape than a connecting portion between a second side surface of the four side surfaces, which is opposite from the first side surface, and a side surface adjacent to the second side surface. A connecting portion between a third side surface of four side surfaces of the second controller device and a side surface adjacent to the third side surface may have a more rounded shape than a connecting portion between a fourth side surface of the four side surfaces, which is opposite from the third side surface, and a side surface adjacent to the fourth side surface.
  • The first controller device may be attached to the main unit with the second side surface of the first controller device facing a fifth side surface of four side surfaces of the main unit. The second controller device may be attached to the main unit with the fourth side surface of the second controller device facing a sixth side surface, which is opposite from the fifth side surface, of the main unit.
  • The second controller device may include an input section having a first function, the first controller device not having the first function.
  • The second controller device may include an image-capturing device as an input section having the first function.
  • The second controller device may include a button as an input section having the first function.
  • The first controller device may include an input section having a second function different from the first function. The second controller device may include an input section having the second function.
  • The first controller device may include one or more input section having a predetermined number of types of functions. The second controller device may include one or more input section having a number of types of functions, the number being different from the predetermined number.
  • Communication between the main unit and the first controller device when the first controller device is attached to the main unit may use a first communication scheme, and communication between the main unit and the first controller device when the first controller device is detached from the main unit may use a second communication scheme different from the first communication scheme.
  • Communication between the main unit and the first controller device when the first controller device is detached from the main unit may be wireless communication.
  • Communication between the main unit and the first controller device when the first controller device is attached to the main unit may be wired communication.
  • The wired communication between the main unit and the first controller device is communication through a wired communication channel formed by an electrical connection a first terminal of the main unit and a second terminal of the first controller device.
  • When the first controller device is attached to the main unit, a first terminal of the main unit and a second terminal of the first controller device may be electrically connected to each other, and communication between the main unit and the first controller device and power supply from the main unit to the first controller device may be implemented through the first terminal and the second terminal.
  • The information processing system may include: a first sensor configured to sense attachment of the first controller device to the main unit; and a second sensor configured to sense attachment of the second controller device to the main unit. The main unit may register, as a set, the first controller device and a second controller device based on a sensing result from the first sensor and the sensing result from the second sensor.
  • When a first controller device and a second controller device are both attached to the main unit, the main unit may register, as a set, the first controller device and the second controller device attached thereto.
  • If operation data is received from each of a first controller device and a second controller device registered as a set, with the first controller device and the second controller device detached from the main unit, the main unit may execute a predetermined information process using the two received operation data as a set.
  • The main unit may include a housing including a first engagement portion configured for engagement with a housing of the first controller device when the first controller device is attached to the main unit, and a second engagement portion configured for engagement with a housing of the second controller device when the second controller device is attached to the main unit.
  • The first controller device may include a housing including a third engagement portion configured for engagement with the first engagement portion of the main unit. The second controller device may include a housing including a fourth engagement portion configured for engagement with the second engagement portion of the main unit.
  • The main unit may selectively output an execution result of the information process either to the display or to a display device separate from the main unit.
  • Another example information processing device described herein includes a first attachment and detachment mechanism, a second attachment and detachment mechanism and a display. The first attachment and detachment mechanism is configured to be removably attached to a first controller device thereto. The second attachment and detachment mechanism is configured to be removably attached of a second controller device thereto. The information processing device displays a result of an information process based on operation data transmitted from the first controller device and the second controller device on the display, irrespective of whether the first controller device and the second controller device are attached to the information processing device.
  • Another example information processing system described herein includes a main unit and a first controller device.
  • The main unit includes: a display; and a housing including a main unit-side engagement portion configured for engagement with a housing of the first controller device. The first controller device is configured to be removably attached to the main unit by means of the main unit-side engagement portion.
  • The first controller device may include a housing including a controller device-side engagement portion configured for engagement with the main unit-side engagement portion.
  • The main unit-side engagement portion may include a first slide member including a surface of the housing of the main unit. The first controller device may include a second slide member configured for slidable and detachable engagement with the first slide member. The first slide member and the second slide member may together comprise a slide mechanism.
  • The first slide member may be configured for slidable engagement with the second slide member in a predetermined direction, and the first slide member may be configured to allow the second slide member to be inserted into and detached from the first slide member via one end of the first slide member in the predetermined direction.
  • The first slide member may be including an up-down direction of the main unit and may be configured to allow the second slide member to be inserted into and detached from the first slide member via an upper end of the first slide member.
  • The first slide member may be configured to extend generally over an entirety of a surface of the housing of the main unit in a predetermined direction.
  • The first slide member may have a C-shaped cross section. The second slide member may have a T-shaped cross section.
  • The first controller device may include a terminal for communicating with the main unit. The main unit may include a terminal on the housing thereof at such a location to allow the terminal to be connected to the terminal of the first controller device when the first controller device is attached to the main unit.
  • The first controller device may include an input section on the predetermined surface.
  • Another example information processing system described herein includes a main unit and a controller device configured to be removably attached to the main unit. The main unit includes a display. Communication between the main unit and the controller device when the controller device is attached to the main unit may use a first communication scheme. Communication between the main unit and the controller device when the controller device is detached from the main unit may use a second communication scheme different from the first communication scheme.
  • The first communication scheme may be wired communication.
  • When the controller device is attached to the main unit, the first terminal of the main unit and the second terminal of the controller device may be electrically connected to each other via contact.
  • The second communication scheme may be wireless communication.
  • Another example information processing system described herein includes a main unit, a first controller device, a second controller device and an accessory. The first controller device is configured to be removably attached to the main unit and to be removably attached to the accessory. The second controller device is configured to be removably attached to the main unit and to be removably attached to the accessory. The main unit includes a display and displays, on the display, an execution result of an information process based on an operation performed on the first controller device and the second controller device. The accessory is configured to be attached of the first controller device and the second controller device thereto at the same time.
  • The accessory may include a charging control section configured to use power supplied to the accessory to charge one or both of the first controller device and the second controller device attached to the accessory.
  • The first controller device may be configured to be attached to the accessory on a left side of a center of the accessory. The second controller device may be configured to be attached to the accessory on a right side of the center of the accessory.
  • The accessory may include a first grip portion on a left side of a center of the accessory and a second grip portion on a right side of the center of the accessory.
  • The first grip portion may be on a left side of an area where the first controller device is attached. The second grip portion may be on a right side of an area where the second controller device is attached.
  • When the first controller device is attached to the accessory, communication between the main unit and the first controller device may be wireless communication. When the second controller device is attached to the accessory, communication between the main unit and the second controller device may be wireless communication.
  • The first controller device may be configured to be attached to the accessory with a tilt in a first direction from a reference attitude. The second controller device may be configured to be attached to the accessory with a tilt in a direction opposite to the first direction from a reference attitude.
  • When at least one of the first controller device and the second controller device is attached to the accessory, the main unit may execute the information process while modifying operation data obtained from the at least one controller device attached to the accessory and/or modifying information obtained based on the operation data.
  • The main unit may be configured to modify data from a directional input section, of all the operation data.
  • An example accessory described herein is an accessory to which a first controller device and a second controller device are removably attachable.
  • The first controller device is configured to be removably attached to a main unit separate from the accessory. The second controller device is configured to be removably attached to the main unit. The main unit includes a display and is configured to display, on the display, an execution result of an information process based on an operation performed on the first controller device and the second controller device. The accessory includes: a first engagement portion configured for detachable engagement with the first controller device; and a second engagement portion configured for detachable engagement with the second controller device.
  • An example game system described herein includes a main unit, a first controller device and a second controller device. The main unit includes a display. The first controller device is configured to be removably attached to the main unit. The second controller device is configured to be removably attached to the main unit. At least when the first controller device and the second controller device are detached from the main unit, each of the first controller device and the second controller device transmits, to the main unit, via wireless communication, operation data representing an operation performed thereon.
  • An example information processing device described herein is a hand-held information processing device. The information processing device includes a main section including a display, a first controller section and a second controller section, and the information processing device is configured to perform a information process in response to an operation performed on either the first controller section or the second controller section. The first controller section is configured to be removably attached to the main section. The second controller section is configured to be removably attached to the main section. When the first controller section and the second controller section are detached from the main section, the main unit is configured to perform the information process based on operation data representing an operation performed on at least one of the first controller section and the second controller section, and displays a result of the information process on the display.
  • When the first controller section and the second controller section are attached to the main section, the first controller section can be operated by one hand of a user and the second controller section can be operated by the other hand of the user.
  • The present specification also discloses an example of an information processing device, a controller device or an accessory of the information processing system. The present specification also discloses an example of a method to be carried out in the information processing system. The present specification also discloses an example of a computer-readable storage medium storing therein an information processing program which causes a computer to execute some of the processes to be executed on the information processing system or the information processing device (in other words, causes a computer to function as some of the various units of the information processing system or the information processing device).
  • With the information processing system, the information processing device and the accessory set forth above, an information processing device and/or an information processing system can be used in different modes.
  • These and other objects, features, aspects and advantages will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram showing an example information processing device according to the present embodiment;
  • FIG. 2 is a diagram showing an example where controllers are detached from a main unit;
  • FIG. 3 is a six-sided view showing an example main unit;
  • FIG. 4 is a diagram showing an example where the main unit is placed upright;
  • FIG. 5 is a six-sided view showing an example left controller;
  • FIG. 6 is a six-sided view showing an example right controller;
  • FIG. 7 is a diagram showing an example of how a left rail member and a slider engage with each other;
  • FIGS. 8A and 8B are diagrams showing an example of how the slider is locked to the left rail member;
  • FIG. 9 is a diagram showing an overall configuration of an example information processing system according to the present embodiment;
  • FIG. 10 is a diagram showing an external configuration of an example cradle;
  • FIG. 11 is a block diagram showing an example internal configuration of the main unit;
  • FIG. 12 is a block diagram showing an example internal configuration of the information processing device;
  • FIG. 13 is a block diagram showing an example internal configuration of the cradle;
  • FIG. 14 is a diagram showing an example of how the information processing device is used with the controllers attached to the main unit;
  • FIG. 15 is a diagram showing an example of a single user holding two controllers to use the information processing device in a detached state;
  • FIG. 16 is a diagram showing an example of two users each holding one controller to use the information processing device in a detached state;
  • FIGS. 17A and 17B are diagrams showing example modes of use where three or more controllers are used;
  • FIG. 18 is a diagram showing an example mode of use where the image is displayed on a TV;
  • FIGS. 19A, 19B, 19C, and 19D are diagrams showing an example operation flow where the image is displayed on a TV;
  • FIG. 20 is a flow chart showing an example flow of a registration process executed on the main unit;
  • FIG. 21 is a diagram showing an example of registration information;
  • FIG. 22 is a diagram showing an example of pairing information;
  • FIG. 23 is a flow chart showing an example flow of a wireless setting process executed on the main unit;
  • FIG. 24 is a flow chart showing an example flow of a mode setting process executed on the main unit;
  • FIG. 25 is a flow chart showing an example flow of an information process executed on the main unit;
  • FIG. 26 is a flow chart showing an example flow of an information process executed on the main unit;
  • FIG. 27 is a flow chart showing an example flow of an information process executed on the main unit;
  • FIG. 28 is a diagram showing another example left controller;
  • FIG. 29 is a diagram showing another example left controller;
  • FIG. 30 is a diagram showing an example information processing device with a right controller different from that of FIG. 1 attached thereto;
  • FIG. 31 is a diagram showing an example accessory to which controllers can be attached;
  • FIG. 32 is a diagram showing another example accessory; and
  • FIG. 33 is a diagram showing an example accessory to which the main unit can be attached.
  • DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS
  • An information processing system, an information processing device, a controller device and an accessory according to an example of the present embodiment will now be described. In the present embodiment, the information processing system includes an information processing device 1 and a cradle 5 (see FIG. 9). The information processing device 1 of the present embodiment includes a main unit 2 and controllers 3 and 4, which can be attached to and detached from each other, and the controllers 3 and 4 can be used separately from the main unit 2 (see FIG. 2). The information processing device 1 can be used both in a mode of use in which the image is displayed on the main unit 2 and in another mode of use in which the image is displayed on a separate display device such as a TV. The information processing device 1 is used as a portable device (e.g., a portable game device) in the former mode, and the information processing device 1 is used as a console-type device (e.g., a console-type game device) in the latter mode.
  • [1. External Configuration of System]
  • [1-1. Configuration of Information Processing Device]
  • FIG. 1 is a diagram showing an example information processing device 1 according to the present embodiment. As shown in FIG. 1, the information processing device 1 includes a main unit 2, a left controller 3 and a right controller 4. The main unit 2, including a display 12, executes various processes of the information processing device 1. The controllers 3 and 4 each include an operation section allowing a user to provide an input.
  • FIG. 2 is a diagram showing an example where the controllers 3 and 4 are detached from the main unit 2. As shown in FIG. 1 and FIG. 2, the controllers 3 and 4 can be attached to and detached from the main unit 2. The left controller 3 can be attached to the left side of the main unit 2 (the x-axis positive direction side shown in FIG. 1). The right controller 4 can be attached to the right side of the main unit 2 (the x-axis negative direction side shown in FIG. 1). Note that the left controller and the right controller may be referred to generally as “controllers”. A specific example configuration of the main unit 2 and the controllers 3 and 4 will now be described.
  • [1-1-1. Configuration of Main Unit]
  • FIG. 3 is a six-sided view showing an example main unit. As shown in FIG. 3, the main unit 2 includes a generally plate-shaped housing 11. In the present embodiment, the primary surface (in other words, the front-side surface, i.e., the surface on which the display 12 is provided) of the housing 11 has a generally rectangular shape. In the present embodiment, the housing 11 has a horizontally-elongated shape. That is, in the present embodiment, the longitudinal direction of the primary surface of the housing 11 (i.e., the x-axis direction shown in FIG. 1) is denoted as the horizontal direction (also referred to as the left-right direction), the width direction of the primary surface (i.e., the y-axis direction shown in FIG. 1) is denoted as the vertical direction (also referred to as the up-down direction), and the direction perpendicular to the primary surface (i.e., the z-axis direction shown in FIG. 1) is denoted as the depth direction (also referred to as the front-rear direction). Note that the main unit 2 may be used in a landscape position or may be used in a portrait position.
  • Note that there is no particular limitation on the shape and the size of the housing 11. For example, in other embodiments, the housing 11 may include a projection or a grip portion for making it easier for a user to hold the device.
  • (Elements Provided on Primary Surface of Housing 11)
  • As shown in FIG. 3, the main unit 2 includes the display 12 provided on the primary surface of the housing 11. The display 12 displays an image (which may be a still image or a video image) obtained or produced by the main unit 2. While the display 12 is assumed to be a liquid crystal display device (LCD) in the present embodiment, it may be any type of a display device.
  • The main unit 2 includes a touch panel 13 on the screen of the display 12. In the present embodiment, the touch panel 13 is of a type (e.g., the capacitive type) that enables a multi-touch input. Note however that there is no particular limitation on the type of the touch panel 13, and the touch panel 13 may be of a type (e.g., the resistive type) that enables a single-touch input, for example.
  • The main unit 2 includes a speaker (i.e., a speaker 88 shown in FIG. 11) inside the housing 11. As shown in FIG. 3, speaker holes 11 a and 11 b are formed in the primary surface of the housing 11. Output sounds from the speaker 88 are output through these speaker holes 11 a and 11 b. In the present embodiment, the main unit 2 includes two speakers, and speaker holes are located respectively for the left speaker and the right speaker. The speaker hole 11 a for the left speaker is formed in a left portion of the display 12. The speaker hole 11 b for the right speaker is formed in a right portion of the display 12.
  • The main unit 2 also includes an ambient light sensor (i.e., an ambient light sensor 94 shown in FIG. 11) inside the housing 11. As shown in FIG. 3, a window portion 14 is provided in the primary surface of the housing 11 so as to allow light from outside the housing 11 to be received by the ambient light sensor 94. The window portion 14 is provided for example as a transparent member that allows light to pass therethrough, or a filter member that allows light of a predetermined wavelength that can be sensed by the ambient light sensor 94 to pass therethrough.
  • Note that there is no particular limitation on the position, the shape and the number of the speaker holes 11 a and 11 b and the window portion 14. For example, in other embodiments, the speaker holes 11 a and 11 b may be provided on the side surface or the back surface of the housing 11. While the window portion 14 is provided on the lower left side of the display 12 in the present embodiment, it may be provided in any other position on the primary surface of the housing 11 or may be provided on the side surface of the housing 11.
  • (Elements Provided on Left Side Surface of Housing 11)
  • As shown in FIG. 3, the main unit 2 includes a left rail member 15 on the left side surface of the housing 11. The left rail member 15 is a member that allows the left controller 3 to be detachably attached to the main unit 2. The left rail member 15 is provided so as to extend in the up-down direction on the left side surface of the housing 11. The left rail member 15 has such a shape that can engage with a slider of the left controller 3 (i.e., a slider 40 shown in FIG. 5). The left rail member 15 and the slider 40 together form a slide mechanism, the details of which will be described later. This slide mechanism allows the left controller 3 to be slidably and detachably attached to the main unit 2.
  • In the present embodiment, the left rail member 15 has a shape with a groove. In other words, the cross section (specifically, the cross section perpendicular to the up-down direction) of the left rail member 15 is C-shaped. More specifically, the cross section of the left rail member 15 is such that the end portions of the cross section extend in the outside-to-center direction. Therefore, the slider 40 in engagement with the left rail member 15 is securely locked so as not to come off in the direction perpendicular to the sliding direction (in other words, the direction in which the left rail member 15 extends) (see FIG. 7 to be discussed below).
  • As shown in FIG. 3, the left rail member 15 is provided with an engagement hole 16. The engagement hole 16 is located so as to face a projection 41 provided on the slider 40 when the left controller 3 is attached to the main unit 2. There is no particular limitation on the specific position of the engagement hole 16. In the present embodiment, the engagement hole 16 is provided on the bottom surface of the left rail member 15 (in other words, the bottom surface of the groove of the left rail member 15). The engagement hole 16 is shaped so that the projection (i.e., the projection 41 shown in FIG. 5) can engage with the engagement hole 16. When the left controller 3 is attached to the main unit 2, the projection 41 is inserted into and engages with the engagement hole 16, thereby locking the left controller 3 to the main unit 2, the details of which will be described later. Note that in other embodiments, the left rail member 15 may be provided with a projection and the slider 40 may be provided with an engagement hole.
  • The main unit 2 includes a left-side terminal 17. The left-side terminal 17 allows the main unit 2 to communicate with the left controller 3 in wired communication. The left-side terminal 17 is located so as to be in contact with the terminal of the left controller 3 (a terminal 42 shown in FIG. 5) when the left controller 3 is attached to the main unit 2. There is no particular limitation on the specific position of the left-side terminal 17. In the present embodiment, as shown in FIG. 3, the left-side terminal 17 is provided on the bottom surface of the left rail member 15. In the present embodiment, the left-side terminal 17 is provided near the lower end on the bottom surface of the left rail member 15. The left-side terminal 17 is provided below the engagement hole 16 (in other words, on the far side with respect to the direction in which the slider 40 is inserted into the left rail member 15).
  • A stopper 18 is provided on the left side surface of the housing 11. As shown in FIG. 3, the stopper 18 is provided near the end (in the present embodiment, near the lower end) of the left rail member 15. The stopper 18 is provided inside the groove of the left rail member 15. The stopper 18 is provided in order to limit the slide of the slider 40 in engagement with the left rail member 15, the details of which will be described later.
  • (Elements Provided on Right Side Surface of Housing 11)
  • As shown in FIG. 3, similar elements to those provided on the left side surface of the housing 11 are provided on the right side surface of the housing 11. That is, the main unit 2 includes a right rail member 19 on the right side surface of the housing 11. The right rail member 19 is provided so as to extend in the up-down direction on the right side surface of the housing 11. The right rail member 19 has such a shape that it can engage with a slider of the right controller 4 (i.e., a slider 62 shown in FIG. 6). The right rail member 19 and the slider 62 together form a slide mechanism, the details of which will be described later. This slide mechanism allows the right controller 4 to be slidably and detachably attached to the main unit 2.
  • In the present embodiment, the right rail member 19 has a similar shape to that of the left rail member 15. That is, the right rail member 19 has a shape with a groove whose cross-sectional shape is similar to that of the left rail member 15. Note however that the right rail member 19 does not need to have exactly the same shape as that of the left rail member 15. For example, in other embodiments, the groove of the left rail member 15 and the groove of the right rail member 19 may differ from each other in terms of the size and/or the shape so that the slider 62 of the right controller 4 cannot engage with the left rail member 15 (and/or so that the slider 40 of the left controller 3 cannot engage with the right rail member 19).
  • As shown in FIG. 3, the right rail member 19 is provided with an engagement hole 20. The engagement hole 20 is located so as to face a projection 63 provided on the slider 62 when the right controller 4 is attached to the main unit 2. There is no particular limitation on the specific position of the engagement hole 20. In the present embodiment, the engagement hole 20 is provided on the bottom surface of the right rail member 19 (in other words, the bottom surface of the groove of the right rail member 19). The engagement hole 20 is shaped so that the projection (i.e., the projection 63 shown in FIG. 6) can engage with the engagement hole 20. When the right controller 4 is attached to the main unit 2, the projection 63 is inserted into and engages with the engagement hole 20, thereby locking the right controller 4 to the main unit 2, the details of which will be described later. Note that in other embodiments, the right rail member 19 may be provided with a projection and the slider 62 with an engagement hole.
  • The main unit 2 includes a right-side terminal 21. The right-side terminal 21 allows the main unit 2 to communicate with the right controller 4 in wired communication. The right-side terminal 21 is located so as to be in contact with the terminal of the right controller 4 (a terminal 64 shown in FIG. 6) when the right controller 4 is attached to the main unit 2. There is no particular limitation on the specific position of the right-side terminal 21. In the present embodiment, as shown in FIG. 3, the right-side terminal 21 is provided on the bottom surface of the right rail member 19. In the present embodiment, the right-side terminal 21 is provided near the lower end on the bottom surface of the right rail member 19. The right-side terminal 21 is provided below the engagement hole 20 (in other words, on the far side with respect to the direction in which the slider 62 is inserted into the right rail member 19).
  • A stopper 22 is provided on the right side surface of the housing 11. As shown in FIG. 3, the stopper 22 is provided near the end (in the present embodiment, near the lower end) of the right rail member 19. The stopper 22 is provided inside the groove of the right rail member 19. The stopper 22 is provided in order to limit the slide of the slider 62 in engagement with the right rail member 19, the details of which will be described later.
  • As described above, in the present embodiment, the housing 11 of the main unit 2 is provided with the left rail member 15 and the right rail member 19. Thus, the housing 11 is configured on the assumption that controllers are attached thereto. Note that there is no particular limitation on the position, the shape and the size of the rail members 15 and 19. For example, in other embodiments, the rail members 15 and 19 may be provided on the left and right end portions, respectively, on the primary surface and/or the reverse surface of the housing 11. There is no particular limitation on the mechanism for allowing the controllers 3 and 4 to be detachably attached to the main unit 2, and a slider mechanism different from that of the present embodiment may be used, or a mechanism different from a slider mechanism may be used.
  • (Elements Provided on Upper Side Surface of Housing 11)
  • As shown in FIG. 3, the main unit 2 includes a first slot 23. The first slot 23 is provided on the upper side surface of the housing 11. The first slot 23 is shaped so as to accommodate a storage medium of a first type. Note that in the present embodiment, a cover that can be opened/closed is provided for the opening of the first slot 23, and a storage medium of the first type can be inserted into the first slot 23 with the cover being open. A storage medium of the first type is, for example, a dedicated storage medium (e.g., a dedicated memory card) for the information processing device 1 or other information processing devices of the same type. The storage medium of the first type is used, for example, for storing data used in the main unit 2 (e.g., application save data, etc.) and/or for storing programs to be executed on the main unit 2 (e.g., application programs, etc.).
  • The main unit 2 also includes a power button 28. As shown in FIG. 3, the power button 28 is provided on the upper side surface of the housing 11. The power button 28 is a button for turning ON/OFF the power of the main unit 2. Note that in the present embodiment, the power button 28 can be used to switch between the ON mode and the sleep mode. The ON mode is a mode in which the screen display of the display 12 is turned on, for example, and the sleep mode is a mode in which the screen display of the display 12 is turned off, for example. In the sleep mode, in addition to (or instead of) turning off the screen display of the display 12, a predetermined process of the application (e.g., a game process of a game application) may be stopped. When a long-press operation is performed on a power button 28 (specifically, when the power button 28 is held down for a predetermined period of time or longer), the main unit 2 executes a process of turning ON/OFF the power of the main unit 2. On the other hand, when a short-press operation is performed on the power button 28 (specifically, when the power button 28 is held down for a period of time that is shorter than the predetermined period of time), the main unit 2 executes a process of switching between the ON mode and the sleep mode.
  • As described above, in the present embodiment, the power button 28 can be used to turn the power ON/OFF and to switch between the ON mode and the sleep mode. Note that in other embodiments, the main unit 2 may be provided with a button only for the function of turning the power ON/OFF or only for the function of switching between the ON mode and the sleep mode.
  • The main unit 2 includes a sound input/output terminal (specifically, an earphone jack) 25. That is, the main unit 2 allows a microphone or an earphone to be attached to the sound input/output terminal 25. As shown in FIG. 3, the sound input/output terminal 25 is provided on the upper side surface of the housing 11.
  • The main unit 2 includes sound volume buttons 26 a and 26 b. As shown in FIG. 3, the sound volume buttons 26 a and 26 b are provided on the upper side surface of the housing 11. The sound volume buttons 26 a and 26 b are buttons for giving instructions to adjust the volume of the sound output from the main unit 2. That is, the sound volume button 26 a is a button for giving an instruction to lower the sound volume, and the sound volume button 26 b is a button for giving an instruction to raise the sound volume.
  • The housing 11 is provided with an air outlet hole 11 c. As shown in FIG. 3, the air outlet hole 11 c is provided on the upper side surface of the housing 11. The air outlet hole 11 c is provided so as to radiate (in other words, discharge) the heat generated inside the housing 11 to the outside of the housing 11.
  • (Elements Provided on Lower Side Surface of Housing 11)
  • The main unit 2 includes a lower terminal 27. The lower terminal 27 is a terminal for allowing the main unit 2 to communicate with the cradle 5 to be described later. As shown in FIG. 3, the lower terminal 27 is provided on the lower side surface of the housing 11. The lower terminal 27 is connected to a terminal of the cradle 5 (a main body terminal 73 shown in FIG. 10) when the main unit 2 is attached to the cradle 5, the details of which will be described later. In the present embodiment, the lower terminal 27 is a USB connector (more specifically, a female-side connector).
  • The main unit 2 also includes a second slot 24. In the present embodiment, the second slot 24 is provided on the lower side surface of the housing 11. Note however that in other embodiments, the second slot 24 may be provided on the same surface as the first slot 23. The second slot 24 is shaped so as to accommodate a storage medium of a second type, which is different from the first type. Note that in the present embodiment, a cover that can be opened/closed is provided for the opening of the second slot 24, and a storage medium of the second type can be inserted into the second slot 24 with the cover being open. A storage medium of the second type may be, for example, a general-purpose storage medium, e.g., an SD card. As is the storage medium of the first type, the storage medium of the second type is used for storing data used in the main unit 2 (e.g., application save data, etc.) and/or for storing programs to be executed on the main unit 2 (e.g., application programs, etc.).
  • The housing 11 is provided with an air inlet hole 11 d. As shown in FIG. 3, the air inlet hole 11 d is provided on the lower side surface of the housing 11. The air inlet hole 11 d is provided so as to take in (in other words, introduce) the air from the outside of the housing 11 to the inside of the housing 11. In the present embodiment, the air inlet hole 11 d is provided on the surface opposite from the surface where the air outlet hole 11 c is provided, thereby allowing for efficient discharge of the heat from inside the housing 11.
  • The main unit 2 also includes a stand member 29 used when placing the housing upright. As shown in FIG. 3, the stand member 29 is provided on the lower side surface of the housing 11. The stand member 29 is rotatably connected to the housing 11 via a pivot 29 a. In FIG. 3, the stand member 29 is accommodated in the housing 11.
  • FIG. 4 is a diagram showing an example in which the main unit 2 is placed upright. Note that in order to facilitate understanding of elements of interest to be discussed in conjunction with the figure, some of the other elements of the main unit 2 are not shown in FIG. 4. The rod-shaped portion of the stand member 29 protrudes from the housing 11 after being rotated about the pivot 29 a. Thus, the stand member 29 is brought into a position protruding from the housing 11, allowing the main unit 2 to be placed upright as shown in FIG. 4. Note that the mechanism for placing the main unit 2 upright is not limited to the stand member 29 shown in FIG. 3, but may be any other mechanism.
  • There is no particular limitation on the shape, the number and the arrangement of the various elements (specifically, the buttons, the slots, the terminals, etc.) provided on the housing 11 described above. For example, in other embodiments, some of the power button 28 and the slots 23 and 24 may be provided on another side surface or the back surface of the housing 11. In other embodiments, some of the elements described above may be absent on the main unit 2.
  • [1-1-2. Configuration of Left Controller]
  • FIG. 5 is a six-sided view showing an example of the left controller 3. As shown in FIG. 5, the left controller 3 includes a generally plate-shaped housing 31. In the present embodiment, the primary surface (in other words, the front-side surface, i.e., the surface on the z-axis negative direction side shown in FIG. 1) of the housing 31 has a generally rectangular shape. In the present embodiment, the housing 31 has a vertically-elongated shape, i.e., a shape that is elongated in the up-down direction (i.e., the y-axis direction shown in FIG. 1). Note that when detached from the main unit 2, the left controller 3 may be held in a portrait position (see FIG. 18) or may be held in a landscape position (see FIG. 16). Note that there is no particular limitation on the shape of the housing 31, and the housing 31 does not need to be generally plate-shaped in other embodiments. The housing 31 does not need to have a rectangular shape, but may have a semi-circular shape, or the like, for example. The housing 31 does not need to have a vertically-elongated shape.
  • The length of the housing 31 in the up-down direction is generally equal to the length of the housing 11 of the main unit 2 in the up-down direction. The thickness of the housing 31 (i.e., the length thereof in the front-rear direction; in other words, the length thereof in the z-axis direction shown in FIG. 1) is generally equal to the thickness of the housing 11 of the main unit 2. Therefore, when the left controller 3 is attached to the main unit 2 (see FIG. 1), a user can hold the main unit 2 and the left controller 3 as if they were an integral unit.
  • As shown in FIG. 5, the left-side corner portion of the primary surface of the housing 31 has a more rounded shape than the right-side corner portion thereof. That is, the connecting portion between the upper side surface and the left side surface of the housing 31 and the connecting portion between the lower side surface and the left side surface of the housing 31 are more rounded (in other words, round-cornered with a greater radius) than the connecting portion between the upper side surface and the right side surface and the connecting portion between the lower side surface and the right side surface. Therefore, when the left controller 3 is attached to the main unit 2 (see FIG. 1), the left side of the information processing device 1 will have a rounded shape, making it easier for a user to hold the device.
  • The left controller 3 includes an analog stick 32. As shown in FIG. 5, the analog stick 32 is provided on the primary surface of the housing 31. The analog stick 32 is an example of a directional input section allowing a user to input a direction. The analog stick 32 includes a stick member that can be tilted in any direction (i.e., 360° directions including the upper, lower, left, right and diagonal directions) parallel to the primary surface of the housing 31. A user can tilt the stick member to make a direction input based on the tilt direction (and a magnitude input based on the tilt angle). Note that the directional input section may also be a cross-shaped key, a slide stick, or the like. A slide stick is an input section including a stick member that can be slid in any direction parallel to the primary surface of the housing 31, and a user can slide the stick member to make an input based on the slide direction (and a magnitude input based on the slide amount). In the present embodiment, a user can also make an input by pressing down the stick member (in a direction vertical to the housing 31). That is, the analog stick 32 is an input section that allows a user to make a direction input and a magnitude input based on the tilt direction and the tilt amount, respectively, of the stick member, and also to make a push input by pressing down the stick member.
  • The left controller 3 includes four operation buttons 33 to 36 (specifically, a right direction button 33, a lower direction button 34, an upper direction button 35 and a left direction button 36). As shown in FIG. 5, these four operation buttons 33 to 36 are provided below the analog stick 32 on the primary surface of the housing 31. Note that while four operation buttons are provided on the primary surface of the left controller 3 in the present embodiment, there is no particular limitation on the number of operation buttons. These operation buttons 33 to 36 are used to give instructions in accordance with various programs executed on the main unit 2 (e.g., the OS program and application programs). Note that in the present embodiment, the operation buttons 33 to 36 can be used to make directional inputs, and the operation buttons 33 to 36 are therefore referred to as the right direction button 33, the lower direction button 34, the upper direction button 35 and the left direction button 36. Note however that the operation buttons 33 to 36 may be used to give instructions other than directional inputs.
  • The left controller 3 also includes a record button 37. As shown in FIG. 5, the record button 37 is provided on the primary surface of the housing 31, more specifically, in a lower right area of the primary surface. The record button 37 is a button for giving an instruction to save the image displayed on the display 12 of the main unit 2. For example, when a game image is displayed on the display 12, a user can press the record button 37 to save the game image that is displayed at the point in time when the button is pressed in a storage section of the main unit 2, for example.
  • The left controller 3 also includes a minus (−) button 47. As shown in FIG. 5, the minus button 47 is provided on the primary surface of the housing 31, more specifically, in an upper right area of the primary surface. The minus button 47 is used to give instructions in accordance with various programs executed on the main unit 2 (e.g., the OS program and application programs). The minus button 47 is used, for example, as a select button (e.g., a button used to move the selection through different selection items) in game applications.
  • When the left controller 3 is attached to the main unit 2, the operation sections provided on the primary surface of the left controller 3 (specifically, the analog stick 32 and the buttons 33 to 37 and 47) are operated with the thumb of the left hand, for example, of a user holding the information processing device 1 (see FIG. 14). When the left controller 3 is used detached from the main unit 2, the operation sections are operated with the left and right thumbs, for example, of a user holding the left controller 3 (see FIG. 15). Specifically, in such a case, the analog stick 32 is operated by the thumb of the left hand of the user, and the operation buttons 33 to 36 are operated with the thumb of the right hand of the user.
  • The left controller 3 includes a first L button 38. The left controller 3 also includes a ZL button 39. As are the operation buttons 33 to 36, these operation buttons 38 and 39 are used to give instructions in accordance with various programs executed on the main unit 2. As shown in FIG. 5, the first L button 38 is provided over a corner portion between the left side surface and the upper side surface of the housing 31. The ZL button 39 is provided to extend over a corner portion between the left side surface and the upper side surface of the housing 31 (strictly speaking, between the left side surface and the upper side surface as seen from the front side of the housing 31) while extending into the reverse surface of the housing 31. That is, the ZL button 39 is provided on the rear side (the z-axis positive direction side shown in FIG. 1) of the first L button 38. In the present embodiment, since the upper left corner portion of the housing 31 has a rounded shape, the first L button 38 and the ZL button 39 each have a rounded shape in conformity with the rounded shape of the upper left corner portion of the housing 31.
  • When the left controller 3 is attached to the main unit 2, the first L button 38 and the ZL button 39 will be placed over the upper left portion of the information processing device 1 (see FIG. 1). Therefore, a user holding the information processing device 1 is allowed to operate the first L button 38 and the ZL button 39 with the index finger or the middle finger of the left hand (see FIG. 14).
  • As shown in FIG. 5, a portion of the reverse surface of the housing 31 where the ZL button 39 is provided (more specifically, at least a portion of the perimeter of the ZL button 39) projects past other portions of the housing 31. The ZL button 39 is provided so as to project past the other portions of the housing 31 on the reverse surface. Therefore, when the main unit 2 with the left controller 3 attached thereto is placed on a flat surface in such an orientation that the reverse surface of the left controller 3 faces the horizontal flat surface, the projecting portions of the housing 31 are in contact with the flat surface. As a result, the information processing device 1 is placed so that the upper side of the main unit 2 is slightly raised from the lower side thereof. When the information processing device 1 is so placed, it is easy for a user to see the display 12.
  • Note that in other embodiments, when the main unit 2 with the left controller 3 attached thereto is placed on a flat surface in such an orientation that the reverse surface of the left controller 3 faces the horizontal flat surface, the ZL button 39 may be in contact with the flat surface. Now, in the present embodiment, the ZL button 39 can be pressed down primarily in the up-down direction (the y-axis direction). That is, the ZL button 39 is supported on the housing 31 so as to move primarily in the up-down direction. Therefore, even if the information processing device 1 is placed so that the ZL button 39 is in contact with the flat surface as described above, the ZL button 39 is unlikely to be pressed down because the ZL button 39 primarily receives a force in the front-rear direction (the z-axis direction). That is, even when the information processing device 1 is placed as described above, the ZL button 39 is unlikely to be pressed down inadvertently.
  • In other embodiments, the ZL button 39 may be provided so as not to project from the reverse surface of the housing 31. For example, the ZL button 39 may be provided on the side surface of the housing 31. For example, an area of the reverse surface of the housing 31 where the ZL button 39 is provided may be sunken from the remaining portion (i.e., the housing is formed to be thinner in this area) so that the ZL button 39 does not project past the remaining portion of the reverse surface.
  • The left controller 3 includes the slider 40 described above. As shown in FIG. 5, the slider 40 is provided so as to extend in the up-down direction on the right side surface of the housing 31. The slider 40 has such a shape that it can engage with the left rail member 15 (more specifically, the groove of the left rail member 15) of the main unit 2. Specifically, the cross section (specifically, the cross section perpendicular to the up-down direction) of the slider 40 is T-shaped. More specifically, the cross section of the slider 40 is T-shaped in conformity with the cross-sectional shape of the left rail member 15 (see FIG. 7). Therefore, the slider 40 in engagement with the left rail member 15 is locked so as not to come off in the direction perpendicular to the sliding direction (in other words, the direction in which the left rail member 15 extends) (see FIG. 7 to be discussed below).
  • As shown in FIG. 5, the slider 40 is provided with the projection 41. The projection 41 is arranged at such a position that allows the projection 41 to be inserted into the engagement hole 16 when the left controller 3 is attached to the main unit 2. There is no particular limitation on the specific position of the projection 41. In the present embodiment, the projection 41 is provided on the engaging surface of the slider 40. Note that the engaging surface of the slider 40 refers to a surface that faces the bottom surface of the left rail member 15 when the left controller 3 is attached to the main unit 2. The projection 41 is shaped so that the projection 41 can engage with the engagement hole 16 of the left rail member 15.
  • In the present embodiment, the projection 41 is biased from the inside of the slider 40 toward the outside of the slider 40. Thus, when a force from the outside of the slider 40 toward the inside of the slider 40 is applied on the projection 41, the projection 41 moves toward the inside of the slider 40 (i.e., retracts into the slider 40). There is no particular limitation on the configuration for biasing the projection 41 as described above. For example, in the present embodiment, the projection 41 is connected to an elastic member inside the slider 40, and the projection 41 is placed inside a hole in the slider 40 with a portion thereof protruding past the engaging surface of the slider 40. Note that in other embodiments, the projection 41 may be fixed to the slider 40.
  • The left controller 3 includes the terminal 42 for allowing the left controller 3 to communicate with the main unit 2 in wired communication. The terminal 42 is located so as to be in contact with the left-side terminal 17 of the main unit 2 (FIG. 3) when the left controller 3 is attached to the main unit 2. There is no particular limitation on the specific position of the terminal 42. In the present embodiment, as shown in FIG. 5, the terminal 42 is provided on the engaging surface of the slider 40. In the present embodiment, the terminal 42 is provided near the lower end on the engaging surface of the slider 40. The terminal 42 is provided below the projection 41 (in other words, on the front side with respect to the insertion of the slider 40 into the left rail member 15).
  • The left controller 3 also includes a second L button 43 and a second R button 44. As are the other operation buttons 33 to 36, these buttons 43 and 44 are used to give instructions in accordance with various programs executed on the main unit 2. As shown in FIG. 5, the second L button 43 and the second R button 44 are provided on the engaging surface of the slider 40. The second L button 43 is provided on the engaging surface of the slider 40 above the center of the engaging surface with respect to the up-down direction (the y-axis direction shown in FIG. 1). The second R button 44 is provided on the engaging surface of the slider 40 below the center of the engaging surface with respect to the up-down direction. The second L button 43 and the second R button 44 are arranged at such positions that they cannot be pressed down with the left controller 3 attached to the main unit 2. That is, the second L button 43 and the second R button 44 are buttons that are used when the left controller 3 is detached from the main unit 2. For example, the second L button 43 and the second R button 44 are operated with the index finger or the middle finger of the left hand and the right hand of a user holding the left controller 3 detached from the main unit 2 (see FIG. 16).
  • The left controller 3 includes an indicator LED 45. The indicator LED 45 is an indicator section for indicating predetermined information to the user. There is no particular limitation on the information to be indicated by the indicator LED 45. In the present embodiment, the indicator LED 45 shows the user identification information of the controller when the main unit 2 communicates with a plurality of controllers. Specifically, as the indicator LED 45, the left controller 3 includes a number (herein, four) of LEDs equal to the number of controllers that the main unit 2 can be simultaneously in communication with. Then, one of the four LEDs is lit, which is associated with the number assigned to the controller. Thus, it is possible with the indicator LED 45 to indicate the number to the user.
  • In other embodiments, the indicator LED 45 may indicate, to the user, the status of communication between the left controller 3 and the main unit 2. For example, the indicator LED 45 may be lit while a connection with the main unit 2 is established. While the number of LEDs (in other words, the light-emitting portions) serving as the indicator LED 45 is four in the present embodiment, there is no particular limitation on the number of LEDs.
  • In the present embodiment, the indicator LED 45 is provided on the engaging surface of the slider 40 as shown in FIG. 5. Thus, the indicator LED 45 is arranged at such a position that the indicator LED 45 cannot be seen when the left controller 3 is attached to the main unit 2. That is, the indicator LED 45 is used when the left controller 3 is detached from the main unit 2.
  • The left controller 3 includes a pairing button 46. In the present embodiment, the pairing button 46 is used to give an instruction for a setting (referred to also as pairing) process regarding wireless communication between the left controller 3 and the main unit 2, and to give an instruction for a resetting process of resetting the left controller 3. Note that in other embodiments, the pairing button 46 may only serve to instruct one of the setting process and the resetting process.
  • That is, when a short-press operation is performed on the pairing button 46 (specifically, when the pairing button 46 is pressed down for a shorter period of time than a predetermined period of time), the left controller 3 executes the setting process. Note that the details of the setting process will be described later.
  • When a long-press operation is performed on the pairing button 46 (specifically, when the pairing button 46 is held down for the predetermined period of time or longer), the left controller 3 executes the resetting process. The resetting process is a process of resetting the status of the left controller 3, and is a process that should be executed, for example, when the left controller 3 freezes (e.g., when the main unit 2 is no longer able to obtain data from the left controller 3). Although there is no particular limitation on the specific details of the resetting process, the resetting process may include, for example, a process of turning OFF the power of the left controller 3 and then turning it back ON, a process of disconnecting with the main unit 2 and then re-connecting with the main unit 2, a process of re-executing a process that is executed when starting communication, and/or the setting process. In the present embodiment, even when the left controller 3 freezes for some reason, the left controller 3 can be reset, using the pairing button 46, to an operative state.
  • In the present embodiment, the pairing button 46 is provided on the engaging surface of the slider 40 as shown in FIG. 5. Thus, the pairing button 46 is arranged at such a position that the pairing button 46 cannot be seen when the left controller 3 is attached to the main unit 2. That is, the pairing button 46 is used when the left controller 3 is detached from the main unit 2. In the present embodiment, it is assumed that the pairing button 46 is pressed down when the left controller 3 is detached from the main unit 2, and that the pairing button 46 will unlikely be pressed down when the left controller 3 is attached to the main unit 2. Thus, the pairing button 46 is arranged at such a position, thereby preventing the pairing button 46 from being operated in error when the left controller 3 is attached to the main unit 2.
  • Note that in the present embodiment, the buttons provided on the engaging surface of the slider 40 (specifically, the second L button 43, the second R button 44 and the pairing button 46) are provided so as not to protrude past the engaging surface. That is, the upper surface (in other words, the surface to be pressed) of these buttons is arranged flush with the engaging surface of the slider 40 or arranged at a position sunken from the engaging surface. This allows the slider 40 to slide smoothly against the left rail member 15 when the slider 40 is engaged with the left rail member 15 of the main unit 2.
  • [1-1-3. Configuration of Right Controller]
  • FIG. 6 is a six-sided view showing an example of the right controller 4. As shown in FIG. 6, the right controller 4 includes a generally plate-shaped housing 51. In the present embodiment, the primary surface (in other words, the front-side surface, i.e., the surface on the z-axis negative direction side shown in FIG. 1) of the housing 51 has a generally rectangular shape. In the present embodiment, the housing 51 has a vertically-elongated shape, i.e., a shape that is elongated in the up-down direction. Note that when detached from the main unit 2, the right controller 4 may be held in a portrait position (see FIG. 18) or may be held in a landscape position (see FIG. 16).
  • As with the housing 31 of the left controller 3, the length of the housing 51 of the right controller 4 in the up-down direction is generally equal to the length of the housing 11 of the main unit 2 in the up-down direction, and the thickness thereof is generally equal to the thickness of the housing 11 of the main unit 2. Therefore, when the right controller 4 is attached to the main unit 2 (see FIG. 1), a user can hold the main unit 2 and the right controller 4 as if they were an integral unit.
  • As shown in FIG. 6, the right-side corner portion of the primary surface of the housing 51 has a more rounded shape than the left-side corner portion thereof. That is, the connecting portion between the upper side surface and the right side surface of the housing 51 and the connecting portion between the lower side surface and the right side surface of the housing 51 are more rounded (in other words, round-cornered with a greater radius) than the connecting portion between the upper side surface and the left side surface and the connecting portion between the lower side surface and the left side surface. Therefore, when the right controller 4 is attached to the main unit 2 (see FIG. 1), the right side of the information processing device 1 will have a rounded shape, making it easier for a user to hold the device.
  • As does the left controller 3, the right controller 4 includes an analog stick 52 as the directional input section. In the present embodiment, the analog stick 52 has the same configuration as the analog stick 32 of the left controller 3. As does the left controller 3, the right controller 4 includes four operation buttons 53 to 56 (specifically, the A button 53, the B button 54, the X button 55 and the Y button 56). In the present embodiment, these four operation buttons 53 to 56 are of the same mechanism as the four operation buttons 33 to 36 of the left controller 3. As shown in FIG. 6, the analog stick 52 and the operation buttons 53 to 56 are provided on the primary surface of the housing 51. Note that while the number of operation buttons provided on the primary surface of the right controller 4 is four in the present embodiment, there is no particular limitation on the number of operation buttons.
  • In the present embodiment, the positional relationship between the two types of operation sections (the analog stick and the operation buttons) of the right controller 4 is opposite from the positional relationship between these two types of operation sections of the left controller 3. That is, the analog stick 52 is arranged above the operation buttons 53 to 56 on the right controller 4, whereas the analog stick 32 is arranged below the operation buttons 33 to 36 on the left controller 3. With such an arrangement, the left and right controllers 3 and 4, when detached from the main unit 2, can be operated in a similar fashion, the details of which will be described later.
  • The right controller 4 also includes a plus (+) button 57. As shown in FIG. 6, the plus button 57 is provided on the primary surface of the housing 51, more specifically, in an upper left area of the primary surface. As are the other operation buttons 53 to 56, the plus button 57 is used to give instructions in accordance with various programs executed on the main unit 2 (e.g., the OS program and application programs). The plus button 57 is used, for example, as a start button in a game application (e.g., a button used to give an instruction to start the game).
  • The right controller 4 includes a home button 58. As shown in FIG. 6, the home button 58 is provided on the primary surface of the housing 51, more specifically, in a lower left area of the primary surface. The home button 58 is a button for displaying a predetermined menu screen on the display 12 of the main unit 2. The menu screen is, for example, a screen where a user can launch an application specified by the user, from among a plurality of applications that can be executed on the main unit 2. The menu screen may be displayed at the start-up of the main unit 2, for example. In the present embodiment, a predetermined control screen may be displayed on the display 12 (the menu screen may be displayed instead of the control screen) when the home button 58 is pressed while an application is executed on the main unit 2 (i.e., while the image of the application is displayed on the display 12). Note that the control screen is, for example, a screen where a user can give an instruction to end an application and display the menu screen on the display 12, and an instruction to resume an application, etc.
  • The operation sections provided on the primary surface of the right controller 4 (specifically, the analog stick 52 and the buttons 53 to 59) are operated with the thumb of the right hand, for example, of a user holding the information processing device 1 when the right controller 4 is attached to the main unit 2 (see FIG. 14). When the right controller 4 is used detached from the main unit 2, the operation sections are operated with the left and right thumbs, for example, of a user holding the right controller 4 (see FIG. 15). Specifically, in such a case, the analog stick 52 is operated with the thumb of the left hand of the user, and the operation buttons 53 to 56 are operated with the thumb of the right hand of the user.
  • The right controller 4 includes a first R button 60. The right controller 4 includes a ZR button 61. As shown in FIG. 6, the first R button 60 is provided over a corner portion between the right side surface and the upper side surface of the housing 51. The ZR button 61 is provided to extend over a corner portion between the right side surface and the upper side surface of the housing 51 (strictly speaking, between the right side surface and the upper side surface as seen from the front side of the housing 51) while extending into the reverse surface of the housing 51. That is, the ZR button 61 is provided on the rear side (the z-axis positive direction side shown in FIG. 1) of the first R button 60. In the present embodiment, since the upper right corner portion of the housing 51 has a rounded shape, the first R button 60 and the ZR button 61 each have a rounded shape in conformity with the rounded shape of the upper right portion of the housing 51.
  • When the right controller 4 is attached to the main unit 2, the first R button 60 and the ZR button 61 will be placed over the upper right portion of the information processing device 1 (see FIG. 1). Therefore, a user holding the information processing device 1 is allowed to operate the first R button 60 and the ZR button 61 with the index finger or the middle finger of the right hand (see FIG. 15).
  • As shown in FIG. 6, the ZR button 61 of the right controller 4 is provided so as to project from the housing 51, as is the ZL button 39 of the left controller 3. That is, a portion of the reverse surface of the housing 51 where the ZR button 61 is provided (more specifically, at least a portion of the perimeter of the ZR button 61) projects past the remaining portion of the housing 51. The ZR button 61 is provided so as to project past the remaining portion of the housing 51 on the reverse surface. Therefore, when the right controller 4 is attached to the main unit 2, as is the case when the left controller 3 is attached to the main unit 2, when the main unit 2 is placed on a flat surface in such an orientation that the reverse surface of the right controller 4 faces the horizontal flat surface, the projecting portions of the housing 51 are in contact with the flat surface. As a result, the main unit 2 is placed so that the upper side thereof is slightly raised from than the lower side thereof, and it is therefore easy for a user to see the display 12.
  • Note that in the present embodiment, as with the ZL button 39 of the left controller 3, the ZR button 61 of the right controller 4 can be pressed down primarily in the up-down direction (the y-axis direction). Therefore, as with the ZL button 39, the ZR button 61 is unlikely to be pressed down when the information processing device 1 is placed so that the ZR button 61 is in contact with a flat surface. Thus, the ZR button 61 is unlikely to be pressed in error. Note that in other embodiments, the housing 51 may be formed so that the reverse surface of the housing 51 projects past the ZR button 61. In other embodiments, as is the ZL button 39 of the left controller 3, the ZR button 61 may be formed so as not to project from the reverse surface of the housing 51.
  • In the present embodiment, the shape of the first L button 38 and the shape of the first R button 60 are not in left-right symmetry, and the shape of the ZL button 39 and the shape of the ZR button 61 are not in left-right symmetry. Note however that in other embodiments, the shape of the first L button 38 and the shape of the first R button 60 may be in left-right symmetry, and the shape of the ZL button 39 and the shape of the ZR button 61 may be in left-right symmetry.
  • The right controller 4 includes a similar slider mechanism to that of the left controller 3. That is, the right controller 4 includes the slider 62 described above. As shown in FIG. 6, the slider 62 is provided so as to extend in the up-down direction on the left side surface of the housing 51. The slider 62 has such a shape that it can engage with the right rail member 19 (more specifically, the groove of the right rail member 19) of the main unit 2. Specifically, the cross section (specifically, the cross section perpendicular to the up-down direction) of the slider 62 is T-shaped. More specifically, the cross section of the slider 62 is T-shaped in conformity with the cross-sectional shape of the right rail member 19 (see FIG. 7). Therefore, the slider 62 in engagement with the right rail member 19 is locked so as not to come off in the direction perpendicular to the sliding direction (in other words, the direction in which the right rail member 19 extends) (see FIG. 7).
  • The slider 62 is provided with the projection 63. The projection 63 is arranged at such a position that allows the projection 63 to be inserted into the engagement hole 20 when the right controller 4 is attached to the main unit 2. There is no particular limitation on the specific position of the projection 63. In the present embodiment, the projection 63 is provided on the engaging surface of the slider 62. Note that the engaging surface of the slider 62 refers to a surface that faces the bottom surface of the right rail member 19 when the right controller 4 is attached to the main unit 2. The projection 63 is shaped so that the projection 63 can engage with the engagement hole 20 of the right rail member 19.
  • In the present embodiment, as is the projection 41 of the left controller 3, the projection 63 of the right controller 4 is biased from the inside of the slider 62 toward the outside of the slider 62. Thus, when a force from the outside of the slider 62 toward the inside of the slider 62 is applied on the projection 63, the projection 63 moves toward the inside of the slider 62 (i.e., retracts into the slider 62). Note that there is no particular limitation on the configuration for biasing the projection 63 as described above, and it may be a similar configuration to the projection 41 of the left controller 3.
  • The right controller 4 includes the terminal 64 for allowing the right controller 4 to communicate with the main unit 2 in wired communication. The terminal 64 is located so as to be in contact with the right-side terminal 21 of the main unit 2 (FIG. 3) when the right controller 4 is attached to the main unit 2. There is no particular limitation on the specific position of the terminal 64. In the present embodiment, as shown in FIG. 6, the terminal 64 is provided on the engaging surface of the slider 62. In the present embodiment, the terminal 64 is provided near the lower end on the engaging surface of the slider 62. The terminal 64 is provided below the projection 63 (in other words, on the front side with respect to the insertion of the slider 62 into the right rail member 19).
  • As does the left controller 3, the right controller 4 also includes a second L button 65 and a second R button 66. As are the operation buttons 53 to 56, these buttons 65 and 66 are used to give instructions in accordance with various programs executed on the main unit 2. As shown in FIG. 6, the second L button 65 and the second R button 66 are provided on the engaging surface of the slider 62. The second L button 65 is provided on the engaging surface of the slider 62 below the center of the engaging surface with respect to the up-down direction (the y-axis direction shown in FIG. 1). The second R button 66 is provided on the engaging surface of the slider 62 above the center of the engaging surface with respect to the up-down direction. As are the second L button 43 and the second R button 44 of the left controller 3, the second L button 65 and the second R button 66 are arranged at such positions that they cannot be pressed down with the right controller 4 attached to the main unit 2, and they are buttons that are used when the right controller 4 is detached from the main unit 2. For example, the second L button 65 and the second R button 66 are operated with the index finger or the middle finger of the left hand and the right hand of a user holding the right controller 4 detached from the main unit 2 (see FIG. 16).
  • The right controller 4 includes an indicator LED 67. As is the indicator LED 45 of the left controller 3, the indicator LED 67 is an indicator section for indicating predetermined information to the user. As does the left controller 3, the right controller 4 includes four LEDs as the indicator LED 67, and one of the four LEDs is lit, which is associated with the number assigned to the right controller 4. In the present embodiment, as is the indicator LED 45, the indicator LED 67 is provided on the engaging surface of the slider 62 (FIG. 6). Thus, the indicator LED 67 is arranged at such a position that the indicator LED 67 cannot be seen with the right controller 4 attached to the main unit 2, and the indicator LED 67 is used when the right controller 4 is detached from the main unit 2.
  • The right controller 4 includes a pairing button 69. As is the pairing button 46 of the left controller 3, the pairing button 69 is used to give an instruction for a setting (referred to also as “pairing”) process regarding wireless communication between the right controller 4 and the main unit 2, and to give an instruction for a resetting process of resetting the right controller 4. The setting process and the resetting process are the same as those for the left controller 3, and will not therefore be described in detail below. In the present embodiment, the pairing button 69 is provided on the engaging surface of the slider 62 as shown in FIG. 6. That is, for the same reason for the pairing button 46 of the left controller 3, the pairing button 69 is arranged at such a position that the pairing button 69 cannot be seen with the right controller 4 attached to the main unit 2.
  • With the right controller 4, as with the left controller 3, the buttons provided on the engaging surface of the slider 62 (specifically, the second L button 65, the second R button 66 and the pairing button 69) are provided so as not to protrude past the engaging surface. This allows the slider 62 to slide smoothly against the right rail member 19 when the slider 62 is engaged with the right rail member 19 of the main unit 2.
  • A window portion 68 is provided in the lower side surface of the housing 51. The right controller 4 includes an infrared image-capturing section (an infrared image-capturing section 123 shown in FIG. 12), the details of which will be described later, to detect a hand movement and/or the gesture, etc., of the user by means of the infrared image-capturing section 123. The window portion 68 is provided so as to allow the camera of the infrared image-capturing section 123 arranged inside the housing 51 to capture an image around the right controller 4. The window portion 68 is provided for protecting the lens of the camera of the infrared image-capturing section 123, and is made of a material (e.g., a transparent material) that allows light of a wavelength to be detected by the camera to pass therethrough. Note that the window portion 68 may be a hole formed in the housing 51. Note that in the present embodiment, the infrared image-capturing section 123 itself has a filter member for suppressing the transmission therethrough of light of wavelengths other than the light (in the present embodiment, infrared light) to be detected by the camera. Note however that in other embodiments, the window portion may have a filter function.
  • Note that for the controllers 3 and 4, there is no particular limitation on the shape, the number and the arrangement of the various elements (specifically, the slider, the stick, the buttons, the LEDs, etc.) provided on the housing 31 or 51. For example, in other embodiments, the controllers 3 and 4 may include a directional input section of a different type from an analog stick. The slider 40 or 62 may be arranged at a position that corresponds to the position of the rail member 15 or 19 provided on the main unit 2, and may be, for example, arranged on the primary surface or the reverse surface of the housing 31 or 51. In other embodiments, one or more of the various elements described above may be absent on the controllers 3 and 4.
  • [1-1-4: Attachment Action]
  • Next, referring to FIG. 7 and FIGS. 8A and 8B, the action of attaching and detaching a controller to and from the main unit 2 will be described. Note that although the description below is directed to the action of attaching and detaching the left controller 3 to and from the main unit 2, the action of attaching and detaching the right controller 4 to and from the main unit 2 can be done in the same manner as the left controller 3.
  • When the left controller 3 is attached to the main unit 2, a user first inserts the lower end of the slider 40 of the left controller 3 into a groove of the left rail member 15 of the main unit 2 via the upper end of the left rail member 15. FIG. 7 is a diagram showing an example of how the left rail member 15 and the slider 40 engage with each other. Note that in order to facilitate understanding of the figure, elements of the main unit 2 are shown in phantom line in FIG. 7. As shown in FIG. 7, the slider 40 is inserted into the left rail member 15 so that the T-shaped cross section of the slider 40 engages with (or is fitted to) the C-shaped cross section of the left rail member 15 (in other words, the groove of the left rail member 15).
  • After inserting the lower end of the slider 40 into the groove of the left rail member 15, the user further inserts the slider 40 down the groove of the left rail member 15. That is, the user slides the left controller 3 downward against the main unit 2. Then, when the left controller 3 has been slid until the lower end of the slider 40 reaches the position of the stopper 18 of the main unit 2, the left controller 3 is locked to the main unit 2.
  • FIGS. 8A and 8B are diagrams showing an example of how the slider 40 is locked to the left rail member 15. Note that FIGS. 8A and 8B show a cross section of the left rail member 15 perpendicular to the front-rear direction (the z-axis direction). In order to facilitate understanding of elements of interest to be discussed in conjunction with the figure, the elements are shown in FIGS. 8A and 8B with different positional relationships and different sizes from FIG. 3, etc.
  • As shown in FIG. 8A, when the left controller 3 is not completely attached to the main unit 2 (i.e., when the lower end of the slider 40 has not reached the stopper 18 of the main unit 2), the projection 41 of the slider 40 is retracted inside the slider 40 by being in contact with the bottom surface of the rail member 15.
  • After the state shown in FIG. 8A, the slider 40 is slid further down the left rail member 15, the lower end of the slider 40 reaches the position of the stopper 18 of the main unit 2 (see FIG. 8B). Then, as shown in FIG. 8B, the projection 41 of the slider 40 faces the engagement hole 16 of the left rail member 15. Therefore, the projection 41 protrudes from the engaging surface of the slider 40 to be inserted into the engagement hole 16. Thus, the projection 41 engages with the engagement hole 16, thereby locking the left controller 3 to the main unit 2 (in other words, locking the slider 40 to the left rail member 15) to such a degree that a small force will not cause the left controller 3 to come off.
  • When the left controller 3 is locked to the main unit 2, the terminal 42 of the slider 40 is located so as to face the left-side terminal 17 of the left rail member 15 as shown in FIG. 8B. Thus, the terminal 42 and the left-side terminal 17 are connected together. This enables wired communication (in other words, communication via physical connection between terminals) between the left controller 3 and the main unit 2. This also allows power to be supplied from the main unit 2 to the left controller 3.
  • Note that one or both of the controller-side terminal (i.e., the terminal 42) and the main unit-side terminal (i.e., the left-side terminal 17) protrudes (only slightly) from the base surface. In the present embodiment, as shown in FIG. 8A, the left-side terminal 17 on the main unit side is provided to slightly protrude from the base surface (i.e., the bottom surface of the left rail member 15). A metal portion 17a to be the contact point of the terminal provided to protrude is able to deform toward its base surface. Therefore, when the terminals come into contact with each other, each terminal receives a pressing force from the other terminal so as to be biased in the direction in which it protrudes, as shown in FIG. 8B. This as a result ensures a reliable contact between the terminals.
  • In the present embodiment, the left-side terminal 17 of the main unit 2 is provided below the engagement hole 16. The terminal 42 of the left controller 3 is provided below the projection 41. Therefore, when the slider 40 is inserted into the left rail member 15, the projection 41 will not come into contact with the left-side terminal 17, thereby lowering the possibility of the projection 41 damaging the left-side terminal 17.
  • When detaching the left controller 3 from the main unit 2, a user slides the left controller 3 upward against the main unit 2. Note that when the left controller 3 is attached to the main unit 2, the left controller 3 is locked to the main unit 2 by means of the projection 41 and the engagement hole 16. Note however that a certain force or more for sliding the left controller 3 upward will dislocate the projection 41 off the position of the engagement hole 16, thereby releasing the lock. After the lock is released, the left controller 3 can be further slid up to remove the left controller 3 from the main unit 2.
  • Note that in other embodiments, the left controller 3 may include a mechanism capable of retracting the projection 41 into the slider 40. The left controller 3 may include a mechanism for retracting the projection 41 into the slider 40 in response to a user pressing a predetermined button provided on the left controller 3, for example. Thus, by performing the operation described above, the user can easily release the lock of the left controller 3 to the main unit 2 by means of the projection 41.
  • As described above, in the present embodiment, the controllers 3 and 4 can be detachably attached to the main unit 2 by means of a slide mechanism including a rail member and a slider. With a slide mechanism, the controllers 3 and 4 can be securely locked to the main unit 2 for directions other than the slide direction. Therefore, a user can easily hold the information processing device 1 with the controllers 3 and 4 attached to the main unit 2, with little shaking of the controllers 3 and 4. In the present embodiment, also for the slide direction, the projection and the engagement hole allow the controllers 3 and 4 to be locked to the main unit 2. This also reduces the shaking of the controllers 3 and 4 attached to the main unit 2, thereby allowing a user to easily hold the information processing device 1.
  • [1-2. Configuration of Cradle]
  • FIG. 9 is a diagram showing an overall configuration of an example information processing system according to the present embodiment. As described above, the information processing system includes the information processing device 1 and the cradle 5. As shown in FIG. 9, the information processing device 1 can be placed in the cradle 5. The cradle 5 can communicate (either via wired communication or wireless communication) with a TV 6, which is an example of an external display device separate from the display 12. When the information processing device 1 is placed in the cradle 5, the information processing system can display the image obtained or produced by the information processing device 1 on the TV 6, the details of which will be described later. In the present embodiment, the cradle 5 has a function of charging the information processing device 1 placed therein and a function as a hub device (specifically, a USB hub), the details of which will be described later.
  • FIG. 10 is a diagram showing an external configuration of an example of the cradle 5. The cradle 5 has a housing such that the information processing device 1 can be placed therein (or attached thereto, in other words). In the present embodiment, the housing includes a first support 71 provided with a groove 71 a, and a generally-planar second support 72, as shown in FIG. 10.
  • As shown in FIG. 10, the groove 71 a provided in the first support 71 is shaped in conformity with the shape of the lower portion of the information processing device 1. Specifically, the groove 71 a is shaped so as to allow the lower portion of the information processing device 1 to be inserted into the groove 71 a, and more specifically is shaped to generally match the shape of the lower portion of the information processing device 1. Thus, the information processing device 1 can be placed in the cradle 5 by inserting the lower portion of the information processing device 1 into the groove 71 a. The second support 72 supports the surface (i.e., the surface on which the display 12 is provided) of the information processing device 1, of which the lower portion has been inserted into the groove 71 a. With the second support 72, the cradle 5 can more stably support the information processing device 1. Note that the shape of the housing shown in FIG. 10 is merely an example, and the housing of the cradle 5 may have any other shape in other embodiments that allows the information processing device 1 to be placed in the cradle 5.
  • Note that in FIG. 10, the information processing device 1 is placed in the cradle 5 so that the primary surface of the main unit 2 (i.e., the surface of the display 12) faces the second support 72. Note however that in the present embodiment, the information processing device 1 may be placed in the cradle 5 so that the reverse surface of the main unit 2 faces the second support 72. That is, in the present embodiment, a user can place the information processing device 1 in the cradle 5 in the front-facing orientation (i.e., an orientation such that the display 12 can be seen), or place the information processing device 1 in the cradle 5 in the back-facing orientation (i.e., an orientation such that the display 12 is hidden).
  • As shown in FIG. 10, the cradle 5 also includes the main body terminal 73 for allowing the cradle 5 to communicate with the information processing device 1. As shown in FIG. 10, the main body terminal 73 is provided on the bottom surface of the groove 71 a formed in the first support 71. More specifically, the main body terminal 73 is located so as to be in contact with the lower terminal 27 of the information processing device 1 when the information processing device 1 is attached to the cradle 5. In the present embodiment, the main body terminal 73 is a USB connector (more specifically, a male-side connector). Note that in the present embodiment, the information processing device 1 can be attached to the cradle 5 either in the front-facing orientation or in the back-facing orientation, as described above. Thus, the lower terminal 27 of the information processing device 1 and the main body terminal 73 of the cradle 5 are each symmetric with respect to the depth direction (i.e., the z-axis direction shown in FIG. 1), allowing them to communicate with each other whether they are connected in one orientation or the other with respect to the depth direction.
  • As shown in FIG. 10, the cradle 5 includes a sleep button 74. The sleep button 74 is button for switching between the ON mode and the sleep mode of the main unit 2 attached to the cradle 5. Note that in other embodiments, the sleep button 74 may have the function of turning ON/OFF the power of the main unit 2, in addition to (or instead of) the function of switching between the ON mode and the sleep mode of the main unit 2.
  • Note that in the present embodiment, the sleep button 74 is provided on the front surface of the first support 71. The sleep button 74 may be provided at any position such that the sleep button 74 can be pressed down with the information processing device 1 attached to the cradle 5. For example, the sleep button 74 may be provided on the side surface of the housing of the cradle 5 or on the back surface of the second support 72.
  • In the present embodiment, the sleep button 74 includes a light-emitting portion (specifically, LED). The light-emitting portion of the sleep button 74 is used to indicate, to the user, the status of the main unit 2 attached to the cradle 5. That is, the light-emitting portion varies how light is emitted (in other words, how it is lit) depending on the status of the main unit 2 attached to the cradle 5. For example, in the present embodiment, the light-emitting portion is lit when the main unit 2 attached to the cradle 5 is in the ON mode, and the light-emitting portion is turned OFF when the main unit 2 is in the sleep mode or in the power-off mode. Moreover, when the main unit 2 has information that should be indicated to the user (e.g., program update information, announcement, advertisement, etc.), the light-emitting portion blinks. Note that the light-emitting portion does not need to be provided on the sleep button 74, but it may be provided at any position on the housing of the cradle 5, for example.
  • Although not shown in FIG. 10, the cradle 5 includes a terminal (in the present embodiment, a plurality of terminals, specifically, a monitor terminal 132, a power terminal 134 and extension terminals 137 shown in FIG. 13) provided on the back surface of the housing. The details of these terminals will be described later.
  • There is no particular limitation on the shape, the number and the arrangement of the various elements provided on the cradle 5 (specifically, the housing, the terminals and the button) described above. For example, in other embodiments, the housing may have any other shape such that the information processing device 1 can be supported by the housing. Some of the terminals provided on the housing may be provided on the front surface of the housing. In other embodiments, some of the various elements described above may be absent from the cradle 5.
  • [2. Internal Configuration of Various Units]
  • [2-1. Internal Configuration of Main Unit 2]
  • FIG. 11 is a block diagram showing an example internal configuration of the main unit 2. In addition to the elements shown in FIG. 3, the main unit 2 also includes elements 81 to 98 shown in FIG. 11. Some of these elements 81 to 98 may be mounted, as electronic components, on an electronic circuit board and accommodated in the housing 11.
  • (Elements Regarding Execution of Information Process)
  • The main unit 2 includes a CPU (Central Processing Unit) 81. The CPU 81 is an information processing section for executing various information processes to be executed on the main unit 2. The CPU 81 executes various information processes by executing an information processing program stored in a storage section that can be accessed by the CPU 81 (specifically, an internal storage medium such as a flash memory 84 or external storage media attached to the slots 23 and 24, etc.).
  • The main unit 2 includes, as an example internal storage medium provided in the main unit 2, the flash memory 84 and a DRAM (Dynamic Random Access Memory) 85. The flash memory 84 and the DRAM 85 are connected to the CPU 81. The flash memory 84 is a memory used primarily for storing various data (which may be programs) saved on the main unit 2. The DRAM 85 is a memory used for temporarily storing various data used in information processes.
  • The main unit 2 includes a first slot interface (hereinafter abbreviated as “I/F”) 91. The main unit 2 also includes a second slot I/F 92. The slot I/ Fs 91 and 92 are connected to the CPU 81. The first slot I/F 91 is connected to the first slot 23, and reads and writes data from and to a storage medium of the first type (e.g., an SD card) inserted in the first slot 23, in response to an instruction from the CPU 81. The second slot I/F 92 is connected to the second slot 24, and reads and writes data from and to a storage medium of the second type (e.g., a dedicated memory card) inserted in the second slot 24, in response to an instruction from the CPU 81.
  • The CPU 81 executes the information processes described above by reading and writing data from and to the memories 84 and 85 and the storage media.
  • (Elements Regarding Communication)
  • The main unit 2 includes a network communication section 82. The network communication section 82 is connected to the CPU 81. The network communication section 82 communicates (specifically, in wireless communication) with an external device via a network. In the present embodiment, the network communication section 82 is a Wi-Fi certified communication module, for example, and communicates with an external device via a wireless LAN. Note that in other embodiments, the main unit 2 may have the function of implementing communication by connecting to a mobile communication network (in other words, a mobile telephone communication network), in addition to (or instead of) the function of implementing communication by connecting to a wireless LAN.
  • The main unit 2 includes a controller communication section 83. The controller communication section 83 is connected to the CPU 81. The controller communication section 83 communicates with the controllers 3 and/or 4 in wireless communication. While there is no particular limitation on the communication scheme between the main unit 2 and the controllers, the controller communication section 83 communicates with the controllers in accordance with the Bluetooth (registered trademark) standard in the present embodiment.
  • The CPU 81 is connected to the left-side terminal 17, the right-side terminal 21 and the lower terminal 27 described above. When communicating with the left controller 3 in wired communication, the CPU 81 transmits data to the left controller 3 via the left-side terminal 17. When communicating with the right controller 4 in wired communication, the CPU 81 transmits data to the right controller 4 via the right-side terminal 21. When communicating with the cradle 5, the CPU 81 transmits data to the cradle 5 via the lower terminal 27.
  • Thus, in the present embodiment, the main unit 2 is capable of communicating with the left and right controllers 3 and 4 either in wired communication or in wireless communication. Note that the process of switching between wired communication and wireless communication will be described later.
  • The main unit 2 can communicate with a plurality of left controllers simultaneously (in other words, in parallel). The main unit 2 can communicate with a plurality of right controllers simultaneously (in other words, in parallel). Therefore, users can make inputs to the information processing device 1 by using a plurality of left controllers and a plurality of right controllers.
  • (Elements Regarding Input/Output to/From Main Unit 2)
  • The main unit 2 includes a touch panel controller 86, which is a circuit for controlling the touch panel 13. The touch panel controller 86 is connected to the touch panel 13, and is connected to the CPU 81. Based on a signal from the touch panel 13, for example, the touch panel controller 86 generates data representing the position at which the touch input has been performed, and outputs the data to the CPU 81.
  • The display 12 described above is connected to the CPU 81. The CPU 81 displays images generated and/or images obtained from outside on the display 12 (e.g., by executing the information processes described above).
  • The main unit 2 includes a codec circuit 87 and a speaker (specifically, a left speaker and a right speaker) 88. The codec circuit 87 is connected to the speaker 88 and the sound input/output terminal 25, and is also connected to the CPU 81. The codec circuit 87 is a circuit for controlling the input/output of sound data to/from the speaker 88 and the sound input/output terminal 25. That is, when sound data is received from the CPU 81, the codec circuit 87 outputs, to the speaker 88 or the sound input/output terminal 25, a sound signal obtained by performing a D/A conversion on the sound data. Thus, a sound is output from the speaker 88 or a sound output section (e.g., an earphone) connected to the sound input/output terminal 25. When a sound signal is received from the sound input/output terminal 25, the codec circuit 87 performs an A/D conversion on the sound signal and outputs sound data of a predetermined format to the CPU 81.
  • The sound volume button 26 described above (specifically, the sound volume buttons 26 a and 26 b shown in FIG. 3) is connected to the CPU 81. The CPU 81 controls the sound volume of the output from the speaker 88 or the sound output section based on an input on the sound volume button 26.
  • The main unit 2 includes an acceleration sensor 89. In the present embodiment, the acceleration sensor 89 detects the magnitude of the linear acceleration along directions of three predetermined axes (e.g., the xyz axes shown in FIG. 1). Note that the acceleration sensor 89 may detect acceleration in one axis direction or two axis directions.
  • The main unit 2 includes an angular velocity sensor 90. In the present embodiment, the angular velocity sensor 90 detects the angular velocity about three predetermined axes (e.g., the xyz axes shown in FIG. 1). Note that the angular velocity sensor 90 may detect the angular velocity about one axis or two axes.
  • The acceleration sensor 89 and the angular velocity sensor 90 are connected to the CPU 81, and the detection results of the acceleration sensor 89 and the angular velocity sensor 90 are output to the CPU 81. Based on the detection results of the acceleration sensor 89 and the angular velocity sensor 90, the CPU 81 can calculate information regarding movement and/or attitude of the main unit 2.
  • (Elements Regarding Power)
  • The main unit 2 includes a power control section 97 and a battery 98. The power control section 97 is connected to the battery 98 and the CPU 81. Although not shown in the figure, the power control section 97 is connected to various sections of the main unit 2 (specifically, various sections receiving power supply from the battery 98, the left-side terminal 17 and the right-side terminal 21). The power control section 97 controls the power supply from the battery 98 to the various sections based on an instruction from the CPU 81. The power control section 97 is connected to the power button 28. The power control section 97 controls the power supply to the various sections based on an input on the power button 28. That is, when the power-OFF operation is performed on the power button 28, the power control section 97 stops the power supply to some or all of the various sections, and when the power-ON operation is performed on the power button 28, the power control section 97 starts the power supply to some or all of the various sections. When an instruction to switch to the sleep mode is given to the power button 28, the power control section 97 stops the power supply to some elements, including the display 12, and when an instruction to switch to the ON mode is given to the power button 28, the power control section 97 starts the power supply to the elements. The power control section 97 also outputs, to the CPU 81, information representing the input on the power button 28 (specifically, information indicating whether or not the power button 28 is being pressed).
  • The battery 98 is connected to the lower terminal 27. Where an external charging device (e.g., the cradle 5) is connected to the lower terminal 27, and power is supplied to the main unit 2 via the lower terminal 27, the battery 98 is charged with the power supplied thereto. Note that in the present embodiment, the charging capacity of the battery 98 of the main unit 2 is larger than the charging capacity of the battery of the controllers 3 and 4 to be described later.
  • (Other Elements)
  • The main unit 2 includes a magnetic force sensor (referred to also as a magnetic sensor) 93 for detecting the intensity and/or the direction of a magnetic field. The magnetic force sensor 93 is connected to the CPU 81, and the detection results of the magnetic force sensor 93 are output to the CPU 81. In the present embodiment, the magnetic force sensor 93 is used to detect whether a protection cover (not shown) attached to the information processing device 1 is open or closed. For example, the protection cover is provided with a magnet, and the CPU 81 detects when the primary surface of the main unit 2 is covered by the protection cover based on the detection results of the magnetic force sensor 93. Note that when such a state is detected, the CPU 81 turns OFF the display of the display 12, for example.
  • The main unit 2 includes the ambient light sensor 94 for detecting the intensity of the ambient light around the main unit 2. The ambient light sensor 94 is connected to the CPU 81, and the detection results of the ambient light sensor 94 are output to the CPU 81. In the present embodiment, the ambient light sensor 94 is used to adjust the brightness of the display 12. That is, the CPU 81 controls the brightness of the display 12 based on the detection results of the ambient light sensor 94.
  • The main unit 2 includes a cooling fan 96 for radiating heat from inside the main unit 2. With the cooling fan 96 in operation, the air outside the housing 11 is introduced into the housing 11 through the air inlet hole 11 d and the air inside the housing 11 is discharged through the air outlet hole 11 c, thereby discharging heat from inside the housing 11. The cooling fan 96 is connected to the CPU 81, and the operation of the cooling fan 96 is controlled by the CPU 81. The main unit 2 includes a temperature sensor 95 for detecting the temperature inside the main unit 2. The temperature sensor 95 is connected to the CPU 81, and the detection results of the temperature sensor 95 are output to the CPU 81. The CPU 81 controls the operation of the cooling fan 96 based on the detection results of the temperature sensor 95.
  • [2-2. Internal Configuration of Left Controller 3]
  • FIG. 12 a block diagram showing an example internal configuration of the information processing device 1. Note that the details of the internal configuration of the information processing device 1 that are related to the main unit 2 are shown in FIG. 11, and are therefore not shown in FIG. 12.
  • (Elements Regarding Communication)
  • The left controller 3 includes a communication control section 101 for communicating with the main unit 2. As shown in FIG. 12, the communication control section 101 is connected to various elements including the terminal 42 described above. In the present embodiment, the communication control section 101 is capable of communicating with the main unit 2 both in wired communication via the terminal 42 and in wireless communication without using the terminal 42. The communication control section 101 controls the communication method used by the left controller 3 for communicating with the main unit 2. That is, when the left controller 3 is attached to the main unit 2, the communication control section 101 communicates with the main unit 2 via the terminal 42. When the left controller 3 is detached from the main unit 2, the communication control section 101 communicates with the main unit 2 (specifically, with the controller communication section 83) in wireless communication. The wireless communication between the controller communication section 83 and the communication control section 101 is in accordance with the Bluetooth (registered trademark) standard.
  • The left controller 3 includes a memory 102 such as a flash memory, for example. The communication control section 101 is a microcomputer (referred to also as a microprocessor), for example, and executes various processes by executing firmware stored in the memory 102.
  • (Elements Regarding Input)
  • The left controller 3 includes the buttons 103 (specifically, the buttons 33 to 39, 43 and 44). The left controller 3 includes the analog stick (labeled “stick” in FIG. 12) 32 described above. The buttons 103 and the analog stick 32 repeatedly output, with appropriate timing, information regarding operations performed thereon to the communication control section 101.
  • The left controller 3 includes an acceleration sensor 104. In the present embodiment, the acceleration sensor 104 detects the magnitude of the linear acceleration along directions of threes predetermined axes (e.g., the xyz axes shown in FIG. 1). Note that the acceleration sensor 104 may detect acceleration in one axis direction or two axis directions.
  • The left controller 3 includes an angular velocity sensor 105. In the present embodiment, the angular velocity sensor 105 detects the angular velocity about three predetermined axes (e.g., the xyz axes shown in FIG. 1). Note that the angular velocity sensor 105 may detect the angular velocity about one axis or two axes.
  • The acceleration sensor 104 and the angular velocity sensor 105 are connected to the communication control section 101. The detection results of the acceleration sensor 104 and the angular velocity sensor 105 are repeatedly output, with appropriate timing, to the communication control section 101.
  • The communication control section 101 obtains input-related information (specifically, information regarding operations or detection results of sensors) from the input sections (specifically, the buttons 103, the analog stick 32 and the sensors 104 and 105). The communication control section 101 transmits operation data including the obtained information (or information obtained by performing a predetermined process on the obtained information) to the main unit 2. Note that operation data is transmitted repeatedly once per a predetermined period of time. Note that the interval at which the input-related information is transmitted to the main unit 2 may or may not be the same among different input sections.
  • As the operation data is transmitted to the main unit 2, the main unit 2 can know the input that has been performed on the left controller 3. That is, the main unit 2 can determine operations performed on the buttons 103 and the analog stick 32 based on the operation data. The main unit 2 can calculate information regarding movement and/or attitude of the left controller 3 based on the operation data (specifically, the detection results of the sensors 104 and 105).
  • (Elements Regarding Output)
  • The indicator LED 45 described above is connected to the communication control section 101. In the present embodiment, the indicator LED 45 is controlled by an instruction from the main unit 2. That is, when the instruction is received from the main unit 2, the communication control section 101 outputs, to the indicator LED 45, a control signal for controlling the lighting of the indicator LED 45 in accordance with the instruction.
  • The left controller 3 includes a vibrator 107 for indications to the user through vibration. In the present embodiment, the vibrator 107 is controlled by an instruction from the main unit 2. That is, when the instruction is received from the main unit 2, the communication control section 101 drives the vibrator 107 in accordance with the instruction. The left controller 3 includes an amplifier 106. When the instruction is received, the communication control section 101 outputs a control signal in accordance with the instruction to the amplifier 106. The amplifier 106 amplifies the control signal from the communication control section 101 and generates a driving signal for driving the vibrator 107 to give the driving signal to the vibrator 107. This actuates the vibrator 107.
  • (Elements Regarding Power)
  • The left controller 3 includes a power supply section 108. In the present embodiment, the power supply section 108 includes a battery and a power control circuit. Although not shown in the figure, the power control circuit is connected to the battery and is connected to the various sections of the left controller 3 (specifically, various sections receiving power supply from the battery). The power control circuit controls the power supply from the battery to the various sections.
  • The battery is also connected to the terminal 42. In the present embodiment, when the left controller 3 is attached to the main unit 2, the battery is charged with the power supply from the main unit 2 via the terminal 42 under a predetermined condition, the details of which will be described later.
  • [2-3. Internal Configuration of Right Controller 4]
  • (Elements Regarding Communication)
  • As shown in FIG. 12, the right controller 4 includes a communication control section 111 for communicating with the main unit 2. The right controller 4 includes a memory 112 connected to the communication control section 111. The communication control section 111 is connected to various element including the terminal 64 described above. The communication control section 111 and the memory 112 have similar functions to the communication control section 101 and the memory 102 of the left controller 3. Therefore, the communication control section 111 is capable of communicating with the main unit 2 both in wired communication via the terminal 64 and in wireless communication without using the terminal 64 (specifically, communication in accordance with the Bluetooth (registered trademark) standard), and controls the communication method by which the right controller 4 communicates with the main unit 2.
  • (Elements Regarding Input)
  • The right controller 4 includes similar input sections (specifically, the buttons 113, the analog stick 52, an acceleration sensor 114 and an angular velocity sensor 115) to those of the left controller 3. These input sections have similar functions and operate in a similar fashion to the input sections of the left controller 3.
  • (Elements Regarding Output)
  • The indicator LED 67 of the right controller 4 operates in a similar fashion to the indicator LED 45 of the left controller 3. That is, when the instruction from the main unit 2 is received, the communication control section 111 outputs, to the indicator LED 67, a control signal for controlling the lighting of the indicator LED 67 in accordance with the instruction.
  • The right controller 4 includes a vibrator 117 and an amplifier 116. The vibrator 117 and the amplifier 116 operate in a similar fashion to the vibrator 107 and the amplifier 106 of the left controller 3. That is, the communication control section 111 actuates the vibrator 117 using the amplifier 116 in accordance with an instruction from the main unit 2.
  • (Elements Regarding Power)
  • The right controller 4 includes a power supply section 118. The power supply section 118 has a similar function and operates in a similar fashion to the power supply section 108 of the left controller 3. That is, the power supply section 118 controls the power supply to the various sections receiving power supply from the battery. When the right controller 4 is attached to the main unit 2, the battery is charged with the power supply from the main unit 2 via the terminal 64 under a predetermined condition.
  • (Other Elements)
  • The right controller 4 includes an NFC communication section 122. The NFC communication section 122 implements near-field wireless communication in accordance with the NFC (Near Field Communication) standard. The NFC communication section 122 has the function of a so-called NFC reader/writer. For example, the NFC communication section 122 includes an antenna used for the near-field wireless communication, and a circuit (e.g., an NFC chip) for generating a signal (radio wave) to be transmitted from the antenna. Note that the near-field wireless communication is not limited to those based on the NFC standard, but may also be any proximity communication (referred to also as non-contact communication). The proximity communication for example includes communication schemes in which one device generates an electromotive force on the other device by radio waves therefrom (e.g., through electromagnetic induction).
  • The right controller 4 includes the infrared image-capturing section 123. The infrared image-capturing section 123 includes an infrared camera for capturing an image around the right controller 4. In the present embodiment, the infrared image-capturing section 123 is used to capture an image of a hand of a user. The information processing device 1 identifies an input made with the hand (e.g., a gesture input, etc.) based on information (e.g., the position, the size, the shape, etc.) of the hand of which the image has been captured. The infrared image-capturing section 123 includes an illuminating section for outputting infrared light. The illuminating section outputs infrared light in synchronism with the timing at which the infrared camera captures an image, for example. The infrared light output from the illuminating section is reflected by an object (e.g., a hand of a user), and the reflected infrared light is received by the infrared camera, thus obtaining an image of the infrared light. Thus, it is possible to obtain a clearer infrared light image. Note that while the infrared image-capturing section 123 including an infrared camera is used in the present embodiment, a visible light camera (a camera using a visible light image sensor) may be used, instead of an infrared camera, as an image-capturing device, in other embodiments.
  • The right controller 4 includes a processing section 121. The processing section 121 is connected to the communication control section 111 and also connected to the NFC communication section 122. The processing section 121 executes a management process for the NFC communication section 122 in response to an instruction from the main unit 2. For example, the processing section 121 controls the action of the NFC communication section 122 in response to an instruction from the main unit 2. The processing section 121 also controls the activation of the NFC communication section 122, and controls the action (specifically, reading and writing, etc.) of the NFC communication section 122 on a communication partner (e.g., an NFC tag). The processing section 121 also receives information to be transmitted to the communication partner from the main unit 2 via the communication control section 111 to pass this information to the NFC communication section 122, and obtains from the NFC communication section 122 information received from the communication partner to transmit this information to the main unit 2 via the communication control section 111.
  • The processing section 121 executes a management process for the infrared image-capturing section 123 in response to an instruction from the main unit 2. For example, the processing section 121 causes the infrared image-capturing section 123 to perform an image-capturing operation, and obtains information based on the image-capturing result (information of the captured image or information calculated from such information, etc.) to transmit this information to the main unit 2 via the communication control section 111.
  • [2-4. Internal Configuration of Cradle 5]
  • FIG. 13 is a block diagram showing an example internal configuration of the cradle 5. Note that the details of the internal configuration regarding the main unit 2 are shown in FIG. 11 and are therefore not shown in FIG. 13.
  • (Elements Regarding Image Conversion)
  • As shown in FIG. 13, the cradle 5 includes a conversion section 131 and the monitor terminal 132. The conversion section 131 is connected to the main body terminal 73 and the monitor terminal 132. The conversion section 131 converts a signal format regarding an image (referred to also as a video) and a sound received from the main unit 2 into a format to be output to the TV 6. In the present embodiment, the main unit 2 outputs image and sound signals to the cradle 5 as a display port signal (i.e., a signal in accordance with the DisplayPort standard). In the present embodiment, communication based on the HDMI (registered trademark) standard is used for communication between the cradle 5 and the TV 6. That is, the monitor terminal 132 is an HDMI terminal, and the cradle 5 and the TV 6 are connected together by an HDMI cable. Thus, the conversion section 131 converts the display port signal (specifically, a signal representing a video and a sound) received from the main unit 2 via the main body terminal 73 into an HDMI signal. The converted HDMI signal is output to the TV 6 via the monitor terminal 132.
  • The cradle 5 includes a processing section 135 for executing various information processes on the cradle 5. The processing section 135 is connected to the sleep button 74 described above, and is connected to the main body terminal 73 via a connection processing section 136 (the details of which will be described later). The processing section 135 detects an operation performed on the sleep button 74, and notifies the main unit 2 of the operation. When receiving such a notification, the main unit 2 switches between the ON mode and the sleep mode thereof. Thus, in the present embodiment, when the main unit 2 is attached to the cradle 5, the ON mode and the sleep mode of the information processing device 1 are switched from one to another in response to the sleep button 74 being pressed down.
  • (Elements Regarding Charging)
  • The cradle 5 includes a power control section 133 and the power terminal 134. The power terminal 134 is a terminal for connecting a charging device (not shown) (e.g., an AC adaptor, etc.). In the present embodiment, it is assumed that an AC adaptor is connected to the power terminal 134, and the cradle 5 is receiving commercial power supply.
  • When the main unit 2 is attached to the cradle 5, the power control section 133 supplies power from the power terminal 134 to the main unit 2 via the main body terminal 73. Thus, the battery 98 of the main unit 2 is charged as described above.
  • Note that in the present embodiment, the power terminal 134 is a connector (i.e., a female-side USB terminal) of the same shape as the lower terminal 27 of the information processing device 1. Therefore, in the present embodiment, the charging device may be connected to the cradle 5 to charge the information processing device 1 via the cradle 5, or the charging device may be connected directly to the main unit 2 to charge the information processing device 1.
  • (Other Elements)
  • The cradle 5 includes the connection processing section 136 and the extension terminals 137. The extension terminals 137 are terminals for connecting other devices. In the present embodiment, the cradle 5 includes a plurality (more specifically, three) of USB terminals as the extension terminals 137. The connection processing section 136 is connected to the main body terminal 73 and the extension terminals 137. The connection processing section 136 has a function of a USB hub, and manages the communication between devices connected to the extension terminals 137 and the main unit 2 connected to the main body terminal 73, for example, (i.e., a signal from one device is appropriately distributed among other devices). As described above, in the present embodiment, the information processing device 1 is capable of communicating with other devices via the cradle 5. Note that the connection processing section 136 may be capable of converting the communication speed or supply power to devices connected to the extension terminals 137.
  • [3. Outline of Operation of Information Processing System]
  • As described above, in the present embodiment, the left and right controllers 3 and 4 can be attached to and detached from the information processing device 1. By being attached to the cradle 5, the information processing device 1 is capable of outputting images (and sound) to the TV 6. Therefore, the information processing device 1 can be used in various modes of use as will be described below. The operation of the information processing system in primary modes of use will now be described.
  • [3-1. Mode in Which Controller is Used Attached to Main Unit]
  • FIG. 14 is a diagram showing an example of how the information processing device 1 is used with the controllers 3 and 4 attached to the main unit 2 (referred to as the “attached state”). As shown in FIG. 14, when the controllers 3 and 4 are attached to the main unit 2, the information processing device 1 can be used as a portable device (e.g., a portable game device).
  • In the attached state, the main unit 2 and the controllers 3 and 4 communicate with each other in wired communication (i.e., communication between devices connected together via terminals). That is, the main unit 2 receives operation data from the controllers 3 and 4 attached thereto, and executes an information process based on the received operation data (specifically, by using operation data as an input).
  • Note that in other embodiments, the main unit 2 and the controllers may communicate with each other in wireless communication in the attached state. In the attached state, wireless communication may not work properly because the main unit 2 and the controllers are too close to each other. In contrast, in the present embodiment, the main unit 2 and the controllers communicate with each other in wired communication in the attached state, thereby improving the reliability of communication.
  • Note that in the attached state, the four operation buttons 33 to 36 of the left controller 3 may be used to give a directional input (in other words, a directional instruction). Then, a user can make a directional input by using the analog stick 32 or can make a directional input by using the operation buttons 33 to 36. A user is allowed to make a directional input by using a preferred operation mechanism, thereby improving the controllability. Note however that the instruction each operation button is assigned can be freely determined by the program executed on the main unit 2.
  • In the present embodiment, the arrangement of the analog stick and the four operation buttons (i.e., A, B, X and Y buttons) is reversed between the left controller 3 and the right controller 4. That is, in the attached state, the analog stick 32 is arranged above the four operation buttons 33 to 36 on the left controller 3, whereas the four operation buttons 53 to 56 are arranged above the analog stick 52 on the right controller 4. Therefore, as shown in FIG. 14, when the information processing device 1 is held with the left hand and the right hand at the same height (in other words, at the same position with respect to the up-down direction), it will be easier for one hand to operate the analog stick and for the other hand to operate the four operation buttons. That is, in the present embodiment, it is possible to provide an information processing device, with which it is easy to operate an analog stick and four operation buttons.
  • Note that controllers of different configurations (e.g., configurations having different functions, configurations where operation sections are arranged differently, etc.) from the present embodiment may be provided as the left controller and/or the right controller (see “[5-1. Other types of controllers]), the details of which will be described later. Then, it is possible to provide an information processing device that can be operated in a different fashion from the present embodiment (i.e., the information processing device 1 shown in FIG. 14) by attaching the left controller and/or the right controller having a different configuration to the main unit 2, instead of the left controller 3 and/or the right controller 4 of the present embodiment.
  • [3-2. Mode in Which a Pair of Controllers are Used Detached]
  • As described above, in the present embodiment, the information processing device 1 can be used with the left and right controllers 3 and 4 detached from the main unit 2 (referred to as the “detached state). Possible modes in which the information processing device 1 is used in the detached state include at least two modes, one in which a single user uses two controllers 3 and 4, and another in which two users each use one controller.
  • (Mode in Which Single User Uses Two Controllers)
  • FIG. 15 is a diagram showing an example of a single user holding two controllers 3 and 4 to use the information processing device 1 in the detached state. As shown in FIG. 15, in the detached state, a user can operate one set of two controllers 3 and 4 by holding them respectively in the left hand and the right hand.
  • Note that in the present embodiment, it is possible to calculate information regarding the movement and/or the attitude of a controller based on the detection results of the acceleration sensor and/or the angular velocity sensor provided in the controller. Therefore, the information processing device 1 can accept, as an input, the operation of moving the controller itself. A user is allowed to not only perform an operation on the operation sections (the operation buttons and the analog stick) of the controller but also perform an operation of moving the controller itself That is, in the present embodiment, the information processing device 1, being a portable device, can allow a user to perform an operation of moving the controller (without moving the display). Also, the information processing device 1, being a portable device, can allow a user to perform an operation away from the display 12.
  • Note that the information processing device 1 can calculate information regarding movement and/or attitude of the information processing device 1 based on the detection results of the acceleration sensor 89 and/or the angular velocity sensor 90 provided in the main unit 2 not only in the detached state but also in the attached state.
  • In the detached state, the main unit 2 and the controllers 3 and 4 communicate with each other using wireless communication. That is, the main unit 2 receives operation data from the controllers 3 and 4 with which a wireless connection has been established (i.e., which have been paired with the main unit 2), and executes an information process based on the received operation data (specifically, using the operation data as an input).
  • In the present embodiment, for wireless communication, the main unit 2 distinguishes between two controllers which are communication partners. That is, the main unit 2 identifies whether the received operation data is from the left controller 3 or from the right controller 4. The method for distinguishing between controllers will be described later.
  • (Mode in Which Two Users Each Use One Controller)
  • FIG. 16 is a diagram showing an example of two users each holding one controller to use the information processing device 1 in the detached state. As shown in FIG. 16, in the detached state, two users can each perform an operation. Specifically, one user (referred to as the “first user”) can perform an operation by using the left controller 3 while the other user (referred to as the “second user”) can perform an operation by using the right controller 4. For example, the information processing device 1 executes an information process of controlling the action of a first object (e.g., the player character) in a virtual space based on an operation performed on the left controller 3, and controlling the action of a second object in the virtual space based on an operation performed on the right controller 4. Note that as in the mode shown in FIG. 15, also in the mode shown in FIG. 16, a user can operate the operation sections provided on the controller and/or perform an operation of moving the controller itself
  • In the present embodiment, the positional relationship between the analog stick 52 and the operation buttons 53 to 56 on the right controller 4 is opposite from the positional relationship between these two types of operation sections on the left controller 3. Therefore, if two users hold the left controller 3 and the right controller 4 in the same orientation, as shown in FIG. 16, for example, the positional relationship between the two types of operation sections on one of the two controllers will be the same as that on the other controller. That is, in the present embodiment, users can use the left controller 3 and the right controller 4 detached from the main unit 2 in the same fashion with respect to the two types of operation sections. Thus, it is possible to improve the controllability of the controller.
  • In the detached state, the four operation buttons 33 to 36 of the left controller 3 may be used for the same function as the four operation buttons 53 to 56 of the right controller 4 (in other words, they may be used for giving the same instructions). Specifically, the right direction button 33 may be used for the same function as the Y button 56, the lower direction button 34 for the same function as the X button 55, the upper direction button 35 for the same function as the B button 54, and the left direction button 36 for the same function as the A button 53. Thus, in the present embodiment, the function of the operation buttons 33 to 36 may vary between the attached state and the detached state. Note however that the instruction each operation button is assigned can be freely determined by the program executed on the main unit 2.
  • Note that in FIG. 16, the information processing device 1 divides the display area of the display 12 into two display areas to display an image for the first user (e.g., an image including the first object) in one of the display areas and an image for the second user (e.g., an image including the second object) in the other display area. Note however that depending on the application running on the information processing device 1, the information processing device 1 may display an image for two users (e.g., an image including both the first object and the second object) without dividing the display area of the display 12.
  • As in the mode shown in FIG. 15, also in the mode shown in FIG. 16, the main unit 2 and the controllers 3 and 4 communicate with each other in wireless communication. Then, the main unit 2 distinguishes between controllers each being a communication partner.
  • (Other Modes)
  • In the present embodiment, the right controller 4 includes the infrared image-capturing section 123. Therefore, with the right controller 4 detached from the main unit 2, the information processing device 1 may execute an information process based on the image-capturing results (referred to also as detection results) from the infrared image-capturing section 123. Examples of the information process include the following processes.
  • For example, when an attachment including an operation section (hereinafter referred to as an “extension controller”) is attached to the right controller 4, the main unit 2 can detect an operation performed on the operation section based on the image-capturing results from the infrared image-capturing section 123. Therefore, the main unit 2 can execute an information process in accordance with an operation performed on the operation section by using the image-capturing results.
  • Specifically, the extension controller can be attached and detached to and from the right controller 4, and include buttons and a movable operation section such as a stick, for example. The extension controller includes, inside the housing, a movable portion that moves (herein, the movement includes rotation) in response to an operation performed on an operation section. For example, the movable portion is a member configured to move in response to a button, which is an operation section, being pressed down. Here, the extension controller is attached to the right controller 4 so that the infrared image-capturing section 123 can capture an image of the movable portion inside the housing. For example, the housing of the extension controller is provided with a window portion, and the infrared image-capturing section 123 can capture an image of the movable portion via the window portion, with the extension controller attached to the right controller 4. Then, the main unit 2 can identify an operation performed on an operation section of the extension controller based on the position and/or the attitude of the movable portion in the image captured by the infrared image-capturing section 123. Thus, by using captured images, the main unit 2 may execute an information process in response to an operation performed on the operation section.
  • The main unit 2 may detect a gesture input by using a hand of a user based on the image-capturing results from the infrared image-capturing section 123 so as to execute an information process in accordance with the gesture input. For example, a user holds the right controller 4 in one hand, and an image is captured of the other hand of the user using the infrared camera of the infrared image-capturing section 123. Note that in the present embodiment, the infrared camera is arranged so as to capture an image of an area under the right controller 4. Therefore, a user makes a gesture input by placing the other hand under the right controller 4. The main unit 2 obtains information based on the image-capturing results from the right controller 4 so as to identify a gesture input made by a user. Then, the main unit 2 executes an information process based on the gesture input.
  • A gesture input may be any input made by using an operation object to be operated (moved) by a user. An operation object may be a body of a user (which may be a part of the body, such as a hand or the face, or may be the whole body), may be an object held by a user, or may include both. The information processing device 1 may recognize the shape of the operation object as a gesture input, may recognize the position, the orientation and/or the movement of the operation object as a gesture input, or may recognize a combination thereof as a gesture input. For example, a user can make a gesture input by using the shape of a hand, the movement of a hand, the position of a hand (with respect to the right controller 4), the orientation (attitude) of a hand, etc.
  • In addition to (or instead of) identifying a gesture input, the infrared image-capturing section 123 may be used to calculate the position and/or the orientation of the right controller 4 with respect to a predetermined marker. For example, a user may place a marker at an intended position (e.g., around the display 12 or around the TV 6), and perform an operation of moving the right controller 4 within such a range that the infrared camera can capture an image of the marker. Note that the marker may be made of a material that reflects infrared light, for example. Then, the information processing device 1 can calculate the position and/or the orientation of the right controller 4 with respect to the marker based on the image-capturing results from the infrared image-capturing section 123. Moreover, the information processing device 1 can execute an information process by using the calculated information as a user's operation input.
  • The main unit 2 can also capture an image of a hand of a user (specifically, the vein pattern of the hand) by means of the infrared image-capturing section 123, to thereby authenticate (specifically, by vein authentication) a user based on the captured image. Moreover, the main unit 2 can also measure the pulse of a user by capturing an image of a hand of the user by means of the infrared image-capturing section 123. That is, where infrared light output onto and reflected by a hand of a user is detected by the infrared image-capturing section 123, the main unit 2 can calculate the pulse of the user based on changes in the reflected waves.
  • Note that the above description has been directed to an example where the infrared image-capturing section 123 is used when the right controller 4 is detached from the main unit 2. Now, the information processing device 1 can also execute an information process based on the image-capturing results from the infrared image-capturing section 123 with the right controller 4 attached to the main unit 2.
  • In the present embodiment, the main unit 2 is provided with input sections (specifically, the touch panel 13, the acceleration sensor 89, the angular velocity sensor 90, etc.). Therefore, a user can use only the main unit 2 with the controllers 3 and 4 detached therefrom. Then, a user can use the information processing device 1 while the weight thereof is reduced.
  • In the present embodiment, the information processing device 1 may be used with one of the left controller 3 and the right controller 4 detached from the main unit 2 and with the other controller attached to the main unit 2. Then, the main unit 2 may communicate with the first controller in wireless communication and communicate with the second controller in wired communication.
  • [3-3. Mode in Which Three or More Controllers are Used]
  • As described above, in the present embodiment, the main unit 2 can communicate with a plurality of left controllers. The main unit 2 can also communicate with a plurality of right controllers. Therefore, in the present embodiment, three or more controllers can be used simultaneously.
  • FIGS. 17A and 17B are diagrams showing example modes of use where three or more controllers are used. FIGS. 17A and 17B show cases in which a total of four controllers, i.e., two left controllers 3 a and 3 b and two right controllers 4 a and 4 b, are used. Note that it is assumed here that the controllers are detached from the main unit 2. When four controllers are used, one can possibly conceive at least a mode in which four users each use one controller (FIG. 17A), and a mode in which two users each use two controllers (specifically, a pair of left and right controllers) (FIG. 17B).
  • (Mode in Which Each User Uses One Controller)
  • In FIG. 17A, each user uses one of the four controllers 3 a, 3 b, 4 a and 4 b. That is, in the present embodiment, where four controllers are provided, four users, User A to User D, are allowed to perform operations each using a controller. For example, based on an operation performed on a controller, the information processing device 1 executes an information process of controlling the action of an object associated with that controller. Note that also in the modes of use shown in FIGS. 17A and 17B, as in the modes of use shown in FIG. 15 and FIG. 16, each user is allowed to perform an operation on an operation section provided on the controller and/or to perform an operation of moving the controller itself.
  • In FIG. 17A, the main unit 2 communicates, in wireless communication, with each of the four controllers 3 a, 3 b, 4 a and 4 b. In the present embodiment, the main unit 2 distinguishes the four controllers 3 a, 3 b, 4 a and 4 b from one another. That is, the main unit 2 identifies from which of the four controllers the operation data has been received. Note that in the case of FIG. 17A, the main unit 2 distinguishes between the left controller 3 a and the left controller 3 b, and distinguishes between the right controller 4 a and the right controller 4 b. Note that the method for distinguishing each controller will be described later.
  • (Mode in Which Each User Uses a Pair of Controllers)
  • In FIG. 17B, each user uses a pair of controllers from the four controllers 3 a, 3 b, 4 a and 4 b. That is, User A uses a pair of the left controller 3 a and the right controller 4 a, and User B uses a pair of the left controller 3 b and the right controller 4 b. Thus, in the present embodiment, where four controllers are provided, two users can each operate a pair of controllers (which can also be referred to as a set of controllers).
  • The information processing device 1 executes an information process using, as a set of data, two pieces of operation data received from a pair of controllers. For example, based on operations performed on a pair of controllers, the information processing device 1 executes an information process of controlling the action of an object associated with the pair of controllers. Specifically, the action of the first object is controlled based on operations performed on the left controller 3 a and the right controller 4 a, and the action of the second object is controlled based on operations performed on the left controller 3 b and the right controller 4 b.
  • Now, in the present embodiment, in the mode of use shown in FIG. 17B, the main unit 2 registers each pair of a left controller and a right controller used by a single user. The main unit 2 executes an information process by using operation data from the two controllers included in the registered pair as a set of data (e.g., as operation data for controlling one control object).
  • While there is no particular limitation on the method for registering a pair of controllers, a left controller and a right controller can be attached to the main unit 2 so as to register the controllers as a pair, in the present embodiment. That is, the main unit 2 registers, as a pair of controllers, a left controller and a right controller that are attached thereto at the same time. For example, when registering pairs of controllers shown in FIG. 17B, a user first attaches the left controller 3 a and the right controller 4 a to the main unit 2 and detaches these two controllers, i.e., the left controller 3 a and the right controller 4 a from the main unit 2, after which the other controllers, i.e., the left controller 3 b and the right controller 4 b, are attached to the main unit 2. Then, a pair of the left controller 3 a and the right controller 4 a and a pair of the left controller 3 b and the right controller 4 b can be registered with the main unit 2. Note that the details of the pairing process will be described later.
  • Note that where three or more controllers are used, the information processing system can be used in various other modes, other than the modes of use shown in FIGS. 17A and 17B. For example, the information processing system can be used in a mode in which a user uses a pair of left and right controllers while another user uses one controller. For example, a user can use a controller or controllers attached to the main unit 2 while another user uses a controller or controllers detached from the main unit 2.
  • [3-4. Mode in Which Image is Displayed on TV]
  • As described above, in the present embodiment, when the information processing device 1 is attached to the cradle 5, the information processing device 1 can output images (and sound) to the TV 6 via the cradle 5. FIG. 18 is a diagram showing an example mode of use where images are displayed on a TV. As shown in FIG. 18, the information processing system in the present embodiment can use the TV 6 as a display device (and a sound output device).
  • [3-4-1. Operation When Image is Displayed on TV]
  • FIGS. 19A, 19B, 19C, and 19D are diagrams showing an example operation flow in which images are displayed on a TV. The operation of switching from a mode in which the information processing device 1 is used as a portable device to a mode in which it is used as a console device (i.e., using the TV 6 as a display device) will be described. Note that it is assumed here that the cradle 5 is connected to the TV 6 in advance. It is also assumed that a charging device (not shown) (e.g., an AC adaptor) is connected to the power terminal 134 of the cradle 5, and the cradle 5 is receiving commercial power supply.
  • First, a user uses the information processing device 1 as a portable device, i.e., in a state in which it is not attached to the cradle ((1) shown in FIG. 19A). In this state, when switching to a mode in which the information processing device 1 is used as a console device, the user attaches the information processing device 1 to the cradle 5 ((2) shown in FIG. 19B). Thus, the lower terminal 27 of the information processing device 1 and the main body terminal 73 of the cradle 5 are connected together. In this process, the information processing device 1 with the controllers 3 and 4 attached thereto may be attached to the cradle 5, or the information processing device 1 with the controllers 3 and 4 detached therefrom (i.e., the main unit 2) may be attached to the cradle 5.
  • Note that in the present embodiment, upon detecting the attachment of the information processing device 1 to the cradle 5, the information processing device 1 turns off the display 12, the details of which will be described later. Thus, in the present embodiment, the display 12 of the main unit 2 is not used when attached to the cradle 5. Note that in other embodiments, the main unit 2 may still display images on the display 12 when attached to the cradle 5. In the present embodiment, upon detecting the detachment of the information processing device 1 from the cradle 5, the information processing device 1 turns on the display 12.
  • As described above, in the present embodiment, the information processing device 1 can be attached to the cradle 5 either in the front-facing orientation or in the back-facing orientation. This allows a user to place the information processing device 1 in any orientation, thereby facilitating the attachment to the cradle 5.
  • Note that in other embodiments, the cradle 5 may be capable of supporting the information processing device 1 in the back-facing orientation (i.e., in an orientation in which the display 12 is hidden facing the second support 72) and not capable of supporting the information processing device 1 in the front-facing orientation (i.e., in an orientation in which the display 12 is not hidden). In the present embodiment, the display 12 is turned off when the information processing device 1 is placed in the cradle 5. Although turning off the display 12 is the intended operation by the information processing device 1, a user may possibly mistake the display 12 being turned off for a breakdown or a malfunction. This can be avoided by not allowing the information processing device 1 to be placed in the cradle 5 in the front-facing orientation.
  • In the present embodiment, upon detecting the attachment of the information processing device 1 to the cradle 5, the cradle 5 starts charging the information processing device 1. That is, upon detecting the attachment of the information processing device 1 to the cradle 5, the processing section 135 instructs the power control section 133 to perform an operation of supplying power from the power terminal 134 to the main unit 2. As the power control section 133 starts this operation, the cradle 5 starts charging the main unit 2. That is, the battery 98 in the main unit 2 is charged by the power supplied from the cradle 5 via the lower terminal 27.
  • If a controller (specifically, the left controller 3 and/or the right controller 4) is attached to the main unit 2 while the main unit 2 is charged by the cradle 5, the main unit 2 charges the controller attached thereto. That is, in such a case, the power control section 97 of the main unit 2 supplies the power from the cradle 5 via the lower terminal 27 to the controller via a terminal corresponding to the controller attached to the main unit 2 (specifically, the left-side terminal 17 and/or the right-side terminal 21). Thus, the controller is charged. That is, when the left controller 3 is charged, the battery of the power supply section 108 is charged by the power supplied via the terminal 42. When the right controller 4 is charged, the battery of the power supply section 118 is charged by the power supplied via the terminal 64.
  • Note that in other embodiments, the cradle 5 may charge the main unit 2 under a predetermined condition. For example, the power control section 133 of the cradle 5 may perform charging on the condition that the remaining charge of the battery 98 of the main unit 2 attached to the cradle 5 is less than or equal to a predetermined level. Similarly, the main unit 2 may charge a controller under a predetermined condition. For example, the power control section 97 of the main unit 2 may perform charging on the condition that the remaining charge of the battery of a controller attached to the main unit 2 is less than or equal to a predetermined level.
  • The above operation regarding charging may be executed even when the power of the information processing device 1 is off. That is, even when the information processing device 1 is attached to the cradle 5 when the power of the information processing device 1 is off, the cradle 5 may charge the information processing device 1 (i.e., charge the main unit 2 and/or controllers).
  • Moreover, in other embodiments, the main unit 2 may charge controllers when the main unit 2 is not attached to the cradle 5. This reduces the possibility that a user can no longer play because the battery of the controller has run out even though the battery 98 of the main unit 2 is still charged sufficiently. As described above, in the present embodiment, the battery 98 of the main unit 2 has a greater charging capacity than the battery of a controller. Thus, it is possible to further reduce such a possibility described above.
  • When the information processing device 1 is attached to the cradle 5, under a predetermined condition, the information processing device 1 outputs images (and sound) to the TV 6 and causes the TV 6 to output the images (and the sound) ((3) shown in FIG. 19C). That is, the information processing device 1 transmits data of the images and the sound to be output to the cradle 5 under a predetermined condition. Note that “the images and the sound to be output” in the present embodiment, refer to the images and the sound that are produced or obtained by a program (e.g., an OS program or an application program) running at a point in time when the information processing device 1 is attached to the cradle 5. For example, if a game application is running at such a point in time, the information processing device 1 outputs, to the cradle 5, data of the images and the sound that are produced by the game application. For example, if an application for obtaining and reproducing video images from the Internet is running at such a point in time, the information processing device 1 transmits, to the cradle 5, data of the images and the sound that are obtained by the application.
  • Note that there is no particular limitation on the condition for the image and the sound to be output to be transmitted to the cradle 5 (referred to as the “image output condition”), but the image output condition in the present embodiment is the satisfaction of Conditions 1 to 3 below.
  • Condition 1: the cradle 5 is connected to the TV 6.
  • Condition 2: the cradle 5 is receiving power supply.
  • Condition 3: the cradle 5 is an authorized product (or a genuine product) (in other words, the cradle 5 is a product that is authorized by the manufacturer of the information processing device 1).
  • When these three conditions are satisfied, the information processing device 1 determines that the image output condition is satisfied. In this case, the information processing device 1 transmits, to the cradle 5, the image and the sound to be output. Note that in other embodiments, the information processing device 1 may use, as the image output condition, one or two conditions of Conditions 1 to 3 above, or may use other conditions, as the image output condition, in addition to (or instead of) Conditions 1 to 3 above.
  • Upon receiving data of the images and the sound from the information processing device 1, the cradle 5 transmits data of the images and the sound to the TV 6. Thus, “the images and the sound to be output” are output from the TV 6 ((3) shown in FIG. 19C). When the TV 6 is ready to output the images and the sound, a user is allowed to perform operations using controllers ((4) shown in FIG. 19D).
  • Note that when no image is displayed on the display 12 of the information processing device 1 (i.e., when the power is off, or when in the sleep mode), no image is displayed on the TV 6 even if the information processing device 1 is attached to the cradle 5.
  • In the present embodiment, even if the image output condition is not satisfied, the information processing device 1 turns off the image display on the display 12 while attached to the cradle 5. Note however that in other embodiments, if the image output condition is not satisfied, the information processing device 1 may resume the image display on the display 12.
  • As described above, in the present embodiment, the image display output can be switched from the display 12 to the TV 6 by attaching the information processing device 1 to the cradle 5 while images are displayed on the display 12 of the information processing device 1. That is, in the present embodiment, a user can easily (and seamlessly) switch the image display output only by placing the information processing device 1 on the cradle 5.
  • Note that as described above, in the present embodiment, the information processing device 1 can be used in a mode of use in which the images are displayed on the display 12 and a mode of use in which the images are displayed on the TV 6. Now, in the present embodiment, the information processing device 1 changes its operation mode depending on these two modes of use. That is, the information processing device 1 can operate at least in two different modes of the portable mode and the console mode. In the portable mode, some of the functions of the information processing device 1 are limited, the details of which will be described later. Note that the switching of the operation mode will be described later (see “[3-5. Changing operation mode]” and “[4-3. Mode setting process]”).
  • Note that in other embodiments, the information processing device 1 may be allowed to communicate directly with the TV 6 (i.e., with no cradle 5 interposed therebetween). Then, the information processing device 1 can transmit the image and/or the sound directly to the TV 6. Note that there is no particular limitation on the communication method used between the information processing device 1 and the TV 6, and it may be, for example, wired communication using a cable (e.g., an HDMI cable) or wireless communication. If the information processing device 1 communicates directly with the TV 6, the cradle 5 may be used for the purpose of charging the information processing device 1, for example. Also in such a case, as in the case of the embodiment described above, the information processing device 1 may transmit the image and/or the sound to the TV 6 at least on the condition that the information processing device 1 is attached to the cradle 5.
  • [3-4-2. Mode of Use Where Image is Displayed on TV]
  • Where the image and the sound are output from the TV 6, a user can perform operations by using a controller or controllers ((4) shown in FIGS. 19A-19D). In the case described above, since the main unit 2 is attached to the cradle 5, it is difficult to use the controllers 3 and 4 while the controllers 3 and 4 are attached to the main unit 2. Therefore, in the case described above, a user may perform operations by using a controller or controllers detached from the main unit 2. For example, a user detaches the left controller 3 and/or the right controller 4 from the main unit 2 as necessary, and performs operations. Now, in the present embodiment, the controllers 3 and 4 can be detached from the main unit 2 by sliding them upward against the main unit 2. Therefore, a user can conveniently detach the controllers 3 and 4 from the main unit 2 while the main unit 2 remains attached to the cradle 5.
  • Note that if controllers other than the controllers 3 and 4 attached to the main unit 2 can communicate with the main unit 2 in wireless communication, such other controllers may be used.
  • If a controller detached from the main unit 2 is used in a mode of use in which images are displayed on the TV 6, the controller and the main unit 2 communicate with each other in wireless communication.
  • The mode of use where the TV 6 is used as a display device is similar to the modes of use described above in “[3-2. Mode in which a pair of controllers are used detached]” and “[3-3. Mode in which three or more controllers are used]”, except that the TV 6 is used instead of the display 12. That is, in the present embodiment, also in the case where the TV 6 is used as a display device, as in the case in which the display 12 is used, one user may perform operations using a pair of controllers (see FIG. 18), or two users may perform operations each using one controller. When controllers other than the left controller 3 and the right controller 4 are provided, three or more users may perform operations each using one controller, or two or more users may perform operations each using a pair of controllers.
  • [3-5. Changing Operation Mode]
  • As described above, in the present embodiment, the information processing device 1 can be used in a mode of use in which images are displayed on the display 12 and a mode of use in which images are displayed on the TV 6. In the present embodiment, the information processing device 1 changes its operation mode depending on these two modes of use. That is, the information processing device 1 can operate at least in two different modes of the portable mode and the console mode.
  • The portable mode is a mode in which the information processing device 1 is used as a portable device. In the portable mode, images obtained or produced by the information processing device 1 are displayed on the display 12. Sound obtained or produced by the information processing device 1 is output from the speaker 88. In the portable mode, settings of the information processing device 1 are changed as shown in (a) and (b) below.
  • (a) Setting for limiting processing power of main unit 2
  • In the present embodiment, in the portable mode, the clock frequency range over which the CPU 81 is allowed to operate is limited. In the present embodiment, the clock frequency at which the CPU 81 is allowed to operate can be specified within a predetermined range by the program executed on the main unit 2. In the portable mode, the range over which a clock frequency can be specified by the program is limited as compared with that in the console mode. For example, the range over which a clock frequency can be specified in the console mode is X1 [Hz] or less, whereas it is limited to X2 (<X1) [Hz] in the portable mode. Note that if the main unit 2 includes a GPU (Graphics Processing Unit) in addition to a CPU, the range of processing power (i.e., clock frequency) may be limited for the CPU and/or for the GPU.
  • In the present embodiment, in the portable mode, the image rendering capacity (which can also be referred to as the image generating capacity) is limited. Specifically, in the portable mode, the resolution (in other words, the number of pixels) of images generated by the main unit 2 is lower than that in the console mode.
  • With the limitation (a) above, the amount of computation to be executed on the main unit 2 is limited in portable mode, and it is therefore possible to reduce the heat generation and the power consumption of the main unit 2.
  • (b) Setting for limiting operation of cooling fan 96
  • In the present embodiment, in the portable mode, the operation of the cooling fan 96 is limited. Specifically, in the portable mode, the rotations per minute (rpm) at which the cooling fan 96 is allowed to operate (in other words, the fan speed) is limited to be lower than the highest rpm at which the cooling fan 96 is allowed to operate in the console mode. For example, in the portable mode, the cooling fan 96 is controlled to operate at a predetermined rpm or less, which is lower than the maximum rpm.
  • With the limitation (b) above, it is possible to reduce the operation noise of the cooling fan 96. In the portable mode, it is assumed that a user uses the main unit 2 close to the user. In view of this, in the present embodiment, since the operation noise of the cooling fan 96 can be reduced in the portable mode, it is possible to reduce the possibility that the user may feel the operation noise is too loud.
  • On the other hand, the console mode is a mode in which the information processing device 1 is used as a console-type device. In the console mode, images obtained or produced by the information processing device 1 are displayed on the TV 6. The sound obtained or produced by the information processing device 1 is output from the speaker of the TV 6. In the console mode, the function suppressions in the portable mode are lifted. That is, in the console mode, the limitations (a) and (b) described above are lifted. Therefore, in the console mode, a program running on the information processing device 1 can make better use of the processing power of the CPU 81. The information processing device 1 can display, on the TV 6, images of a higher resolution than in the portable mode. The information processing device 1 can be cooled by the cooling fan 96 more effectively than in the portable mode.
  • In the present embodiment, the switching between the portable mode and the console mode is done as follows. That is, when the information processing device 1 is attached to the cradle 5 and the image output condition described above is satisfied, the information processing device 1 switches the operation mode to the console mode. On the other hand, when the information processing device 1 is detached from the cradle 5, the information processing device 1 switches the operation mode to the portable mode. Note that the switching of the operation mode will be described later (see FIG. 24).
  • Note that differences between settings in the portable mode and those in the console mode are not limited to those described above. For example, in other embodiments, the information processing device 1 may impose only one of the limitations (a) and (b) described above in the portable mode. For example, in other embodiments, the information processing system may use different communication methods for the communication with external devices via a network (e.g., the Internet) (in other words, may switch between different communication methods) in the portable mode and in the console mode. For example, consider a case in which a predetermined network connection device is connected to the cradle 5, the predetermined network connection device having a function of connecting to and communicating with the network. In such a case, in the portable mode, the information processing device 1 may communicate with the network using the network communication section 82. On the other hand, in the console mode, the information processing device 1 may communicate with the network via the cradle 5 and the network connection device.
  • For example, in an application executed by the information processing device 1, settings in the application may be changed between the portable mode and the console mode. For example, in a game application, particulars of the game (e.g., the rules, the operation method, etc., of the game) may be changed between the portable mode and the console mode (see FIG. 25).
  • Note that the operation mode (the portable mode and the console mode) is of a different level from the ON mode and the sleep mode. That is, the main unit 2 may be either in the ON mode or in the sleep mode when the operation mode is the portable mode. Or, the main unit 2 may be either in the ON mode or in the sleep mode when the operation mode is the console mode.
  • [4. Specific Example of Process of Information Processing System]
  • Next, some processes to be executed when the information processing system operates as described above will be described by way of more specific examples.
  • [4-1. Registration Process]
  • As described above, in the present embodiment, the main unit 2 and the controllers 3 and 4 communicate with each other. Thus, the main unit 2 executes a registration process for registering controllers so that the main unit 2 can identify controllers to communicate with the main unit 2. Note that those controllers offered (specifically, sold) together with the main unit 2 may be pre-registered with the main unit 2 at the time of offering.
  • FIG. 20 is a flow chart showing an example flow of a registration process executed on the main unit 2. In the present embodiment, the registration process is executed in response to a controller being attached to the main unit 2. That is, the main unit 2 detects the attachment of the left controller 3 or the right controller 4, and starts executing the registration process shown in FIG. 20 in response to detecting the attachment.
  • There is no particular limitation on the method for detecting a controller being attached to the main unit 2. For example, the sensing method may be a method based on a signal state (e.g., a voltage state) at a predetermined pin included in the terminal of the main unit 2 and/or controller. For example, the sensing method may be a method in which the subject device mechanically senses the connection of a terminal of another device thereto. The main unit 2 and a controller may both have the sensing function described above, or either one of them may have the sensing function. Where only one of the devices has the sensing function, that device may indicate, to the other device, that the connection has been sensed, as necessary.
  • Note that the processes of the steps of the flow chart shown in FIG. 20 (this similarly applies to the flow charts in FIG. 23 to FIG. 27 to be discussed later) are merely illustrative, and the order of steps may be switched around as long as similar results are obtained, and other processes may be executed in addition to (or instead of) the process of any step. While the present embodiment is described assuming that the processes of the steps of the flow chart are executed by the CPU 81 of the main unit 2, processes of some of the steps of the flow chart may be executed by a processor or a dedicated circuit other than the CPU 81. Some of the processes executed by the main unit 2 may be executed by another information processing device capable of communicating with the main unit 2 (e.g., a server capable of communicating with the main unit 2 via a network). That is, the processes shown in FIG. 20, FIG. 23 to FIG. 27 may be executed through a cooperation of a plurality of information processing devices including the main unit 2.
  • In the registration process shown in FIG. 20, first, in step S1, the CPU 81 determines whether or not the controller sensed to have been attached to the main unit 2 has been registered. Now, in the present embodiment, the main unit 2 stores, in a storage section thereof (e.g., the flash memory 84), registration information representing controllers that have been registered. The determination of step S1 is based on this registration information.
  • FIG. 21 is a diagram showing an example registration information. In the present embodiment, the registration information is information in which number information, identification information and wireless communication information are associated together, as shown in FIG. 21. The number information is a number assigned to a registered controller. The indicator LED described above on the controller may be controlled to represent a value based on this number.
  • The identification information is information representing a value (e.g., an ID) that is uniquely assigned to a controller. With this identification information, it is possible to uniquely identify a controller. In the present embodiment, identification information includes information representing whether a controller is a left controller or a right controller. That is, the main unit 2 can determine whether a controller is a left controller or a right controller based on the identification information assigned to the controller. Note that in other embodiments, the identification information does not need to include information representing whether a controller is a left controller or a right controller. In such a case, the registration information may include information representing whether a controller is a left controller or a right controller (separately from the identification information). Note that the CPU 81 can determine whether a controller is a left controller or a right controller based on whether the identification information of the controller has been obtained through the left-side terminal 17 or through the right-side terminal 21.
  • The wireless communication information indicates whether or not the connection setting (i.e., pairing) of the device for wireless communication with the main unit 2 has been done. That is, if the pairing between the main unit 2 and the controller has been done, information representing “registered” is stored as the wireless communication information of the controller. On the other hand, if the pairing between the main unit 2 and the controller has not been done, information representing “unregistered” is stored as the wireless communication information of the controller. Note that the main unit 2 may store information regarding the connection setting for wireless communication (separately from the registration information) so as to eliminate the need to perform pairing again for a controller for which pairing has been done.
  • Note that a part of the registration information may be deleted or changed in response to an instruction from a user. For example, the main unit 2 may delete information regarding a specified controller or change a number assigned to the controller, in response to an instruction from a user.
  • Referring back to FIG. 20, in step S1 described above, the CPU 81 first obtains identification information from a controller sensed to have been attached. Note that it is assumed that a memory of a controller (e.g., the memories 102 and 112) stores, in advance, identification information assigned to the controller. A controller transmits the identification information thereof to the main unit 2 in response to the controller being connected to the main unit 2 or in response to a request from the CPU 81. The CPU 81 determines whether or not a controller sensed to have been attached has been registered based on whether or not the obtained identification information is included in the registration information. If the determination result of step S1 is negative, the process of step S2 is executed. On the other hand, if the determination result of step S1 is affirmative, the process of step S3 is executed, skipping the process of step S2.
  • In step S2, the CPU 81 registers a controller sensed to have been attached. That is, the CPU 81 updates the registration information stored in the storage section so that the number information, the identification information and the wireless communication information of the controller are associated together and added to the registration information. The number information may be, for example, information representing a number that has not been used for any other registered controller. The identification information may be identification information obtained from the controller in step S1 described above. The wireless communication information is set to “unregistered” since pairing has not been done at this point in time. Following step S2, the process of step S3 is executed.
  • In step S3, the CPU 81 determines whether or not both a left controller and a right controller are attached to the main unit 2. That is, it is determined whether or not the attachment of the left controller 3 and the right controller 4 to the main unit 2 can be sensed at the present point in time. If the determination result of step S3 is affirmative, the process of step S4 is executed. On the other hand, if the determination result of step S3 is negative, the CPU 81 ends the registration process, skipping the process of step S4.
  • In step S4, the CPU 81 registers the left controller 3 and the right controller 4 attached to the main unit 2 as a pair. Now, in the present embodiment, the main unit 2 stores pairing information representing a pair of a left controller and a right controller in the storage section (e.g., the flash memory 84).
  • FIG. 22 is a diagram showing an example of pairing information. In the present embodiment, as shown in FIG. 22, the pairing information is information in which left identification information and right identification information are associated together. The left identification information is identification information of a left controller registered as a pair, from among registered controllers (in other words, controllers represented by identification information included in the registration information). The right identification information is identification information of a right controller registered as a pair, from among registered controllers (in other words, controllers represented by identification information included in the registration information). Left identification information and right identification information that are associated together in the pairing information represent a pair of a left controller and a right controller.
  • In step S4 described above, the CPU 81 first obtains identification information of the left controller and the right controller attached to the main unit 2. Note that for a controller whose identification information has been obtained in step S1 described above, the identification information does not need to be obtained again. Next, the CPU 81 adds, to the pairing information, the obtained identification information of the left controller and that of the right controller, while they are associated together. That is, the CPU 81 updates pairing information so as to add the pair of two identification information obtained to the pairing information (strictly speaking, unupdated pairing information). Then, the updated pairing information is stored in the storage section. Note that if information representing the pair of two identification information is already included in the pairing information, the CPU 81 does not need to update the pairing information. If information representing a pair including one of the two identification information is already included in the pairing information, the CPU 81 deletes such a pair from the pairing information. Thus, any pair including one of two controllers that are included in a pair to be registered is unregistered (i.e., deleted from the pairing information). After step S4, the CPU 81 ends the registration process.
  • As described above, in the present embodiment, in response to a controller being attached to the main unit 2, the controller is registered with the main unit 2. Thus, a user can easily register controllers. In the present embodiment, since the registration process is executed when the main unit 2 and the controllers communicate with each other in wired communication, it is possible to reduce the possibility that the main unit 2 cannot obtain information (specifically, identification information) from controllers, and to reduce the possibility that registration fails.
  • In the present embodiment, in response to a left controller and a right controller being both attached to the main unit 2, these two controllers are registered as a pair. A user can attach any two controllers that the user wishes to use as a pair to the main unit 2, and the user can therefore register a pair through an easy and straightforward operation.
  • Note that in the present embodiment, an accessory (see FIG. 31) may be provided to which a left controller and a right controller can be attached at the same time, the details of which will be described later. Then, when a left controller and a right controller are both attached to the accessory, the main unit 2 registers these two controllers as a pair, as when a left controller and a right controller are both attached to the main unit 2. The details of such a pairing process will be described in “[5-2. Accessory for controllers]” below.
  • Note that the main unit 2 may register, as a pair, two controllers attached to the main unit 2 at the same time as in the embodiment described above, or may register, as a pair, two controllers that are not attached to the main unit 2 at the same time, unlike in the embodiment described above. For example, the main unit 2 may register, as a pair, a left controller last attached thereto and a right controller last attached thereto.
  • Note that in other embodiments, any other pairing method may be used for registering pairs of controllers. For example, the main unit 2 may register, as a pair, two controllers on which a predetermined operation is performed within a predetermined period of time. Specifically, the main unit 2 prompts a user to perform a predetermined operation in response to satisfaction of a predetermined condition (e.g., a pairing instruction has been received from a user). Then, the main unit 2 registers, as a pair, two controllers on which the predetermined operation is performed within a predetermined period of time from the prompting, from among two or more controllers being in communication with the main unit 2. Note that the “predetermined operation” may be, for example, an operation of pressing A button, or may be an operation of bumping two controllers against each other (this operation can be determined, for example, based on detection results of an acceleration sensor).
  • [4-2. Wireless Setting Process]
  • As described above, in the present embodiment, the main unit 2 and the controllers 3 and 4 may communicate with each other in wireless communication. Therefore, in the present embodiment, the information processing device 1 performs a registration (referred to also as pairing) for enabling wireless communication between the main unit 2 and the controllers. Note that those controllers offered (specifically, sold) together with the main unit 2 may be pre-registered with the main unit 2 for wireless communication at the time of offering.
  • In the present embodiment, the main unit 2 executes a wireless setting process for enabling wireless communication with controllers. FIG. 23 is a flow chart showing an example flow of a wireless setting process executed on the main unit 2. In the present embodiment, the wireless setting process is executed in response to a controller being removed from the main unit 2. That is, the main unit 2 senses the removal of the left controller 3 or the right controller 4, and starts executing the wireless setting process shown in FIG. 23 in response to the removal (referred to as “detachment”). Note that although there is no particular limitation on the method for sensing the detachment of a controller from the main unit 2, a similar method to the method for sensing the attachment of a controller to the main unit 2 may be used, for example.
  • In the wireless setting process, first, in step S11, the CPU 81 determines whether or not the controller sensed to have been detached from the main unit 2 has been paired for wireless communication. This determination can be done by using the registration information described above. That is, if wireless communication information included in the registration information indicates “registered” for the controller sensed to have been detached, the CPU 81 determines that the controller has been paired for wireless communication. On the other hand, if wireless communication information included in the registration information indicates “unregistered” for the controller sensed to have been detached, the CPU 81 determines that the controller has not been paired for wireless communication. If the determination result of step S11 is negative, the process of step S12 is executed. On the other hand, if the determination result of step S11 is affirmative, the CPU 81 ends the wireless setting process, skipping the process of step S12.
  • In step S12, the CPU 81 executes a pairing process for enabling wireless communication with a controller sensed to have been detached. The pairing process in the present embodiment may be similar to a pairing process of a conventional communication technique in accordance with the Bluetooth (registered trademark) standard. Note that in the present embodiment, a controller executes the pairing process in response to being removed from the main unit 2, the details of which will be described later. Thus, the main unit 2 and the controller are paired together, establishing a wireless connection therebetween. Following step S12, step S13 is executed.
  • In step S13, the CPU 81 determines whether or not the pairing with the controller sensed to have been detached has succeeded. Note that the process of step S13 is executed after lapse of a predetermined period of time from the execution of step S12. If the determination result of step S13 is affirmative, the process of step S14 is executed. On the other hand, if the determination result of step S13 is negative, the CPU 81 ends the wireless setting process, skipping the process of step S14.
  • In step S14, the CPU 81 updates the registration information. That is, the CPU 81 changes the wireless communication information included in the registration information to “registered” for the controller sensed to have been detached. Following step S14, the CPU 81 ends the wireless setting process.
  • In the present embodiment, for the wireless setting process, the controllers 3 and 4 execute a similar process (referred to as a “controller-side wireless setting process”) to that of the main unit 2. Specifically, a controller stores registration information in a storage section (e.g., the memories 102 and/or 112), wherein identification information of the main unit to which the controller is attached and wireless communication information indicating that whether or not the controller has been paired with the main unit are associated together in the registration information. A controller (specifically, the communication control section 101 or 111) determines whether or not it has been paired with the main unit 2 in response to the removal from the main unit 2. If it has not been paired, the controller executes the pairing process. Moreover, the controller determines whether or not the pairing has succeeded, and updates the registration information if the pairing has succeeded. Thus, in the present embodiment, when a controller is removed from the main unit 2, the main unit 2 and the controller each perform the pairing process, and a wireless connection is established upon successful pairing.
  • As described above, in the present embodiment, in response to the removal of a controller from the main unit 2, a setting process for wireless communication between the main unit 2 and the controller is executed to establish a wireless connection. In the present embodiment, a controller communicates in wired communication while it is attached to the main unit 2, and in wireless communication while it is removed from the main unit 2. Therefore, by executing the setting process when the controller is removed from the main unit 2, it is possible to establish a wireless connection with appropriate timing. Note that in the present embodiment, since a controller is registered when the controller is attached to the main unit 2 for the first time, the setting process is executed when the controller is removed from the main unit 2 for the first time. Then, it is possible to reduce the possibility that “the wireless connection is not established when a registered controller is removed from the main unit 2”, thus providing the information processing device 1 that is easy to use for a user. Moreover, according to the present embodiment, a user can easily cause the information processing device 1 to execute the setting process described above by removing a controller from the main unit 2.
  • Note that the condition on which the setting process for wireless communication is executed on the information processing device 1 may be any condition, and it not limited to the removal of a controller from the main unit 2. For example, in other embodiments, the setting process may be executed in response to a controller being attached to the main unit 2. For example, in other embodiments, the setting process may be executed in response to an instruction from a user. Specifically, the main unit 2 may execute the wireless setting process (FIG. 23) in response to a predetermined instruction from a user. The predetermined instruction may be given by a user selecting a predetermined item displayed on a menu screen, or may be given by a user pressing a predetermined button provided on the main unit 2. Moreover, in response to a predetermined instruction from a user, a controller may execute the “controller-side wireless setting process” described above. This predetermined instruction may be given by pressing down a predetermined button provided on the controller.
  • In the present embodiment, when a controller is removed from the main unit 2, the main unit 2 and the controller each determine whether or not the controller and the main unit 2 have been paired together, so that the pairing process is executed if they have not been paired together. In other embodiments, the main unit 2 and the controller may execute the pairing process, upon removal of the controller from the main unit 2, without performing the determination process.
  • (Operation on Controller)
  • In the present embodiment, if a controller is attached to the main unit 2, the controller regards the main unit 2 as its communication partner. That is, the controller transmits operation data in wired communication to the main unit 2, to which the controller is attached. On the other hand, if a controller is not attached to the main unit 2 but the controller has been paired with another main unit 2, the controller regards the other main unit 2 as its communication partner. That is, the controller transmits operation data in wireless communication to the other main unit 2, with which the controller has been paired. Thus, in the present embodiment, the controller transmits operation data to the main unit 2, irrespective of whether it is attached to the main unit 2. Note that in the present embodiment, if a controller has been paired with a plurality of main units 2, the controller regards one main unit that satisfies a predetermined condition as its communication partner. Although there is no particular limitation on the predetermined condition, it may be, for example, “being the main unit with which the controller has last been paired”. Note that if a controller is not attached to the main unit 2 and there is no main unit 2 with which the controller has been paired, the controller does not transmit operation data.
  • If a controller is able to communicate with a main unit 2, the controller repeatedly transmits operation data to the main unit 2, as its communication partner, at a rate of once per a predetermined period of time. If there is no main unit 2 with which a controller is able to communicate (i.e., where the main unit 2 as its communication partner has been turned OFF or set in the sleep mode), the controller stops transmitting operation data to a main unit 2. If there is newly a main unit 2 with which a controller is able to communicate (i.e., where the main unit 2 as its communication partner has been turned ON or set in the ON mode), the controller resumes transmitting operation data to the main unit 2.
  • [4-3. Mode Setting Process]
  • As described above, in the present embodiment, the information processing device 1 operates in the portable mode and in the console mode. Therefore, the information processing device 1 executes a process (referred to as the “mode setting process”) of switching the operation mode between the portable mode and the console mode.
  • FIG. 24 is a flow chart showing an example flow of a mode setting process executed on the main unit 2. In the present embodiment, while the power of the main unit 2 is ON, a series of processes of steps S21 to S29 shown in FIG. 24 is executed repeatedly. Note that in the present embodiment, it is assumed that at the start of the process (i.e., when the power is turned ON), the operation mode of the main unit 2 is set to the portable mode.
  • In the mode setting process, first, in step S21, the CPU 81 determines whether or not the main unit 2 is attached to the cradle 5. Note that there is no particular limitation on the method for sensing the attachment of the main unit 2 to the cradle 5. For example, the CPU 81 may perform the sensing based on a signal state (e.g., a voltage state) at a predetermined pin included in the terminal of the main unit 2 and/or the cradle 5. If the determination result of step S21 is affirmative, the process of step S22 is executed. On the other hand, if the determination result of step S21 is negative, the process of step S21 is executed again. That is, while the operation mode is the portable mode and the main unit 2 is not attached to the cradle 5, the CPU 81 repeatedly executes the process of step S21. The process of step S21 is executed at a rate of once per a predetermined period of time, for example.
  • In step S22, the CPU 81 turns off the display 12. Specifically, the CPU 81 stops supplying power to the display 12. The CPU 81 also stops outputting image data to the display 12. Following step S22, the process of step S23 is executed.
  • As described above, the main unit 2 turns OFF the display 12 at a point when the main unit 2 is attached to the cradle 5. Note that in other embodiments, the main unit 2 may turn OFF the display 12 at a point when the image is output from the main unit 2 to the TV 6, rather than turning OFF the display 12 (i.e., the display 12 is left ON) at the point when the main unit 2 is attached to the cradle 5. In other embodiments, the main unit 2 may not turn OFF the display 12 even when the image is output from the main unit 2 to the TV 6 so that the image is displayed both on the display 12 and on the TV 6. Then, the cradle 5 may be configured so that the display 12 can be viewed while the main unit 2 is attached to the cradle 5. The main unit 2 may generate two different images so that different images are displayed on the TV 6 and on the display 12.
  • In step S23, the CPU 81 determines whether or not the image output condition is satisfied. That is, the CPU 81 determines whether or not Conditions 1 to 3 described above are satisfied. Note that Conditions 1 to 3 can be determined based on information obtained from the cradle 5. Specifically, for “Condition 1: the cradle 5 is connected to the TV 6”, the CPU 81 may obtain information from the cradle 5 that indicates whether or not the cradle 5 is connected to the TV 6, and make the determination based on the obtained information. For “Condition 2: the cradle 5 is receiving power supply”, the CPU 81 may obtain information from the cradle 5 that indicates whether or not power is supplied from the power terminal 134 to the cradle 5, and make the determination based on the obtained information. For “Condition 3: the cradle 5 is an authorized product”, the CPU 81 may obtain predetermined information with which it is possible to identify an authorized product, and make the determination based on the obtained information. Note that the predetermined information may for example be identification information assigned to the cradle or information that shows predetermined content if the product is an authorized product (in other words, information that shows different content if the product is not an authorized product). The CPU 81 obtains the information at a predetermined timing from the cradle 5, and determines whether or not the image output condition is satisfied based on the obtained information. There is no particular limitation on the predetermined timing, but it may be, for example, when the main unit 2 is attached to the cradle 5 or when step S23 is executed.
  • If the determination result of step S23 is affirmative, the process of step S25 to be described later is executed. On the other hand, if the determination result of step S23 is negative, the process of step S24 is executed. Note that in this case, no image will be displayed on the TV 6 even though a user has attached the information processing device 1 to the cradle 5. Therefore, the information processing device 1 may indicate to the user that no image will be displayed on the TV 6. For example, the information processing device 1 may indicate by outputting a sound from the speaker 88 or by displaying an image on the display 12. For example, if the cradle 5 includes an indicator light-emitting portion (e.g., an LED provided on the sleep button 74), the information processing device 1 may indicate by controlling the light-emitting portion. If an indicator light-emitting portion is provided on the cradle 5, the information processing device 1 may control the light-emitting portion so as to indicate whether the operation mode is the portable mode or the console mode.
  • In step S24, the CPU 81 determines whether or not the main unit 2 has been removed from the cradle 5. If the determination result of step S24 is affirmative, the process of step S29 to be described later is executed. On the other hand, if the determination result of step S24 is negative, the process of step S23 described above is executed again. That is, while the main unit 2 is attached to the cradle 5 and the image output condition is not satisfied, the CPU 81 repeatedly executes a series of processes of steps S23 and S24. The series of processes is repeatedly executed at a rate of once per a predetermined period of time.
  • In step S25, the CPU 81 outputs images to the TV 6. That is, the CPU 81 outputs data of “the images and the sound to be output” to the cradle 5 via the lower terminal 27. The cradle 5 transfers the data to the TV 6. That is, the format of the data is converted through the conversion section 131 of the cradle 5, and the data is output to the TV 6 via the monitor terminal 132. Thus, the images and the sound are output from the TV 6. After the process of step S25, the CPU 81 repeatedly executes the process of outputting the images to the TV 6. This process is continued until the information processing device 1 ends the image display (e.g., the power of the information processing device 1 is turned OFF, or the information processing device 1 is set in the sleep mode), or until the operation mode is changed to the portable mode. Following step S25, the process of step S26 is executed.
  • In step S26, the CPU 81 changes the operation mode to the console mode. That is, the CPU 81 changes the settings of the main unit 2 so as to lift the functional limitations of the main unit 2 in the portable mode as described in “[3-5. Changing operation mode]” above. Following step S26, the process of step S27 is executed.
  • In step S27, the CPU 81 determines whether or not the main unit 2 has been removed from the cradle 5. If the determination result of step S27 is affirmative, the process of step S28 is executed. On the other hand, if the determination result of step S27 is negative, the process of step S27 is executed again. That is, while the operation mode is the console mode, the CPU 81 repeatedly executes the process of step S27. The process of step S27 is executed at a rate of once per a predetermined period of time, for example.
  • In step S28, the CPU 81 changes the operation mode to the portable mode. That is, the CPU 81 changes the settings of the main unit 2 so as to impose functional limitations on the main unit 2 as described in “[3-5. Changing operation mode]” above. The CPU 81 also stops the image output to the TV 6. Following step S28, the process of step S29 is executed.
  • In step S29, the CPU 81 outputs images to the display 12. That is, the CPU 81 displays the “images to be output” on the display 12. The CPU 81 also outputs the “sound to be output” from the speaker 88 (or the sound input/output terminal 25). After the process of step S29, the CPU 81 repeatedly executes the process of outputting images to the display 12. Following step S29, the process of step S21 is executed again.
  • As described above, in the present embodiment, the information processing device 1 can switch between the portable mode, in which images are displayed on the display 12, and the console mode, in which images are displayed on the TV 6. Now, the condition for changing the operation mode to the console mode (in other words, the image output condition) may be any condition. For example, the condition may be as follows.
  • In other embodiments, the image output condition may include a condition that “there has been a user instruction to output images to the TV 6”. For example, the cradle 5 may include an output instruction button for giving an instruction to output the images obtained or produced by the information processing device 1 to the TV 6. Then, the information processing device 1 may use, as one of the image output conditions, a condition that “the output instruction button has been pressed while the information processing device 1 is attached to the cradle 5”.
  • In other embodiments, the image output condition may include a condition that “the TV 6 is capable of displaying the images from the cradle 5”. Specifically, this condition is that the power of the TV 6 is ON and the input select setting thereof is such that input images from the cradle 5 are displayed. By using image output conditions including the condition described above, it is possible to avoid a situation where no image is displayed on the TV 6 even though images are output from the information processing device 1 via the cradle 5.
  • In addition, the information processing device 1 may further perform a control, over the TV 6, to turn ON the power of the TV 6, and/or to switch the input select so as to display an input from the cradle 5. Then, a user does not need to operate the TV 6 so as to satisfy the condition described above (i.e., that the TV 6 is able to display the images from the cradle 5), and it is possible to easily display the image on the TV 6. Note that the information processing device 1 can perform a control as described above by causing the cradle 5 to transmit a control signal (e.g., a CEC command of the HDMI standard) to the TV 6.
  • The information processing device 1 may change the image output condition in response to an instruction from a user. For example, the information processing device 1 may present a plurality of different conditions, as the image output conditions, on a predetermined setting changing screen, and allow a user to select one of the conditions. This is convenient for a user because the user can choose a condition that suits the user's way of using the device.
  • For example, where the cradle 5 is provided with an output instruction button as described above, information processing system may use a switching-type button as the output instruction button so as to allow a user to change the condition. That is, a first state and a second state can be switched from one to another by pressing the output instruction button of the cradle 5. Then, if the output instruction button is in the first state, the information processing device 1 interprets this to mean that there is a user instruction to output the images on the TV 6, whereas if the output instruction button is in the second state, the information processing device 1 interprets this to mean that there is no user instruction to output the images on the TV 6. Thus, a user can cause the images to be output on the TV 6 (under a certain condition) in response to the attachment of the main unit 2 to the cradle, by leaving the output instruction button in the first state.
  • In the description above, it can be said that based on the presence/absence of the user instruction described above, the information processing device 1 determines whether a user has attached the information processing device 1 to the cradle 5 merely for charging, or has attached the information processing device 1 to the cradle 5 to display the images on the TV 6. Now, in other embodiments, the information processing device 1 may make the determination above by any other method.
  • For example, the information processing device 1 may make the determination described above based on the state of the information processing device 1 regarding the execution of an application. Now, “the state of the information processing device 1 regarding the execution of an application” is, for example, a state regarding whether an application is running on the information processing device 1 or a state regarding whether the images produced or obtained by a running application are displayed on the display 12. Specifically, if an application is running or if the images produced or obtained by a running application are displayed on the display 12, the information processing device 1 may determine that a user has attached the information processing device 1 to the cradle 5 in order to display the images on the TV 6. Therefore, in such a case, the information processing device 1 may output the images to the TV 6 on the condition that the image output condition is satisfied. That is, the images output condition may include a condition that “an application is running or the image produced or obtained by a running application are displayed on the display 12”.
  • In other embodiments, the information processing device 1 may store, in a storage section (e.g., the flash memory 84), the operation mode at the point in time when the power of the information processing device 1 is turned OFF. Then, when the power of the information processing device 1 is turned ON, if the operation mode stored in the storage section indicates the portable mode, the information processing device 1 may execute the mode setting process from step S21, whereas if the operation mode stored in the storage section indicates the console mode, the information processing device 1 may execute the mode setting process from step S27. Thus, the operation mode of the information processing device 1 at the time of power-off can be carried on when the power is next turned ON.
  • [4-4. Example of Executing Information Process Depending on Operation Mode]
  • As described above, in the present embodiment, the information processing device 1 operates one of operation modes including the portable mode and the console mode. Therefore, in the information processing device 1, information processes executed by various programs executed on the main unit 2 (e.g., the OS program and application programs) may be of different content between the portable mode and the console mode. An example information process, in which a different process is executed depending on the operation mode, will now be described.
  • FIG. 25 is a flow chart showing an example flow of an information process executed on the main unit 2. A series of information processes shown in FIG. 25 is performed as the CPU 81 executes a program (e.g., an application program or an OS program) that is executable by the information processing device 1, for example. For example, if a program to be executed is an application program, the series of information processes shown in FIG. 25 is started in response to a user instruction to launch the application program. For example, if a program to be executed is an OS program, the series of information processes shown in FIG. 25 is started in response to the power of the main unit 2 being turned ON.
  • In the series of information processes shown in FIG. 25, first, in step S31, the CPU 81 obtains operation data from each controller. For example, the CPU 81 may obtain operation data from any controller with which the CPU 81 can communicate whether in wired communication or wireless communication. Note that the main unit 2 may obtain operation data with substantially the same frequency from controllers it is communicating in wired communication and from controllers it is communicating in wireless communication. Then, since there is no difference between controllers depending on their communication method, it is possible to improve the controllability of controllers. Following step S31 described above, the process of step S32 is executed.
  • Depending on the program related to the series of information processes, the CPU 81 may obtain operation data from controllers that satisfy a predetermined condition and not obtain operation data from controllers that do not satisfy the condition. For example, the predetermined condition may be a communication-related condition. That is, the CPU 81 may obtain operation data from controllers that communicate with the main unit 2 in wired communication and not obtain operation data from controllers that communicate with the main unit 2 in wireless communication. For example, the predetermined condition may be a condition regarding the function of a controller. That is, the CPU 81 may obtain operation data from controllers having a predetermined function (e.g., controllers having an analog stick, controllers having the NFC communication function, controllers having an acceleration sensor, etc.) and not obtain operation data from controllers not having these predetermined functions. If there is a controller that is capable of communication but that does not obtain operation data, the main unit 2 may indicate to the user that the controller is not available for use on the running program.
  • Note that in other embodiments, instead of the CPU 81 not obtaining operation data from controllers that do not satisfy a predetermined condition, the CPU 81 may not use the operation data obtained from the controllers that do not satisfy the predetermined condition in the process or step S33 or S34 to be described later.
  • In step S32, the CPU 81 determines whether or not the operation mode of the information processing device 1 is the portable mode (i.e., whether it is the portable mode or the console mode). The information processing device 1 has information indicating the current operation mode stored in a predetermined storage section (e.g., the flash memory 84). The determination of step S32 is made based on this information. If the determination result of step S32 is affirmative (i.e., if the operation mode is the portable mode), the process of step S33 is executed. On the other hand, if the determination result of step S32 is negative (i.e., if the operation mode is the console mode), the process of step S34 is executed again.
  • In step S33, the CPU 81 executes an information process based on operation data in accordance with settings in the portable mode. As described above, in the portable mode, the processing power of the main unit 2 is limited. That is, the CPU 81 executes the information process under such a limitation. For example, the CPU 81 executes a predetermined process (e.g., a game process) using, as an input, the operation data obtained in step S31, and produces images representing the process results, which are displayed on the display 12. Following step S33, the process of step S35 to be described later is executed.
  • On the other hand, in step S34, the CPU 81 executes an information process based on operation data in accordance with settings in the console mode. As described above, in the console mode, the limitation on the processing power of the main unit 2 is lifted. For example, the CPU 81 may execute a predetermined process (e.g., a game process) using, as an input, the operation data obtained in step S31, and produces images representing the process results, which are displayed on the TV 6. Following step S33, the process of step S35 to be described later is executed.
  • Depending on the program related to the series of information processes, the CPU 81 may use different content of the predetermined process to be executed in steps S33 and S34 described above depending on the operation mode. For example, if the program is a game program (i.e., if a game application is executed), the information processing device 1 may use different game content depending on the operation mode. Specifically, levels that can be played may differ between the portable mode and the console mode, and there may be levels that can only be played in the portable mode or there may be levels that can only be played in the console mode. Also, the game mode may differ in the portable mode and in the console mode.
  • For example, if the screen display settings can be changed, the information processing device 1 may use different screen display settings for each operation mode. Note that the “screen display settings” refer, for example, to those settings that are directed to the size of icons on the menu screen and various information to be displayed together with images representing the game space (e.g., images representing the status of the player character, the map image, etc.). Therefore, in the portable mode, for example, since the images are displayed on the display 12 whose screen size is assumed to be small, graphical symbols such as icons are displayed to be larger with respect to the displayed images (in other words, the display screen), whereas in the console mode, since the images are displayed on the TV 6 whose screen size is assumed to be large, those graphical symbols may be displayed to be smaller with respect to the displayed images.
  • In step S35, the CPU 81 determines whether or not to end the series of information processes shown in FIG. 25. This determination is made based on whether or not there has been a user instruction to end, for example. If the determination result of step S35 is negative, the process of step S31 is executed again. Thereafter, the series of processes of steps S31 to S35 is repeatedly executed until it is determined in step S35 that the series of information processes is to be ended. On the other hand, if the determination result of step S35 is affirmative, the CPU 81 ends the series of information processes shown in FIG. 25.
  • As described above, in the present embodiment, the information processing device 1 may be configured so that the OS or an application may execute a different process depending on the operation mode. Then, for example, it is possible to give the user a different kind of entertainment (e.g., a different game level or a different game mode) for each operation mode. For example, it is possible to save the trouble for the user to change settings by automatically changing settings (e.g., the screen display settings) depending on the operation mode.
  • [4-5. Example of Executing Information Process Depending on Attached State of Controller]
  • As described above, in the present embodiment, the information processing device 1 can be used both in a state in which a controller is removed from the main unit 2 and a state in which they are attached to the main unit 2. Therefore, the information processing device 1 may be configured so that the content of information processes to be executed by various programs executed on the main unit 2 (e.g., the OS program and application programs) may be different from each other depending on the attached state of a controller to the main unit 2 (i.e., whether the controller is attached to or removed from the main unit 2). Examples of information processes depending on the attached state of a controller will now be described below.
  • FIG. 26 is a flow chart showing an example flow of an information process executed on the main unit 2. As is the series of processes shown in FIG. 25, the series of information processes shown in FIG. 26 is executed by an application program or an OS program executable on the information processing device 1, for example. As is the series of processes shown in FIG. 25, the series of information processes shown in FIG. 26 is also started in response to a user instruction to launch an application program or in response to the power of the main unit 2 being turned ON.
  • In the series of information processes shown in FIG. 26, first, in step S41, the CPU 81 determines whether or not a controller is attached to the main unit 2. In the present embodiment, the CPU 81 determines whether or not a controller is attached to the main unit 2 for each of the left and right controllers 3 and 4. If it is determined that at least one of the left and right controllers 3 and 4 is attached to the main unit 2, the determination result of step S41 is affirmative. If the left and right controllers 3 and 4 are both removed from the main unit 2, the determination result of step S41 is negative. If the determination result of step S41 is affirmative, the process of step S42 is executed. On the other hand, if the determination result of step S41 is negative, the process of step S43 to be described later is executed.
  • In step S42, the CPU 81 obtains operation data from a controller attached to the main unit 2 via a wired connection. That is, the CPU 81 obtains operation data received via the left-side terminal 17 and the right-side terminal 21. Note that where only one of the left and right controllers 3 and 4 is attached, the CPU 81 obtains operation data only from the terminal corresponding to that controller. Note that the obtained operation data is stored in a predetermined storage section (e.g., the DRAM 85). Following step S42, the process of step S43 is executed.
  • In step S43, the CPU 81 obtains operation data from a controller removed from the main unit 2 via a wireless connection. That is, the CPU 81 obtains operation data received via the controller communication section 83. Note that if there are no controllers that are capable of wireless communication (in other words, controllers that are transmitting operation data to the main unit 2) at the point of step S43, the CPU 81 may skip the process of step S43 since no operation data is received by the controller communication section 83. Following step S43, the process of step S44 is executed.
  • In step S44, the CPU 81 executes a predetermined information process based on the operation data obtained in steps S42 and S43. Note that there is no particular limitation on the content of the information process. For example, if the running program is an OS program, the CPU 81 may move a cursor across the menu screen based on the operation data, or identify an application selected by a user based on the operation data to launch the identified application. If the running program is an application program, the CPU 81 may execute a process according to the application by using the operation data as an input to the application. Following step S44, the process of step S45 is executed.
  • In step S45, the CPU 81 outputs an image representing the result of the information process of step S44 to a display device (the display 12 or the TV 6). For example, if the running program is an OS program, the menu screen described above is displayed on the display device. For example, if the running program is an application program, an image of the application (e.g., a game image in the game application) is displayed on the display device. Following step S45, the process of step S46 is executed.
  • In step S46, the CPU 81 determines whether or not to end the series of information processes shown in FIG. 26. This determination is made based on whether or not there has been a user instruction to end, for example. If the determination result of step S46 is negative, the process of step S41 executed again. Thereafter, the series of processes of steps S41 to S46 is repeatedly executed until it is determined in step S46 that the series of information processes is to be ended. On the other hand, if the determination result of step S46 is affirmative, the CPU 81 ends the series of information processes shown in FIG. 26.
  • As described above, in the present embodiment, the main unit 2 varies the content of the process to be executed depending on the attached state of the controller. That is, if no controller is attached to the main unit 2, the main unit 2 skips the process of obtaining operation data via wired communication. Thus, it is possible to simplify the process to be executed on the main unit 2.
  • Note that in the series of processes shown in FIG. 26, the main unit 2 executes the process of obtaining operation data via wireless communication (step S43), irrespective of the attached state of the controller. Now, depending on the application executed on the main unit 2, if the determination result of step S41 is affirmative (i.e., if a controller is attached to the main unit 2), the process of step S43 does not need to be executed. That is, depending on the running application, if a controller is attached, the main unit 2 may receive operation data only from the controller attached thereto while not accepting operation data from the other, un-attached controller (via wireless communication).
  • FIG. 27 is a flow chart showing an example flow of an information process executed on the main unit 2. As is the series of processes shown in FIG. 25, the series of information processes shown in FIG. 27 is executed by an application program or an OS program executable on the information processing device 1, for example. As is the series of processes shown in FIG. 25, the series of information processes shown in FIG. 27 is also started in response to a user instruction to launch an application program or in response to the power of the main unit 2 being turned ON.
  • In the series of information processes shown in FIG. 27, first, in step S51, the CPU 81 determines whether or not whether or not a controller is attached to the main unit 2. In the present embodiment, the CPU 81 determines whether the left and right controllers 3 and 4 are both attached to the main unit 2 or at least one of the left and right controllers 3 and 4 is removed from the main unit 2. Note however that in other embodiments, the CPU 81 may determine whether at least one of the left and right controllers 3 and 4 is attached to the main unit 2 or the left and right controllers 3 and 4 are both removed from the main unit 2. If the determination result of step S51 is affirmative, the process of step S52 is executed. On the other hand, if the determination result of step S51 is negative, the process of step S54 is executed.
  • In step S52, the CPU 81 sets the control mode of the information process to the single-player control mode. That is, if step S52 is executed, since a controller is attached to the main unit 2, it is assumed that a single user performs operations by using the attached controller. The single-player control mode is a mode in which there is one control object, for example. For example, in the game program, the CPU 81 starts a game in which there is one player character. In an application in which operations are performed by using a cursor displayed on the screen, the CPU 81 displays a cursor. Following step S52, the process of step S53 is executed.
  • In step S53, the CPU 81 executes a predetermined process in the single-player control mode. That is, the CPU 81 obtains operation data from the left and right controllers 3 and 4, and executes a predetermined process based on the operation data. Note that there is no particular limitation on the content of the predetermined process. Now, in the single-player control mode, one control object (e.g., one player character or one cursor) is controlled based on operation data obtained from two controllers 3 and 4. Note that in step S53, the predetermined process described above is repeatedly executed. When a condition for ending the predetermined process is satisfied (e.g., when there is a user instruction to end the application), the CPU 81 ends the process of step S53, thereby ending the series of information processes shown in FIG. 27.
  • On the other hand, in step S54, the CPU 81 sets the control mode of the information process to the two-player control mode. That is, if step S54 is executed, it means that (at least one) controller is removed from the main unit 2, and it is therefore assumed that a total of two users are performing operations, wherein one user uses the removed controller while the other user uses the other controller. The two-player control mode is a mode in which there are two control objects, for example. For example, in the game program, the CPU 81 starts a game in which there are two player characters. In an application in which operations are performed by using a cursor displayed on the screen, the CPU 81 displays two cursors. Following step S54, the process of step S55 is executed.
  • In step S55, the CPU 81 executes a predetermined process in the two-player control mode. That is, the CPU 81 obtains operation data from the left and right controllers 3 and 4, and executes a predetermined process based on the operation data. Note that there is no particular limitation on the content of the predetermined process. In the two-player control mode, a first control object (e.g., one player character or one cursor) is controlled based on operation data obtained from the left controller 3, and a second control object, different from the first control object, is controlled based on operation data obtained from the right controller 4. Note that in step S55, as in step S53, the predetermined process is repeatedly executed. When a condition for ending the predetermined process is satisfied (e.g., when there is a user instruction to end the application), the CPU 81 ends the process of step S55, thereby ending the series of information processes shown in FIG. 27.
  • As described above, in the present embodiment, when a program is launched, the attached state of controllers is sensed, and a control mode is set depending on the attached state (step S52, S54). Thus, an appropriate control mode is selected automatically without the user selecting the control mode, and it is therefore possible to save the trouble for the user.
  • Note that in FIG. 27, the main unit 2 determines the attached state of controllers at the start of the execution of a program (step S51). Now, in other embodiments, the main unit 2 may determine the attached state of controllers at a predetermined timing during the execution of a program. Then, in response to the attached state of controllers being changed, the main unit 2 may change the control mode. Note that there is no particular limitation on the predetermined timing. For example, where a game application is executed, the main unit 2 may determine the attached state of controllers at the point in time when one play of the game is complete, or may determine the attached state of controllers at the point in time when a predetermined instruction is given by a user.
  • In FIG. 27, the main unit 2 changes the number of users to perform operations depending on the attached state of controllers. There is no particular limitation on the content of the process to be changed depending on the attached state of controllers. For example, in other embodiments, the interpretation (specifically, the interpretation by the main unit 2) of an operation on a controller may be changed depending on the attached state of controllers. Specifically, when the left controller 3 is attached to the main unit 2, the CPU 81 may interpret the operation of tilting the analog stick 32 in the upper direction (i.e., in the y-axis direction shown in FIG. 1) as being an input of the upper direction. On the other hand, when the left controller 3 is removed from the main unit 2, the CPU 81 may interpret the operation of tilting the analog stick 32 in the upper direction as being an input of the left direction (assuming that the controller is held in a landscape position as shown in FIG. 16).
  • Note that each of the three series of processes shown in FIG. 25 to FIG. 27 may be executed in any phase on the main unit 2. That is, these three series of processes may each be executed in a different phase (i.e., in the execution of a different program). Two or three of these series of processes may be executed simultaneously (e.g., within a single program). For example, the main unit 2 may execute a process that combines together processes of FIG. 26 and FIG. 27 when a certain application is executed. Specifically, in steps S53 and S55 shown in FIG. 27, the CPU 81 may execute the predetermined process described above based on operation data obtained from the processes of steps S41 to S43 shown in FIG. 26.
  • [5. Other Devices Included in Information Processing System]
  • [5-1. Other Types of Controllers]
  • As described above, in the present embodiment, the controllers 3 and 4 can be attached and detached to and from the main unit 2. Therefore, there may be provided a left controller different from the left controller 3 shown in FIG. 5, and the other left controller may be attached to the main unit 2. Moreover, there may be provided a right controller different from the right controller 4 shown in FIG. 6, and the other right controller may be attached to the main unit 2. That is, in the present embodiment, controllers attached to the main unit 2 may be detached therefrom and other controllers may be attached thereto.
  • FIG. 28 and FIG. 29 are diagrams each showing another example left controller. A left controller 201 shown in FIG. 28, as compared with the left controller 3 shown in FIG. 5, includes a cross-shaped key 202, instead of the operation buttons 33 to 36. A left controller 203 shown in FIG. 29, as compared with the left controller 3 shown in FIG. 5, includes a cross-shaped key 204, instead of the analog stick 32. Thus, another controller may have a different function from the controllers 3 and 4 shown in FIG. 5 and FIG. 6. Thus, the information processing system may include other controllers having different functions from the controllers 3 and 4, in addition to (or instead of) the controllers 3 and 4.
  • FIG. 30 is a diagram showing an example information processing device with a right controller that is different from that of FIG. 1. In FIG. 30, another right controller 205, instead of the right controller 4 shown in FIG. 6, is attached to the main unit 2. On the right controller 205, as compared with the right controller 4 shown in FIG. 6, the arrangement of the analog stick 52 and the four operation buttons 53 to 56 is reversed. Thus, for an information processing device 1′ shown in FIG. 30, the arrangement of the analog stick and the four operation buttons is symmetric between the left controller and the right controller. That is, in the present embodiment, the arrangement of the operation section on the information processing device 1 can be changed by changing the controller to be attached to the main unit 2. Thus, the information processing system may include other controllers having different arrangements from the controllers 3 and 4, in addition to (or instead of) the controllers 3 and 4.
  • As described above, in the present embodiment, there may be provided different kinds of controllers having different functions and/or arrangements. Then, a user can use any of the different kinds of controllers by attaching the controller to the main unit 2. Thus, it is possible to provide a portable device with which it is possible to change the controller device (i.e., the controller) in accordance with, for example, the user preference or the content of the application to be executed on the information processing device 1.
  • [5-2. Accessory for Controllers]
  • The information processing device 1 may include an accessory configured so that the controllers 3 and 4, having been removed from the main unit 2, can be attached to the accessory. FIG. 31 is a diagram showing an example accessory to which controllers can be attached. As shown in FIG. 31, an extension grip 210, which is an example accessory, is an accessory to be used by a user for performing operations. The extension grip 210 includes a housing 211. The left controller 3 can be attached to the housing 211, and the right controller 4 can be attached thereto. Therefore, the extension grip 210 allows a user to perform operations while holding two controllers 3 and 4, having been removed from the main unit 2, as an integral unit.
  • Note that as mechanisms for allowing controllers to be attached thereto, the extension grip 210 includes mechanisms similar to those of the main unit 2 (specifically, the left rail member 15, the engagement hole 16, the stopper 18, the right rail member 19, the engagement hole 20 and the stopper 22). Thus, the controllers 3 and 4 can be attached to the extension grip 210, as they are attached to the main unit 2.
  • As shown in FIG. 31, the extension grip 210 includes grip portions to be held by a user. Specifically, the extension grip 210 accommodates the left controller 3 attached thereto on the left side of the center of the extension grip 210 in the left-right direction (the x-axis direction shown in FIG. 31). The extension grip 210 includes a left grip portion 212 provided on the left side (in other words, on the outer side in the left-right direction) of the area where the left controller 3 is attached. The extension grip 210 accommodates the right controller 4 attached thereto on the right side of the extension grip 210 in the left-right direction. The extension grip 210 includes a right grip portion 213 provided on the right side (in other words, on the outer side in the left-right direction) of the area where the right controller 4 is attached. Therefore, by holding the grip portions 212 and 213, a user can easily operate the controllers 3 and 4 attached to the extension grip 210.
  • Although not shown in the figure, the extension grip 210 includes a left-side terminal similar to the left-side terminal 17 of the main unit 2 at such a location that allows the left-side terminal to be connected to the terminal 42 of the left controller 3 attached to the extension grip 210. The extension grip 210 also includes a right-side terminal similar to the right-side terminal 21 of the main unit 2 at such a location that allows the right-side terminal to be connected to the terminal 64 of the right controller 4 attached to the extension grip 210. Therefore, when the controllers 3 and 4 are attached to the extension grip 210, the extension grip 210 and the controllers 3 and 4 are electrically connected to each other. Therefore, when the controllers and the extension grip 210 are connected to each other, they can communicate with each other and can supply power to each other.
  • Moreover, although not shown in the figure, the extension grip 210 includes a power terminal similar to the power terminal 134 of the cradle 5. Therefore, by connecting to a charging device (not shown) (e.g., an AC adaptor) to the power terminal, the extension grip 210 can receive power supply from the charging device via the power terminal. Then, the power supplied from the power terminal is supplied by the extension grip 210 to the left controller 3 and the right controller 4 via the left-side terminal and the right-side terminal. Therefore, by connecting the extension grip 210 to a charging device, it is possible to charge the left controller 3 and the right controller 4 attached to the extension grip 210.
  • As described above, the extension grip 210 has the function of charging controllers attached thereto. That is, the extension grip 210 includes a charging control section (i.e., the power terminal, and the left-side terminal or the right-side terminal), which uses power supplied to the extension grip 210 to charge controllers attached to the extension grip 210. Then, controllers can be charged while being attached to the extension grip. This improves the usability of the controllers.
  • Note however that in other embodiments, the extension grip 210 does not have the charging function. Then, the extension grip 210 does not need to include electronic members such as terminals. By using no electronic members, it is possible to simplify the configuration of the extension grip 210.
  • When the controllers 3 and 4 are attached to the extension grip 210, the controllers 3 and 4 and the main unit 2 communicate with each other in wireless communication (as when the controllers 3 and 4 are removed from the main unit 2). That is, even when attached to the extension grip 210, the controllers 3 and 4 can communicate with the main unit 2.
  • Specifically, in the present embodiment, a controller transmits, to the main unit 2, attachment information regarding its attachment to another device. For example, when a controller is attached to another device, the attachment information indicates the device (and/or the type of the device) to which the controller is attached, and when a controller is not attached to another device, the attachment information indicates that the controller is not attached to another device. The attachment information may be transmitted together with the operation data (or while being included in the operation data), for example.
  • Thus, when a controller senses the attachment of the controller to the extension grip 210, the controller transmits, to the main unit 2 via wireless communication, a notification indicating that the controller has been attached to the extension grip 210. Specifically, the controller transmits, to the main unit 2, the attachment information indicating that the controller is attached to the extension grip 210.
  • Note that there is no particular limitation on the method for a controller to identify the device to which the controller is attached. In the present embodiment, in response to a controller being attached to another device, the controller obtains identification information identifying the other device from the other device (herein, the extension grip 210). The identification information is, for example, identification information assigned to each device (more specifically, identification information unique to the device). Note that in the present embodiment, the identification information includes information indicating the type of the device so that it is possible to identify the type of the device from the identification information. Based on the identification information, a controller determines (or senses) that the device to which the controller has been attached is the extension grip 210, i.e., that the device has been attached to the extension grip 210. Note that when a controller is removed from the extension grip 210, the controller transmits, to the main unit 2 via wireless communication, a notification indicating that the controller has been removed from the extension grip 210. That is, the controller transmits, to the main unit 2, the attachment information indicating that the controller is not attached to another device.
  • Now, the main unit 2 may change the interpretation of operations performed on a controller depending on the device to which the controller is attached. That is, the main unit 2 may change the method of identifying the content of input (in other words, the content of a user instruction) based on the operation data from a controller depending on the device to which the controller is attached. For example, in the example shown in FIG. 31, the left and right controllers 3 and 4 are attached to the extension grip 210 while being slightly inclined with respect to the up-down direction. Therefore, when the analog stick is tilted in the directly upward direction with respect to the extension grip 210 (the y-axis positive direction shown in FIG. 31), the tilt is not in the directly upward direction with respect to the controller. Therefore, when a controller is attached to the extension grip 210, the main unit 2 preferably changes the interpretation of directional inputs on the analog stick, as compared with a case where the controller is not attached to the extension grip 210 (e.g., when the controller is attached to the main unit 2). For example, the main unit 2 may change the direction of tilt of the analog stick that is interpreted as being an input in the directly upward direction.
  • Thus, when a controller is attached to the extension grip 210, the main unit 2 changes the interpretation of a directional input on the analog stick, as compared with a case in which the controller is attached to the main unit 2. Specifically, when a controller is attached to the extension grip 210, the main unit 2 modifies the tilt direction of the analog stick represented by the operation data by rotating the tilt direction by a predetermined angle. Note that the predetermined angle is, for example, the angle of inclination of the controller attached to the extension grip 210 with respect to the attitude of the controller when attached to the main unit 2. The predetermined angle may be preset based on the structure of the extension grip 210. The main unit 2 uses the modified tilt direction as the direction input by a user. That is, the main unit 2 executes an information process using the modified tilt direction as an input. Thus, when an analog stick is tilted in the directly upward direction with respect to the extension grip 210, the input direction is regarded as being the directly upward direction, thus allowing an information process to be executed appropriately.
  • When a controller is attached to the extension grip 210, the main unit 2 modifies the values of the detection results of the acceleration sensor and the angular velocity sensor of the controller. That is, the main unit 2 modifies the detection result of the acceleration sensor by rotating the direction of acceleration (i.e., the detection result of the acceleration sensor) by the above predetermined angle in a rotational direction about an axis extending in the front-rear direction of the controller (the z-axis direction shown in FIG. 1). The main unit 2 modifies the detection result of the angular velocity sensor by rotating the axis of rotation of the angular velocity (i.e., the detection result of the angular velocity sensor) by the above predetermined angle in a rotational direction about an axis extending in the front-rear direction of the controller (the z-axis direction shown in FIG. 1). Then, the main unit 2 calculates the movement and/or the attitude of the controller (in other words, the extension grip 210) based on the modified values of the acceleration and/or the angular velocity.
  • Note that the main unit 2 may calculate the movement and/or the attitude of the controller without modifying the values of the detection results of the acceleration sensor and the angular velocity sensor, and modify the calculated movement and/or the calculated attitude. That is, when the main unit 2 calculates the movement based on the detection results of the acceleration sensor and/or the angular velocity sensor, the main unit 2 modifies the calculated movement by rotating the calculated movement direction by the above predetermined angle in a rotational direction about an axis extending in the front-rear direction of the controller. When the main unit 2 calculates the attitude based on the detection results of the acceleration sensor and/or the angular velocity sensor, the main unit 2 modifies the calculated attitude by rotating the calculated attitude by a predetermined angle in a rotational direction about an axis extending in the front-rear direction of the controller.
  • As described above, in the present embodiment, the left controller 3 is attached to the extension grip 210 while being inclined in the first direction from a predetermined reference attitude (i.e., the attitude in which the up-down direction of the left controller 3 coincides with the up-down direction of the extension grip 210; or the attitude when the left controller 3 is attached to the main unit 2) (see FIG. 31). The right controller 4 is attached to the extension grip 210 while being inclined in a direction opposite to the first direction from a predetermined reference attitude (i.e., the attitude where the up-down direction of the right controller 4 coincides with the up-down direction of the extension grip 210). Thus, controllers can be attached to the accessory with an easy-to-operate inclination.
  • In the present embodiment, when at least one of the left controller 3 and the right controller 4 is attached to the extension grip 210, the main unit 2 executes a predetermined information process by modifying the operation data from the attached controller and/or information obtained from the operation data. For example, the main unit 2 performs a modifying process for modifying the inclination of the controller with respect to the predetermined reference attitude. More specifically, the main unit 2 performs a modifying process so that the result of the information process when a predetermined operation is performed on a controller attached to the extension grip 210 is generally equal to that when the predetermined operation is performed on the controller being in the predetermined reference attitude. Thus, since a user can similarly operate a controller whether the controller is attached to an accessory while being inclined from the predetermined reference attitude or the controller is not attached to the accessory (or attached to the main unit 2), thereby improving the controllability of a controller attached to an accessory.
  • Moreover, in the present embodiment, the main unit 2 modifies a part of the operation data representing an operation performed on a directional input section (i.e., an analog stick). Thus, the analog stick can be operated in a similar fashion whether the controller is attached to the accessory while being inclined from the predetermined reference attitude or the controller is not attached to the accessory (or attached to the main unit 2).
  • In the present embodiment, also when the left and right controllers 3 and 4 are both attached to the extension grip 210, as when the left and right controllers 3 and 4 are both attached to the main unit 2, these two controllers are registered as a pair. Specifically, if one of the left and right controllers has been attached to the extension grip 210, when an indication that the other controller is attached to the extension grip 210 is received, the main unit 2 registers these two controllers as a pair. Then, a user can register two controllers as a pair by attaching them to the extension grip 210, as well as by attaching them to the main unit 2.
  • Note that in other embodiments, the extension grip 210 may include a communication section capable of communicating with the main unit 2 in wireless communication. Then, the extension grip 210 may obtain operation data from the controllers 3 and 4 attached to the extension grip 210, and transmit the obtained operation data to the main unit 2 via the communication section.
  • Note that the main unit 2 may communicate with devices that cannot be attached to the main unit 2 (e.g., the extension grip 210) in wireless communication, as well as communicating with controllers that can be attached to the main unit 2 in wireless communication. For example, the main unit 2 may receive via wireless communication operation data from a controller device that cannot be attached to the main unit 2, and execute an information process based on the operation data. Then, operations can be performed on the main unit 2 by using a wider variety of controller devices.
  • The extension grip 210 may include an operation section (e.g., buttons and an analog stick, etc.). Then, the extension grip 210 allows for a wider variety of operations using controllers. Note that information representing an operation performed on the operation section may be obtained by a controller attached to the extension grip 210 (if there are a plurality of controllers attached to the extension grip 210, one of the controllers), and transmitted from the controller to the main unit 2. Such information may be transmitted together with (or being included in) operation data of the controller. If the extension grip 210 includes a communication section capable of communicating with the main unit 2 in wireless communication, information representing an operation performed on the operation section may be transmitted directly from the extension grip 210 to the main unit 2 (i.e., without passing through a controller).
  • Note that an accessory controller device such as the extension grip described above may be configured without grip portions. Then, the accessory controller device may be configured so that the housings of the controllers are used as grip portions. FIG. 32 is a diagram showing another example accessory controller device. In FIG. 32, an attachment 220 includes a housing 221 to which the left controller 3 and the right controller 4 can be attached, as with the extension grip 210 shown in FIG. 31. Note that the attachment 220 has similar mechanisms to those of the extension grip 210 for allowing controllers to be attached thereto.
  • The attachment 220 shown in FIG. 32 includes no grip portion. Therefore, a user holds the left controller 3 and the right controller 4 attached to the attachment 220. This also enables the user to hold the two controllers 3 and 4 removed from the main unit 2 as an integral unit.
  • [5-3. Accessory for Main Unit]
  • The information processing system may include an accessory to which the main unit 2 can be attached. An HMD accessory to be described below as an example accessory can be used as a so-called HMD (head mounted display) with the main unit 2 attached thereto.
  • FIG. 33 is a diagram showing an example HMD accessory to which the main unit 2 can be attached. An HMD accessory 230 shown in FIG. 33 includes a housing 231 and belts 232 a and 232 b. One end of the belt 232 a is attached to one end of the housing 231, and one end of the belt 232 b is attached to the other end of the housing 231. Although not shown in the figure, the other end of the belt 232 a can be removably connected to the other end of the belt 232 b. Thus, the housing 231 can be mounted on the head of the user by connecting together the two belts 232 a and 232 b around the head of the user. Note that there is no particular limitation on the mechanism for allowing the HMD accessory 230 to be mounted on the head of the user.
  • As shown in FIG. 33, the housing 231 includes two openings 231 a. The openings 231 a are located so as to face the eyes of the user with the housing 231 mounted on the head of the user. Although not shown in the figure, the HMD accessory 230 includes a lens provided in each of the openings 231 a.
  • Moreover, as shown in FIG. 33, the housing 231 includes an insertion slot 23 lb for receiving the main unit 2 attached thereto (or inserted thereinto). That is, the main unit 2 can be attached to the HMD accessory 230 by inserting the main unit 2 into the insertion slot 231 b. If the main unit 2 is attached so that the display 12 is facing the openings 231 a, the display 12 can be viewed through the lenses through the openings 231 a. That is, the housing 231 supports the display 12 so that the user can view the display 12 of the main unit 2.
  • With such a configuration, a user can view images on the display 12 when the HMD accessory 230 with the main unit 2 attached thereto is mounted on the head of the user. That is, the HMD accessory 230 with the main unit 2 attached thereto functions as a so-called HMD. Note that the HMD accessory 230 may present images of a wide viewing angle to a user by enlarging the viewing angle of the images on the display 12 through the lenses. This can enhance the sense of immersion for a user looking at the images. Note that the main unit 2 may perform a predetermined conversion process on the images displayed on the display 12 so that appropriate images are viewed through the lenses.
  • Note that in the present embodiment, the main unit 2 includes the acceleration sensor 89 and the angular velocity sensor 90, and can calculate the movement and/or the attitude of the main unit 2 based on the detection results of these sensors. Therefore, the main unit 2 can calculate the movement and/or the attitude of the HMD accessory 230 to which the main unit 2 is attached, and perform a predetermined process in accordance with the movement and/or the attitude. Note that the predetermined process, for example, is a process of controlling a virtual camera for producing images to be displayed on the display 12 based on the movement and/or the attitude of the HMD accessory 230, and more specifically is a process of changing the line-of-sight direction of the virtual camera depending on the attitude, for example.
  • In other embodiments, if the main unit 2 does not have sensors for sensing the movement and/or the attitude of the main unit 2 (e.g., an acceleration sensor and/or an angular velocity sensor), the HMD accessory 230 may include the sensors. Then, the HMD accessory 230 may transmit the detection results of the sensors (or information obtained by performing a predetermined process on the detection results) to the main unit 2. The main unit 2 may calculate the movement and/or the attitude of the HMD accessory 230 based on the information transmitted from the HMD accessory 230.
  • In the present embodiment, the controllers 3 and 4 may be used while they are removed from the main unit 2. Therefore, even with the main unit 2 attached to the HMD accessory 230, the controllers 3 and 4 can be used as controller devices. That is, a user can operate the controllers using the hands while the HMD accessory 230 with the main unit 2 attached thereto is mounted on the head of the user.
  • As described above, the information processing device 1 of the present embodiment, with the use of the HMD accessory 230 described above, can be used in a mode where it is used as an HMD. In the present embodiment, since the controllers can be removed from the main unit 2, it is possible to reduce the weight of the device or devices to be mounted on the head of the user. A user can perform operations using removed controllers.
  • [6. Functions/Effects and Variations of Present Embodiment]
  • In the present embodiment described above, the information processing device 1 includes the main unit 2, the left controller (referred to also as the first controller device) 3, and the right controller (referred to also as the second controller device) 4. Since the information processing device 1 includes a plurality of devices, it can be referred to also as an information processing system. The main unit includes a display (i.e., the display 12). The left controller 3 is configured to be removably attached to the main unit 2, and the right controller 4 is configured to be removably attached to the main unit 2 (FIG. 2). The left controller 3 transmits first operation data representing an operation performed on the left controller 3 to the main unit 2, whether it is attached to the main unit 2 or not. The right controller 4 transmits second operation data representing an operation performed on the right controller 4 to the main unit 2, whether it is attached to the main unit 2 or not. The main unit is configured to display, on the display, the execution result of an information process (step S44) based on the first operation data transmitted from the left controller 3 and the second operation data transmitted from the right controller 4 (step S45). Thus, with the left controller 3 and the right controller 4 attached to the main unit 2, the main unit 2 is capable of displaying, on the display, images based on operations performed on the left controller 3 and the right controller 4 (FIG. 14). The main unit 2 is also capable of displaying, on the display, images based on operations performed on the left controller 3 and the right controller 4 when the left controller 3 and the right controller 4 are removed from the main unit 2 (FIG. 15).
  • As described above, the information processing device 1 can be used both in the mode in which the controllers 3 and 4 are attached to the main unit 2 and in the mode in which the controllers 3 and 4 are removed from the main unit. Thus, since a plurality of modes of use are realized with a single information processing device 1, the information processing device 1 can be used in a wider variety of manners.
  • The “image based on operations” may be images obtained by an information process that is performed based on an operation (e.g., images obtained by an operation performed on an application used for obtaining and viewing information from the Internet) or images produced by an information process that is performed based on an operation (e.g., game images produced in accordance with a game operation performed on a game application).
  • In the above description, the main unit 2 may be used in the mode in which the left controller 3 and the right controller 4 are attached to the main unit 2 and in the mode in which the left controller 3 and the right controller 4 are removed from the main unit 2, and it is not necessary that both of these modes be available under certain conditions. For example, only one of the two modes may be available in a predetermined application running on the main unit 2. That is, the main unit 2 may run an application that is available only in the mode in which the left controller 3 and the right controller 4 are attached to the main unit 2, and may run another application that is available only in the mode in which the left controller 3 and the right controller 4 are removed from the main unit 2.
  • In the above description, the main unit 2 may include an information processing unit (e.g., the CPU 81) instead of a display.
  • The left controller 3 includes a first input section (e.g., the analog stick 32) and a second input section (e.g., the operation buttons 33 to 36). The right controller 4 includes a third input section (e.g., the analog stick 52) of the same type as the first input section and a fourth input section (e.g., the operation buttons 53 to 56) of the same type as the second input section.
  • Note that an “input section” is any means that outputs information representing a user input and/or information with which it is possible to calculate (or estimate) a user input. For example, an input section may be a button, a directional input section such as an analog stick, a touch panel, a microphone, a camera, a sensor capable of calculating the movement of the controller, for example, (e.g., an acceleration sensor and an angular velocity sensor), and the like.
  • As described above, the left controller 3 and the right controller 4 include two sets of input sections of the same type. Therefore, using the input sections described above, a user can use two controllers in a similar manner. For example, if a single user uses a controller, it is convenient because the user can perform operations in a similar manner using either one of two controllers. Also conveniently, two users can each use one controller, for example.
  • In the above description, “input sections being of the same type” is not limited to cases where the two input sections are the same input sections, but may also include cases where two input sections have functions and/or applications of the same type. For example, where the first input section is an analog stick that can be tilted up, down, left and right, the third input section may be a slide stick that can be slid up, down, left and right or a cross-shaped key capable of making an input of up, down, left and right.
  • In the embodiment described above, the input mechanism of the first input section and the input mechanism of the third input section (e.g., the operation button 33 and the operation button 53 or the analog stick 32 and the analog stick 52) are substantially the same. The input mechanism of the second input section and the input mechanism of the fourth input section are substantially the same. Thus, the two controllers will have two types of input sections that can be operated in a similar fashion. Therefore, a user can use two controllers in a similar fashion, thereby improving the controllability of the controllers.
  • In the embodiment described above, the first input section has substantially the same shape as the third input section. Also, the second input section has substantially the same shape as the fourth input section. Thus, the two controllers will have two types of input sections that can be operated in a similar fashion. Therefore, a user can use two controllers in a similar fashion, thereby improving the controllability of the controllers.
  • When the left controller 3 and the right controller 4 are removed from the main unit 2, the positional relationship between the first input section and the second input section of the left controller 3 placed in a certain orientation is the same as the positional relationship between the third input section and the fourth input section of the right controller 4 placed in a certain orientation. For example, consider a case in which the left side surface the left controller 3 is facing a user and the right side surface of the right controller 4 is facing another user, as shown in FIG. 16 and FIGS. 17A and 17B. Then, the positional relationship between the first input section (i.e., the analog stick 32) and the second input section (i.e., the operation buttons 33 to 36) is the same as the positional relationship between the third input section (i.e., the analog stick 52) and the fourth input section (i.e., the operation buttons 53 to 56).
  • Then, users can use the left controller 3 and the right controller 4 in a similar fashion. Thus, it is possible to improve the controllability of the controllers. For example, in the example shown in FIG. 16, with either one of the two controllers, a user can operate the analog stick with the left hand and the operation buttons with the right hand.
  • When the left controller 3 and the right controller 4 are attached to the main unit 2, the positional relationship between the first input section (i.e., the analog stick 32) and the second input section (i.e., the operation buttons 33 to 36) is opposite from the positional relationship between the third input section (i.e., the analog stick 52) and the fourth input section (i.e., the operation buttons 53 to 56) (see FIG. 1).
  • Thus, if the controllers 3 and 4 are attached to the main unit 2, if a user holds the controllers 3 and 4 respectively with the left hand and the right hand, the user is allowed to easily operate different input sections with the left hand and with the right hand. Thus, it is possible to improve the controllability of the information processing device 1 with controllers attached thereto.
  • Now, assume a configuration in which the positional relationship between the first input section and the second input section is the same as the positional relationship between the third input section and the fourth input section. Even with such a configuration, a user may tend to operate the left and right controllers in a similar fashion (e.g., operating the analog stick with the left hand and the operation buttons with the right hand). Then, if the controllers have a similar shape to that of the present embodiment (i.e., one side surface is rounded), one controller will be held with the rounded side surface facing away from the user while the other controller will be held with the rounded side surface facing toward the user. That is, this configuration has a problem in that when users hold controllers removed from the main unit 2, the direction of the rounded side surface of one controller will be opposite to that of the other controller, which makes it less easy for a user to understand the appropriate orientation in which to hold a controller.
  • Moreover, with such a configuration, if sub-buttons (the second L button and/or the second R button in the embodiment described above) are provided on each controller, the sub-buttons will be provided on the rounded side surface (as in the present embodiment) for one controller, whereas the sub-buttons will be provided on the opposite side surface from the rounded side surface for the other controller. With such a configuration, however, the sub-buttons of one controller will be exposed even when the controller is attached to the main unit 2, and the sub-buttons may possibly be operated in error in the attached state.
  • In contrast, according to the present embodiment employing a configuration in which the positional relationship between the first input section and the second input section is opposite from the positional relationship between the third input section and the fourth input section, it is possible to prevent the two problems described above.
  • In the embodiment described above, the first input section and the third input section are each a directional input section for receiving a directional input. More specifically, the directional input section may include an operation member (e.g., a stick member) that can be tilted or slid in a predetermined direction. Then, a user can make directional inputs by using the left controller 3 or the right controller 4.
  • In the embodiment described above, the second input section and the fourth input section are each a button that can be pressed. Thus, a user can make button inputs by using the left controller 3 or the right controller 4.
  • In the embodiment described above, a controller is attached integrally to the main unit 2 with a predetermined surface of the housing of the controller (e.g., the right side surface for the left controller 3) facing a predetermined surface of the main unit 2 (see FIG. 2). Thus, with the controller attached to the main unit 2, a user can handle the controller and the main unit as an integral unit, thereby improving the controllability.
  • In the embodiment described above, the left controller 3 is attached to the main unit 2 so as to be facing one of a left side surface and a right side surface of the main unit 2 (specifically, the left side surface). The right controller 4 is attached to the main unit 2 so as to be facing the other one of the left side surface and the right side surface of the main unit 2 (i.e., the right side surface). Then, a user can operate the two controllers attached to the main unit 2 respectively with the left hand and the right hand, thereby providing the information processing device 1 with a good controllability.
  • The controller includes a light-emitting portion (e.g., the indicator LED 45 for the left controller 3) provided on the predetermined surface for notifying a user of predetermined information. Then, predetermined information (e.g., the status of the information processing device 1) can be indicated to the user using the controller.
  • Note that the light-emitting portion may indicate the communication status between the main unit 2 and the controller. For example, in the embodiment described above, the indicator LED indicates a number that is assigned to the controller as a result of communication (more specifically, the number represented by the number information described above). For example, the indicator LED may indicate the status of wireless communication between the information processing device 1 and the controller. Specifically, the indicator LED may indicate whether or not wireless communication is available at that point in time or may indicate whether or not pairing has been done.
  • The controller includes operation sections provided on the predetermined surface (e.g., the second L button 43 and the second R button 44 for the left controller 3). Note that the “operation section” for example means any input section operated by a user, such as buttons, a stick, etc. Thus, more operation sections are available when the controller is removed from the main unit 2 than when the controller is attached to the main unit 2. Therefore, a user is allowed to perform a variety of operations even when the controller is removed from the main unit 2, and it is possible to improve the controllability of the controller.
  • Note that in other embodiments, the main unit 2 may include operation sections (e.g., buttons) provided on the engaged surface (i.e., the left side surface or the right side surface of the main unit 2) to which the left controller 3 or the right controller 4 is attached. Note that the operation section may be a button having a particular function, which may specifically be a power button, a home button or a sleep button. For example, in the embodiment described above, the power button 28 may be provided on the left side surface or the right side surface of the main unit 2. Then, the power button 28 cannot be operated when the left controller 3 or the right controller 4 is attached, thereby preventing an erroneous operation by a user. Note that when an operation section is provided on a side surface of the main unit 2, the operation section may be provided on the rail member or on another portion of the housing other than the rail member.
  • Note that an operation section having the same function as the operation section described above may be provided on the left controller 3 and/or the right controller 4. For example, in other embodiments, a power button having the same function as the power button 28 may be provided on the left controller 3 and/or the right controller 4. Then, the function of the input section is available to the user even when the left controller 3 and the right controller 4 are in the attached state.
  • In other embodiments, the main unit 2 may include a terminal for connecting other devices provided on the engaged surface (i.e., the left side surface or the right side surface of the main unit 2) to which the left controller 3 or the right controller 4 is attached. For example, in the embodiment described above, the first slot 23 and/or the second slot 24 (in other words, terminals provided in the slots) may be provided on the left side surface or the right side surface of the main unit 2. Then, with the controller attached to the main unit 2, the terminals can be protected by the controller. If a slot is provided on the left side surface or the right side surface of the main unit 2, it is possible to prevent a device inserted in the slot (e.g., a card storage medium) from coming off of the main unit 2, when the controller is attached to the main unit 2.
  • In the embodiment described above, connecting portions (i.e., the upper left portion and the lower left portion) between the first side surface (i.e., the left side surface) of the four side surfaces of the left controller 3 and side surfaces adjacent thereto (i.e., the upper side surface and the lower side surface) have a more rounded shape than connecting portions (i.e., the upper right portion and the lower right portion) between the second side surface (i.e., the right side surface) opposite from the first side surface and side surfaces adjacent thereto (i.e., the upper side surface and the lower side surface) (see FIG. 5). Moreover, connecting portions between the third side surface (i.e., the right side surface) of the four side surfaces of the right controller 4 and side surfaces adjacent thereto (i.e., the upper side surface and the lower side surface) have a more rounded shape than connecting portions (i.e., the upper side surface and the lower side surface) between the fourth side surface (i.e., the left side surface) opposite from the third side surface and side surfaces adjacent thereto (see FIG. 6). Thus, the controller has a rounded shape on one side thereof, and it is therefore easy for a user to understand the orientation in which to hold the controller removed from the main unit 2. This also makes it easier for a user to hold the controller removed from the main unit 2.
  • Moreover, in the embodiment described above, the left controller 3 is attached to the main unit 2 with the second side surface of the left controller 3 facing the fifth side surface (i.e., the left side surface) of the four side surfaces of the main unit 2 (see FIG. 2). Moreover, the right controller 4 is attached to the main unit 2 with the fourth side surface of the right controller 4 facing the sixth side surface (i.e., the right side surface) opposite from the fifth side surface of the main unit 2 (see FIG. 2).
  • Then, with the controllers 3 and 4 attached to the main unit 2 (see FIG. 1), the information processing device 1 will have an overall shape such that the left side and the right side thereof are rounded, making it easier for a user to hold. The left side of the left controller 3 is more rounded than the right side thereof, whereas the right side of the right controller 4 is more rounded than the left side thereof (FIG. 5 and FIG. 6). Thus, since the left controller 3 has a different overall shape than that of the right controller 4, it is possible to reduce the possibility that a user mistakes the left and right controllers for each other when attaching them to the main unit 2.
  • In the embodiment described above, the right controller 4 includes an input section having a first function that the left controller 3 does not have (in the embodiment described above, the plus button 57, the home button 58 and the infrared image-capturing section 123). In other words, in the embodiment described above, the left controller 3 includes one or more input sections having a predetermined number of types of functions (13 in the embodiment described above, including nine buttons, the analog stick 32, the acceleration sensor 104 and the angular velocity sensor 105). In contrast, the right controller 4 includes one or more input sections having a number (different from the predetermined number) of types of functions (15 in the embodiment described above, including 11 buttons, the analog stick 52, the acceleration sensor 114, the angular velocity sensor 115 and the infrared image-capturing section 123). Thus, when each controller has some functions that the other controller does not have (as compared with a case where the controllers both have the same functions), it is possible to simplify the configuration of the controllers.
  • Note that the “input section having the first function” may be an image-capturing device (e.g., the infrared image-capturing section 123) or a button. The button may be a button having a particular function (e.g., a power button or a home button), for example.
  • In the embodiment described above, the left controller 3 includes input sections having the second function different from the first function (e.g., the analog stick 32 and the buttons 33 to 38 in the embodiment described above), and the right controller 4 includes input sections having the second function (e.g., the analog stick 52 and the buttons 53 to 56, 60 and 61 in the embodiment described above). Then, a user can use the second function on either controller, thereby improving the controllability, whereas the first function is omitted for one of the controllers, thereby simplifying the configuration of the controller.
  • In the embodiment described above, communication between the main unit 2 and a controller when the controller is attached to the main unit 2 uses a first communication scheme (specifically, wired communication), and communication between the main unit 2 and a controller when the controller is removed from the main unit 2 uses a second communication scheme (specifically, wireless communication) different from the first communication scheme. Then, by changing the communication scheme between when the controller is attached to the main unit 2 and when the controller is removed from the main unit 2, the controller can communicate in either case. The controller can easily communicate with the main unit 2 in wired communication when it is attached to the main unit 2, and the controller can communicate with the main unit 2 in wireless communication when it is removed from the main unit 2.
  • In the embodiment described above, the wired communication between the main unit and the controller is a communication connection via a wired communication channel formed by an electrical connection between the first terminal of the main unit 2 (specifically, the left-side terminal 17 or the right-side terminal 21) and the second terminal of the controller (specifically, the terminal 42 or 64). The wired communication as used herein means communication via a cable connection between devices, and also means communication via a connection between a terminal (e.g., a connector) of one device and a terminal (e.g., a connector) of the other device.
  • In the embodiment described above, when the controller is attached to the main unit 2, the first terminal of the main unit 2 (i.e., the left-side terminal 17 or the right-side terminal 21) and the second terminal of the controller (i.e., the terminal 42 or 64) are electrically connected to each other by being in contact with each other. Then, when the controller is attached to the main unit 2, wired communication and/or power supply are implemented via the terminals connected together.
  • In the embodiment described above, when the controller is attached to the main unit 2, the first terminal of the main unit 2 and the second terminal of the controller are electrically connected together, and the communication between the main unit 2 and the controller and the power supply from the main unit 2 to the controller are implemented via the first terminal and the second terminal. Then, it is possible to increase the opportunity to charge the controller and thus to reduce the possibility of the controller running out of battery. When a user attaches controllers to the main unit 2 so as to use the information processing device 1 as an integral portable device, the controllers can be charged without the user knowing. Therefore, the user does not need to perform a separate operation for charging the controller, thus saving the trouble for the user.
  • In the embodiment described above, the information processing device 1 includes a first sensing section (e.g., the CPU 81 executing step S3) for sensing the attachment of the left controller 3 to the main unit 2, and a second sensing section (e.g., the CPU 81 executing step S3) for sensing the attachment of the right controller 4 to the main unit 2. The main unit 2 registers a pair of a left controller and a right controller based on the sensing results from the first sensing section and the second sensing section (see step S4). Therefore, a user can register a pair through a straightforward, easy operation of attaching two controllers, to be used as a pair, to the main unit 2.
  • In the embodiment described above, when the left controller 3 and the right controller 4 are attached to the main unit 2, the main unit 2 registers the left controller 3 and the right controller 4 attached thereto as a pair (see step S3, S4). Thus, a user can register a pair of controllers through a straightforward operation.
  • Note that the first sensing section and the second sensing section may or may not simultaneously sense the attachment of the controllers to the main unit 2. That is, two controllers that are not at the same time attached to the main unit 2 may be registered as a pair.
  • The main unit 2 may execute a predetermined information process based on operation data received from two pairs of controllers. For example, the main unit 2 receives operation data from each of a left controller and a right controller that are registered as a first pair, and receives operation data from each of a left controller and a right controller that are registered as a second pair. The main unit 2 may execute a predetermined information process using, as a set of data, operation data received from controllers that are registered as a first pair, and using, as another set of data, operation data received from controllers that are registered as a second pair. Then, the information processing device can use operation data from a plurality of controllers while distinguishing between operation data from different registered pairs.
  • In the embodiment described above, when receiving operation data from each of the left controller 3 and the right controller 4 that are removed from the main unit 2 and registered as a pair, the main unit 2 executes a predetermined information process using the two pieces of operation data received as a set of data (see FIG. 17B). Thus, a user can perform operations using controllers registered as a pair. For example, a user can control one object by using a pair of controllers.
  • In the embodiment described above, the main unit 2 includes the housing 11 provided with a first engagement portion (which can also be referred to as the attachment and detachment mechanism; specifically, the left rail member 15) to be engaged with the housing (specifically, the slider 40 provided on the housing 31) of the left controller 3, and a second engagement portion (specifically, the right rail member 19) to be engaged with the housing (specifically, the slider 62 provided on the housing 51) of the right controller 4.
  • In the above description, the first engagement portion and the second engagement portion are to be engaged with controller housings (including members provided on the housings), and are not to be engaged with connectors of the controllers. That is, in the present embodiment, the main unit 2 is configured so that controllers are configured to be removably attached to the main unit 2 by the method of engaging the engagement portions with the controllers, which is different from the method of connecting the connectors of the main unit 2 with those of the controllers (the engagement method and the method of connecting the connectors may be both used at the same time). Then, the main unit 2 and the controllers can be firmly connected together.
  • Note that in other embodiments, the main unit 2 may include only one engagement portion to be engaged with the housing of the controller or may include three or more engagement portions.
  • In the embodiment described above, the left controller 3 includes the housing 31 provided with a third engagement portion (specifically, the slider 40) to be engaged with the first engagement portion of the main unit 2. The right controller 4 includes the housing 51 provided with a fourth engagement portion (specifically, the slider 62) to be engaged with the second engagement portion of the main unit 2. Thus, members are provided also on the controller side, which members are to be engaged with the engagement portions on the main unit 2 side are provided, and it is therefore possible to more firmly connect the main unit 2 and the controllers together.
  • In the embodiment described above, the main unit 2 selectively outputs the execution result of the information process to either the display (the display 12) or a display device (the TV 6) separate from the main unit 2 (see FIG. 24). This enables two different modes of use, including a mode in which images are displayed on the display of the main unit 2, and another mode in which images are displayed on a display device separate from the main unit 2.
  • In the embodiment described above, the engagement portion of the main unit 2 is a rail member (referred to also as the first slide member) provided on a surface of the housing 11 of the main unit 2. The controller includes a slider (referred to also as the second slide member) which slidably and detachably engages with the rail member. In the embodiment described above, the rail member and the slider together form a slide mechanism (see FIG. 7). Thus, the slide mechanism allows for firm locking between the main unit 2 and the controllers and allows for easy attachment and detachment of the controllers.
  • In the embodiment described above, the rail member is formed so that the slider can slidably engage with the rail member in a predetermined direction (specifically, the y-axis direction shown in FIG. 1), and so that the slider can be inserted and detached into and from the rail member via one end thereof in the predetermined direction (see FIG. 2). Thus, it is possible to easily attach and detach controllers to and from the main unit 2 via the end.
  • Moreover, in the embodiment described above, the rail member is provided so as to extend in the up-down direction of the main unit 2 so that the slider can be inserted and detached into and from the rail member via the upper end thereof (see FIG. 2). Thus, controllers can be conveniently attached and detached to and from the main unit 2 while the main unit 2 is placed upright. For example, in the present embodiment, controllers can be attached and detached to and from the main unit 2 while the main unit 2 is attached to the cradle 5.
  • In the embodiment described above, the slide member of the main unit 2 is provided so as to extend generally over the entirety (e.g., so that the length of the slide member is at least one 80% or more of the length of the housing 11 of the main unit 2) of a surface of the housing 11 of the main unit 2 in a predetermined direction (specifically, the up-down direction) (see FIG. 3). Thus, when a controller is attached to the main unit 2, the controller is connected generally over the entirety of the aforementioned surface of the main unit 2, thereby allowing the controller to be firmly connected to the main unit 2.
  • In the embodiment described above, the first slide member of the main unit 2 (i.e., the rail member) has a C-shaped cross section, and the second slide member of the controller (i.e., the slider) has a T-shaped cross section. Note that in other embodiments, the first slide member of the main unit 2 may have a T-shaped cross section, and the second slide member of the controller may have a C-shaped cross section.
  • In the embodiment described above, the controller includes a terminal (e.g., the terminal 42 or 64) for communication with the main unit 2. The main unit 2 includes a terminal (i.e., the left-side terminal 17 or the right-side terminal 21) provided on the housing 11 at such a location that allows the terminal to be connected to a terminal of a controller when the controller is attached to the main unit 2. Thus, when the controller is attached to the main unit 2, the terminals are connected together, enabling wired communication.
  • In the embodiment described above, the information processing device 1 can be said to be a game system capable of executing game applications. When at least the left controller 3 and the right controller 4 are removed from the main unit 2 (in other words, when the two controllers are both removed from the main unit 2), the left controller 3 and the right controller 4 each transmit operation data representing operations performed on the left controller 3 and the right controller 4 to the main unit 2 via wireless communication. Thus, in the embodiment described above, it is possible to provide a novel game system that can be used with two controllers removed.
  • In the embodiment described above, the information processing device 1 can be said to be a hand-held information processing device including a main section (i.e., the main unit 2) having a display (i.e., the display 12), a first controller section (i.e., the left controller 3) and a second controller section (i.e., the right controller 4) for performing a predetermined information process in response to an operation performed on either the first controller section or the second controller section. When the first controller section and the second controller section are removed from the main section, the main unit 2 performs a predetermined information process based on operation data representing an operation performed on either the first controller section or the second controller section, and displays the results of the information process on the display. Thus, in the embodiment described above, it is possible to provide a novel information processing device that can be used in a mode in which two controller sections are removed.
  • In the above description, the first controller section and the second controller section are arranged so that when the first controller section is attached to the main section, a user (i.e., the user holding the information processing device) can operate the first controller section with one hand and operate the second controller section with the other hand (see FIG. 14). Thus, it is possible to provide a novel information processing device that can be used both in a mode in which the controller sections are attached to the main section and in a mode in which the controller sections are removed from the main section.
  • The information processing system of the present embodiment includes the main unit 2, and controllers (specifically, the left controller 3 and the right controller 4; referred to also as controller devices) that are configured to be removably attached to the main unit 2. The main unit 2 includes the display 12 (referred to also as the display). When controllers are attached to the main unit 2, the main unit 2 can display, on the display 12, images that are obtained based on operations performed on the controllers. When the controllers are removed from the main unit 2, the main unit 2 can display, on an external display device (specifically, the TV 6) separate from the main unit 2, images that are obtained based on operations performed on the controllers. Note that in the above description, there may be one controller that can be attached to the main unit or there may be a plurality of controllers that can be attached to the main unit.
  • As described above, the information processing device 1 can be used both in a mode in which controllers are attached to the main unit 2 and in a mode in which the controllers are removed from the main unit. Thus, since a plurality of modes of use are realized with a single information processing device 1, the information processing device 1 can be used in a wider variety of manners. As described above, when the controllers are removed from the main unit 2, an external display device can be used as the display device. Thus, a user can provide, as the external display device, a display device having a larger screen size than the display 12, so that it is possible to display the images on a larger screen.
  • In the embodiment described above, the information processing system can communicate with the TV 6, and further includes the cradle 5 (referred to also as an add-on device) to and from which the main unit 2 can be attached and detached. The main unit 2 detects the attachment of the main unit 2 to the cradle 5, and determines whether the images obtained based on operations performed on the controllers is displayed on the display 12 or displayed on the TV 6 based at least on the detection results (step S21, S25, S29). Thus, the main unit 2 can determine the image display output based on whether or not the main unit 2 is attached to the cradle 5.
  • Note that in the above description, there is no particular limitation on the method for determining the image display output. As in the embodiment described above, the main unit 2 may select the TV 6 as the image display output at least on the condition that the main unit 2 is attached to the cradle 5. In other embodiments, the main unit 2 may select the TV 6 as the image display output when the main unit 2 is attached to the cradle 5. That is, the main unit 2 may output the images to the TV 6 via the cradle 5 in response to the attachment of the main unit 2 to the cradle 5.
  • In other embodiments, the main unit 2 may be capable of communicating directly with the TV 6. For example, the main unit 2 and the TV 6 may be capable of communicating with each other in wireless communication. Also in such a case, as in the embodiment described above, the main unit 2 may determine the image display output based on the attached state of the main unit 2 to the cradle 5.
  • The add-on device (e.g., the cradle) may be any add-on device to and from which the main unit 2 can be attached and detached. The add-on device may or may not have the function of charging the main unit 2, as in the present embodiment.
  • In the embodiment described above, the cradle 5 is enabled to communicate with the main unit 2 at least on the condition that the main unit 2 is attached thereto. If the main unit 2 has determined to display the images based on operations performed on the controllers on the TV 6, the main unit 2 outputs the images to the TV 6 via the cradle 5. Thus, the main unit 2 can display the images on the TV 6 by outputting the images to the TV 6 via the cradle 5. Therefore, the main unit 2 does not need to communicate with the TV 6, thereby simplifying the configuration of the main unit 2.
  • The information processing system of the embodiment described above includes the main unit 2, the left controller 3 (referred to also as the first controller device), the right controller 4 (referred to also as the second controller device), and an accessory (e.g., the extension grip 210 or the attachment 220). The left controller 3 is configured to be removably attached to the main unit 2 or the accessory. The right controller 4 is configured to be removably attached to the main unit 2 or the accessory. The main unit 2 includes the display 12 (referred to also as the display), and displays, on the display 12, the execution result of a predetermined information process based on operations performed on the left controller 3 and the right controller 4. The left controller 3 and the right controller 4 can be attached to the accessory at the same time (see FIG. 31). Thus, by using the accessory, a user can hold the two controllers 3 and 4, removed from the main unit 2, as an integral unit. That is, it is possible to improve the controllability of the controllers when removed from the main unit 2.
  • In the embodiment described above, the left controller 3 is attached to the accessory on the left side of the center of the accessory, and the right controller 4 is attached to the accessory on the right side of the center of the accessory (see FIG. 31, FIG. 32). Thus, a user can operate the left controller 3 attached to the accessory with the left hand, and operate the right controller 4 attached to the accessory with the right hand. That is, a user can operate the controllers in a similar fashion to that when the controllers are not attached to the accessory, thereby providing an accessory having a good controllability.
  • In the embodiment described above, the accessory includes a first grip portion (i.e., the left grip portion 212) provided on the left side and a second grip portion (i.e., the right grip portion 213) provided on the right side. Then, a user can operate the controllers while holding the grip portions respectively with the left hand and the right hand, thereby providing an accessory having a good controllability.
  • In the embodiment described above, the first grip portion is provided on the left side of the area where the left controller 3 is attached. The second grip portion is provided on the right side of the area where the right controller 4 is attached (see FIG. 31). Therefore, by holding the grip portions, a user can easily operate the controllers 3 and 4 attached to the accessory.
  • According to the embodiment described above and variations thereof, there is provided an accessory to which a first controller device and a second controller device are removably attachable, the accessory including:
  • a first slide member configured for engaging with a slide member of the first controller device;
  • a second slide member configured for engaging with a slide member of the second controller device;
  • a first grip portion on a first side of a main part of the accessory with respect to a predetermined direction; and
  • a second grip portion on a second side of the main part of the accessory with respect to the predetermined direction.
  • The first slide member may include a first engagement hole configured for engaging with a projection of the first controller device. The second slide member may include a second engagement hole configured for engaging with a projection of the second controller device.
  • The first slide member may be configured to allow the first controller device to be attached to the accessory on the first side of a center of the accessory with respect to the predetermined direction. The second slide member may be configured to allow the second controller device to be attached to the accessory on the second side of the center of the accessory with respect to the predetermined direction.
  • The first grip portion may be on the first side, with respect to the predetermined direction, of an area where the first controller device is attached to the accessory. The second grip portion may be on the second side, with respect to the predetermined direction, of an area where the second controller device is attached to the accessory.
  • The Accessory May Further Include:
  • a first connection terminal at a position allowing the first connection terminal to be connected to a terminal of the first controller device attached to the accessory; and
  • a second connection terminal at a position allowing the second connection terminal to be connected to a terminal of the second controller device attached to the accessory.
  • The Accessory May Further Include:
  • a power supply terminal; and
  • a charging unit configured to use power supplied to the accessory through the power supply terminal to charge the first controller device through the first connection terminal and charge the second controller device through the second connection terminal.
  • According to the embodiment described above and variations thereof, there is provided an accessory to which at least one controller device is removably attachable, the accessory including:
  • a slide member configured for engaging with a slide member of the controller device;
  • a connection terminal at a position allowing the connection terminal to be connected to a terminal of the controller device attached to the accessory;
  • a power supply terminal; and
  • a charging unit configured to use power supplied to the accessory through the power supply terminal to charge the controller device through the connection terminal.
  • According to the embodiment described above and variations thereof, there is provided a game controller configured which is remobably attachable to a main unit of a game device, the game controller including:
  • an operation section;
  • a slide member configured for slidably and detachably engaging with a rail member of the main unit; and
  • a projection on the slide member, the projection being configured to engage with the rail member of the main unit when the game controller is attached to the main unit, thereby locking the game controller to the main unit.
  • The game controller may further include a release mechanism configured to retract the projection into the slide member in response to an operation by a user, thereby releasing the locking of the game controller to the main unit.
  • The release mechanism may include a user-actuable button.
  • The projection may be configured to be biased from an inside toward an outside of the slide member. When the game controller is locked to the main unit by virtue of the projection, application of a force to slide the game controller releases the lock between the game controller and the main unit.
  • The slide member may be configured for insertion into the rail member of the main unit downwardly from an upper end of the rail member. The game controller may further include a terminal below the projection on the slide member, the terminal being configured for electrically connecting to a terminal of the main unit.
  • According to the embodiment described above and variations thereof, there is provided a game controller which is remobably attachable to a main unit of a game device, the game controller including:
  • a slide member configured for slidably and removably engaging with a rail member of the main unit; and
  • a first operation section and/or a light-emitting portion on a surface of the slide member, the surface being faced to the main unit when the game controller is attached to the main unit.
  • The first operation section may be at a position recessed relative to the surface.
  • the first operation section is configured so as not to protrude past the surface.
  • A plurality of the first operation sections may be on the surface.
  • The first operation section may include a first button on a first side of a center of the slide member with respect to a slide direction and a second button on a second side of the center of the slide member with respect to the slide direction.
  • A plurality of light-emitting portions may be on the surface.
  • The game controller may further include:
  • a housing having a primary surface and a side surface, the slide member being on the side surface; and
  • a second operation section on the primary surface of the housing.
  • According to the embodiment described above and variations thereof, there is provided a game controller including:
  • a housing having a shape elongated in a predetermined direction, the housing having a primary surface, at least one first side surface extending in the predetermined direction, and at least one second side surface different from the at least one first side surface;
  • a directional input section on the primary surface of the housing;
  • a first operation section and a second operation section on one of the at least one first side surface, wherein the first operation sections is on one side of a center of the first side surface with respect to the predetermined direction and the second operation sections is on the other side of the center of the first side surface with respect to the predetermined direction; and
  • a third operation section and a fourth operation section on a corner portion at one end of one of the at least one second side surface away from the first side surface on which the first operation section and the second operation section are.
  • the housing shape may be elongated in an up-down direction. The first operation section and the second operation section may be on a right side surface of the housing, the first operation sections being on an upper side of a center of the right side surface with respect to the up-down direction and the second operation sections being on a lower side of the center of the right side surface with respect to the up-down direction. The third operation section and the fourth operation section may be on a corner portion between a left side surface and an upper side surface of the housing.
  • The housing may have a shape elongated in an up-down direction. The first operation section and the second operation section may be on a left side surface of the housing, the first operation sections being on an upper side of a center of the left side surface with respect to the up-down direction and the second operation sections being on a lower side of the center of the left side surface with respect to the up-down direction. The third operation section and the fourth operation section may be on a corner portion between a right side surface and an upper side surface of the housing.
  • The housing may have a rounded shape in the corner portion on which the third operation section and the fourth operation section are. The third operation section and the fourth operation section may each have a rounded shape in conformity with the rounded shape of the housing.
  • The fourth operation section may be on a rear side of the third operation section with respect to a front-rear direction defined perpendicular to the primary surface of the housing.
  • The fourth operation section may be configured so as to project past a reverse surface of the housing.
  • According to the embodiment described above and variations thereof, there is provided a game controller including:
  • a housing having a shape elongated in a predetermined direction;
  • a directional input section on a primary surface of the housing;
  • a first operation section on a first side surface of the housing at such a position allowing operation by an index finger of a user, when the housing is held in a portrait position in one hand of the user so as to allow operation of the directional input section by a thumb; and
  • a second operation section and a third operation section on a second side surface of the housing at such positions allowing operation of the second operation section and the third operation section respectively by a left and a right index finger of the user, when the housing is held in a landscape position by both hands of the user so as to allow operation of the directional input section by a thumb.
  • According to the embodiment described above and variations thereof, there is provided a game controller capable of wireless communication, the game controller including:
  • a housing shape is elongated in a predetermined direction;
  • a first operation section on a primary surface of the housing;
  • a second operation section and a third operation section on a side surface of the housing extending along the predetermined direction; and
  • a pairing button between the second operation section and the third operation section for performing a wireless communication-related setting process.
  • The second operation section, the third operation section and the pairing button may be configured so as not to protrude past the side surface.
  • According to the embodiment described above and variations thereof, there is provided a game controller comprising:
  • a housing;
  • a rail along a side surface of the housing, the rail having a T-shaped cross-section including a stem extending perpendicular to the side surface and a crossbar extending perpendicular to the stem; and
  • one or more game control buttons on the rail.
  • The one or more game control buttons may be on a surface of the crossbar.
  • The game controller may further comprise one or more lights on the rail.
  • The one or more lights may be on a surface of the crossbar.
  • The one or more game control buttons may include a first button and a second button. The one or more lights may be between the first button and the second button.
  • The game controller may further comprise an electrical connector on the rail.
  • According to the embodiment described above and variations thereof, there is provided an information processing apparatus comprising:
  • a housing;
  • a display on a front surface of the housing;
  • processing circuitry configured to generate images for the display;
  • a first channel formed in and extending in a lengthwise direction along a first side surface of the housing;
  • a second channel formed in and extending in a lengthwise direction along a second side surface of the housing opposite to the first side of the housing;
  • a first electrical connector in the first channel; and
  • a second electrical connector in the second channel.
  • According to the embodiment described above and variations thereof, there is provided a game controller comprising:
  • a housing;
  • a rail on a side surface of the housing, the rail having a T-shaped cross-section including a stem extending perpendicular to the side surface and a crossbar extending perpendicular to the stem; and
  • an electrical connector on the rail.
  • The electrical connector may be on a surface of the crossbar.
  • According to the embodiment described above and variations thereof, there is provided a game controller comprising:
  • a housing;
  • a rail located on a side surface of the housing, the rail having a T-shaped cross-section including a stem extending perpendicular to the side surface and a crossbar extending perpendicular to the stem; and
  • a retractable projection on the rail.
  • The retractable projection may be on a surface of the crossbar.
  • The embodiment described above is applicable to, for example, an information processing device such as a game device and an information processing system such as a game system, with the aim of providing an information processing device that can be used in different modes.
  • While certain example systems, methods, devices and apparatuses have been described herein, it is to be understood that the appended claims are not to be limited to the systems, methods, devices and apparatuses disclosed, but on the contrary, are intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (14)

What is claimed is:
1. An information processing system comprising:
a main unit including a display;
a first controller configured to be removably attached to the main unit, the first controller being configured to transmit first operation data representing an operation performed on the first controller to the main unit, irrespective of whether the first controller is attached to the main unit;
a second controller configured to be removably attached to the main unit, the second controller being configured to transmit second operation data representing an operation performed on the second controller to the main unit, irrespective of whether the second controller is attached to the main unit, wherein
the main unit is configured to display, on the display, an execution result of an information process based on the first operation data transmitted from the first controller and the second operation data transmitted from the second controller,
the first controller includes a first input and a second input on a first surface which faces in the same direction as a surface of the main unit when the first controller is attached to the main unit,
the second controller includes a third input of the same type as the first input and a fourth input of the same type as the second input on a second surface which faces the same direction as the surface of the main unit when the first controller is attached to the main unit, and
when the first controller device and the second controller device are attached to the main unit, the first input is above the second input on the first surface, and the third input is below the fourth input on the second surface.
2. An information processing system comprising:
a main unit including a display;
a first controller configured to be removably attached to the main unit, the first controller being configured to transmit first operation data representing an operation performed on the first controller to the main unit, irrespective of whether the first controller is attached to the main unit;
a second controller configured to be removably attached to the main unit, the second controller being configured to transmit second operation data representing an operation performed on the second controller to the main unit, irrespective of whether the second controller is attached to the main unit, wherein
the main unit is configured to display, on the display, an execution result of an information process based on the first operation data transmitted from the first controller and the second operation data transmitted from the second controller,
the first controller includes a first input and a second input on a first surface which faces in the same direction as a surface of the main unit when the first controller is attached to the main unit,
the second controller includes a third input of the same type as the first input and a fourth input of the same type as the second input on a second surface which faces the same direction as the surface of the main unit when the first controller is attached to the main unit, and
when the first controller device and the second controller device are attached to the main unit, a direction from the first input to the second input on the first surface is opposite to a direction from the third input to the fourth input on the second surface.
3. An information processing system comprising:
a main unit including a display;
a first controller configured to be removably attached to the main unit, the first controller being configured to transmit first operation data representing an operation performed on the first controller to the main unit, irrespective of whether the first controller is attached to the main unit;
a second controller configured to be removably attached to the main unit, the second controller being configured to transmit second operation data representing an operation performed on the second controller to the main unit, irrespective of whether the second controller is attached to the main unit, wherein
the main unit is configured to display, on the display, an execution result of an information process based on the first operation data transmitted from the first controller and the second operation data transmitted from the second controller,
the first controller includes a first input and a second input on a first surface,
the second controller includes a third input of the same type as the first input and a fourth input of the same type as the second input on a second surface, and
when the first controller and the second controller are attached to the main unit, a direction from the first input to the second input on the first surface is opposite to a direction from the third input to the fourth input on the second surface.
4. The information processing system according to claim 1, wherein, when the first controller and the second controller are detached from the main unit, a positional relationship between the first input and the second input of the first controller placed in a predetermined orientation is the same as a positional relationship between the third input and the fourth input of the second controller placed in a predetermined orientation.
5. The information processing system according to claim 1, wherein:
the first input has substantially the same input mechanism as the third input; and
the second input has substantially the same input mechanism as the fourth input.
6. The information processing system according to claim 1, wherein:
the first input has substantially the same shape as the third input; and
the second input has substantially the same shape as the fourth input.
7. The information processing system according to claim 1, wherein the first input and the third input are each a directional input configured to accept a directional input.
8. The information processing system according to claim 7, wherein the directional input includes an operation member configured to be tilted or slid in a predetermined direction.
9. The information processing system according to claim 1, wherein the second input and the fourth input each comprise a pressable button.
10. The information processing system according to claim 1, wherein:
the first controller is configured to be attached to the main unit so as to be facing one of a left side surface and a right side surface of the main unit; and
the second controller is configured to be attached to the main unit so as to be facing the other one of the left side surface and the right side surface of the main unit.
11. The information processing system according to claim 1, wherein:
a connecting portion between a first side surface of four side surfaces of the first controller and a side surface adjacent to the first side surface has a more rounded shape than a connecting portion between a second side surface of the four side surfaces, which is opposite from the first side surface, and a side surface adjacent to the second side surface; and
a connecting portion between a third side surface of four side surfaces of the second controller and a side surface adjacent to the third side surface has a more rounded shape than a connecting portion between a fourth side surface of the four side surfaces, which is opposite from the third side surface, and a side surface adjacent to the fourth side surface.
12. The information processing system according to claim 11, wherein:
the first controller is attached to the main unit with the second side surface of the first controller facing a fifth side surface of four side surfaces of the main unit; and
the second controller is attached to the main unit with the fourth side surface of the second controller facing a sixth side surface, which is opposite from the fifth side surface, of the main unit.
13. The information processing system according to claim 1, wherein the main unit includes a housing including a first engagement portion configured for engagement with a housing of the first controller when the first controller is attached to the main unit, and a second engagement portion configured for engagement with a housing of the second controller when the second controller is attached to the main unit.
14. The information processing system according to claim 13, wherein:
the first controller includes a housing including a third engagement portion to configured for engagement with the first engagement portion of the main unit; and
the second controller includes a housing including a fourth engagement portion configured for engagement with the second engagement portion of the main unit.
US17/369,672 2015-06-12 2021-07-07 Information processing system, information processing device, controller device and accessory Abandoned US20210402288A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/369,672 US20210402288A1 (en) 2015-06-12 2021-07-07 Information processing system, information processing device, controller device and accessory

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2015-119707 2015-06-12
JP2015119707 2015-06-12
US15/179,022 US10583356B2 (en) 2015-06-12 2016-06-10 Information processing system, information processing device, controller device and accessory
US16/529,532 US11110344B2 (en) 2015-06-12 2019-08-01 Information processing system, information processing device, controller device and accessory
US17/369,672 US20210402288A1 (en) 2015-06-12 2021-07-07 Information processing system, information processing device, controller device and accessory

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/529,532 Continuation US11110344B2 (en) 2015-06-12 2019-08-01 Information processing system, information processing device, controller device and accessory

Publications (1)

Publication Number Publication Date
US20210402288A1 true US20210402288A1 (en) 2021-12-30

Family

ID=56117608

Family Applications (9)

Application Number Title Priority Date Filing Date
US15/179,022 Active 2038-07-19 US10583356B2 (en) 2015-06-12 2016-06-10 Information processing system, information processing device, controller device and accessory
US15/344,276 Active US10010789B2 (en) 2015-06-12 2016-11-04 Information processing system, information processing device, controller device and accessory
US15/344,208 Active US10543423B2 (en) 2015-06-12 2016-11-04 Information processing system, information processing device, controller device and accessory
US15/418,426 Active US10118093B2 (en) 2015-06-12 2017-01-27 Information processing system, information processing device, controller device and accessory
US16/529,532 Active US11110344B2 (en) 2015-06-12 2019-08-01 Information processing system, information processing device, controller device and accessory
US16/735,838 Abandoned US20200139232A1 (en) 2015-06-12 2020-01-07 Information processing system, information processing device, controller device and accessory
US17/369,672 Abandoned US20210402288A1 (en) 2015-06-12 2021-07-07 Information processing system, information processing device, controller device and accessory
US17/665,503 Active US11951386B2 (en) 2015-06-12 2022-02-05 Information processing system, information processing device, controller device and accessory
US18/597,140 Pending US20240278112A1 (en) 2015-06-12 2024-03-06 Information processing system, information processing device, controller device and accessory

Family Applications Before (6)

Application Number Title Priority Date Filing Date
US15/179,022 Active 2038-07-19 US10583356B2 (en) 2015-06-12 2016-06-10 Information processing system, information processing device, controller device and accessory
US15/344,276 Active US10010789B2 (en) 2015-06-12 2016-11-04 Information processing system, information processing device, controller device and accessory
US15/344,208 Active US10543423B2 (en) 2015-06-12 2016-11-04 Information processing system, information processing device, controller device and accessory
US15/418,426 Active US10118093B2 (en) 2015-06-12 2017-01-27 Information processing system, information processing device, controller device and accessory
US16/529,532 Active US11110344B2 (en) 2015-06-12 2019-08-01 Information processing system, information processing device, controller device and accessory
US16/735,838 Abandoned US20200139232A1 (en) 2015-06-12 2020-01-07 Information processing system, information processing device, controller device and accessory

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/665,503 Active US11951386B2 (en) 2015-06-12 2022-02-05 Information processing system, information processing device, controller device and accessory
US18/597,140 Pending US20240278112A1 (en) 2015-06-12 2024-03-06 Information processing system, information processing device, controller device and accessory

Country Status (6)

Country Link
US (9) US10583356B2 (en)
EP (17) EP3733254B1 (en)
JP (27) JP6612183B2 (en)
CN (10) CN114602161A (en)
ES (11) ES2712467T3 (en)
WO (1) WO2016199887A1 (en)

Families Citing this family (105)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102689629B1 (en) * 2009-10-16 2024-07-31 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Semiconductor device
US9539507B2 (en) * 2013-07-11 2017-01-10 Incipio, Llc Apparatus and method for controller accessory
US11045723B1 (en) 2014-11-18 2021-06-29 Razer (Asia-Pacific) Pte. Ltd. Gaming controller for mobile device and method of operating a gaming controller
CN107206280B (en) 2014-11-18 2019-07-09 雷蛇(亚太)私人有限公司 The method of game console and operation game console for mobile device
US9724601B2 (en) 2015-06-12 2017-08-08 Nintendo Co., Ltd. Game controller
JP6612183B2 (en) 2015-06-12 2019-11-27 任天堂株式会社 Game controller
JP6693795B2 (en) * 2016-04-11 2020-05-13 ホシデン株式会社 Waterproof electronic components
US20170340957A1 (en) * 2016-05-24 2017-11-30 Jesse Cochenour Cross platform controller
USD819696S1 (en) * 2016-08-30 2018-06-05 Nintendo Co., Ltd. Computer
US10512837B2 (en) * 2016-09-16 2019-12-24 Franklin Maldonado Portable entertainment system
JP6300384B1 (en) * 2016-12-21 2018-03-28 任天堂株式会社 Attachment and operation system
EP3482805B1 (en) 2016-10-06 2020-01-29 Nintendo Co., Ltd. Attachment
JP6877995B2 (en) 2016-12-27 2021-05-26 任天堂株式会社 Vibration control system, vibration control device, vibration control program and vibration control method
JP6383815B2 (en) 2017-01-06 2018-08-29 任天堂株式会社 GAME SYSTEM, GAME DEVICE, AND GAME CONTROLLER
JP6854132B2 (en) * 2017-01-10 2021-04-07 任天堂株式会社 Information processing system, information processing device, information processing program, and information processing method
JP1585832S (en) * 2017-01-10 2017-09-11
JP6931285B2 (en) * 2017-01-11 2021-09-01 任天堂株式会社 Attachment for game controller
JP6905341B2 (en) * 2017-01-12 2021-07-21 任天堂株式会社 Operation system, game system and game controller
JP6912889B2 (en) * 2017-01-12 2021-08-04 任天堂株式会社 Game system and game controller
IT201700020955A1 (en) * 2017-02-24 2018-08-24 Xplored S R L METHOD AND APPARATUS FOR CHECKING VIDEO GAMES
WO2018158847A1 (en) * 2017-02-28 2018-09-07 株式会社ソニー・インタラクティブエンタテインメント Information processing device and software processing method
JP6594920B2 (en) * 2017-03-01 2019-10-23 任天堂株式会社 GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
US10806997B2 (en) * 2017-03-02 2020-10-20 Tablet Console Games Inc. Universal game controller/console
US20180250588A1 (en) * 2017-03-02 2018-09-06 Jayme Winick Universal game controller/console
JP6151468B1 (en) * 2017-03-31 2017-06-21 任天堂株式会社 Attached equipment
EP3610350A1 (en) * 2017-04-12 2020-02-19 Nike Innovate C.V. Wearable article with removable module
US11494986B2 (en) * 2017-04-20 2022-11-08 Samsung Electronics Co., Ltd. System and method for two dimensional application usage in three dimensional virtual reality environment
US10195529B2 (en) 2017-06-12 2019-02-05 Nintendo Co. Ltd. Systems and methods of providing emergent interactions in virtual worlds
US10297090B2 (en) 2017-06-12 2019-05-21 Nintendo Co., Ltd. Information processing system, apparatus and method for generating an object in a virtual space
WO2018232616A1 (en) 2017-06-21 2018-12-27 SZ DJI Technology Co., Ltd. Methods and apparatuses related to transformable remote controllers
JP6544869B2 (en) * 2017-07-27 2019-07-17 任天堂株式会社 Game system, accessory, game program, game device, game processing method, and cardboard member
CN207149060U (en) * 2017-08-17 2018-03-27 深圳市大疆创新科技有限公司 For controlling the remote control of unmanned plane
JP6755224B2 (en) * 2017-08-28 2020-09-16 任天堂株式会社 Game systems, game programs, game devices, and game processing methods
JP2019046156A (en) * 2017-09-01 2019-03-22 エレコム株式会社 pointing device
JP6230747B1 (en) * 2017-09-22 2017-11-15 任天堂株式会社 Attachment for game controller
GB2567634B (en) * 2017-10-17 2021-08-25 Sony Interactive Entertainment Inc Input apparatus and method
JP6682079B2 (en) * 2017-11-01 2020-04-15 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and signal conversion method
JP6736606B2 (en) * 2017-11-01 2020-08-05 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and operation signal providing method
JP7042061B2 (en) 2017-11-10 2022-03-25 株式会社バンダイナムコエンターテインメント Operation input system, operation input device and game system
CN109935996B (en) * 2017-12-15 2022-01-25 富士康(昆山)电脑接插件有限公司 Sliding rail connector
JP7081922B2 (en) * 2017-12-28 2022-06-07 株式会社バンダイナムコエンターテインメント Programs, game consoles and methods for running games
JP7081921B2 (en) 2017-12-28 2022-06-07 株式会社バンダイナムコエンターテインメント Programs and game equipment
CN108176047A (en) * 2017-12-28 2018-06-19 努比亚技术有限公司 A kind of game control method, terminal and computer readable storage medium
JP6703015B2 (en) * 2018-01-16 2020-06-03 任天堂株式会社 Case
JP6353994B1 (en) 2018-01-17 2018-07-04 任天堂株式会社 Information processing system, information processing program, information processing method, and information processing apparatus
JP7071134B2 (en) * 2018-01-22 2022-05-18 株式会社コーエーテクモゲームス Information processing device, operation control program and operation control method
CN108417963A (en) * 2018-02-05 2018-08-17 南昌黑鲨科技有限公司 A kind of input equipment for intelligent terminal
JP7212450B2 (en) * 2018-02-07 2023-01-25 任天堂株式会社 Auxiliary Equipment, Stereoscopic Image Display System, and Stereoscopic Image Display Device
JP7060910B2 (en) * 2018-02-27 2022-04-27 任天堂株式会社 Game controller
USD997911S1 (en) * 2018-03-01 2023-09-05 Festool Gmbh Remote control
JP6511178B1 (en) 2018-03-02 2019-05-15 任天堂株式会社 Power-on device
JP2019170025A (en) * 2018-03-22 2019-10-03 株式会社椿本チエイン Charge and discharge device and server
CN108445970B (en) * 2018-03-30 2021-02-19 联想(北京)有限公司 Assembled electronic device and information processing method
JP2019185484A (en) * 2018-04-12 2019-10-24 日本電信電話株式会社 Pseudo force sensation presenting device
CN108594938B (en) * 2018-04-24 2021-06-25 徐文娟 Expansion equipment and mobile combination equipment of mobile terminal
CN108854057B (en) * 2018-05-31 2021-04-23 杭州流形科技有限公司 Game machine structure
USD870113S1 (en) * 2018-10-30 2019-12-17 Stoneage, Inc. Remote controller
TWI694354B (en) * 2019-01-15 2020-05-21 和碩聯合科技股份有限公司 Input device and notebook computer
CN109568945A (en) * 2019-01-23 2019-04-05 深圳市利达成科技有限公司 The method of synchronous configuration, synchronous device, storage medium and the processor configured
CN209772175U (en) * 2019-02-15 2019-12-13 孔令阔 SLAB super laboratory and SLAB experiment host computer subassembly
JP6975191B2 (en) * 2019-02-27 2021-12-01 任天堂株式会社 Game programs, game devices, game systems, and game methods
CN110124304B (en) * 2019-03-29 2020-09-08 华为技术有限公司 Control method and device of gamepad
JP7317559B2 (en) * 2019-04-18 2023-07-31 キヤノン株式会社 Electronics
US11559742B2 (en) 2019-05-23 2023-01-24 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having game program stored therein, game system, game apparatus control method, and game apparatus
JP6666504B1 (en) 2019-06-20 2020-03-13 任天堂株式会社 Device, system, game system, and device set
JP6666505B1 (en) 2019-06-20 2020-03-13 任天堂株式会社 Input device
CN110545640B (en) * 2019-08-19 2020-09-15 武汉市圣祥电子科技有限公司 Hand-held terminal for camera shooting
GB2586789A (en) * 2019-08-30 2021-03-10 Sony Interactive Entertainment Inc Controlling data processing
JP6670030B1 (en) 2019-08-30 2020-03-18 任天堂株式会社 Peripheral device, game controller, information processing system, and information processing method
US10868436B1 (en) * 2019-10-02 2020-12-15 Kingston Technology Corporation Mobile device gaming accessory with swappable battery packs and wireless charging
JP7534587B2 (en) * 2019-12-12 2024-08-15 株式会社デンソーウェーブ Air conditioning controller and air conditioning device
US11458386B2 (en) * 2020-02-14 2022-10-04 Valve Corporation Controller with adjustable features
CN115279468A (en) 2020-03-31 2022-11-01 索尼互动娱乐股份有限公司 Input device
JP7203785B2 (en) 2020-03-31 2023-01-13 株式会社ソニー・インタラクティブエンタテインメント input device
TWI784277B (en) * 2020-05-29 2022-11-21 寶德科技股份有限公司 Game controller
JP7457905B2 (en) * 2020-06-04 2024-03-29 株式会社リコー Operation expansion device, image forming device
US11705762B2 (en) * 2020-06-25 2023-07-18 Sony Interactive Entertainment LLC Method for game console operation based on detection of change in controller state
JP7300425B2 (en) * 2020-07-08 2023-06-29 株式会社バンダイ Toy sets and game systems
AU2020101534A4 (en) 2020-07-28 2020-09-10 Razer (Asia-Pacific) Pte. Ltd. Extendable cover for a portable electronic device
TWI777230B (en) 2020-08-28 2022-09-11 寶德科技股份有限公司 Mobile dock
TWI751688B (en) * 2020-09-11 2022-01-01 寶德科技股份有限公司 Game controller
US12028474B2 (en) * 2020-09-25 2024-07-02 Apple Inc. User interfaces associated with remote input devices
TWI768538B (en) * 2020-11-11 2022-06-21 寶德科技股份有限公司 Gaming controller
EP4227770A4 (en) 2020-11-18 2024-06-12 Nintendo Co., Ltd. Information processing system, controller, information processing method, and information processing program
WO2022107258A1 (en) 2020-11-18 2022-05-27 任天堂株式会社 Information processing system, controller, information processing method, and information processing program
CN116438504A (en) 2020-12-10 2023-07-14 任天堂株式会社 Information processing system, controller, information processing method, and information processing program
KR20220101485A (en) * 2021-01-11 2022-07-19 삼성전자주식회사 Display device and operating method for the same
US12117643B2 (en) 2021-03-02 2024-10-15 Sony Interactive Entertainment Inc. Input device and control system
WO2022222044A1 (en) * 2021-04-20 2022-10-27 深圳市金拓达科技有限公司 Game stand
JP7031777B1 (en) 2021-04-23 2022-03-08 住友電装株式会社 Electric unit and mounting method
KR20220165566A (en) * 2021-06-08 2022-12-15 엘지전자 주식회사 A display device
CN113377319B (en) * 2021-07-01 2021-12-24 深圳市宏铨煋科技有限公司 Double-source liquid crystal display device with same screen display
CN113471738B (en) * 2021-07-16 2023-11-03 北京晶品特装科技股份有限公司 Quick-change device and assembly method thereof
CN113771760B (en) * 2021-09-10 2023-06-30 科莱德(惠州)电子有限公司 Automatic positioning voice recognition device
JPWO2023047472A1 (en) * 2021-09-21 2023-03-30
WO2023047471A1 (en) * 2021-09-21 2023-03-30 株式会社ソニー・インタラクティブエンタテインメント Input device equipped with variable grip
JP2024537710A (en) * 2021-09-24 2024-10-16 ベルキン・インターナショナル・インコーポレイテッド GAME CONTROLLER SYSTEMS AND RELATED METHODS
JP7072111B1 (en) 2021-10-21 2022-05-19 PayPay株式会社 Service providers, service delivery methods, and programs
CN114247129A (en) * 2021-12-13 2022-03-29 深圳市致尚科技股份有限公司 Game machine
CN114546047A (en) * 2022-02-22 2022-05-27 广州市唯图新电子科技有限公司 Paperless conference terminal
CN219398928U (en) * 2022-04-18 2023-07-25 广州小鸡快跑网络科技有限公司 Game paddle with movable interface
WO2023228383A1 (en) * 2022-05-26 2023-11-30 任天堂株式会社 Input device and electronic equipment set
WO2023203803A1 (en) * 2022-11-11 2023-10-26 任天堂株式会社 Cover and system comprising cover
WO2023195197A1 (en) * 2022-11-11 2023-10-12 任天堂株式会社 Portable game device
WO2024158074A1 (en) * 2023-01-27 2024-08-02 엘지전자 주식회사 Portable controller

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080167740A1 (en) * 2007-01-05 2008-07-10 David Merrill Interactive Audio Recording and Manipulation System
US20180345130A1 (en) * 2014-11-18 2018-12-06 Razer (Asia-Pacific) Pte. Ltd. Gaming controller for mobile device and method of operating a gaming controller

Family Cites Families (243)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3850271A (en) * 1973-09-29 1974-11-26 Vahle Kg P Third rail for current consumers with large current requirement and high speed
JPS54112123A (en) 1978-02-23 1979-09-01 Sony Corp Display unit
JPS562185A (en) 1979-06-22 1981-01-10 Toshiba Corp Apparatus for stamping and delivering bundle of paper sheet
JPS6121034U (en) 1984-07-11 1986-02-06 三洋電機株式会社 Cassette type tape recorder switching device
JPS6353873A (en) 1986-08-22 1988-03-08 キヤノン株式会社 Contact terminal device
JPS6437986A (en) * 1987-08-05 1989-02-08 Takara Co Ltd Pocket type game computer
US5276733A (en) 1989-06-16 1994-01-04 Canon Kabushiki Kaisha System utilizing portable device
US5052937A (en) 1990-03-27 1991-10-01 Glen Bryan D Baseboard with movable electrical outlet
US5046739A (en) 1990-10-31 1991-09-10 Dynasound Organizer, Inc. Ergonomic handle for game controller
JPH04255510A (en) 1991-02-01 1992-09-10 Honda Motor Co Ltd Working method for surrounding of valve seat of internal combustion engine
JPH04126443U (en) * 1991-05-09 1992-11-18 三菱電機株式会社 In-vehicle radio equipment
EP0522330B1 (en) 1991-06-18 1998-04-15 Canon Kabushiki Kaisha Camera-integrated video recorder apparatus
JPH0677387A (en) 1992-08-27 1994-03-18 Babcock Hitachi Kk Nc programming apparatus for printed board
EP0587162B1 (en) * 1992-09-11 2002-02-06 Canon Kabushiki Kaisha Information processing apparatus
JPH0677387U (en) * 1993-03-31 1994-10-28 ミツミ電機株式会社 Remote control device
US5477415A (en) 1993-11-12 1995-12-19 Texas Instruments Incorporated Automatic computer docking station having a motorized tray, cammed side connectors, motorized side connectors, and locking and unlocking guide pins
JP2604748Y2 (en) 1993-11-16 2000-06-05 株式会社セガ・エンタープライゼス Operation panel for game consoles
JPH07155465A (en) 1993-12-03 1995-06-20 Sega Enterp Ltd Video game operating unit
JP3054306B2 (en) 1994-02-04 2000-06-19 キヤノン株式会社 Notebook PC and battery case
JP3628358B2 (en) * 1994-05-09 2005-03-09 株式会社ソニー・コンピュータエンタテインメント Game console controller
JP4036246B2 (en) 1994-08-02 2008-01-23 任天堂株式会社 Game console operating device
CN2211084Y (en) 1994-11-15 1995-10-25 郑秋豪 Split-type keyboard of television game-machine
JP2003236246A (en) * 1995-05-10 2003-08-26 Nintendo Co Ltd Operating device for game machine
US5702305A (en) 1996-02-15 1997-12-30 Motorola Electronic game system
GB2310481B (en) * 1996-02-22 1999-05-05 Top Game & Company Ltd Controller for a video game console
CN1177634C (en) * 1996-03-05 2004-12-01 世嘉企业股份有限公司 Controller and extension unit for controller
JP3028834U (en) * 1996-03-08 1996-09-13 科鋒実業有限公司 Video game playstation controller
JP3994112B2 (en) * 1996-12-27 2007-10-17 株式会社バンダイ cartridge
US6720949B1 (en) * 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
JPH1176617A (en) * 1997-09-04 1999-03-23 Sega Enterp Ltd Remote controller
US6512511B2 (en) 1998-07-20 2003-01-28 Alphagrip, Inc. Hand grippable combined keyboard and game controller system
US6257984B1 (en) * 1999-09-27 2001-07-10 Howard Gershon Video game cap
US20010003708A1 (en) * 1999-12-10 2001-06-14 Takuya Aizu Entertainment system, entertainment apparatus, recording medium, and program
US6580420B1 (en) 2000-03-15 2003-06-17 Yanqing Wang Convertible computer input device
JP3356757B2 (en) * 2000-07-07 2002-12-16 コナミ株式会社 Game console controller
US6946988B2 (en) 2000-11-10 2005-09-20 Simple Devices Detachable remote controller for an electronic entertainment device and a method for using the same
JP2002182856A (en) 2000-12-15 2002-06-28 Mitsumi Electric Co Ltd Controller having holding mechanism
US20020119819A1 (en) 2001-02-28 2002-08-29 Kunzle Adrian E. Systems and methods wherein a portion of landscape information is provided to a remote player device
US20030100340A1 (en) 2001-03-16 2003-05-29 Cupps Bryan T. Novel personal electronics device with thermal management
US6788285B2 (en) * 2001-04-10 2004-09-07 Palmone, Inc. Portable computer with removable input mechanism
US6530838B2 (en) 2001-04-18 2003-03-11 Mobilink Telecom Co., Ltd. Game pad connectable to personal portable terminal
US6795304B1 (en) 2001-04-27 2004-09-21 Palmone, Inc. Keyboard sled with rotating screen
US20030171146A1 (en) * 2001-05-15 2003-09-11 Umrao Mayer Quick passing feature for sports video games
TW587901U (en) 2001-06-01 2004-05-11 Inventec Corp Heat dissipation structure for electronic component
US6530570B2 (en) * 2001-06-14 2003-03-11 Man To Ku Handheld case gripper
JP2003018275A (en) 2001-07-04 2003-01-17 Hori Co Ltd Extended controller to be used in combination with mobile telephone set
KR100436429B1 (en) * 2001-09-06 2004-06-16 주식회사 이오리스 Joystick for mobile telephone terminal
JP3442758B2 (en) 2001-10-26 2003-09-02 コナミ株式会社 GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
JP3857566B2 (en) * 2001-10-30 2006-12-13 インターナショナル・ビジネス・マシーンズ・コーポレーション Notebook computer
US20030109314A1 (en) 2001-12-06 2003-06-12 Man To Ku Handheld case gripper
US7095442B2 (en) 2002-01-31 2006-08-22 Hewlett-Packard Development Company, L.P. Method and apparatus for capturing an image
JP3089139U (en) 2002-04-08 2002-10-11 盈彬 呉 Wireless transmission light gun for home video game console
JP2004038332A (en) 2002-06-28 2004-02-05 Toshiba Corp Information processor
JP3866163B2 (en) 2002-07-01 2007-01-10 アルプス電気株式会社 Game console controller
US20040082361A1 (en) 2002-10-18 2004-04-29 Bala Rajagopalan Handeld, portable electronic computing and communication device and methods for using the same
JP2004146986A (en) 2002-10-23 2004-05-20 Toya Kensetsu Kogyo:Kk Portable game machine
US6989818B2 (en) * 2002-11-07 2006-01-24 Electro Source, Inc. Input controller for a game system having a combination move feature
JP2004201728A (en) * 2002-12-24 2004-07-22 Mitsumi Electric Co Ltd Controller
JP3665063B2 (en) 2003-01-09 2005-06-29 シャープ株式会社 Thin display device and method for removing display unit
TWM242791U (en) 2003-01-17 2004-09-01 Asustek Comp Inc Wireless remote controller accommodated in the PCMCIA slot of notebook computer
JP3619517B2 (en) * 2003-04-17 2005-02-09 株式会社ホリ Video game console controller
JP2004336723A (en) * 2003-04-17 2004-11-25 Sharp Corp Terminal apparatus and center apparatus, control method and control program thereof communication system, and recording medium recording these programs
US8493326B2 (en) * 2003-05-09 2013-07-23 Microsoft Corporation Controller with removably attachable text input device
US7402100B2 (en) 2003-10-21 2008-07-22 Mattel, Inc. Hand held electrical device and method for using the same
DE10330773A1 (en) * 2003-07-08 2005-04-28 Glenn Rolus Borgward Input device for portable digital computers and portable digital computers with variable mouse
US7107018B2 (en) 2003-09-12 2006-09-12 Motorola, Inc. Communication device having multiple keypads
GB2411452C (en) 2004-08-06 2008-03-17 Simon Richard Daniel Flat and collapsible mouse
JP4457885B2 (en) * 2004-12-22 2010-04-28 パナソニック電工株式会社 Switch with dimming function
JP4203476B2 (en) 2005-01-24 2009-01-07 シャープ株式会社 Portable information device
US20060279039A1 (en) 2005-03-10 2006-12-14 Vector Products, Inc. Grip accessory device for a portable video game
US20080214305A1 (en) * 2005-05-13 2008-09-04 David Ralph Addington System and method for interfacing a simulation device with a gaming device
JP4849829B2 (en) * 2005-05-15 2012-01-11 株式会社ソニー・コンピュータエンタテインメント Center device
CN100434131C (en) * 2005-07-07 2008-11-19 技嘉科技股份有限公司 Game controller
JP2007025060A (en) 2005-07-13 2007-02-01 Toshiba Corp Information processor and video signal output control method
JP2007021065A (en) * 2005-07-21 2007-02-01 Aruze Corp Controller for portable game machine
JP4805633B2 (en) 2005-08-22 2011-11-02 任天堂株式会社 Game operation device
JP4262726B2 (en) 2005-08-24 2009-05-13 任天堂株式会社 Game controller and game system
US20070045392A1 (en) 2005-08-29 2007-03-01 Youens John E Modular keyboard system
US8308563B2 (en) * 2005-08-30 2012-11-13 Nintendo Co., Ltd. Game system and storage medium having game program stored thereon
JP4907128B2 (en) * 2005-08-30 2012-03-28 任天堂株式会社 Game system and game program
US20070178966A1 (en) * 2005-11-03 2007-08-02 Kip Pohlman Video game controller with expansion panel
JP2007127935A (en) * 2005-11-07 2007-05-24 Funai Electric Co Ltd Light guide and display of electronic device state
US20070111801A1 (en) 2005-11-11 2007-05-17 Adimos Systems Ltd. Method, apparatus and system for transmitting and receiving media data
US20070131727A1 (en) 2005-12-14 2007-06-14 Martuccio Michael C Gaming device harness
US7833097B1 (en) 2005-12-19 2010-11-16 Kyocera Corporation Game pad controller and system for portable communication device
JP5204381B2 (en) * 2006-05-01 2013-06-05 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
JP2007213121A (en) 2006-02-07 2007-08-23 Sony Corp Mobile type electronic equipment and cradle unit
WO2007095314A2 (en) 2006-02-14 2007-08-23 Jakks Pacific, Inc. Portable interactive game device with separate console for enhanced features and alternate display capability
JP5089060B2 (en) * 2006-03-14 2012-12-05 株式会社ソニー・コンピュータエンタテインメント Entertainment system and game controller
US20070293318A1 (en) 2006-03-27 2007-12-20 Splitfish Gameware Inc. Video game controller
JP4906391B2 (en) * 2006-04-26 2012-03-28 株式会社ソニー・コンピュータエンタテインメント Game controller
JP4989105B2 (en) * 2006-05-09 2012-08-01 任天堂株式会社 Game controller
JP4749245B2 (en) * 2006-06-13 2011-08-17 株式会社オリンピア Game machine
US7684185B2 (en) 2006-07-03 2010-03-23 Apple Inc. Integrated monitor and docking station
JP5134224B2 (en) * 2006-09-13 2013-01-30 株式会社バンダイナムコゲームス GAME CONTROLLER AND GAME DEVICE
KR20080024679A (en) * 2006-09-14 2008-03-19 삼성전자주식회사 Portable composition apparatus and power controlling method
US8143848B2 (en) * 2006-10-13 2012-03-27 Nyko Technologies, Inc. Video game controller charging system having a docking structure
JP5153122B2 (en) * 2006-11-15 2013-02-27 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JP2007103380A (en) 2006-11-29 2007-04-19 Nintendo Co Ltd Operating device for game machine and hand-held game machine
CN201015711Y (en) * 2007-02-05 2008-02-06 周建正 WII game machine tennis racket handle
CN101332363B (en) * 2007-06-29 2012-06-20 鸿富锦精密工业(深圳)有限公司 Game machine controller
EP2018030A1 (en) * 2007-07-18 2009-01-21 Blue Bee Limited A docking station and a kit for a personal electronic device
TWI341482B (en) 2007-07-30 2011-05-01 Asustek Comp Inc Electronic apparatus with game keypad
JP5265159B2 (en) 2007-09-11 2013-08-14 株式会社バンダイナムコゲームス Program and game device
US20090069096A1 (en) 2007-09-12 2009-03-12 Namco Bandai Games Inc. Program, information storage medium, game system, and input instruction device
JP5421266B2 (en) 2007-09-14 2014-02-19 パナソニック・アビオニクス・コーポレイション Portable user operating device and operating method for vehicle information system
US8143850B2 (en) 2007-09-17 2012-03-27 Teknocreations, Inc. Inductive charger battery replacement system and device
JP2009075861A (en) * 2007-09-20 2009-04-09 Panasonic Corp Input device and system
US20090093307A1 (en) * 2007-10-08 2009-04-09 Sony Computer Entertainment America Inc. Enhanced game controller
US20090111508A1 (en) 2007-10-29 2009-04-30 Wang-Hung Yeh Handheld communication device having dual operating modes
JP5202013B2 (en) 2008-02-06 2013-06-05 三洋電機株式会社 Charging stand
US8087115B2 (en) 2008-04-25 2012-01-03 Elemental Tools, Llc Apparatus having one or more removable tools
US20090291760A1 (en) 2008-05-22 2009-11-26 Bennett Hepburn Video Gaming Controller Bay for Mobile Devices
US20100064883A1 (en) 2008-06-10 2010-03-18 Deshko Gynes Compact modular wireless control devices
CN101601931B (en) * 2008-06-11 2012-12-19 鸿富锦精密工业(深圳)有限公司 Handheld game device
JP2010000257A (en) 2008-06-20 2010-01-07 Namco Bandai Games Inc Game controller case, game controller case set, program, and information storage medium
US8384565B2 (en) 2008-07-11 2013-02-26 Nintendo Co., Ltd. Expanding operating device and operating system
JP5520457B2 (en) 2008-07-11 2014-06-11 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
JP5501582B2 (en) * 2008-08-04 2014-05-21 株式会社タイトー Mounting unit
JP5325492B2 (en) * 2008-08-04 2013-10-23 株式会社タイトー Video game device, video posting device
US8939838B2 (en) 2008-09-30 2015-01-27 Apple Inc. Accessory for playing games with a portable electronic device
US8957835B2 (en) 2008-09-30 2015-02-17 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
JP4255510B1 (en) 2008-10-24 2009-04-15 任天堂株式会社 Extended operation unit
KR20100054032A (en) 2008-11-13 2010-05-24 삼성전자주식회사 Remote controller and operating method thereof, image display apparatus controlled by remote controller
US8057309B1 (en) 2008-12-18 2011-11-15 Hasbro, Inc. Versatile toy capable of activating electronics and launching components thereof
US20100178981A1 (en) 2009-01-13 2010-07-15 Holcomb Paul L Weighted accessory for game controller
US8153881B2 (en) * 2009-02-20 2012-04-10 Activision Publishing, Inc. Disc jockey video game and controller
US8608564B2 (en) 2009-04-21 2013-12-17 Nyko Technologies, Inc. Connector for video game controller, and video game controller including the same
US20100304873A1 (en) 2009-05-28 2010-12-02 Lipa Markowitz Bowling Ball and Football Game Controller
JP2010277508A (en) 2009-06-01 2010-12-09 Sony Computer Entertainment Inc Operation device
JP5227910B2 (en) 2009-07-21 2013-07-03 株式会社コナミデジタルエンタテインメント Video game apparatus, game image display method, and game image display program
US8777741B2 (en) * 2009-09-10 2014-07-15 Nintendo Co., Ltd. Illumination device
JP5294085B2 (en) * 2009-11-06 2013-09-18 日本電気株式会社 Information processing apparatus, accessory apparatus thereof, information processing system, control method thereof, and control program
AU2010331756A1 (en) 2009-12-20 2012-07-05 Keyless Systems Ltd. Features of a data entry system
JP4798809B1 (en) * 2010-11-01 2011-10-19 任天堂株式会社 Display device, game system, and game processing method
EP2392389A4 (en) 2010-02-03 2014-10-15 Nintendo Co Ltd Game system, operating device, and game processing method
US20110216495A1 (en) 2010-03-04 2011-09-08 Keith Bernard Marx Docking system for electronic devices
US8858329B2 (en) 2010-03-22 2014-10-14 Christine Hana Kim Apparatus and method for using a dedicated game interface on a wireless communication device with projector capability
US8497659B2 (en) 2010-03-23 2013-07-30 Nyko Technologies, Inc. Video game controller charging system
US20110260969A1 (en) * 2010-04-21 2011-10-27 Brandon Workman Removably attachable controller for portable electronic devices
JP5550433B2 (en) 2010-04-22 2014-07-16 任天堂株式会社 Operating device and information processing system
JP5662696B2 (en) 2010-04-30 2015-02-04 株式会社ソニー・コンピュータエンタテインメント Controller support device and controller support system
JP5733308B2 (en) * 2010-05-20 2015-06-10 日本電気株式会社 Portable device
RU2010131174A (en) 2010-07-26 2012-02-10 Алексей Владимирович Лысенко (RU) DEVICE FOR TYPING AND ENTERING SYMBOLS IN PORTABLE COMMUNICATION MEANS
US20130225288A1 (en) * 2010-08-09 2013-08-29 Roman Levin Mobile gaming platform system and method
US8469815B2 (en) * 2010-08-10 2013-06-25 Imation Corp. Video game accessory
US20120058821A1 (en) * 2010-09-04 2012-03-08 Cheng Uei Precision Industry Co., Ltd. Projecting handheld game console and operating method thereof
TW201216123A (en) * 2010-10-08 2012-04-16 Primax Electronics Ltd Separatable game controller
TWI440496B (en) * 2010-11-01 2014-06-11 Nintendo Co Ltd Controller device and controller system
KR101364826B1 (en) * 2010-11-01 2014-02-20 닌텐도가부시키가이샤 Operating apparatus and operating system
TWI434717B (en) * 2010-11-01 2014-04-21 Nintendo Co Ltd Display device, game system, and game process method
JP5837325B2 (en) * 2010-11-01 2015-12-24 任天堂株式会社 Operating device and operating system
US8810536B2 (en) 2010-11-05 2014-08-19 William James McDermid Tablet computer input device
US10909564B2 (en) * 2010-11-17 2021-02-02 PicScore, Inc. Fast and versatile graphical scoring device and method
JP5122659B2 (en) 2011-01-07 2013-01-16 任天堂株式会社 Information processing program, information processing method, information processing apparatus, and information processing system
CN102606855B (en) 2011-01-21 2015-11-25 富泰华工业(深圳)有限公司 Base and use the electronic equipment assembly of this base
JP5379176B2 (en) * 2011-01-25 2013-12-25 株式会社ソニー・コンピュータエンタテインメント Portable electronic devices
JP5840844B2 (en) * 2011-02-01 2016-01-06 株式会社ソニー・コンピュータエンタテインメント Controller support device and controller charging system
US20120196680A1 (en) 2011-02-02 2012-08-02 Joshua Provitt Compact game controller
US8298084B2 (en) * 2011-02-03 2012-10-30 Edmond Yee Multi-player game controller system with combinable hand-held game controllers
JP5792971B2 (en) * 2011-03-08 2015-10-14 任天堂株式会社 Information processing system, information processing program, and information processing method
US8972617B2 (en) * 2011-04-20 2015-03-03 Pieco Multi-configuration computer
JP5161338B2 (en) * 2011-05-09 2013-03-13 株式会社ソニー・コンピュータエンタテインメント keyboard
US8882597B2 (en) * 2011-05-25 2014-11-11 Sony Computer Entertainment America Llc Hybrid separable motion controller
US8740708B2 (en) 2011-05-27 2014-06-03 Performance Designed Products Llc Gun handle attachment for game controller
JP5755943B2 (en) 2011-06-03 2015-07-29 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
JP2012249923A (en) * 2011-06-06 2012-12-20 Sony Computer Entertainment Inc Controller device
US9914060B1 (en) 2011-07-15 2018-03-13 Bryan Corigliano Controller docking device for portable electronics and method of use
JP5392863B2 (en) * 2011-09-05 2014-01-22 任天堂株式会社 Peripheral device, information processing system, and connection method of peripheral device
CN202342834U (en) * 2011-09-19 2012-07-25 深圳市迪瑞德科技有限公司 Game handle of mobile device applicable to touch screen operation
CN202259500U (en) * 2011-09-28 2012-05-30 深圳市安保科技有限公司 Two-in-one device of pulse oximeter oxygen saturation (SPO2) detection device and mobile battery
CN103049041A (en) * 2011-10-12 2013-04-17 鸿富锦精密工业(深圳)有限公司 Gamepad and component of tablet computer
CN103999012B (en) 2011-10-14 2018-01-26 爱格升有限公司 Tablet personal computer storage device
WO2013059759A1 (en) * 2011-10-21 2013-04-25 Gamestop, Inc. Wireless controller
US20130106687A1 (en) * 2011-10-26 2013-05-02 Bladepad, Llc Electronic communication between a gamepad and an electronic device
US20130120258A1 (en) 2011-11-16 2013-05-16 Daryl D. Maus Multi-touch input device
US9116555B2 (en) 2011-11-23 2015-08-25 Sony Computer Entertainment America Llc Gaming controller
US9459457B2 (en) 2011-12-01 2016-10-04 Seebright Inc. Head mounted display with remote control
WO2013095703A1 (en) 2011-12-20 2013-06-27 Wikipad, Inc. Game controller
US9126119B2 (en) 2012-06-12 2015-09-08 Wikipad, Inc. Combination computing device and game controller with flexible bridge section
US8529357B2 (en) 2011-12-20 2013-09-10 Wikipad, Inc. Combination tablet computer and game controller with flexible bridge section
US9407100B2 (en) * 2011-12-20 2016-08-02 Wikipad, Inc. Mobile device controller
CN104012193A (en) 2012-01-06 2014-08-27 索尼公司 External structure and electronic apparatus
US20130221923A1 (en) * 2012-02-23 2013-08-29 Whalen Furniture Manufacturing Inc. Universal charging system and battery pack for video game controllers
US9216349B2 (en) * 2012-04-05 2015-12-22 Michael Paul South Gripping controllers for portable computing devices
EP2700436A4 (en) * 2012-04-20 2015-06-17 Nintendo Co Ltd Game system and game controller
CA2775338C (en) * 2012-04-25 2014-04-08 Fulcrum Environmental Solutions Inc. Processing unit and method for separating hydrocarbons from feedstock material
US8842423B2 (en) 2012-06-15 2014-09-23 Blackberry Limited Cover for a portable electronic device
US20140179437A1 (en) * 2012-06-20 2014-06-26 Zagg Intellectual Property Holding Co., Inc. Gaming accessory for a portable electronic device, gaming systems and methods
CN202817305U (en) 2012-07-25 2013-03-20 名硕电脑(苏州)有限公司 An angle-adjustable connecting module
CN202860081U (en) * 2012-08-21 2013-04-10 高博 Mobile phone game controller
JP2014041528A (en) * 2012-08-23 2014-03-06 Sharp Corp Terminal device and electric apparatus
CN202817828U (en) * 2012-09-13 2013-03-20 名硕电脑(苏州)有限公司 Wireless charging device
JP6051016B2 (en) * 2012-10-30 2016-12-21 任天堂株式会社 Information processing system, game system, and information processing apparatus
US20140247246A1 (en) * 2012-11-15 2014-09-04 Daryl D Maus Tactile to touch input device
HK1184322A2 (en) 2012-11-23 2014-01-17 Phonejoy Solutions Ltd A handheld apparatus for communicating control signals to an information processing device
US20140374457A1 (en) 2012-12-19 2014-12-25 Brian Piccolo Adjustable game controller neck supporter
US20140200085A1 (en) 2013-01-14 2014-07-17 William Bares Game controller hand grip for coupling with associated support apparatus
US9529447B2 (en) * 2013-01-18 2016-12-27 Microsoft Technology Licensing, Llc Removable input module
US8858335B2 (en) * 2013-01-18 2014-10-14 Microsoft Corporation Reconfigurable clip-on modules for mobile computing devices
US9176538B2 (en) * 2013-02-05 2015-11-03 Microsoft Technology Licensing, Llc Input device configurations
WO2014133558A1 (en) * 2013-02-28 2014-09-04 Wikipad, Inc. Combination tablet computer and game controller with flexible bridge section
US8905845B2 (en) * 2013-03-13 2014-12-09 Bby Solutions, Inc. Gaming controller system
TWI594791B (en) * 2013-03-15 2017-08-11 新力電腦娛樂(美國)責任有限公司 Gaming controller
JP6132613B2 (en) * 2013-03-25 2017-05-24 ツインバード工業株式会社 Fan
JP5937998B2 (en) * 2013-04-10 2016-06-22 株式会社ユニバーサルエンターテインメント Game machine
KR20140127171A (en) 2013-04-22 2014-11-03 임머숀 코퍼레이션 Gaming device having a haptic-enabled trigger
GB201310373D0 (en) 2013-06-11 2013-07-24 Sony Comp Entertainment Europe Head-Mountable Apparatus and systems
US9539507B2 (en) * 2013-07-11 2017-01-10 Incipio, Llc Apparatus and method for controller accessory
US20150031452A1 (en) * 2013-07-25 2015-01-29 Immersa Labs, Inc. Game Controller for a Portable Computing Device
SE540658C2 (en) 2013-07-31 2018-10-09 Vaederstad Holding Ab Holder for a tablet
JP2015041465A (en) * 2013-08-21 2015-03-02 三菱化学株式会社 Connector, and lighting system
TWM470688U (en) * 2013-08-22 2014-01-21 Alcor Micro Corp Control handle combined with electronic device
JP5668114B2 (en) 2013-09-04 2015-02-12 株式会社東芝 Electronics
TWM474553U (en) * 2013-10-25 2014-03-21 Acer Inc Game controller
US9529194B2 (en) 2013-11-21 2016-12-27 Samsung Electronics Co., Ltd. Head-mounted display apparatus
CN103736275B (en) * 2013-12-13 2016-10-26 青岛歌尔声学科技有限公司 Electronic device stand and be provided with the Multifunctional gamepad of described support
GB2522008B (en) * 2013-12-20 2017-06-28 Sony Interactive Entertainment Inc Videogame controller, system and entertainment device
CN203816200U (en) * 2014-01-08 2014-09-10 余招某 Gamepad with supporting structure
CN104645613A (en) * 2015-02-05 2015-05-27 陈强 Hidden game pad and intelligent mobile phone with hidden game pad
JP5926773B2 (en) 2014-06-26 2016-05-25 任天堂株式会社 Peripheral device, information processing system, and connection method of peripheral device
TWM494020U (en) 2014-07-04 2015-01-21 Dexin Corp Control handle capable of being combined with portable electronic device
CN204147522U (en) * 2014-07-14 2015-02-11 艾芮克股份有限公司 The stretching structure of control device
CN204246818U (en) 2014-08-11 2015-04-08 宝德科技股份有限公司 Can in conjunction with the control handle of portable electronic equipment
CN104307172B (en) * 2014-10-16 2017-07-28 英业达科技有限公司 Game console external member
CN104258565B (en) * 2014-10-16 2017-05-17 英业达科技有限公司 Game controller suite
CN104436646A (en) * 2014-10-22 2015-03-25 英业达科技有限公司 Combined type signal input module and combined type signal input handle
CN204182121U (en) * 2014-11-12 2015-03-04 青岛歌尔声学科技有限公司 There is the Multifunctional gamepad of removable electronic device support
CN104353239A (en) * 2014-11-12 2015-02-18 青岛歌尔声学科技有限公司 Multifunctional gamepad with detachable electronic equipment bracket
US9711980B2 (en) * 2014-11-25 2017-07-18 Microsoft Technology Licensing, Llc Charging device for removable input modules
CN204208280U (en) * 2014-11-25 2015-03-18 上海星融实业股份有限公司 A kind of Bluetooth gamepad used on mobile terminals
CN104383683A (en) * 2014-11-27 2015-03-04 英业达科技有限公司 Electronic device and signal input hand shank thereof
CN104368149A (en) * 2014-11-28 2015-02-25 英业达科技有限公司 Electronic device and signal input handle thereof
CN204380232U (en) * 2015-01-12 2015-06-10 深圳市申江科技开发有限公司 Game paddle
JP6327669B2 (en) 2015-02-10 2018-05-23 任天堂株式会社 Attached equipment, information processing system, information processing apparatus, information processing program, operation determination method, and processing execution method
US20160360644A1 (en) 2015-06-08 2016-12-08 Arctec Mobile Inc. Heat Dissipation in a Mobile Device to Enable High-Performance Desktop Functionality
US9724601B2 (en) 2015-06-12 2017-08-08 Nintendo Co., Ltd. Game controller
JP6612183B2 (en) * 2015-06-12 2019-11-27 任天堂株式会社 Game controller
CN205375273U (en) 2015-11-24 2016-07-06 深圳市爱坤电子科技有限公司 Panel computer based on dismantled and assembled game paddle of wireless 2. 4G
US10198074B2 (en) 2016-02-18 2019-02-05 Immersion Corporation Haptically-enabled modular peripheral device assembly
EP3482805B1 (en) 2016-10-06 2020-01-29 Nintendo Co., Ltd. Attachment
JP6153238B1 (en) 2016-10-06 2017-06-28 任天堂株式会社 Attachment and operation system
US20180250588A1 (en) 2017-03-02 2018-09-06 Jayme Winick Universal game controller/console

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080167740A1 (en) * 2007-01-05 2008-07-10 David Merrill Interactive Audio Recording and Manipulation System
US20180345130A1 (en) * 2014-11-18 2018-12-06 Razer (Asia-Pacific) Pte. Ltd. Gaming controller for mobile device and method of operating a gaming controller

Also Published As

Publication number Publication date
ES2693896T3 (en) 2018-12-14
ES2847649T3 (en) 2021-08-03
EP3103530A1 (en) 2016-12-14
EP3103531A1 (en) 2016-12-14
EP3103535A2 (en) 2016-12-14
US20170072304A1 (en) 2017-03-16
EP3103532A3 (en) 2017-03-29
US20170136353A1 (en) 2017-05-18
CN110559650B (en) 2020-05-12
EP3106213A1 (en) 2016-12-21
CN111330264A (en) 2020-06-26
EP3106213B1 (en) 2018-08-01
JP2017000783A (en) 2017-01-05
JP2024042014A (en) 2024-03-27
JP2017000760A (en) 2017-01-05
JP2023040129A (en) 2023-03-22
JP7482208B2 (en) 2024-05-13
EP3103532B1 (en) 2020-07-22
JP2017148526A (en) 2017-08-31
US20170052750A1 (en) 2017-02-23
ES2683255T3 (en) 2018-09-25
CN110280015A (en) 2019-09-27
EP3103535A3 (en) 2017-05-10
US10543423B2 (en) 2020-01-28
EP3103533A3 (en) 2016-12-21
US10583356B2 (en) 2020-03-10
EP3733254A1 (en) 2020-11-04
EP3461541A1 (en) 2019-04-03
JP6601914B2 (en) 2019-11-06
ES2823024T3 (en) 2021-05-05
JP2021037325A (en) 2021-03-11
EP3103531B1 (en) 2020-07-08
EP3269434A1 (en) 2018-01-17
JP2017000784A (en) 2017-01-05
CN110280015B (en) 2020-04-10
JP6121034B2 (en) 2017-04-26
EP3103534A3 (en) 2016-12-28
JP2019197585A (en) 2019-11-14
US20240278112A1 (en) 2024-08-22
JP6635597B2 (en) 2020-01-29
JP2017000759A (en) 2017-01-05
EP3103528A2 (en) 2016-12-14
ES2826923T3 (en) 2021-05-19
ES2712467T3 (en) 2019-05-13
EP3103532A2 (en) 2016-12-14
JP2020205060A (en) 2020-12-24
CN111330264B (en) 2021-03-09
ES2864576T3 (en) 2021-10-14
CN111135555A (en) 2020-05-12
JP2021142328A (en) 2021-09-24
EP3666351B1 (en) 2021-01-06
EP3103534B1 (en) 2023-04-26
EP3858451B1 (en) 2022-04-13
JP6700114B2 (en) 2020-05-27
ES2797127T3 (en) 2020-12-01
US20200139232A1 (en) 2020-05-07
US20160361641A1 (en) 2016-12-15
EP3275516B1 (en) 2020-03-11
EP3103529B1 (en) 2018-05-09
EP3103533B1 (en) 2020-11-18
EP3103530B1 (en) 2018-05-09
JP6482038B2 (en) 2019-03-13
US20190351320A1 (en) 2019-11-21
JP2018083088A (en) 2018-05-31
JP2019195743A (en) 2019-11-14
CN114602160A (en) 2022-06-10
JP2017000781A (en) 2017-01-05
ES2682986T3 (en) 2018-09-24
JP6289724B1 (en) 2018-03-07
US10118093B2 (en) 2018-11-06
JP2020072963A (en) 2020-05-14
CN113041606A (en) 2021-06-29
CN106537295A (en) 2017-03-22
JP2017000758A (en) 2017-01-05
EP3103528A3 (en) 2016-12-28
US20220233949A1 (en) 2022-07-28
JP6712668B2 (en) 2020-06-24
CN106537295B (en) 2022-03-04
EP4212221A1 (en) 2023-07-19
EP3824981A1 (en) 2021-05-26
EP3275516A1 (en) 2018-01-31
EP3733254B1 (en) 2021-07-28
EP3666351A1 (en) 2020-06-17
JP2019010534A (en) 2019-01-24
JP6869401B2 (en) 2021-05-12
JP2019171100A (en) 2019-10-10
JP2017000757A (en) 2017-01-05
WO2016199887A1 (en) 2016-12-15
JP6580023B2 (en) 2019-09-25
US11110344B2 (en) 2021-09-07
EP3103533A2 (en) 2016-12-14
ES2683569T3 (en) 2018-09-26
JP2017086922A (en) 2017-05-25
JP6683549B2 (en) 2020-04-22
JP6799120B2 (en) 2020-12-09
CN111309148B (en) 2021-02-05
CN113244633A (en) 2021-08-13
CN110559650A (en) 2019-12-13
US11951386B2 (en) 2024-04-09
JP2017060859A (en) 2017-03-30
CN111309148A (en) 2020-06-19
ES2919675T3 (en) 2022-07-27
JP7427762B2 (en) 2024-02-05
JP2017209503A (en) 2017-11-30
JP2017000782A (en) 2017-01-05
US10010789B2 (en) 2018-07-03
EP3103528B1 (en) 2018-11-21
CN113041606B (en) 2022-03-04
JP2020166875A (en) 2020-10-08
JP2023052163A (en) 2023-04-11
EP3103534A2 (en) 2016-12-14
EP3269434B1 (en) 2020-10-07
EP3103529A3 (en) 2016-12-28
EP3461541B1 (en) 2021-01-13
JP2018023796A (en) 2018-02-15
JP6886778B2 (en) 2021-06-16
JP6074534B2 (en) 2017-02-01
JP6647358B2 (en) 2020-02-14
JP6628460B2 (en) 2020-01-08
EP3103535B1 (en) 2018-05-09
JP6839309B2 (en) 2021-03-03
EP3103529A2 (en) 2016-12-14
JP6612183B2 (en) 2019-11-27
JP6233943B1 (en) 2017-11-22
CN111135555B (en) 2023-05-26
JP6083884B2 (en) 2017-02-22
JP6857770B2 (en) 2021-04-14
JP2017004523A (en) 2017-01-05
EP3858451A1 (en) 2021-08-04
CN113244633B (en) 2022-03-04
JP6172778B2 (en) 2017-08-02
CN114602161A (en) 2022-06-10

Similar Documents

Publication Publication Date Title
US11951386B2 (en) Information processing system, information processing device, controller device and accessory
US11724178B2 (en) Game controller
US20240359089A1 (en) Game controller

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION