GB2467951A - Detecting orientation of a controller from an image of the controller captured with a camera - Google Patents

Detecting orientation of a controller from an image of the controller captured with a camera Download PDF

Info

Publication number
GB2467951A
GB2467951A GB0902939A GB0902939A GB2467951A GB 2467951 A GB2467951 A GB 2467951A GB 0902939 A GB0902939 A GB 0902939A GB 0902939 A GB0902939 A GB 0902939A GB 2467951 A GB2467951 A GB 2467951A
Authority
GB
Grant status
Application
Patent type
Prior art keywords
device
control
camera
image
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0902939A
Other versions
GB0902939D0 (en )
Inventor
Colin Jonathan Hughes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Interactive Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/02Accessories
    • A63F13/06Accessories using player-operated means for controlling the position of a specific area display
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/26Measuring arrangements characterised by the use of optical means for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/22Other optical systems; Other optical apparatus for producing stereoscopic or other three dimensional effects
    • G02B27/2214Other optical systems; Other optical apparatus for producing stereoscopic or other three dimensional effects involving lenticular arrays or parallax barriers
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light

Abstract

Data processing apparatus, such as a games console 1120, comprises a camera 1110 and a control device 1000. The control device comprises a group of features arranged so that when the camera captures an image of the control device, only a subset of the group of features is visible to the camera. The particular subset of features present in any captured image depending on the orientation of the control device with respect to the camera. The orientation of the control device with respect to the camera is detected by analysing an image of the control device captured by the camera to detect which subset of the group of features is present in the captured image. The control device further comprises one or more light-directing formations positioned between the group of features and the camera, the light directing formations allowing light from the subset of features appropriate to the current orientation of the control device to reach the camera, and inhibiting light from features not in the subset of features appropriate to the current orientation of the control device from reaching the camera. The light directing formations may comprise one or more apertures in a masking arrangement spaced apart from the group of features; or one or more lenses, such as an array of spherical convex lenses, for example a lenticular array, a "fly's eye' lens array or integral imaging array. The group of features may be printed or may be dynamically generated or actively generated on an LCD display. In the case of actively generated features a controller trigger function may be indicated by an abrupt change in the displayed features (e.g. pattern).

Description

ORIENTATION DETECTION

This invention relates to methods and apparatus for orientation detection.

In a video game or computer application, it is useful to know the orientation in space of an item that the person playing the game or using the application is holding. An example of such an item is a so-called "light gun" (See http://en.wikipedia.org/wiki/Light_gun), which is a device which the user points at a target on a video display device (such as a television screen) in order to "shoot" the target.

Typically a light gun comprises a gun "barrel", which in its simplest form is just an elongate tube, and a light sensor within the barrel. The barrel's function is to prevent light hitting the light sensor unless the light is arriving at the barrel in a direction generally parallel to the long axis of the barrel. A light gun may also incorporate a "trigger", which is an electrical switch or push button operable by the user.

Light guns are particularly relevant to so-called "first person shooter" games, in which a user tries to shoot targets or enemies displayed on the display device. In operation, the video game or application displays a bright area as a "target" on the display device. To increase a level of challenge, the target may well be moving. The user aims the light gun at the target and operates the trigger. As the trigger is pressed, the amount of light hitting the light sensor within the barrel is detected and compared to a threshold level. If the detected light exceeds the threshold level, the attempt is classified as a hit. Otherwise the attempt is classified as a miss.

So in general terms, the light gun used in this manner provides a way of detecting the orientation of an object (the light gun) with respect to a required orientation (pointing at the displayed target on the display device) at the time that the user carries out a certain action (pressing the trigger button).

This simple type of light gun has the disadvantages that the enemies or targets have to be displayed as the brightest parts of the image, and that a successful" hit" could be registered simply by directing the light gun at another source of light such as the room lighting.

More advanced versions of this type of light gun make use of the scanning process that occurs when an image is displayed on a cathode ray tube (CRT) display device. The image is produced by an electron beam travelling across and down the screen in a raster pattern, once in each display period (generally a field period). So, the position of the electron beam varies with time in a predetermined manner, This allows the light gun system to detect the small change in light output when the electron beam excites the particular area of the screen at which the gun is pointed. The system can derive the orientation of the light gun with respect to the screen (i.e. which part of the screen the light gun is pointing towards) from the exact time of that change in light output with respect to the known timing of the scanning pattern of the screen.

Although this arrangement avoids the need for the target to be displayed as a bright block on the screen, a main disadvantage of this technique is that it relies on a raster-scanned display screen. The technique will not work with modem display technology such as liquid crystal or plasma displays.

More recently, video game controllers have used acceleration sensors to detect changes in their orientation in space. Examples of such arrangements include the Sony � PlayStation 3 � Six-axis � controller and the Nintendo � Wii � remote controller. The Wii remote controller also allows absolute orientation to be estimated by referencing an optical sensor to two sets of infra red LEDs spaced about 20cm apart on a so-called "Sensor Bar" which is positioned in line with the front of the user's television set. The LEDs are arranged in two longitudinal groups, one at each end of the Sensor Bar and are angled with respect to the long axis of the Sensor Bar, so that outermost LEDs (those nearest each end of the Sensor Bar) are directed slightly outwards, those in the middle of each group point forwards, and the innermost LEDs point slightly inwards.

The Wii remote controller carries an image sensor which is used to locate the LEDs of the Sensor Bar in the remote controller's field of view. The distance and angle between the two clusters of lights as detected by the image sensor allows the Wii remote controller to estimate its orientation by a triangulation process.

Disadvantages with the arrangements described above are that the use of accelerometers alone does not allow an absolute orientation to be estimated; only changes in orientation can be detected. However, the use of infra-red sensors to derive estimates of absolute orientation can lead to problems if there are other sources of infra-red light in the vicinity. Also, controllers requiring accelerometers, optical sensors and a Sensor Bar are complicated and relatively expensive devices.

It is desirable to provide a cheaper and simpler technique for object orientation detection.

This invention provides data processing apparatus comprising a camera; a control device comprising a group of features arranged so that when the camera captures an image of the control device, only a subset of the group of features is visible to the camera, the particular subset in any captured image depending on the orientation of the control device with respect to the camera; and an image processor operable to detect the orientation of the control device with respect to the camera by analysing an image of the control device captured by the camera to detect which subset of the group of features is present in the captured image; in which the control device comprises one or more light-directing formations positioned between the group of features and the camera, the light directing formations allowing light from the subset of features appropriate to the current orientation of the control device to reach the camera, and inhibiting light from features not in the subset of features appropriate to the current orientation of the control device from reaching the camera.

The invention recognises that some modern video games or computer systems have associated video cameras. The invention makes use of such a video camera to observe special features of an object by means of light-directing formations in order to detect the orientation of the object.

One example type of special feature is a so-called Integral Imaging array (See http://en.wikipedia.org/wiki/Integralimaging). Arrays of this type appear as one of a number of different images depending on the angle at which the viewer is looking at the array. So, the camera views an integral imaging array on the front of the object (or elsewhere on the object but within its view), and applies known image recognition techniques to detect which of the number of different images is currently shown. From this detection, analysis of the camera images can determine the angular orientation of the array (and hence the object to which the array is mounted) with respect to the camera.

Multiple integral imaging arrays can be used so as to increase the angular resolution of the system. Pairs of one-dimensional arrays could be used so as to give separate detections of angular position in two orthogonal directions. The arrays could be mounted on a convex or concave substrate so as to increase their sensitivity to angular movement.

A second example type of special feature involves arranging a set of (preferably illuminated) images behind a pinhole or window, so that the particular image(s) seen by the camera depend on the angle of the object with respect to the camera.

Embodiments of the invention are particularly suited for use in a game processing apparatus, in which a detection by the image processor of the position and orientation of the control device (e.g. relative to a video display screen associated with the game processing apparatus) is used by the game processing apparatus to control game playing operation of the game processing apparatus.

Further aspects and features of the invention are defined in the appended claims.

Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which: Figure 1 is a schematic diagram of an entertainment device; Figure 2 is a schematic diagram of a processor of the entertainment device; Figure 3 is a schematic diagram of a video processor of the entertainment device; Figure 4 is a schematic diagram illustrating the interaction of a control device with an entertainment device; Figure 5 schematically illustrates a control device; Figures 6 and 7 schematically illustrate possible image patterns in the control device of Figure 5; Figure 8 schematically illustrates an active image generator; Figure 9 schematically illustrates another embodiment of a control device; Figures lOa and lOb schematically illustrate a calibration process; Figure 11 is a schematic cutaway perspective view of an integral imaging array; Figure 12 is a schematic cross sectional view of a part of an integral imaging array; Figure 13 is a schematic representation of a part of an array of images; Figure 14 schematically illustrates a so-called lenticular array; Figures 15 and 16 schematically illustrate alternative patterns of lenticular arrays; Figure 17 schematically illustrates a possible set of image patterns corresponding to the array of Figure 15; Figure 18 schematically illustrates a possible set of image patterns corresponding to the array of Figure 16; and Figures 19 and 20 schematically illustrate further embodiments of control devices incorporating lenticular arrays.

Figure 1 schematically illustrates the overall system architecture of the Sony� PlayStation 3� entertainment device. A system unit 10 is provided, with various peripheral devices connectable to the system unit.

The system unit 10 comprises: a Cell processor 100; a Rambus� dynamic random access memory (XDRAM) unit 500; a Reality Synthesiser graphics unit 200 with a dedicated video random access memory (VRAM) unit 250; and an 110 bridge 700.

The system unit 10 also comprises a Blu Ray� Disk BD-ROM� optical disk reader 430 for reading from a disk 440 arid a removable slot-in hard disk drive (HDD) 400, accessible through the I/O bridge 700. Optionally the system unit also comprises a memory card reader 450 for reading compact flash memory cards, Memory Stick� memory cards and the like, which is similarly accessible through the 110 bridge 700.

The I/O bridge 700 also connects to four Universal Serial Bus (USB) 2.0 ports 710; a gigabit Ethernet port 720; an IEEE 802.1 lb/g wireless network (Wi-Fi) port 730; and a Bluetooth� wireless link port 740 capable of supporting up to seven Bluetooth connections.

In operation the I/O bridge 700 handles all wireless, USB and Ethernet data, including data from one or more game controllers 751. For example when a user is playing a game, the I/O bridge 700 receives data from the game controller 751 via a Bluetooth link and directs it to the Cell processor 100, which updates the current state of the game accordingly.

The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 751, such as: a remote control 752; a keyboard 753; a mouse 754; a portable entertainment device 755 such as a Sony PlayStation Portable� entertainment device; a video camera such as an EyeToy� video camera 756; and a microphone headset 757. Such peripheral devices may therefore in principle be connected to the system unit wirelessly; for example the portable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 757 may communicate via a Bluetooth link.

The provision of these interfaces means that the PlayStation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.

In addition, a legacy memory card reader 410 may be connected to the system unit via a USB port 710, enabling the reading of memory cards 420 of the kind used by the PlayStation� or PlayStation 2� devices.

In the present embodiment, the game controller 751 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link. However, the game controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 751. In addition to one or more analogue joysticks and conventional control buttons, the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the PlayStation Portable device may be used as a controller. In the case of the PlayStation Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).

The remote control 752 is also operable to communicate wirelessly with the system unit 10 via a Bluetooth link. The remote control 752 comprises controls suitable for the operation of the Blu Ray Disk BD-ROM reader 430 and for the navigation of disk content.

The Blu Ray Disk BD-ROM reader 430 is operable to read CD-ROMs compatible with the PlayStation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader 430 is also operable to read DVD-ROMs compatible with the PlayStation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. The reader 430 is further operable to read BD-ROMs compatible with the PlayStation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.

The system unit 10 is operable to supply audio and video, either generated or decoded by the PlayStation 3 device via the Reality Synthesiser graphics unit 200, through audio and video connectors to a display and sound output device 300 such as a monitor or television set having a display 305 and one or more loudspeakers 310. The audio connectors 210 may include conventional analogue and digital outputs whilst the video connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 72Op, 1080i or lO8Op high definition.

Audio processing (generation, decoding and so on) is performed by the Cell processor 100. The PlayStation 3 device's operating system supports Dolby� 5.1 surround sound, Dolby� Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray� disks.

In the present embodiment, the video camera 756 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 10. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 10, for example to signify adverse lighting conditions. Embodiments of the video camera 756 may variously connect to the system unit 10 via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.

In general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 10, an appropriate piece of sofiware such as a device driver should be provided. Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.

Referring now to Figure 2, the Cell processor 100 has an architecture comprising four basic components: external input and output structures comprising a memory controller 160 and a dual bus interface controller 1 70A,B; a main processor referred to as the Power Processing Element 150; eight co-processors referred to as Synergistic Processing Elements (SPEs) L1OA-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 180. The total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the PlayStation 2 device's Emotion Engine.

The Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (LI) cache. The PPE is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 150 is to act as a controller for the Synergistic Processing Elements 11 OA-H, which handle most of the computational workload. In operation the PPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 11 OA-H and monitoring their progress. Consequently each Synergistic Processing Element 11 OA-H runs a kernel whose role is to fetch a job, execute it and synchronise with the PPE 150.

Each Synergistic Processing Element (SPE) 11 OA-H comprises a respective Synergistic Processing Unit (SPU) 120A-H, and a respective Memory Flow Controller (MFC) 140A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142A-H, a respective Memory Management Unit (MMD) 144A-H and a bus interface (not shown). Each SPU l2OA-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 1 30A-H, expandable in principle to 4 GB. Each SPE gives a theoretical 25.6 GFLOPS of single precision performance. An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. The SPU 1 20A-H does not directly access the system memory XDRAM 500; the 64-bit addresses formed by the SPU 120A-H are passed to the MFC 1 40A-H which instructs its DMA controller 1 42A-H to access memory via the Element Interconnect Bus 180 and the memory controller 160.

The Element Interconnect Bus (EIB) 180 is a logically circular communication bus internal to the Cell processor 100 which connects the above processor elements, namely the PPE 150, the memory controller 160, the dual bus interface 170A,B and the 8 SPEs 1IOA-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 11 OA-H comprises a DMAC I 42A-H for scheduling longer read or write sequences. The EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction. The theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) ata clock rate of 3.2GHz.

The memory controller 160 comprises an XDRAM interface 162, developed by Rambus Incorporated. The memory controller interfaces with the Rambus)(DRAM 500 with a theoretical peak bandwidth of 25.6 GB/s.

The dual bus interface 1 70A,B comprises a Rambus FlexIO� system interface 1 72A,B.

The interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller io 170A and the Reality Simulator graphics unit 200 via controller 170B.

Data sent by the Cell processor 100 to the Reality Simulator graphics unit 200 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.

Referring now to Figure 3, the Reality Simulator graphics (RSX) unit 200 is a video accelerator based upon the NVidia� G70/71 architecture that processes and renders lists of commands produced by the Cell processor 100. The RSX unit 200 comprises a host interface 202 operable to communicate with the bus interface controller 1 70B of the Cell processor 100; a vertex pipeline 204 (VP) comprising eight vertex shaders 205; a pixel pipeline 206 (PP) comprising 24 pixel shaders 207; a render pipeline 208 (RP) comprising eight render output units (ROPs) 209; a memory interface 210; and a video converter 212 for generating a video output.

The RSX 200 is complemented by 256 MB double data rate (DDR) video RAM (VRAM) 250, clocked at 600MHz and operable to interface with the RSX 200 at a theoretical peak bandwidth of 25.6 GB/s. In operation, the VRAM 250 maintains a frame buffer 214 and a texture buffer 216.

The texture buffer 216 provides textures to the pixel shaders 207, whilst the frame buffer 214 stores results of the processing pipelines. The RSX can also access the main memory 500 via the EIB 180, for example to load textures into the VRAM 250.

The vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered.

The pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel. Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture).

The render pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image. Optionally, if the intervening pixel process will not affect depth values (for example in the absence of transparency or displacement mapping) then the render pipeline and vertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency. In addition, the render pipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image.

Both the vertex shaders 205 and pixel shaders 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second. The total floating point performance of the RSX 200 is 1.8 TFLOPS.

Typically, the RSX 200 operates in close collaboration with the Cell processor 100; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene. In this case, the PPU 155 of the Cell processor may schedule one or more SPEs 1 bA-H to compute the trajectories of respective batches of particles. Meanwhile, the RSX 200 accesses any texture data (e.g. snowflakes) not currently held in the video RAM 250 from the main system memory 500 via the element interconnect bus 180, the memory controller 160 and a bus interface controller I 70B.

The or each SPE 11 OA-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to the video RAM 250; the DMA controller 142A-H of the or each SPE 1 1OA-H addresses the video RAM 250 via the bus interface controller 1 70B. Thus in effect the assigned SPEs become part of the video processing pipeline for the duration of the task.

In general, the PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled. The disabling of one SPE provides a greater level of tolerance during fabrication of the Cell processor, as it allows for one SPE to fail the fabrication process. Alternatively if all eight SPEs are functional, then the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor.

The PPU 155 can assign tasks to SPEs in several ways. For example, SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video and audio decoding, and error masking, with each step being assigned to a separate SPE.

Alternatively or in addition, two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above.

Software instructions implemented by the Cell processor 100 and/or the RSX 200 may be supplied at manufacture and stored on the HDD 400, andlor may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or internet connection, or via combinations of these.

The software supplied at manufacture comprises system firmware and the PlayStation 3 device's operating system.(OS). In operation, the OS provides a user interface enabling a user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video. The interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally. The user navigates by moving through the function icons (representing the functions) horizontally using the game controller 751, remote control 752 or other suitable control device so as to highlight a desired function icon, at which point options pertaining to that function appear as a vertically scrollable list of option icons centred on that function icon, which may be navigated in analogous fashion. However, if a game, audio or movie disk 440 is inserted into the BD-ROM optical disk reader 430, the PlayStation 3 device may select appropriate options automatically (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HDD 400).

In addition, the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demonstration games (demos) and other media may be downloaded, and a friends management capability, providing on-line communication with other PlayStation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available. The on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the PlayStation 3 device itself. It will be appreciated that the term "on-line" does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.

Figure 4 is a schematic diagram illustrating the interaction of a control device 1000 with an entertainment device 1100. Note that the entertainment device 1100 is just one example of a generic game and/or data processing apparatus.

The entertainment device comprises a camera 1110 (for example the camera 756 of Figure 1), a system unit 1120 which may be embodied as the PS3 10 of Figure 1 operating under appropriate software control, and a video display device 1130 (for example, the display device 300 of Figure 1). The system unit provides various functionality relating to game playing, including the functions of an image detector (image processor) 1140, a command generator 1150 and a game engine 1160. Suitable software which, when executed by the PS3 10, causes the PS3 to provide these functions is envisaged as an embodiment of the invention and may be provided as a computer program product comprising a storage medium (such as an optical disc) carrying such software. The software may also be provided via a network connection such as an internet connection.

The camera 1110 is arranged so that, in use, it can view the control device 1000. For example the camera might be placed just above, below or to the side of the display device 1130.

The control device comprises a group of special features which can be recognised by the image detector (as described below) to detect the orientation of the control device with respect to the camera. Optionally it may comprise a user trigger 1010. The function of the user trigger will be described below. Note that a trigger is an optional feature; the embodiments will still provide the features needed for orientation detection without a trigger.

In operation, the camera 1110 views the control device. The image detector 1140 detects the orientation (and optionally the image position) of the control device with respect to the camera 1110 by analysing an image of the control device captured by the camera to detect which subset of the group of features is present in the captured image. The detection by the image processor of the orientation of the control device is used by the game processing apparatus to control game playing operation of the game processing apparatus; in particular, the detection by the image detector is passed to the command generator 1150 which generates a command to be supplied to the game engine 1160.

The user trigger associated with the control device can be activated by the user to cause the initiation of a game process by the game engine. For example, activation of the user trigger could initiate a "missile firing" operation within some types of game, with the target of the missile being dependent on the current orientation of the control device at the time that the trigger was activated. This type of arrangement particularly lends itself to a control device shaped generally like a gun or pistol, with a convenient finger-operated trigger, but the present techniques are not limited to any particular physical configuration of the control device.

in general terms, two main sets of embodiments of the control device will now be described. A significant feature of all of the embodiments is the use of one or more light-directing formations positioned between the group of features and the camera, the light directing formations allowing light from the subset of features appropriate to the current orientation of the control device to reach the camera, and inhibiting light from features not in the subset of features appropriate to the current orientation of the control device from reaching the camera.

In one set of embodiments, the light-directing formations take the form of a perforated mask spaced apart from various features to be viewed by the camera. Here, the mask has one or more holes through which a subset of the features may be seen by the camera, with the particular subset being dependent upon the orientation of the control device relative to the camera. These embodiments will be described with reference to Figures 5 to 9. In another set of embodiments, the light-directing formations take the form of an array of lenses to direct light onto a subset of a group of features, making use of the way in which lenses map a direction of incident light onto a lateral position in the focal plane of the lens. These embodiments will be described with reference to Figures 11 to 18. Figures lOa, lOb, 20 and 21 relate to features applicable to either set of embodiments.

Figure 5 schematically illustrates a control device formed as a generally box-shaped housing 1200 having a front aperture 1210, a handle 1220, a switch (for use as the trigger activator) 1230 and a light 1240 which illuminates the inside of the housing when the switch 1230 is activated. The light can provide a single flash (a short duration burst of light, for example less than 0.1 second duration) in response to the switch first being pressed by the user, or the light could provide a continuous illumination for as long as the switch is depressed. Or in an alternative embodiment the light could provide continuous illumination except for times during which the switch is depressed or for short durations at times when the switch is initially activated.

A rear inside surface 1250 of the housing 1200 carries printed images. The selection of images and/or the pattern or arrangement of the images with respect to one another, is such that a view, by the camera, of a small section of the surface 1250 through the aperture 1210 provides an indication of the orientation of the control device with respect to the camera, because the particular subset of images or image parts that can be seen at a particular angle are unique with respect to other possible viewing angles of the surface 1250 through the aperture 1210.

This feature is schematically illustrated in Figure 6, which shows the rear inside surface 1250 and a sub-area 1260 of that surface which may be seen by the camera at an arbitrarily chosen orientation of the control device relative to the camera. In this example, the surface 1250 is arranged as an array of sub-images 1270. These sub-images could all be different to one another. Alternatively, they could be formed from a set of images arranged with respect to one another so that (for example) any particular square group of four or nine sub-images is different to any other possible square group of four or nine sub-images. This would provide the required unique relationship between the device-camera orientation and the view as seen by the camera through the aperture 1210 without the need for every sub-image to be different.

The operation of the device depends to an extent on the nature of the light arrangement 1230, 1240.

In a system where the light is normally off, the camera 1110 can track the general position of the control device (by known object recognition and tracking algorithms) but the image detector cannot detect its orientation until the light is illuminated. The illumination of the light by the user pressing the trigger therefore serves two purposes: it allows the orientation of the control device to be detected, and the fact that such a detection is now possible can also serve to trigger a data processing operation (such as a "missile fire" command) by the entertainment device.

Similarly, in a system where the light is normally on, the image detector could continuously detect the orientation of the control device. When the light is turned off this can act as a trigger for a data processing command, with that command being referenced to the last-detected orientation of the control device before the light went off.

In a third possibility, the control device may not even have a light. For example, the walls of the housing 1200 could be formed of a translucent material which allows ambient light to illuminate the surface 1250 but which does not allow features on the surface 1250 to be resolved. So the only way that the image detector can resolve features on the surface 1250 is through the aperture 1210. In embodiments without a light, an optional trigger function could is be provided by a movable mechanical shutter or blanking plate, which can be arranged under the control of a user-operable lever to obscure or reveal the features used by the image detector to assess orientation. As described above, the instant of obscuring or revealing the features could be taken as the trigger instant. If the trigger has a function of obscuring the features, the last-detected orientation would be used in connection with the triggered function.

Note that the features on the surface 1250 need not be printed features, though this is a conveniently straightforward solution. Three-dimensional features could be employed, for

example.

If more than one control device is used in respect of a single camera (for example, two game players could each have a control device to share in operation of a common game), then different patterns could be used on the respective surfaces 1250 of the control devices.

Alternatively other identifying markings or features which are always visible (or at least, which are visible some of the time and are not used for the current orientation detection operation) could be provided, for example on the exterior surface of the control device.

Figure 7 schematically illustrates another possible arrangement of the surface 1250, in which a colour gradient is provided. In this example the surface has a linearly increasing green (G) component from left to right (as viewed), a linearly increasing red (R) component from bottom to top (as held by the handle 1220) and a constant blue (B) component.

The constant blue component is provided for calibration of the detection arrangement. In other words, detection (by the image detector) of the blue component of the portion of the surface 1250 seen through the aperture 1210 provides an indication of the overall illumination level within the housing 1200. Once this illumination level has been detected, the image detector can detect the horizontal and vertical position of the currently-observed portion by detecting the mean green and red components of that portion.

Note that the continuous variations described with respect to Figure 7 still class as a group of features, even though there is not necessarily a distinction between one feature and the next. Only a subset of such a group is visible by the camera at any one orientation of the control device.

The arrangements of Figures 6 and 7 have the advantage of being passive (apart from the optional light 1230) and cheap to produce. Figure 8 schematically illustrates a further possibility in which the surface 1250 is provided as a liquid crystal display (LCD) screen under the control of an image generator 1280, which may be hidden in the handle 1220 of the control device. The LCD screen displays the image patterns required to allow orientation detection and/or differentiation of multiple control devices. In this instance, a trigger function for detection by the image detector could be initiated by an abrupt change in the displayed pattern, for example in response to the user pressing the switch 1230. The LCD arrangement of Figure 8 can therefore be used with or without the light 1240.

Figure 9 schematically illustrates another configuration of a control device, comprising a housing 1300, which in this case is generally cylindrical (though a square or other section tube or similar shape could be used), coupled to a handle 1320 carrying a a user trigger switch 1330.

The switch operates a light 1340 as described above, except that in the present example the light is positioned at the back of the housing 1300.

An aperture 1310 on the opaque front surface 1305 of the housing carries a lens 1315 such as a convex lens. Patterns, such as those described with reference to Figures 6 to 8, are distributed around the interior surface of the cylindrical wall of the housing 1300.

In operation, the patterns are illuminated by the light 1340. The lens 1315 provides a mapping between light from a region 1350 of the interior of the cylindrical housing and an exit direction 1360. In this way, when the control device is viewed by the camera 1110, the camera is able to observe the patterns or markings on a region of the interior of the cylindrical housing, with the particular region being dependent on the orientation of the housing (and hence the control device) with respect to the camera.

The switch 1330 and light 1340 can act in the same various ways as those described with reference to Figure 5. Similarly, as an alternative to the light, the cylindrical housing can be formed of translucent material so that ambient light can be used to illuminate the patterns.

So far, the description has been in respect of detecting the orientation of the control device with respect to the camera. However, in many applications it could be more useful to be able to detect the orientation of the control device with respect to the display device 1130 -in other words, the angle 1400 as illustrated schematically in Figure lOa andlor the position 1405 on the display device at which the control device is pointing. [For clarity, of course Figure 1 Oa represents the situation in only two dimensions; in reality, the arrangement would of course be three-dimensional and the angle 1400 would be considered with reference to the three-dimensional relationship between the camera, display device and control device.] In order to derive a relationship between the image position and orientation of the control device relative to the camera and the orientation of the control device relative to target positions on the display device, a calibration process is carried out. An aspect of the calibration process is illustrated schematically in Figure 1 Ob.

In basic terms, the calibration process involves the entertainment device 1100 instructing the display device to display one or more targets. An example of a displayed target is shown schematically in Figure lOb. Preferably multiple targets are used, one after another, which collectively cover substantially the whole viewable area of the display device. For example, successive targets in the four corners and the centre of the display device could be used. The user is asked (for example by an on-screen message) to point the control device at a currently displayed target and make an indication to the entertainment device that he is doing so. Such an indication could be by pressing the trigger, on a control device equipped with a trigger, or by pressing a button on a remote control or game controller, or even by making a loud noise if the system is equipped with a microphone. When such an indication is made, the entertainment device stores data indicative of the image position of the control device 1000 within the image captured by the camera 1110, and the angle of orientation of the control device 1000 relative to the camera.

The calibration process is carried out by the image detector 1140 and allows the entertainment device to derive a mapping between the image position and the orientation of the control device relative to the camera and the orientation of the control device relative to the display device. Assuming a limited number of targets are displayed (for example the five targets mentioned above), this mapping will be expressed as the camera-control device orientation at a number of display device sample points. The entertainment device can use known techniques to triangulate newly required mapping values and/or to interpolate between or (as the case may be) extrapolate beyond the sample points to derive the relevant mapping relationship at other display device screen positions. Any lateral movements of the control device (i.e. movements up, down or side-to-side with respect to the camera and/or display device) are detected by a change in the image position of the control device in the image captured by the camera. Angular changes of the control device are detected by the orientation detection techniques described in the present application. The calculations assume that the control device remains at approximately the same distance from the display device while in use, and that most changes are angular rather than lateral. This is a reasonable assumption in the context of a video game playing arrangement.

However, changes in the separation of the camera and control device can in fact be detected by the entertainment device storing data indicative of the image size of the control device, in the image captured by the camera, at the time of calibration. This can not only give an indication of the separation of the camera and control device (which is used to calibrate for the effect of lateral movements of the control device by the user) but also any subsequent changes in the detected image size of the control device (i.e. in use) are used to derive a change in camera-control device separation which is then used to adjust the mapping information discussed above.

The calibration process can be carried out when the control device is first used, or more preferably it can be carried out whenever the entertainment device is initiated (switched on, rebooted etc). As an optional refinement of this process in respect of games or other applications where the user is required to direct the control device at particular display device screen positions, the game engine can provide an error detection function. Here, the game engine detects the positional error between the required target and the display device screen position at which the control device is pointing. If the error is generally consistent (e.g. consistently approximately 50-100 pixels to the left of the required position) then the error detector can either initiate a recalibration process with the user, or simply correct the calibration data stored by the entertainment system on the assumption that, for example, the camera 1110 has been accidentally moved from the position it occupied when the initial calibration was carried out, or the user has moved the control device significantly closer to or further away from the display device. Similarly, if the system responds to changes in detected orientation substantially correctly, i.e. the user is generally successful in hitting targets on the display screen, but the system responds incorrectly to changes in lateral movement (i.e. the user misses display screen targets), the calibration with respect to lateral movements can be corrected.

Figure 11 is a schematic cutaway perspective view of a so-called integral imaging array.

An integral image generally consists of a very large number of closely-packed, distinct micro-images on a substrate 1410, that are viewed by an observer through an array of spherical convex lenses 1420, one lens for every micro-image. The term "integral" comes from the integration of all the micro images into a complete three dimensional image through the lens array. This special type of lens array is also sometimes known as a fly's-eye or integral lens array.

In this type of array, the thickness of the lens array sheet is chosen so that parallel incoming light rays 1430 generally focus on the opposing side of the array (i.e. the substrate 1410), which is typically flat; see Figure 12 which is a schematic cross section of a part of the array of Figure 11. It is at this focal plane that the micro-images 1440 are placed, one for every lens, side by side. Since each lens focuses to a point onto a micro-image below, an observer can never view two spots within a micro-image simultaneously. By using identical micro-images spaced at the same separation as the lenses 1420, the array can provide a mapping between the angle of incident (or reflected) light and the position within the micro-image that is viewed at that angle. It is this mapping which allows the image detector in the present arrangement of Figure 4 to detect the angle or orientation between the camera and the control device.

Figure 13 schematically illustrates a part of an array of images (or micro-images) 1440, such that one image 1440 lies behind each lens 1420. Each image 1440 carries a pattern of markings which are in many ways similar to the markings carried on the surface 1250 in Figures 6 or 7. In other words the markings are chosen andlor arranged relative to one another so that when a region of the image 1440 is viewed by the camera, image analysis of the region can give a unique positional reference within the image, which in turn maps to a unique angle between the camera and the plane of the array.

Figure 14 schematically illustrates a so-called lenticular array, in which a set of elongate lenses 1500 are arranged in a side-by-side array, that is to say with the long axes of the lenses parallel or substantially parallel to one another. Note that the lenses do not exhibit curvature along their long axis; the curvature (providing the lens effect) is perpendicular to the long axis.

An array of this type provides angular detection along one direction (the direction perpendicular to the long axes of the lenses) while not providing angular discrimination along the long axes of the lenses.

In fact Figure 14 shows two such arrays 1510, 1520, with the long axes of the lenses in one array being at a non-zero angle (in this example, perpendicular) to the long axes of the lenses in the other array. In this way, the array 1510 provides angular discrimination in a horizontal direction (as drawn) and the array 1520 provides angular discrimination in a vertical direction (as drawn). The angular detection by each array is substantially independent of the angle in the other dimension.

Figures 15 and 16 schematically illustrate further possible patterns of lenticular arrays; an array of the same type as the array 1520 is shown as a series of adjacent horizontal lines (such as the array 1530), whereas an array of the same type as the array 1510 is shown as a series of vertical lines (such as the array 1540).

Behind each lens is an elongate image. The images are arranged behind the lenses and are spaced apart at the same spacing as the lenses. Figures 17 and 18 schematically illustrate possible configurations of images for use with the arrays of Figure 15 and 16 respectively.

Referring to Figure 17, five schematic images are shown in order from the top to bottom of the page, a respective one for each of the five arrays of Figure 15. Looking at the three horizontally aligned arrays of Figure 15, the top array and bottom array have images arranged to be lighter at the top and darker at the bottom. The middle array has images the other way up. So as the vertical component of the camera-array angle varies, the arrays lighten or darken according to the arrangement of the sub-images. This provides a more reliable detection of angle than a single array, as the ratio of brightness between the differently-arranged images can be detected as well as the absolute brightnesses. A similar arrangement is provided for the two vertically oriented arrays.

Figure 18 illustrates a similar arrangement with the orientations of the four sets of images varying so as to give simultaneous positive and negative brightness variations for a change of angle in either axis.

Figures 19 and 20 schematically illustrate further embodiments of control devices incorporating lenticular arrays of the type described above.

Figure 19 shows a control device having a handle 1600, a lenticular array 1610 on a convex substrate, a retroreflective strip 1620 and a trigger 1630. Viewed from the front, the array 1610 may be circular in section, with the retroreflective strip 1620 being an annulus around the array 1610.

By placing the array 1610 on a convex substrate, the angular range over which the array may be viewed is extended and the sensitivity of the system to angular changes is increased. The retroreflective strip allow the camera and image detector to identify the control device more easily.

The trigger could operate a light such as an LED in the centre of the array 1610, so as to indicate a trigger event to the game engine. Alternatively (or in addition), the array 1610 could be formed on a transparent or translucent substrate, with the trigger being coupled to a backlight arrangement.

Figure 20 schematically illustrates an arrangement having a flat substrate on which the array 1710 is mounted, the control device having a handle 1700 and a trigger 1630 similar to those shown in Figure 19.

The retroreflective strip 1620 may be used with any of the embodiments described above, to assist in identification of the control device by the image detector. Similarly, the backlight arrangement andlor the indicator LED or similar light of Figure 19 may be applied to the other embodiments.

Sum ma In summary, embodiments of the invention provide apparatus for detecting the angular orientation of an object relative to a camera, in which the object comprises special features which, when viewed by the camera, appear differently in dependence on the orientation of the object relative to the camera, and in which the apparatus comprises image recognition means for deriving, from an image of the object picked up by the camera, the orientation of the object. In some embodiments the special features comprise one or more integral image arrays having different images visible at different angles of view. In other embodiments the special features comprise a plurality of different images and a viewing window, arranged so that only a subset of the images may be seen by the camera through the viewing window at any one time, in which the particular subset depends on the angular orientation of the object relative to the camera.

Embodiments of the invention provide a video game having a light gun or similar device which makes use of apparatus as defined above.

Claims (27)

  1. CLAIMS1. Data processing apparatus comprising:* a camera; s a control device comprising a group of features arranged so that when the camera captures an image of the control device, only a subset of the group of features is visible to the camera, the particular subset in any captured image depending on the orientation of the control device with respect to the camera; and an image processor operable to detect the orientation of the control device with respect to the camera by analysing an image of the control device captured by the camera to detect which subset of the group of features is present in the captured image; in which the control device comprises one or more light-directing formations positioned between the group of features and the camera, the light directing formations allowing light from the subset of features appropriate to the current orientation of the control device to reach the camera, and inhibiting light from features not in the subset of features appropriate to the current orientation of the control device from reaching the camera.
  2. 2. Apparatus according to claim 1, in which the one or more light-directing formations comprise one or more apertures in a masking arrangement spaced apart from the group of features.
  3. 3. Apparatus according to claim 2, in which the group of features comprises a configuration of two or more objects arranged with respect to the one or more apertures so that, when in use with a camera spaced apart from the control device, for any orientation of the control device with respect to the camera not all of the objects are visible to the camera through the one or more apertures.
  4. 4. Apparatus according to claim 3, in which the objects and/or the configuration are such for any orientation of the control device with respect to the camera at which an image of one or more objects may be captured, a respective unique pattern of objects is visible to the camera through the one or more apertures.
  5. 5. Apparatus according to any one of claims 2 to 4, in which: the control device comprises a light arranged under user control to illuminate the group of features; the image processor is operable to detect illumination of the light by analysing an image of the control device captured by the camera, and to initiate a data processing operation of the apparatus in response to a detection of a change in illumination provided by the light, the data processing operation being dependent upon the detected orientation of the control device with respect to the camera.
  6. 6. Apparatus according to claim 5, in which the data processing apparatus is a game processing apparatus, and the data processing operation is a game playing operation.
  7. 7. Apparatus according to claim 1, in which the one or more light-directing formations comprise one or more lenses positioned with respect to the group of features so as to direct light from different ones of the features towards the camera in dependence on the orientation of the control device with respect to the camera.
  8. 8. Apparatus according to claim 7, in which the light-directing formations comprise an array of substantially identical lenses.
  9. 9. Apparatus according to claim 8, in which the lenses are spherical convex lenses.
  10. 10. Apparatus according to claim 8, in which the lenses are elongate lenses arranged side-by-side, substantially parallel to one another.
  11. 11. Apparatus according to claim 10, comprising two or more arrays of elongate lenses, the arrays being arranged with respect to one another so that the long axis of the lenses in one array is not parallel to the long axis of the lenses in an adjacent array.
  12. 12. Apparatus according to any one of claims 8 to 11, in which the group of features comprises a repeating pattern of printed features such that each repetition of the pattern represents the group of features and each repetition of the pattern is disposed so as to be viewed, by the camera, through a respective one of the array of lenses.
  13. 13. Apparatus according to claim 12, in which the array of lenses abuts the pattern of printed features.
  14. 14. Apparatus according to any one of claims 7 to 13, in which the lenses are provided on a convex substrate.
  15. 15. Apparatus according to any one of claims ito 5 and 7 to 14, in which: the apparatus is a game processing apparatus; the image detector is arranged to detect the image position of the control device in an image captured by the camera; and the detection by the image processor of the image position and orientation of the control device is used by the game processing apparatus to control game playing operation of the game processing apparatus.
  16. 16. Apparatus according to claim 15, comprising a video display device, in which the game processing apparatus is responsive to the detected image position and orientation of the control device to detect a position on the video display device pointed to by the control device.
  17. 17. Apparatus according to claim 16, in which the image processor is arranged to detect the orientation of the control device with respect to the video display device by carrying out an initial calibration operation in which the orientation of the control device relative to the camera is detected while the control device is being pointed at a known position relative to the video display device.
  18. 18. Apparatus according to claim 17, in which, during the calibration operation, the apparatus is operable to display a succession of targets on the video display device and to instruct the user to direct the control device to a currently displayed target.
  19. 19. Apparatus according to claim 17 or claim 18 in which: in use, a user is required to direct the control device at target areas on the video display device; the apparatus comprises an error detector operable to detect an orientation error when the user directs the control device at a target area on the video display device, the error detector being operable to initiate a calibration operation if a substantially consistent error is detected in respect of a threshold number of displayed targets.
  20. 20. Apparatus according to any one of the preceding claims, in which the control device comprises a retroreflective marker arranged so as to be viewed, during use of the control device, by the camera.
  21. 21. A control device for use with a data processing apparatus having a camera and an image processor, the control device comprising a group of features arranged so that when the camera captures an image of the control device, only a subset of the group of features is visible to the camera, the particular subset in any captured image depending on the orientation of the control device with respect to the camera, the control device having one or more light-directing formations positioned between the group of features and the camera, the light directing formations allowing light from the subset of features appropriate to the current orientation of the control device to reach the camera, and inhibiting light from features not in the subset of features appropriate to the current orientation of the control device from reaching the camera; the arrangement being such that the image processor can detect the orientation of the control device with respect to the camera by analysing an image of the control device captured by the camera to detect which subset of the group of features is present in the captured image.
  22. 22. A method of operation of a data processing apparatus comprising a camera, an image processor and a control device; in which the control device comprises a group of features arranged so that when the camera captures an image of the control device, only a subset of the group of features is visible to the camera, the particular subset in any captured image depending on the orientation of the control device with respect to the camera; and one or more light-directing formations positioned between the group of features and the camera, the light directing formations allowing light from the subset of features appropriate to the current orientation of the control device to reach the camera, and inhibiting light from features not in the subset of features appropriate to the current orientation of the control device from reaching the camera.the method comprising the step of: the image processor detecting the orientation of a control device with respect to the camera by analysing an image of the control device captured by the camera to detect which subset of the group of features is present in the captured image.
  23. 23. Computer software for carrying out a method according to claim 22.
  24. 24. A computer program product comprising a storage medium on which software according to claim 23 is stored.
  25. 25. Data processing apparatus substantially as hereinbefore described with reference to the s accompanying drawings.
  26. 26. A control device substantially as hereinbefore described with reference to the accompanying drawings.
  27. 27. A data processing method substantially as hereinbefore described with reference to the accompanying drawings.
GB0902939A 2009-02-20 2009-02-20 Orientation detection Withdrawn GB0902939D0 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0902939A GB0902939D0 (en) 2009-02-20 2009-02-20 Orientation detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0902939A GB0902939D0 (en) 2009-02-20 2009-02-20 Orientation detection

Publications (2)

Publication Number Publication Date
GB0902939D0 GB0902939D0 (en) 2009-04-08
GB2467951A true true GB2467951A (en) 2010-08-25

Family

ID=40565489

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0902939A Withdrawn GB0902939D0 (en) 2009-02-20 2009-02-20 Orientation detection

Country Status (1)

Country Link
GB (1) GB0902939D0 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2533134A (en) * 2014-12-11 2016-06-15 Sony Computer Entertainment Inc Exercise mat, entertainment device and method of interaction between them

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1055479A (en) * 1962-12-31 1967-01-18 Ibm Optical apparatus for indicating the angular orientation of a body
US3633212A (en) * 1970-10-15 1972-01-04 Guy F Cooper System for determining the orientation of an object by employing plane-polarized light
US5936722A (en) * 1996-08-15 1999-08-10 Armstrong; Brian S. R. Apparatus and method for determining the angular orientation of an object
US5936723A (en) * 1996-08-15 1999-08-10 Go Golf Orientation dependent reflector
EP1074934A2 (en) * 1999-08-02 2001-02-07 Lucent Technologies Inc. Computer input device having six degrees of freedom for controlling movement of a three-dimensional object
WO2001035054A1 (en) * 1999-11-12 2001-05-17 Armstrong Brian S Methods and appparatus for measuring orientation and distance
US6384908B1 (en) * 1996-08-15 2002-05-07 Go Sensors, Llc Orientation dependent radiation source
GB2379493A (en) * 2001-09-06 2003-03-12 4D Technology Systems Ltd Controlling an electronic device by detecting a handheld member with a camera
US20050211885A1 (en) * 2004-03-25 2005-09-29 Tobiason Joseph D Optical path array and angular filter for translation and orientation sensing
US7102616B1 (en) * 1999-03-05 2006-09-05 Microsoft Corporation Remote control device with pointing capacity
WO2007050885A2 (en) * 2005-10-26 2007-05-03 Sony Computer Entertainment America Inc. System and method for interfacing with a computer program
WO2007130999A2 (en) * 2006-01-10 2007-11-15 Sony Computer Entertainment Inc. Detectable and trackable hand-held controller
US20080080789A1 (en) * 2006-09-28 2008-04-03 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US20080122786A1 (en) * 1997-08-22 2008-05-29 Pryor Timothy R Advanced video gaming methods for education and play using camera based inputs

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1055479A (en) * 1962-12-31 1967-01-18 Ibm Optical apparatus for indicating the angular orientation of a body
US3633212A (en) * 1970-10-15 1972-01-04 Guy F Cooper System for determining the orientation of an object by employing plane-polarized light
US6384908B1 (en) * 1996-08-15 2002-05-07 Go Sensors, Llc Orientation dependent radiation source
US5936722A (en) * 1996-08-15 1999-08-10 Armstrong; Brian S. R. Apparatus and method for determining the angular orientation of an object
US5936723A (en) * 1996-08-15 1999-08-10 Go Golf Orientation dependent reflector
US20080122786A1 (en) * 1997-08-22 2008-05-29 Pryor Timothy R Advanced video gaming methods for education and play using camera based inputs
US7102616B1 (en) * 1999-03-05 2006-09-05 Microsoft Corporation Remote control device with pointing capacity
EP1074934A2 (en) * 1999-08-02 2001-02-07 Lucent Technologies Inc. Computer input device having six degrees of freedom for controlling movement of a three-dimensional object
WO2001035054A1 (en) * 1999-11-12 2001-05-17 Armstrong Brian S Methods and appparatus for measuring orientation and distance
GB2379493A (en) * 2001-09-06 2003-03-12 4D Technology Systems Ltd Controlling an electronic device by detecting a handheld member with a camera
US20050211885A1 (en) * 2004-03-25 2005-09-29 Tobiason Joseph D Optical path array and angular filter for translation and orientation sensing
WO2007050885A2 (en) * 2005-10-26 2007-05-03 Sony Computer Entertainment America Inc. System and method for interfacing with a computer program
WO2007130999A2 (en) * 2006-01-10 2007-11-15 Sony Computer Entertainment Inc. Detectable and trackable hand-held controller
US20080080789A1 (en) * 2006-09-28 2008-04-03 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2533134A (en) * 2014-12-11 2016-06-15 Sony Computer Entertainment Inc Exercise mat, entertainment device and method of interaction between them
US9814925B2 (en) 2014-12-11 2017-11-14 Sony Interactive Entertainment Inc. Exercise mat, entertainment device and method of interaction between them

Also Published As

Publication number Publication date Type
GB0902939D0 (en) 2009-04-08 grant

Similar Documents

Publication Publication Date Title
US20120026166A1 (en) Spatially-correlated multi-display human-machine interface
US20100201812A1 (en) Active display feedback in interactive input systems
US20060139314A1 (en) Interactive video display system
US20110285704A1 (en) Spatially-correlated multi-display human-machine interface
US20080024523A1 (en) Generating images combining real and virtual images
US20070257884A1 (en) Game program and game system
US20130194259A1 (en) Virtual environment generating system
US20130141434A1 (en) Virtual light in augmented reality
US20140176591A1 (en) Low-latency fusing of color image data
US20100060632A1 (en) Method and devices for the real time embeding of virtual objects in an image stream using data from a real scene represented by said images
US20100105475A1 (en) Determining location and movement of ball-attached controller
US20120086630A1 (en) Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system
US20060192782A1 (en) Motion-based tracking
US20130141419A1 (en) Augmented reality with realistic occlusion
US20110304714A1 (en) Storage medium storing display control program for providing stereoscopic display desired by user with simpler operation, display control device, display control method, and display control system
US20110157017A1 (en) Portable data processing appartatus
US8287373B2 (en) Control device for communicating visual information
US20100302378A1 (en) Tracking system calibration using object position and orientation
US20120229508A1 (en) Theme-based augmentation of photorepresentative view
US8432476B2 (en) Media viewing
US20110304611A1 (en) Storage medium having stored thereon image processing program, image processing apparatus, image processing system, and image processing method
US20100033479A1 (en) Apparatus, method, and computer program product for displaying stereoscopic images
US20090091553A1 (en) Detecting touch on a surface via a scanning laser
US20140049559A1 (en) Mixed reality holographic object development
US20160091720A1 (en) Realtime lens aberration correction from eye tracking

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)