GB2441975A - Video game - Google Patents

Video game Download PDF

Info

Publication number
GB2441975A
GB2441975A GB0618406A GB0618406A GB2441975A GB 2441975 A GB2441975 A GB 2441975A GB 0618406 A GB0618406 A GB 0618406A GB 0618406 A GB0618406 A GB 0618406A GB 2441975 A GB2441975 A GB 2441975A
Authority
GB
United Kingdom
Prior art keywords
game
sequence
primary game
further representation
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0618406A
Other versions
GB0618406D0 (en
Inventor
Matthew Christian Townsen Hart
Tameem Nadi Antoniades
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Computer Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Ltd filed Critical Sony Computer Entertainment Europe Ltd
Priority to GB0618406A priority Critical patent/GB2441975A/en
Publication of GB0618406D0 publication Critical patent/GB0618406D0/en
Priority to PCT/GB2007/002744 priority patent/WO2008035027A1/en
Publication of GB2441975A publication Critical patent/GB2441975A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device

Abstract

Video game apparatus, in which a primary game object 840 is displayed within a game environment of a video game, comprises a user controller; and a game program defining a sequence of one or more target tasks 810,820,830 required to be performed with respect to the primary game object by operating the user controller; in which the apparatus displays a further representation 850 of the primary game object displaced, in the game environment, from the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence. Optionally the game program is arranged to select an immediately following scene in dependence upon a degree to which the target tasks in the associated sequence are successfully completed.

Description

1 2441975
VIDEO GAME
This invention relates to video games.
Most current video games are played by means of a user operating a user control such as a so-called "joypad", which is a handheld unit providing several user-operable switches, buttons, joysticks and the like. For example, the joypad normally used with the Sony PlayStation 2" video game machine provides two joysticks, four directional buttons (up, down, left and right), four multi-function buttons (a, o and x) and four shoulder buttons (Ll,L2,Rl,R2).
As part of the game design process, the available user controls are mapped to required game functions. This might be relatively straightforward in the case of defining motion of a game object (such as a game character, a vehicle or the like), where an obvious mapping is that a joystick and/or the directional control buttons are used to control required movements. As regards the other buttons, while some conventions have developed within the game industry, the allocation of functions to controls is still arbitrary and must be learned by the player of the game. For example, a function such as punching or kicking an opponent has no intrinsically obvious control button to carry out this function.
It is known, especially in role-playing, action ad-venture video games, for a game object to pass through various game scenes, such that a sequence of game tasks must be completed in. a current scene for the game object either to enter the next scene at all, or to have a chance of completing the next scene successfully. In many cases these scenes have rio predetermined outcome, and the user can cause the game object to move extensively through the game environment and carry out tasks in a flexible order. This requires that the game environment be generated in real time. However, in other instances, for example in so-called "interactive cut scenes", the game designer may wish to provide a more detailed or complicated video sequence through which the game object must pass. In such cases, rather than derive the game environment in real time, it can be set up in advance, almost as a video : *. clip. The game object must perform various tasks to pass through such an interactive cut scene; if a task is not carried out successfully and at the required time within the scene, the game object passes to a "failure" scene. Such scenes are often fast-moving and require a : complicated sequence of actions by the user in order to pass successfully through the scene.
It is always important to establish a balance between a game being challenging and yet being achievable (eventually) by the user. For this reason, it has been proposed that the : : :.* user is given some assistance during a fast-moving interactive cut scene. 2
An example of this.is to display, at the time that a user needs to take a particular action, a visual indication of which button needs to be pressed. For example, in the game Tomb Raider', at the time that a user needs to press (say) the X button, a large X is.
displayed on the video game's display screen. However, there are at least two disadvantages with such an arrangement. One drawback is that because an icon representing a particular button is alien to the rest of the game environment, it appears to be in the foreground and can be distracting from the action going on behind it. This is especially a problem in the case of a carefully prepared interactive cut scene using particularly exciting video scenes -it is not desirable for the user's attention to be taken away from the underlying action. A second disadvantage is that the process of displaying a button identification, and the user pressing that button, could tend to disconnect (in the user's mind) the hard-learned association between that button and a particular game ociion.
This invention provides video game apparatus in which a primary game object is displayed within a game environment of a video game, the apparatus having: a user controller; and a game program defining a sequence of one or more target tasks required to be performed with respect to the primary game object by operating the user controller; in which the apparatus displays a further representation of the primary game object displaced, in the game environment, from the displayed primary game character, a display property of the further representation being dependent upon-a next target task in the sequence.
The invention provides an elegantly simple yet innovative approach to providing user instruction or guidance as to the next required game task by a game object (e.g. a game character, vehicle or the like). A further representation of the game object is displayed, which is displaced from the primary game object (i.e. the one which the user is controlling).
A display property such as the position, colour or both of the further representation is used to indicate what the user should do next.
* ,* In a specific example, the further, representation might be displayed ahead of the :::::: primary game object (in game time) so that the further representation indicates a movement path that the primary game object must follow -for example to scale a cliff face or to cross a derelict bridge.
Further respective aspects and features of the invention are defined in the appended claims. S * S 55. 55. S5 S. -3
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which: Figure 1 schematically illustrates the overall system architecture of the PlayStation2; Figure 2 schematically illustrates the architecture of an Emotion Engine; Figure 3 schematically illustrates the configuration of a Graphics Synthesiser; Figure 4 schematically illustrates a game involving successive scenes; Figure 5 schematically illustrates an interactive cut scene; Figure 6 schematically illustrates a branch scene; Figure 7 schematically illustrates a scene definition table; Figures 8a to 8c schematically illustrate a ghost game character; Figures 9a to 9d schematically illustrate a ghost game object; and Figure 10 schematically illustrates another possible scene definition table.
Figure 1 schematically illustrates the overall system architecture of the PlayStation2.
A system unit 10 is provided, with various peripheral devices connectable to the system unit.
The system unit 10 comprises: an Emotion Engine 100; a Graphics Synthesiser 200; a sound processor unit 300 having dynamic random access memory (DRAM); a read only memory (ROM) 400; a compact disc (CD) and digital versatile disc (DVD) reader 450; a Rambus Dynamic Random Access Memory (RDR.AM) unit 500; an input/output processor (lOP) 700 with dedicated RAM 750. An (optional) external hard disk drive (HDD) 390 may be connected.
The input/output processor 700 has two Universal Serial Bus (U SB) ports 715 and an iLink or IEEE 1394 port (iLink is the Sony Corporation implementation of the IEEE 1394 standard). The lOP 700 handles aft USB, iLink and game controller data traffic. For example when a user is playing a game, the lOP 700 receives data from the game controller and directs it to the Emotion Engine 100 which updates the current state of the game accordingly. The lOP 700 has a Direct Memory Access (DMA) architecture to facilitate rapid data transfer rates. DMA involves transfer of data from main memory to a device * ** without passing it through the CPU. The USB interface is compatible with Open Host Controller Interface (OHCI) and can handle data transfer rates of between 1.5 Mbps and 12 * I Mbps. Provision of these interfaces means that the PlayStation2 is potentially compatible * with peripheral devices such as video cassette recorders (VCRs), digital cameras, microphones, set-top boxes, printers, keyboard, mouse and joystick.
Generally, in order for successful data communication to occur with a peripheral ** device connected to a USB port 715, an appropriate piece of software such as a device driver *1 S * i a *
S
should be provided. Device driver technology is very well known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the embodiment described here.
In the present embodiment, a USB microphone 730 is connected to the USB port. It will be appreciated that the USB microphone 730 may be a hand-held microphone or may form part of a head-set that is worn by the human operator. The advantage of wearing a head-set is that the human operator's hand are free to perform other actions. The microphone includes an analogue-to-digital converter (ADC) and a basic hardware-based real-time data compression and encoding arrangement, so that audio data are transmitted by the microphone 730 to the USB port 715 in an appropriate format, such as 16-bit mono PCM (an uncompressed format) for decoding at the PlayStation 2 system unit 10.
Apart from the USB ports, two other pOrts 705, 710 are proprietary sockets allowing the connection of a proprietary non-volatile RAM memory card 720 for storing game-related information, a hand-held game controller 725 or a device (not shown) mimicking a hand-held controller, such as a dance mat.
The system unit 10 may be connected to a network adapter 805 that provides an interface (such as an Ethernet interface) to a network. This network may be, for example, a LAN, a WAN or the Internet. The network may be a general network or one that is dedicated to game related communication. The network adapter 805 allows data to be transmitted to and received from other system units-10 that are connected to the same network, (the other system units 10 also havingcorresponding network adapters 805).
The Emotion Engine 100 is a 128-bit Central Processing Unit (CPU) that -has been specifically designed for efficient simulation of 3 dimensional (3D) graphics for games applications. The Emotion Engine components include a data bus, cache memory and registers, all of which are 128-bit. This facilitates fast processing of large volumes of multi-media data. Conventional PCs, by way of comparison, have a basic 64-bit data structure.
The floating point calculation performance of the PlayStation2 is 6.2 GFLOPs. The Emotion Engine also comprises MPEG2 decoder circuitry which allows for simultaneous processing of 3D graphics data and DVD data. The Emotion Engine performs geometrical calculations including mathematical transforms and translations and also performs calculations associated
I
with the physics of simulation objects, for example, calculation of friction between two * objects. It produces sequences of image rendering commands which are subsequently * utilised by the Graphics Synthesiser 200. The image rendering commands are output in the form of display lists. A display list is a sequence of drawing commands that specifies to the * * * .S* ** * * S * * ** Graphics Synthesiser which primitive graphic objects (e.g. points, tines, triangles1 sprites) to draw on the screen and at which co-ordinates. Thus a typical display list will comprise commands to draw vertices, commands to shade the faces of polygons, render bitmaps and so on. The Emotion Engine 100 can asynchronously generate multiple display lists.
The Graphics Synthesiser 200 is a video accelerator that performs rendering of the display lists produced by the Emotion Engine 100. The Graphics Synthesiser 200 includes a graphics interface unit (Off) which handles, tracks and manages the multiple display lists.
The rendering function of the Graphics Synthesiser 200 can generate image data that supports several alternative standard output image formats, i.e., NTSCIPAL, High Definition Digital TV and VESA. In general, the rendering capability of graphics systems is defined by the memory bandwidth between a pixel engine and a video memory, each of which is located within the graphics processor. Conventional graphics systems use external Video Random Access Memory (VRAM) connected to the pixel logic via an off-chip bus which tends to restrict available bandwidth. However, the Graphics Synthesiser 200 of the PlayStation2 provides the pixel logic and the video memory on a single high-performance chip which allows for a comparatively large 38.4 Gigabyte per second memory access bandwidth. The Graphics Synthesiser is theoretically capable of achieving a peak drawing capacity of 75 million polygons per second. Even with a full range of eftbcts such as textures, lighting and transparency, a sustained rate of 20 million polygons per second can be drawn continuously.
Accordingly, the Graphics Synthesiser 200 is-capable of rendering a film-quality image.
The Sound Processor Unit (SPU) 300 is effectively the soundcard of thq. system which is capable of recognising 3D digital sound such as Digital Theater Surround (DTS ) sound and AC-3 (also known as Dolby Digital) which is the sound format used for DVDs.
A display and sound output device 305, such as a video monitor or television set with an associated loudspeaker arrangement 310, is connected to receive video and audio signals from the graphics synthesiser 200 and the sound processing unit 300.
The main memory supporting the Emotion Engine 100 is the RDRAM (Rambus Dynamic Random Access Memory) module 500 produced by Rambus Incorporated. This * RDRAM memory subsystem comprises RAM, a RAM controller and a bus connecting the
S
*..**. 30 RAM to the Emotion Engine 100. S...
Figure 2 schematically illustrates the architecture of the Emotion Engine 100 of
SSS
*,, Figure 1. The Emotion Engine 100 comprises: a floating point unit (FPU) 104; a central processing unit (CPU) core 102; vector unit zero (VUO) 106; vector unit one (VU 1) 108; a *5S* S. * * SS * S. graphics interface unit (G1F) 11-0; an interrupt controller (INTC) 112; a timer unit 114; a direct memory access controller 116; an image data processor unit (IPU) 118; a dynamic random access memory controller (DR.AMC) 120; a sub-bus interface (SIF) 122; and all of these components are connected via a 128-bit main bus 124.
The CPU core 102 is a 128-bit processor clocked at 300 M1-Iz. The CPU core has access to 32 MB of main memory via the DRAMC 120. The CPU core 102 instruction set is based on MIPS Ill RISC with some MIPS N RISC instructions together with additional multimedia instructions. MIPS Ill and IV are Reduced Instruction Set Computer (RISC) instruction set architectures proprietary to MIPS Technologies, Inc. Standard instructions are 64-bit, two-way superscalar, which means that two instructions can be executed simultaneously. Multimedia instructions, on the other hand, use 128-bit instructions via two pipelines. The CPU core 102 comprises a 16KB instruction cache, an 8KB data cache and a 16KB scratchpad RAM which is a portion of cache reserved for direct private usage by the CPU.
The FPU 104 serves as a first co-processor for the CPU core 102. The vector unit 106 acts as a second co-processor. The FPU 104 comprises a floating point product sum arithmetic logic unit (FMAC) and a floating point division calculator (FDIV). Both the FMAC and FDIV operate on 32bit values so when an operation is carried out on a 128-bit value ( composed of four 32-bit values) an operation can be carried out on all four parts concurrently. For example adding 2 vectors together can be done-at the same time.
The vector units 106 and 108 perform mathematical operations and are essentially specialised FPUs that are extremely fast at evaluating the multiplication and addition of vector equations. They use Floating-Point Multiply-Adder Calculators (FMACs) for addition and multiplication operations and Floating-Point Dividers (FDIVs) for division and square root operations. They have built-in memory -for storing micro-programs and interface with the rest of the system via Vector Interface Units (VJFs). Vector unit zero 106 can work as a coprocessor to the CPU core 102 via a dedicated 128-bit bus so it is essentially a second specialised FPU. Vector unit one 108, on the other hand, has a dedicated bus to the Graphics synthesiser 200 and thus can be considered as a completely separate processor. The *.S.
*... 30 inclusion of two vector units allows the software developer to split up the work between *SSS different parts of the CPU and the vector units can be used in either serial or parallel * * * connection. ** S
Vector unit zero 106 comprises 4 FMACS and 1 FDIV. It is connected to the CPU core 102 via a coprocessor connection. It has 4 Kb of vector unit memory for data and 4 Kb ***
S *5S S. S * SS * **
----, 7 of micro-memory for instructions. Vector unit zero 106 is useful for performing physics calculations associated with the images for display. It primarily executes non-patterned geometric processing together with the CPU core 102.
Vector unit one 108 comprises 5 FMACS and 2 FDLVs. It has no direct path to the CPU core 102, although it does have a direct path to the GIF unit 110. It has 16 Kb of vector unit memory for data and 16 Kb of micro-memory for instructions. Vector unit one 108 is useful for performing transformations. It primarily executes patterned geometric processing and directly outputs a generated display list to the GIF 110.
The GIF 110 is an interface unit to the Graphics Synthesiser 200. It converts data according to a tag specification at the beginning of a display list packet and transfers drawing commands to the Graphics Synthesiser 200 whilst mutually arbitrating multiple transfer.
The interrupt controller (INTC) 112 serves to arbitrate interrupts from peripheral devices, except the DMAC 116.
The timer unit 114 comprises four independent timers with 16-bit counters. The timers are driven either by the bus clock (at 1/16 or 1/256 intervals) or via an external clock.
The DMAC 116 handles data transfers between main memory and peripheral processors or main memory and the scratch pad memory. It arbitrates the main bus 124 at the same time.
Performance optimisation of the DMAC 116 is a key way by which to improve Emotion Engine performance. The image processing unit (IPU) 118 is an image data processor that is used to expand-compressed animations and texture. images. It performs I-PICTURE Macro-Block decoding, colour space conversion and vector quantisation. Finally, the sub-bus interface (SIF) 122 is an. interface unit to the lOP 700. It has its own memory and bus to control I/O-devices such as sound chips and storage devices.
Figure 3 schematically illustrates the configuration of the Graphic Synthesiser 200.
The Graphics Synthesiser comprises: a host interface 202; a set-up / rasterizing unit; a pixel pipeline 206; a memory interface 208; a local memory 212 including a frame page buffer 214 and a texture page buffer 216; and a video converter 210.
The host interface 202 transfers data with the host (in this case the CPU core 102 of * the Emotion Engine 100). Both drawing data and buffer data from the host pass through this *..** 30 interface. The output from the host interface 202 is supplied to the graphics synthesiser 200 **S which develops the graphics to draw pixels based on vertex information received from the * .. : Emotion Engine 100, and calculates information such as RGBA value, depth value (i.e. Z- ** value), texture value and fog value for each pixel. The RGBA value specifies the red, green, blue (RGB) colour components and the A (Alpha) component represents opacity of an image
S *..* *. * * S * * S.
object. The Alpha value can range from completely transparent to totally opaque. The pixel data is supplied to the pixel pipeline 206 which performs processes such as texture mapping, fogging and Alpha-blending and determines the final drawing colour based on the calculated pixel information.
The pixel pipeline 206 comprises 16 pixel engines PEI, PE2, PEI6 so that it can process a maximum of 16 pixels concurrently. The pixel pipeline 206 runs at 150MHz with 32-bit colour and a 32-bit Z-buffer. The memory interface 208 reads data from and writes data to the local Graphics Synthesiser memory 212. It writes the drawing pixel values (RGBA and Z) to memory at the end of a pixel operation and reads the pixel values of the frame buffer 214 from memory. These pixel values read from the frame buffer 214 are used for pixel test or Alpha-blending. The memory interface 208 also reads from local memory 212 the RGBA values for the current contents of the frame buffer. The local memory 212 is a 32 Mbit (4MB) memory that is built-in to the Graphics Synthesiser 200. It can be organised as a frame buffer 214, texture buffer 216 and a 32-bit Z-buffer 215. The frame buffer 214 is the portion of video memory where pixel data such as colour information is stored.
The Graphics Synthesiser uses a 2D to 3D texture mapping process to add visual detail to 3D geometry. Each texture may be wrapped around a 3D image object and is stretched and skewed to give a 3D graphical effect. The texture buffer is used to store the texture inThrmation forJmage objects. The Z-buffer 215 (also known as depth buffer) is the memory available to store the depth information for a pixel. Images are constructed from basic building blocks known as graphics primitives or polygons. When a polygon is rendered with Z-buffering. the depth value of each of its pixels is compared with the corresponding value stored in the Z-buffer. If the value stored in the Z-buffer is greater than or equal to the depth of the new pixel value then this pixel is determined visible so that it should be rendered and the Z-buffer will be updated with the new pixel depth. If however the Z-buffer depth value is less than the new pixel depth value the new pixel value is behind what has already been drawn and will not be rendered.
The local memory 212 has a 1024-bit read port and a 1024-bit write port for S..' accessing the frame buffer and Z-buffer and a 512-bit port for texture reading. The video a...
converter 210 is operable to display the contents of the frame memory in a specified output **S* *., : format.
:. Figure 4 schematically illustrates the operation of a game involving successive scenes. In particular, the functionality of the PlayStation 2 described above has been *5* I. * * * ..
simplified to a single unit, the "game engine", which receives user commands from the hand-held controller 725 and generates video signals for display and sound signals for audio output.
The game engine responds to game information stored on discs and read by the disc reader 450. Game design is a well established art and, although complicated and lengthy, the type of information to be stored as part of a game program and its supporting data. Only the differences relevant to the present invention will be described here.
In an example game, a set of game scenes is defined: scene 1, scene 2 etc. The general intention is that the game object (character, vehicle etc) moves from scene to scene in a generally predetermined order, although branching points may be provided to allow different routes through the game. In the case of a role playing, action adventure type game, a game character may typically move through the scenes: killing or removing opponents, collecting treasure or point-scoring objects, acquiring clues and the like.
In some cases the scenes may be defined by a bounded game environment within which the character may roam. Within the scene there may be enemies wandering around waiting for a fight and clues or treasure dispersed around the environment for the character to find.
In other cases a scene may be entirely choreographed in advance. There may still be enemies to fight and items to find, but these are laid out in a predetermined order. A _20 sequence of tasks for the character to. complete. will have beeirpre-defmed. Successful completion of each task allows the user to attempt the next one; failure at a task generally means that the user has failed and either the game terminates or the.user is demoted (e.g. the user has to return to an earlier stage to try again). This type of scene is sometimes referred to as an interactive cut scene.
A technique of using a farther representation of a game object (to be called a ghost character) will now be described. Such a -technique is particularly relevant to the type of pre-choreographed scenes described above, but could also be used in-other types of scene.
Figure 5 schematically illustrates an interactive cut scene in which a sequence of required user actions (actions 1.1 to 1.5), each with an associated background video clip, enemies, treasure and the like, is arranged in a linear order. As a user successfully completes each task, the user is allowed to move on to the next task, through the whole scene. If any *S* . : task is not successfully completed, the user moves to a failure clip. An example of a failure clip, in the case of an action requiring the user's character to climb a cliff face, is a video clip showing the character falling off the cliff face. The failure clips might be shared between
S * S * * *5
actions and can be arranged in a sequence to add more dramatic effect, if required. After the failure clip the user's character can be removed from the game or can be transferred to a previous point in the game to try again.
Figure 6 schematically illustrates a branching scene. Failure cips are not shown but may apply to any of the actions shown in Figure 6.
If the user completes action 2.1, the user then attempts action 2.2. The action 2.2 has two successful outcomes: depending on the outcome, the user's character passes either to a sequence of actions 2.3, 2.4 and 2.5 or to a sequence of actions 2.6, 2.7 and 2.8.
The use of a ghost character will now be described.
Figure 7 schematically illustrates a scenedefinition table. This represents a part of the data associated with a scene. Each action is listed, along with a starting position of the game object (e.g. the character) relevant to that action. The position that the game object will have reached at the end of a successful completion of that action is also stored. Finally, a ghost position is stored. This could represent either a static position of a ghost image, to act as a target for movement of the game character, or an offset with respect to the game character, or a more complicated movement specification. When the user's character reaches the ending position for an action, the ghost ceases to be displayed.
Of course, the skilled person will appreciate that much more information thanihis is required to define an action or a scene. However, the rest of the information is routine to one skilled in the art.
To illustrate this-technique, Figures 8a to 8c schematically show a ghost game character in use. Here, in a short scene a user's character 840 has to leap across three boulders 810, 820, 830.
The character 840 starts on the boulder 810. A ghost representation 850 of the character is shown on the boulder 820 indicating that the user's character has to jump forwards and upwards onto the boulder 820. Assuming that the user does this successfully, the ghost representation 850 disappears (or moves see below) either when the character 840 arrives on the second boulder 820 or just before then, as defined by the data in the scene definition table of Figure 7.
A similar process applies for the user's character 840 to leap from the boulder 820 to the third boulder 830. After that successful leap, the ghost character is no longer displayed.
SSS
It may be that a leap form one of these boulders to the next is beyond the normal "jumping" capabilities of the game character. However, in a pre-choreographed scene such as this, such a limitation does not matter. The scene can be arranged so that if the user * **** ** * * S * * 1*1 operates a "jump forward and upward" control atthe appropriate time, The character 840 will move from one boulder to the next, following the ghost character. in this way, the ghost character can encourage the user to operate a certain control even where the user may not believe that the character 840 can fulfil that action. S Note that the ghost character is a representation of the user's
character, but this does not necessarily mean that the ghost character is in the same orientation as the user's character at any time.
Figures 9a to 9d are another example schematically illustrating a ghost game object: in this case a ghost automobile 860 which indicates to a user's automobile 870 which is the correct path through a (highly simplified) maze.
Figure 10 schematically illustrates another possible scene definition table, in this case applicable to a situation where the ghost character, as well as being displaced within the game environment from the user's character, is displayed with different display properties such as a different colour, texture, transparency or a combination of these, to indicate a required user action. At each action within a scene, an image "effect" is defined for the ghost character, over a particular time range within the scene. A required user action (such as pressing a certain button) is also defined. Successful completion of the task requires the user to press that button within the defined time range; Of course, the display appearance variation can be used in addition to the technique described earlier where the ghost character's positiorr indicates a required action. * S *SS'
S ***S * * S
S
S *5*
S *** **SS *5 * * ** * **

Claims (9)

  1. CLAiMS 1. Video game apparatus in which a primary game object is
    displayed within a game environment of a video game, the apparatus having: a user controller; and a game program defining a sequence of one or more target tasks required to be performed with respect to the primary game object by operating the user controller; in which the apparatus displays a further representation of the primary game object displaced, in the game environment, from the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence.
  2. 2. Apparatus according to claim 1, in which the display property is at least a display position of the further representation with respect to the display position of the primary game object
  3. 3. Apparatus according to claim 2, in which the direction within the game environment from the displayed primary game character to the further representation is indicative oi a required direction of movement by the primary game object in order to fulfil the next target task in the sequence.
  4. 4. Apparatus according to any one of claims I to 3, in which the display property is at least a display colour of the further representation, the display colour being selected from. a group of at least two display colours indicating different respective required user actions.
  5. 5. Apparatus according to any one of the preceding claims, in which: the video game has a series of scenes; and the further representation is provided in respect of at least a subset of the scenes; * in which: each scene in the subset has an associated sequence of target tasks; and *S* the game program is arranged to select an immediately following scene in I...
    dependence upon a degree to which the target tasks in the associated sequence are successfully completed. a... ** * * S. S 3
  6. 6. Apparatus..according to any one of the preceding claims, m which the primary game object is a game character.
  7. 7. Video game apparatus substantially as hereinbefore described with reference to the accompanying drawings.
  8. 8. A method of operation of a video game in which a primary game object is displayed within a game environment, the video game defining a sequence of one or more target tasks required to be performed with respect to the primary game object by a user operating a user controller, the method comprising the step of: displaying a further representation of the primary game object displaced, in the game environment, from the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence.
  9. 9. A method of operation of a video game, the method being substantially as hereinbefore described with reference to the accompanying drawings.
    S S. *.*S
    S S.
    S
    S
    S S... *
    S
    S
    9. A method of operation of a video game, the method being substantially as hereinbefore described with reference to the accompanying drawings.
    10: Computer software having program code which, when run on a computer, causes the computer to carry out a method according to claim 8 or claim 9.
    11. A providing medium by which computer software according to claim 10 is provided.
    12. A medium according to claim 11, the medium being a storage-medium.
    13. A medium according to claim II, the medium being a transmission medium. * * S*. S... * . * S SS' a a. * * S * * . . * S. 1+
    1. Video game apparatus in which a primary game object is displayed within a game environment of a video game, the apparatus having: a user controller; and a game program defining a sequence of one or more target tasks required to be perfonned with respect to the primary game object by operating the user controller; in which the apparatus displays a further representation of the primary game object displaced, in the game environment, with respect to the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence.
    2. Apparatus according to claim 1, in which the display property is at least a display position of the further representation with respect to the display position of the primary game object 3. Apparatus according to claim 2, in which the direction within the game enviromnent from the displayed primary game character to the further representation is indicative of a p..
    required direction of movement by the primary game object in order to fulfil the next target *: 20 task in the sequence.
    4. Apparatus according to any one of claims I to 3, in which the display property is at * ... least a display colour of the further representation, the display colour being selected from a . ** * * group of at least two display colours indicating different respective required user actions.
    5. Apparatus according to any one of the preceding claims, in which: the video game has a series of scenes; and the further representation is provided in respect of at least a subset of the scenes; in which: each scene in the subset has an associated sequence of target tasks; and the game program is arranged to select an immediately following scene in dependence upon a degree to which the target tasks in the associated sequence are successfully completed.
    6. Apparatus according to any one of the preceding claims, in which the primaiy game object is a game character.
    7. Video game apparatus substantially as hereinbefore described with reference to the accompanying drawings.
    8. A method of operation of a video game in which a primary game object is displayed within a game environment, the video game defining a sequence of one or more target tasks required to be performed with respect to the primary game object by a user operating a user controller, the method comprising the step of: displaying a further representation of the primary game object displaced, in the game environment, with respect to the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence.
GB0618406A 2006-09-19 2006-09-19 Video game Withdrawn GB2441975A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0618406A GB2441975A (en) 2006-09-19 2006-09-19 Video game
PCT/GB2007/002744 WO2008035027A1 (en) 2006-09-19 2007-07-19 Video game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0618406A GB2441975A (en) 2006-09-19 2006-09-19 Video game

Publications (2)

Publication Number Publication Date
GB0618406D0 GB0618406D0 (en) 2006-11-01
GB2441975A true GB2441975A (en) 2008-03-26

Family

ID=37421221

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0618406A Withdrawn GB2441975A (en) 2006-09-19 2006-09-19 Video game

Country Status (2)

Country Link
GB (1) GB2441975A (en)
WO (1) WO2008035027A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001005475A1 (en) * 1999-07-15 2001-01-25 Midway Games West Inc. System and method of vehicle competition with enhanced ghosting features
EP1661608A1 (en) * 2004-11-26 2006-05-31 Kabushiki Kaisha Sega doing business as Sega Corporation Image processing device, image processing method and storage medium for storing programs for executing image process cycles

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7286930B2 (en) * 2004-01-30 2007-10-23 Microsoft Corporation Ghost following
JP4822553B2 (en) * 2004-05-10 2011-11-24 任天堂株式会社 GAME DEVICE, GAME PROGRAM, COMPUTER-READABLE INFORMATION STORAGE MEDIUM, GAME SYSTEM, AND GAME PROCESSING METHOD
JP3785176B2 (en) * 2004-05-11 2006-06-14 コナミ株式会社 GAME DEVICE, GAME CONTROL METHOD, AND PROGRAM

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001005475A1 (en) * 1999-07-15 2001-01-25 Midway Games West Inc. System and method of vehicle competition with enhanced ghosting features
EP1661608A1 (en) * 2004-11-26 2006-05-31 Kabushiki Kaisha Sega doing business as Sega Corporation Image processing device, image processing method and storage medium for storing programs for executing image process cycles

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Adventure game *
Ghost (Video game) *
Guitar Hero review *
Mario Kart *
Prince of persia: The sands of time review *
Table Tennis Review *

Also Published As

Publication number Publication date
WO2008035027A1 (en) 2008-03-27
GB0618406D0 (en) 2006-11-01

Similar Documents

Publication Publication Date Title
US8035613B2 (en) Control of data processing
US7586502B2 (en) Control of data processing
EP1880576B1 (en) Audio processing
EP1768759B1 (en) Control of data processing
US20090247249A1 (en) Data processing
WO2006000786A1 (en) Real-time voice-chat system for an networked multiplayer game
WO2006024873A2 (en) Image rendering
US20100035678A1 (en) Video game
EP1700273B1 (en) Image rendering
EP1786532A1 (en) Data processing
EP1072298A1 (en) Display method for a confrontation type video game for displaying different information to players, storage medium and video game system
GB2441975A (en) Video game
EP1889645B1 (en) Data processing
JP2005275798A (en) Program, information storage medium, and image generation system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)