WO2013024771A1 - コンピュータゲーム装置、コンピュータゲーム装置を制御するための制御方法及びゲームプログラム、並びにゲームプログラムを記録した記録媒体 - Google Patents
コンピュータゲーム装置、コンピュータゲーム装置を制御するための制御方法及びゲームプログラム、並びにゲームプログラムを記録した記録媒体 Download PDFInfo
- Publication number
- WO2013024771A1 WO2013024771A1 PCT/JP2012/070235 JP2012070235W WO2013024771A1 WO 2013024771 A1 WO2013024771 A1 WO 2013024771A1 JP 2012070235 W JP2012070235 W JP 2012070235W WO 2013024771 A1 WO2013024771 A1 WO 2013024771A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- game
- player character
- behavior
- touch
- target position
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5258—Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/573—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/812—Ball games, e.g. soccer or baseball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/204—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
Definitions
- the present invention relates to a computer game device, a control method and program for controlling the computer game device, and a recording medium storing the program, and more particularly to a technique for controlling a user interface in a game device having a touch screen function. .
- game devices There are known computer game devices and intelligent devices (hereinafter simply referred to as “game devices”) having a touch screen (touch panel) function. Such a game device allows a user (player) to perform a direct and interactive operation by touching a game screen (game image) displayed on a touch screen.
- An intuitive user interface environment different from an input device such as a conventional operation button is provided to the user.
- the game device can determine from the display position of the object and the touched position, It can be determined that the object has been selected. Further, when the user moves the touched finger with respect to the object at a predetermined speed (usually slowly) without releasing it from the touch screen, the game apparatus determines that the drag operation is performed.
- the display of the object can be controlled so as to move along the line.
- the game device responds to an action of flipping with a finger on the touch screen or an action of quickly wiping with a finger (that is, an action of moving the touched finger without releasing it at a predetermined speed or higher).
- the display of the game screen itself or a predetermined object can be controlled.
- Such an operation of flipping with a finger or an operation of wiping with a finger is generally called a flick operation or a swipe operation.
- Patent Document 1 the user selects all the characters in the range surrounded by an operation surrounding the range on the game screen on the touch panel, and further indicates a position on the game screen.
- a technique for setting the position as the movement destination of all the selected characters is disclosed.
- Patent Document 2 when an input is given by a touch operation to a player character displayed on a touch-panel-type second display screen that is linked to the first display screen, the first display screen is displayed according to the input.
- a technique for changing the game image displayed on the screen is disclosed.
- the touch screen function as described above has high operability in that it can accept more intuitive and interactive operations for the user, and is widely adopted as a user interface of game devices.
- the position where the user should originally perform a touch operation may be located outside the game screen.
- the user designates a movement target position by a touch operation and moves the player character in the virtual three-dimensional space viewed from the virtual camera in the forward direction (that is, the downward direction on the game screen)
- the player If the character is displayed so as to be positioned in the vicinity of the lower part of the game screen, the movement target position will be located outside the game screen, and the user performs an intended operation on the game device via the touch screen.
- Can't give when the user designates a movement target position by a touch operation and moves the player character in the virtual three-dimensional space viewed from the virtual camera in the forward direction (that is, the downward direction on the game screen), the player If the character is displayed so as to be positioned in the vicinity of the lower part of the game screen, the movement target position will be located outside the game screen, and the user performs an intended operation on the game device via the touch screen. Can't give.
- FIG. 9 is a game screen of a play scene with a game in progress.
- the area A on the right side of the court of the player character is not displayed in the game screen, and the user cannot directly specify a point in the area as it is.
- the game screen expressing the virtual three-dimensional space is reconfigured, and a sufficient touch operation area is ensured in the moving direction of the player character. Is required.
- the user is requested to perform an operation for changing the configuration of the game screen. It is not possible to concentrate on the original operation, which reduces the fun of the game.
- an object of the present invention is to provide a game that always realizes intuitive operability regardless of the configuration of the game screen displayed on the touch screen. That is, an object of the present invention is to provide a game in which the movement target position can be set by an intuitive touch operation even when the movement target position is outside the game screen when the player character moves. It is to be.
- the player character may perform a series of actions to improve the fun of the game. For example, in a tennis game, a series of actions in which a player character moves following a ball object on a court in a virtual three-dimensional space and hits a shot is displayed as a game image.
- an object of the present invention is to provide a game that allows a player character to perform a series of actions by a combination of intuitive touch operations.
- the present invention that solves the above-described problems includes the following technical features.
- the present invention is a program that is executed on a computing device of a game device having a touch screen, and that realizes a predetermined game on the game device, and the player operates the game device by a user.
- Image generation means for generating a game image representing a virtual three-dimensional space including characters
- display control means for performing control for displaying the generated game image on the touch screen, and time for touch operation on the touch screen
- an operation action determining means for detecting by sampling and determining an operation action based on the detected touch operation
- a behavior control means for controlling the behavior of the player character in the virtual three-dimensional space based on the determined operation action
- It is a program that makes it workThe operation action determining means, when the detected touch operation is an operation of moving a predetermined stroke distance at a predetermined speed or more from touchdown, the flick operation specified by a predetermined movement vector.
- the behavior control means calculates a movement target position to which the player character should move based on the determined flick operation, so that the player character moves toward the movement target
- the image generation means generates a game image so that the target mark moves from the current position of the player character to the movement target position when the movement target position is calculated.
- the behavior control means controls the behavior of the player character so that the player character follows the movement of the target mark
- the image generation means determines the behavior of the player character controlled by the behavior control means. Display on the game image.
- the behavior control means determines whether or not to move the player character based on the progress of the game.
- the present invention according to another aspect is grasped as a computer-readable storage medium storing the program.
- the present invention is a game device provided with a touch screen, which is generated by an image generation means for generating a game image representing a virtual three-dimensional space including a player character operated by a user.
- Display control means for performing control for displaying the game image on the touch screen; and operation action determination means for detecting a touch operation on the touch screen by time sampling and determining an operation action based on the detected touch operation.
- a behavior control means for controlling the behavior of the player character in the virtual three-dimensional space based on the determined operation action, wherein the operation action determination means detects the detected touch operation from a touchdown.
- the touch operation is determined as a flick operation specified by a predetermined movement vector
- the behavior control means is a movement target to which the player character is to move based on the determined flick operation.
- a game apparatus that calculates a position and controls the behavior of the player character so that the player character moves toward the movement target position.
- the present invention is a control method for controlling a game device including a touch screen and a computing device by the computing device.
- the control method includes a step of generating a game image representing a virtual three-dimensional space including a player character operated by a user, and a step of performing control for displaying the generated game image on the touch screen. Detecting a touch operation on the touch screen by time sampling, determining an operation action based on the detected touch operation, and determining a behavior of the player character in the virtual three-dimensional space based on the determined operation action. Controlling.
- the step of determining the operation action is specified by a predetermined movement vector when the detected touch operation is an operation of moving a predetermined stroke distance at a predetermined speed or more from touchdown.
- the step of determining as a flick operation and controlling the behavior calculates a movement target position to which the player character should move based on the determined flick operation, and the player character moves toward the movement target position. In this manner, the behavior of the player character is controlled.
- a game that always provides intuitive operability is realized regardless of the configuration of the game screen displayed on the touch screen.
- FIG. 1 is a plan view showing an appearance of a game device according to an embodiment of the present invention.
- the appearance of the game apparatus 100 is substantially defined by a housing 110.
- a housing 110 In this example, a super oval-shaped housing is adopted, and the housing is designed to have a size that can be easily held by both hands.
- the housing 110 houses therein a computing device (control circuit) 210 that controls the game apparatus 100 in an integrated manner.
- the game apparatus 100 includes a touch screen 220 (touch panel) disposed at a substantially central portion of the upper surface of the housing 110.
- the touch screen 220 is typically a device having a display and input function in which a transparent touch sensor 222 is arranged on a display (screen) 221, and an existing one can be used.
- the touch screen 220 of this example is an organic EL display that supports multi-touch, but is not limited thereto.
- a plurality of operation buttons 260 are arranged at the left and right ends of the housing 110, that is, the left and right positions where the thumb abuts when the user holds the housing with both hands.
- a cross button and a round button are arranged on the left and right sides of the housing 110.
- audio output units 230 are arranged in the vicinity of the left and right operation button sets, respectively.
- the game apparatus 100 includes an external media unit 240 for accessing an external storage medium.
- the external storage medium is a memory card, for example, and records a game program and various data.
- the memory card memory cards of various standards can be adopted.
- the housing of the game apparatus 100 is provided with a memory card slot.
- the game apparatus 100 may include a communication unit 250 that realizes network communication.
- a communication unit 250 that realizes network communication.
- network communication for example, an existing wireless communication standard such as 3G or WiFi can be adopted, but is not limited thereto.
- FIG. 2 is a block diagram showing an example of a schematic hardware configuration of the game device according to the embodiment of the present invention.
- the game apparatus 100 includes a computing device 210 for overall control of the game apparatus 100 and an image (for example, a still image or a moving image) for expressing or producing the game. , Video, etc.) and audio for providing the user with sound (for example, music, voice, sound effects, etc.) for expressing or producing the game.
- An output unit 230, an external media unit 240 for accessing an external storage medium, and a communication unit 250 for performing wireless network communication with a server computer and other game apparatuses 100 are provided.
- the game apparatus 100 also includes an operation button 260 for replacing or supplementing the user input by the touch screen 220.
- the computing device 210 is a computer circuit element including, for example, a chip set, which includes various processors and memories.
- the computing device 210 of this embodiment includes, for example, a processor core 211, a memory module 212, a graphics engine 213, a sound engine 214, an I / O controller 215, and a system bus 216 that connects these components. Consists of.
- the computing device 210 may include a media engine for processing moving images at high speed.
- the processor core 211 is a chip that functions as a main processor of a computing device.
- the term “processor core” can be treated as synonymous with a processor, CPU, MPU or the like that means a main processor.
- the processor core 211 executes a game program developed on the memory module 212 and causes the computing device 210 to realize various functions.
- the computing device 210 executes a game program under the control of the processor core 211 to realize a game on the game apparatus 100 in cooperation with other hardware units / components.
- the game program may be executed under the control of an operating system (OS) that is realized under the control of the processor core 211.
- the processor core 211 can adopt either a single core type or a multi-core type.
- the processor core 211 includes a plurality of data caches not shown.
- the processor core 211 may be configured to connect a floating-point processor (FPU), a vector floating-point processor (VFPU), or the like (not shown) and execute a game program in cooperation with them.
- FPU floating-point processor
- VFPU vector floating-point processor
- the memory module 212 stores various programs and data necessary for game execution.
- FIG. 3 is a diagram showing an example of the contents of the memory module 212 of the game apparatus 100 according to an embodiment of the present invention.
- the memory module 212 is, for example, one of a volatile memory typified by a DRAM, a non-rewritable nonvolatile memory typified by a mask ROM, and a rewritable nonvolatile memory typified by a flash memory. Composed of a combination.
- the volatile memory is typically provided to the main memory of the processor core 211 and stores, for example, part or all of the game program and game data as necessary.
- the nonvolatile memory stores, for example, BIOS, OS program, device driver, system data necessary for controlling the game apparatus 100, user data, graphic data, play result data, and the like. All of the non-volatile memories may be configured by a flash memory without using a mask ROM. Further, all of the memory module 212 may be constituted by a rewritable nonvolatile memory such as a flash memory. An external storage medium attached to the memory card slot of game device 100 may also be provided as part of memory module 212.
- the graphics engine 213 performs various graphics processing under the control of the processor core 211.
- the graphics engine 213 typically includes a graphic core 2131 and a graphic memory 2132.
- the graphic core 2131 may be referred to as a video display processor (VDP) or a graphic processing unit (GPU).
- VDP video display processor
- GPU graphic processing unit
- the graphic core 2131 reads out graphic data stored in the graphic memory 2132 and performs various graphic processing (for example, geometry processing, rendering processing, texture mapping processing, etc.) to generate image data.
- the image data generated by the graphic core 2131 is converted into a predetermined video signal by a video interface circuit (not shown) and output to the touch screen 220.
- the sound engine 214 performs various kinds of sound processing (for example, effect processing, mixing processing, etc.) as appropriate on sound data stored in a sound memory (not shown) under the control of the processor core 211.
- the sound data processed by the sound engine 214 is converted into a predetermined audio signal by a sound interface circuit (not shown) and output to the audio output unit 230.
- the I / O controller 215 is an interface circuit for controlling input / output with various external units.
- a touch sensor 222, operation buttons 260, an external media unit 240, and a communication unit 250 are connected to the I / O controller 215.
- the I / O controller 215 converts external signals supplied from various connected units into internal data used in the computing device 210, and converts the internal data into external signals suitable for the various units.
- the touch screen 220 is a user interface device including a display (screen) 221 and a transmissive touch sensor 222 formed to substantially coincide with the display 221.
- the touch screen 220 displays a game image under the control of the computing device 210 and can accept an interactive touch operation by the user.
- the display 221 displays an image (for example, a still image, a moving image, a video, etc.) on the display 221 based on the video signal from the graphics engine 213.
- the display 221 may correspond to 3D display.
- the touch sensor 222 outputs an operation signal corresponding to the touch operation by the user.
- the user's touch operation may be performed by the user's finger or may be performed by a stylus or the like.
- the touch sensor 222 may be, for example, a capacitance type, but is not limited thereto.
- the operation signal output from the touch sensor 222 is input to the computing device 210.
- the computing device 210 detects the operation signal from the touch sensor 222, the computing device 210 interprets the operation signal as an operation on the game, and processing corresponding to the operation is executed. Is done.
- the computing device 210 calculates position information on the display 221 based on an operation signal from the touch sensor 222, and is used to determine an operation action in combination with a game image and time information.
- the game apparatus 100 according to the present embodiment is configured to accept a multi-touch operation with a plurality of fingers (or styluses).
- the audio output unit 230 is an audio output device for providing game sound to the user in accordance with the progress of the game.
- the audio output unit 230 generates sound (for example, sound, music, sound effect, etc.) based on the audio signal from the sound engine 214.
- the external media unit 240 is a media interface device for accessing an external storage medium of a memory card, for example.
- the external storage medium is not limited to a memory card, and may be a CD-ROM, a CD-R, a DVD-ROM, a DVD-R, or the like.
- the external storage medium is used, for example, for supplying the game program of the present embodiment to the game apparatus 100 and storing user data.
- the external media unit 240 accesses the attached external storage medium, reads data, and if writing is permitted, writes to the data.
- the communication unit 250 is a communication board for the game apparatus 100 to communicate with other game apparatuses and computers.
- a circuit that implements a communication standard such as a wireless LAN, Bluetooth (registered trademark), or 3G is mounted. Yes.
- the communication form is premised on wireless communication, but does not exclude wired communication such as wired LAN.
- the operation button 260 is typically composed of a plurality of buttons (keys), but is not limited to this.
- left and right operation button sets including a cross button and a round button are provided.
- Such a game is realized by the game device 100 executing a game program under the control of the computing device 210.
- the game provided by the game device 100 is a tennis game.
- a tennis game typically, a user (player) operates a player character, hits a ball object with an opponent player character operated by a computer (that is, the game device 100) or another player, and virtually Playing real tennis in a game format.
- the player is a singles who plays a one-on-one match.
- the game screen displayed on the display 221 is typically a game image representing a virtual three-dimensional space. During play, the viewing direction and field of view of the virtual camera are controlled, and the game image representing the virtual three-dimensional space changes accordingly. Similar to a normal tennis game, the player character needs to enter an appropriate hit point position in order to hit the ball object. For this reason, the user is required to perform an operation for moving the player character more accurately.
- the user operates the player character by a touch operation on the touch screen 220 (touch operation with a finger or a stylus, etc., hereinafter the same), releases a serve shot, moves the player character to a desired position, and moves the ball. You can strike back an object.
- a touch operation for a shot is required following a touch operation for movement.
- the direction, strength, type, and the like of the shot are determined by a computer calculation by the computing device 210 based on factors such as the timing, position, direction, speed (speed), etc. of the touch operation.
- the operation action by the user's touch operation includes, for example, a tap operation, a drag operation, and a flick operation.
- the computing device 210 determines an operation action from a given touch operation as necessary.
- the tap is an operation in which the user's finger simply touches the touch screen 220 (touch down), and typically lift-up is performed after the touch down (finger is released).
- the computing device 210 can determine a single tap operation or a double tap operation based on the number of tap operations performed within a predetermined time.
- the drag operation is an operation of dragging a finger on the touch screen 220.
- the drag operation is an operation of forming a stroke by moving the touched-down finger on the touch screen 220 at a predetermined speed or less without being released as it is. . That is, whether the operation is a tap operation or a drag operation can be distinguished depending on whether the touched-down finger moves two-dimensionally.
- the flick operation is an operation of flipping with a finger or an operation of wiping with a finger quickly, and specifically, an operation in which a touched-down finger is lifted up while moving at a predetermined speed or more. Whether the operation is a drag operation or a flick operation can be distinguished depending on whether the finger is moved at a predetermined speed or higher and lifted up within a predetermined time.
- the user taps a predetermined point on the play area including the court in order to move the player character on the court in the virtual three-dimensional space.
- the movement target position of the player character is set, and the player character moves to the movement target position.
- the line-of-sight direction and / or visual field area of the virtual camera is also controlled in accordance with the movement of the player character.
- a game image including a player character moving on the court is displayed on the touch screen 220. This also eliminates the need for the user to continue the touch operation when moving the player character.
- the user can change the movement target position by tapping another new point while the player character is moving. Subsequently, the user performs a flick operation in a direction in which the player wants to hit the ball object in order to cause the player character that has moved to the desired movement target position to make a shot. Thereby, the player character can strike back the flying ball object.
- the accuracy of the shot depends on, for example, the timing of starting the flick operation, the direction, the speed, the stroke distance, and the like.
- the user can select a shot type (for example, top spin, slice, volley, lob, smash, etc.) according to the situation based on the form of the flick operation. As a result, the user can quickly perform the touch operation for moving the player character, and can estimate the start timing of the flick operation for the shot while the player character is moving to the desired movement target position.
- An operation time for a flick operation according to the type can be provided.
- the user may move the player character by a drag operation.
- a drag operation the user can move the player character by following the movement of the moving finger.
- the user can give a shot motion to the player character that has moved to the desired movement target position.
- the user can move the player character by a flick operation.
- a flick operation the user can set an arbitrary point as the movement target position and move the player character according to the form of the flick operation.
- the user moves the player character by a flick operation, it is not necessary to directly specify the movement target point in the virtual three-dimensional space.
- the user can set the movement target position using a flick operation. Accordingly, the game image representing the virtual three-dimensional space changes in accordance with the movement of the player character that moves to the movement target position. This also eliminates the need for the user to continue the touch operation when moving the player character.
- FIG. 4 is a block diagram for explaining a functional configuration of the game apparatus 100 according to the embodiment of the present invention. That is, the figure shows various functions realized in the game apparatus 100 in cooperation with other hardware / software by the processor (that is, the processor core 211) of the computing device 210 executing the game program. This figure shows the configuration of the functional means, and in the figure, functional means related to the game realized by the game program of the present embodiment are particularly illustrated.
- game control unit 401 is a functional means for comprehensively controlling the progress of a game realized by a game program. For example, in the case of a tennis game, the game control unit 401 performs computer processing so that a virtual reality tennis game between the player character and the opponent player character is developed. Data necessary for the progress of the game is stored in the game data storage unit 402.
- the game control unit 401 includes a player character behavior control unit 4011 that controls movement of the player character and other actions (for example, shots).
- the touch operation unit 403 receives a touch operation by the user, and outputs an operation signal corresponding to the touch operation to the touch operation processing unit 406.
- the display unit 404 displays a game image based on a signal output from the image generation unit 408 under the control of the game control unit 401.
- the touch operation unit 403 and the display unit 404 can be associated with the touch screen 220 described above.
- the audio output unit 405 generates sound based on the signal output from the audio generation unit 409 under the control of the game control unit 401.
- the audio output unit 405 can be associated with the audio output unit 230 shown in FIG.
- the touch operation processing unit 406 determines the type of touch operation (operation action) given to the game device 100 based on an operation signal corresponding to the touch operation on the touch operation unit 403.
- the touch operation processing unit 406 of this example includes a position information acquisition unit 4061 and an operation action determination unit 4062.
- the position information acquisition unit 4061 acquires position information on the display unit 404 (display 221) related to the touch-operated position based on the operation signal sent from the touch operation unit 403.
- the position information acquisition unit 4061 outputs the acquired position information to the operation action determination unit 4062.
- the operation action determination unit 4062 determines the type of touch operation (operation action) by the user based on the position information sent from the position information acquisition unit 4061 and the time information obtained from the timer 407.
- the operation action includes, for example, a tap operation, a drag operation, and a flick operation as described above.
- the image generation unit 408 is a functional unit that generates a game image based on the image data output from the game control unit 401 and outputs the game image to the display unit 404.
- the audio generation unit 409 generates a game sound based on the audio data output from the game control unit 401 and outputs the game sound to the audio output unit 405.
- FIG. 5 is a flowchart for explaining the behavior control processing of the player character in the game device 100 according to the embodiment of the present invention. In the following, a behavior control process in a rally between a player character and an opponent player character will be described.
- the touch operation processing unit 406 of the game device 100 monitors at a predetermined time interval whether or not there is a touch operation by the user on the touch operation unit 403 during the progress of the game (STEP 501).
- the touch operation processing unit 406 determines that there is a touch operation (YES in STEP 501)
- the touch operation processing unit 406 determines a type of the touch operation, that is, an operation action based on the touch operation (STEP 502).
- the touch operation is detected by time sampling, and an operation signal corresponding to the touch operation is output.
- the operation action determination process will be described with reference to FIG.
- the game control unit 401 determines whether or not the player character can perform a swing motion to shot the ball object according to the determined operation action and the status of the game in progress (STEP 503).
- the game control unit 401 determines that the swing operation is not possible for the touch operation (NO in STEP 503)
- the game control unit 401 performs a player character movement process (STEP 504).
- the player character movement process will be described with reference to FIG.
- the game control unit 401 determines that a swing motion is possible for the touch operation (YES in STEP 503)
- the game control unit 401 performs a swing process for controlling the swing motion of the player character (STEP 505).
- the player character swing process will be described with reference to FIG.
- the game control unit 401 is configured to determine that the touch operation by the user during the rally is either the player character movement process or the swing process, but the present invention is not limited thereto. Alternatively, any one of a plurality of operation processing groups including other operation processing may be selected.
- FIG. 6 is a flowchart for explaining the operation action determination process in the game apparatus 100 according to the embodiment of the present invention.
- the game control unit 401 of the game apparatus 100 receives an operation signal from the touch operation unit 403, the game control unit 401 acquires position information on the touch screen 220 based on the operation signal (STEP 601). . Subsequently, the game control unit 401 determines whether or not the touch operation is moving (STEP 602). The game control unit 401 can determine that the touch operation is moving when there is a difference in a series of position information acquired at predetermined time intervals.
- the game control unit 401 determines that the touch operation has not moved (NO in STEP 602), the game control unit 401 outputs the current (latest) position information (STEP 603), and then whether the finger has been released within a predetermined time. It is determined whether or not (STEP 604). If the game control unit 401 determines that the finger has not been removed within the predetermined time (NO in STEP 604), the game control unit 401 returns to STEP 601. On the other hand, when the game control unit 401 determines that the finger has been released within a predetermined time (YES in STEP 604), the operation action determination process ends.
- the game control unit 401 determines that the touch operation is moving (YES in STEP 602)
- the game control unit 401 calculates the moving speed of the touch operation (STEP 605).
- the game control unit 401 can calculate the moving speed from a series of position information acquired at predetermined time intervals. Subsequently, the game control unit 401 determines whether or not the calculated moving speed is greater than a predetermined speed v (STEP 606).
- the game control unit 401 determines that the calculated moving speed is equal to or less than the predetermined speed v (No in STEP 606)
- the game control unit 401 outputs the current position information (STEP 603).
- the game control unit 401 processes a drag operation by a touch operation as a continuous tap operation or a tap operation as a drag operation with a short stroke distance.
- the game control unit 401 determines that the calculated moving speed is higher than the predetermined speed v (Yes in STEP 606), the game action is determined as a flick operation (STEP 607), and the moving direction and moving speed are determined. (Movement vector) is output (STEP608), and the process is terminated.
- FIG. 7 is a flowchart for explaining the player character movement process in the game apparatus 100 according to the embodiment of the present invention. That is, as shown in the figure, the game control unit 401 determines whether or not the operation action is a flick operation (STEP 701). When the game control unit 401 determines that the operation action is not a flick operation (No in STEP 701), the game control unit 401 regards it as a tap operation or a drag operation, sets the output position information as target position information (STEP 702), and The display of the game image is controlled so that the player character moves toward the movement target position (STEP 703).
- the game control unit 401 determines whether or not the operation action is a flick operation (STEP 701).
- the game control unit 401 regards it as a tap operation or a drag operation, sets the output position information as target position information (STEP 702), and The display of the game image is controlled so that the player character moves toward the movement target position (STEP 703).
- the game control unit 401 determines that the operation action is a flick operation (YES in STEP 701)
- the game control unit 401 calculates a movement target position based on the output movement direction and movement speed (STEP 704).
- the game control unit 401 controls display of the game image so that the player character moves toward the movement target position (STEP 703).
- the operation parameters such as the moving direction and the moving speed by the flick operation are used for calculating the movement target position of the player character, and are directly given to the position data of the player character by directly giving the player character. It is not used to calculate the behavior of In the present disclosure, the behavior calculation of the player character is performed based on the calculated movement target position.
- FIG. 10 to 14 are diagrams showing game screens for explaining the movement of the player character by the flick operation.
- the player character P is required to take a hit point position behind the court in a certain play scene.
- a part of the area behind the court is located outside the game screen (FIG. 10).
- the game control unit 401 detects this, and first displays a motion cursor (displayed as a double circle) C at the touched position (FIG. 10).
- the game control unit 401 determines that the operation action is a flick operation.
- the game control unit 401 moves and displays the motion cursor C to a position where the finger is released.
- the game control unit 401 determines that the shot is not possible based on the position information of the ball object, and enters a process for moving the player character P to the movement target position calculated based on the flick operation.
- a target mark M is displayed so that the movement of the player character P to the movement target position is intuitively transmitted to the user.
- the target mark M is, for example, a display with a virtual glow, but is not limited thereto. That is, the game control unit 401 highlights the current position of the player character with a glow (FIG. 11).
- the game control unit 401 starts the movement of the target mark M so that the glow moves to the movement target position calculated by the flick operation (FIG. 12). Accordingly, the game control unit 401 starts moving so that the player character P follows the target mark M (FIG. 13). Depending on the configuration of the game screen according to the angle of the virtual camera, the target mark M may temporarily disappear outside the game screen.
- the game control unit 401 changes the configuration of the game screen as necessary as the player character P moves, and finally displays the display so that the player character P catches up with the target mark M at the movement target position. Control (FIG. 14). Thus, the user can shot the ball object by the next touch operation.
- FIG. 8 is a flowchart for explaining the swing process of the player character in the game apparatus 100 according to the embodiment of the present invention.
- the swing process is executed when it is determined in STEP 503 in FIG. 5 that the touch operation can be swung. In this example, it is assumed that the swing process is executed based on a flick operation.
- the game control unit 401 sets a shot target position according to the moving direction and moving speed obtained by the flick operation (STEP 801).
- the shot target position is a target position where the player character tries to strike back the ball object by a shot.
- the game control unit 401 extracts a swing motion candidate that can be applied in a game situation in progress from preset swing motions (STEP 802).
- the swing motion includes, for example, a forehand and a backhand, and top spin, slice, volley, rob, smash, and the like that are fed out from these.
- the game control unit 401 extracts applicable swing motion candidates in consideration of the position information and speed information regarding the ball object, the position information of the player character, the time (number of frames) until the shot, and the like. Typically, the earlier the player character can reach the hit position, the more swing motion candidates that can be extracted. Subsequently, the game device 100 determines an optimum swing motion from the extracted swing motion candidates based on the timing of the flick operation, the moving direction, and the moving speed (STEP 803). Then, the game control unit 401 controls the display of the game image so that the player character swings in accordance with the determined swing motion, and in conjunction with this, while calculating the trajectory of the ball object, The display is controlled (STEP 804).
- the game apparatus 100 adjusts the shot target position by, for example, accepting an operation from the operation button by the user during the swing of the player character, and moving the shot cursor. You may control.
- the user can give an arbitrary touch operation to the player character regardless of the configuration of the game screen displayed on the touch screen.
- the movement target position of the player character can be set by an intuitive touch operation.
- the processes or operations performed in the method may be executed in different orders as long as no contradiction arises in the result of executing the method.
- the processes or operations are merely examples, and some processes or operations are optional and may be omitted, or may be combined with each other or by additional processes or operations. It can also be expanded. For example, changing the order of processes or operations or executing the processes or operations in parallel is included in the gist of the present invention.
- the present invention can be widely applied to a computer game device having a touch screen function and a game program executed by an intelligent device represented by a smartphone or a tablet computer.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-178062 | 2011-08-16 | ||
JP2011178062A JP5958787B2 (ja) | 2011-08-16 | 2011-08-16 | コンピュータゲーム装置、コンピュータゲーム装置を制御するための制御方法及びゲームプログラム、並びにゲームプログラムを記録した記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013024771A1 true WO2013024771A1 (ja) | 2013-02-21 |
Family
ID=47715091
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/070235 WO2013024771A1 (ja) | 2011-08-16 | 2012-08-08 | コンピュータゲーム装置、コンピュータゲーム装置を制御するための制御方法及びゲームプログラム、並びにゲームプログラムを記録した記録媒体 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP5958787B2 (enrdf_load_stackoverflow) |
WO (1) | WO2013024771A1 (enrdf_load_stackoverflow) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104645616A (zh) * | 2015-03-16 | 2015-05-27 | 成都优聚软件有限责任公司 | 一种塔防游戏中游戏对象的移动路径设置方法和系统 |
CN105392539A (zh) * | 2013-09-06 | 2016-03-09 | 科乐美数码娱乐株式会社 | 游戏程序、游戏方法、游戏系统 |
JP2016049111A (ja) * | 2014-08-28 | 2016-04-11 | 株式会社セガゲームス | プログラムおよびゲーム装置 |
JP6084719B1 (ja) * | 2016-02-26 | 2017-02-22 | 株式会社コロプラ | 画像処理方法、及び画像処理プログラム |
JP2017170242A (ja) * | 2017-07-06 | 2017-09-28 | 株式会社コナミデジタルエンタテインメント | ゲームプログラム、ゲームシステム |
JP2024018071A (ja) * | 2022-07-29 | 2024-02-08 | 株式会社コロプラ | プログラムおよび情報処理システム |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5918285B2 (ja) * | 2014-02-14 | 2016-05-18 | 株式会社コナミデジタルエンタテインメント | 移動制御装置及びプログラム |
WO2016114591A1 (ko) * | 2015-01-16 | 2016-07-21 | 주식회사 웨이브쓰리스튜디오 | 게임 캐릭터의 위치 이동 방법 및 장치와, 이를 이용한 타겟 지정 방법 |
US10293253B2 (en) | 2015-11-27 | 2019-05-21 | Gree, Inc. | Program, game control method, and information processing apparatus |
JP5942031B1 (ja) * | 2015-11-27 | 2016-06-29 | グリー株式会社 | プログラム、ゲームの制御方法、及び情報処理装置 |
JP5977878B1 (ja) * | 2015-11-27 | 2016-08-24 | グリー株式会社 | プログラム、ゲームの制御方法、及び情報処理装置 |
JP6030258B1 (ja) * | 2016-05-23 | 2016-11-24 | グリー株式会社 | プログラム、ゲームの制御方法、及び情報処理装置 |
JP6043448B1 (ja) * | 2016-02-25 | 2016-12-14 | 株式会社コロプラ | ゲームプログラム |
JP6277474B2 (ja) * | 2016-03-11 | 2018-02-14 | 株式会社コナミデジタルエンタテインメント | ゲーム制御装置、ゲームシステム、及びプログラム |
JP6002345B1 (ja) * | 2016-03-31 | 2016-10-05 | 株式会社コロプラ | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 |
JP6924564B2 (ja) * | 2016-03-31 | 2021-08-25 | 株式会社コロプラ | ゲームプログラム |
JP6002344B1 (ja) * | 2016-03-31 | 2016-10-05 | 株式会社コロプラ | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 |
JP6209243B1 (ja) * | 2016-04-28 | 2017-10-04 | 株式会社コロプラ | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 |
JP2017205311A (ja) * | 2016-05-19 | 2017-11-24 | 株式会社コロプラ | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 |
JP6069571B1 (ja) * | 2016-10-19 | 2017-02-01 | グリー株式会社 | プログラム、ゲームの制御方法、及び情報処理装置 |
KR102062398B1 (ko) * | 2017-02-15 | 2020-02-11 | 서진택 | 터치 액션 기반의 단순한 규칙 연속에 의한 창발성 향상을 위한 모바일 퍼즐 게임 시스템, 스마트 디바이스, 이를 기록한 기록 매체 |
JP6255525B1 (ja) | 2017-06-07 | 2017-12-27 | 株式会社 ディー・エヌ・エー | 電子ゲーム装置及び電子ゲームプログラム |
JP2017209573A (ja) * | 2017-09-08 | 2017-11-30 | 株式会社コロプラ | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 |
JP7340054B2 (ja) * | 2020-05-01 | 2023-09-06 | 株式会社コロプラ | ゲームプログラム |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0644001A (ja) * | 1992-07-27 | 1994-02-18 | Toshiba Corp | 表示制御装置及び表示制御方法 |
JP2001092578A (ja) * | 1999-09-20 | 2001-04-06 | Casio Comput Co Ltd | オブジェクト移動処理装置およびそのプログラム記録媒体 |
JP2005218779A (ja) * | 2004-02-09 | 2005-08-18 | Nintendo Co Ltd | ゲーム装置およびゲームプログラム |
JP2011053748A (ja) * | 2009-08-31 | 2011-03-17 | Konami Digital Entertainment Co Ltd | ゲーム装置、ゲーム処理方法、ならびに、プログラム |
JP2011107781A (ja) * | 2009-11-12 | 2011-06-02 | Canon Inc | 表示制御装置、及びその制御方法 |
JP2011110222A (ja) * | 2009-11-27 | 2011-06-09 | Konami Digital Entertainment Co Ltd | ゲーム装置、ゲーム制御プログラム、及びゲーム制御方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3470118B2 (ja) * | 2002-01-16 | 2003-11-25 | コナミ株式会社 | キャラクタ操作プログラム、キャラクタ操作方法及びビデオゲーム装置 |
JP4545545B2 (ja) * | 2004-05-10 | 2010-09-15 | 任天堂株式会社 | ゲームプログラムおよびゲーム装置 |
JP4792878B2 (ja) * | 2005-04-11 | 2011-10-12 | 株式会社セガ | 対戦ビデオゲーム制御プログラム |
JP2010233752A (ja) * | 2009-03-30 | 2010-10-21 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
-
2011
- 2011-08-16 JP JP2011178062A patent/JP5958787B2/ja active Active
-
2012
- 2012-08-08 WO PCT/JP2012/070235 patent/WO2013024771A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0644001A (ja) * | 1992-07-27 | 1994-02-18 | Toshiba Corp | 表示制御装置及び表示制御方法 |
JP2001092578A (ja) * | 1999-09-20 | 2001-04-06 | Casio Comput Co Ltd | オブジェクト移動処理装置およびそのプログラム記録媒体 |
JP2005218779A (ja) * | 2004-02-09 | 2005-08-18 | Nintendo Co Ltd | ゲーム装置およびゲームプログラム |
JP2011053748A (ja) * | 2009-08-31 | 2011-03-17 | Konami Digital Entertainment Co Ltd | ゲーム装置、ゲーム処理方法、ならびに、プログラム |
JP2011107781A (ja) * | 2009-11-12 | 2011-06-02 | Canon Inc | 表示制御装置、及びその制御方法 |
JP2011110222A (ja) * | 2009-11-27 | 2011-06-09 | Konami Digital Entertainment Co Ltd | ゲーム装置、ゲーム制御プログラム、及びゲーム制御方法 |
Non-Patent Citations (1)
Title |
---|
"Vita-ban 'Power Smash 4', Touch Screen deno Sosa Hoho no Ichibu ya Senshu Type o Shokai", 28 October 2011 (2011-10-28), Retrieved from the Internet <URL:http://www.4gamer.net/games/134/G013437/20111019040/>(RefertothefollowingURL,withrespecttoAetas,Inc.whichmanages''4Gamer.net''<http://www.aetas.co.jp/profile/index.html> [retrieved on 20121015] * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105392539A (zh) * | 2013-09-06 | 2016-03-09 | 科乐美数码娱乐株式会社 | 游戏程序、游戏方法、游戏系统 |
US9975047B2 (en) | 2013-09-06 | 2018-05-22 | Konami Digital Entertainment Co., Ltd. | Game program, game method, and game system |
CN105392539B (zh) * | 2013-09-06 | 2019-05-14 | 科乐美数码娱乐株式会社 | 游戏程序、游戏方法、游戏系统 |
JP2016049111A (ja) * | 2014-08-28 | 2016-04-11 | 株式会社セガゲームス | プログラムおよびゲーム装置 |
CN104645616A (zh) * | 2015-03-16 | 2015-05-27 | 成都优聚软件有限责任公司 | 一种塔防游戏中游戏对象的移动路径设置方法和系统 |
JP6084719B1 (ja) * | 2016-02-26 | 2017-02-22 | 株式会社コロプラ | 画像処理方法、及び画像処理プログラム |
WO2017145684A1 (ja) * | 2016-02-26 | 2017-08-31 | 株式会社コロプラ | 画像処理方法、及び画像処理プログラム |
JP2017170242A (ja) * | 2017-07-06 | 2017-09-28 | 株式会社コナミデジタルエンタテインメント | ゲームプログラム、ゲームシステム |
JP2024018071A (ja) * | 2022-07-29 | 2024-02-08 | 株式会社コロプラ | プログラムおよび情報処理システム |
Also Published As
Publication number | Publication date |
---|---|
JP5958787B2 (ja) | 2016-08-02 |
JP2013039232A (ja) | 2013-02-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5958787B2 (ja) | コンピュータゲーム装置、コンピュータゲーム装置を制御するための制御方法及びゲームプログラム、並びにゲームプログラムを記録した記録媒体 | |
JP7695961B2 (ja) | 仮想コンソールゲーム用コントローラ | |
US8690674B2 (en) | Storage medium having game program stored thereon and game apparatus | |
US8337284B2 (en) | Game apparatus and storage medium having game program stored therein | |
US7762893B2 (en) | Storage medium having game program stored thereon and game apparatus | |
US20060238498A1 (en) | Storage medium having stored therein game program and game device | |
US20050159217A1 (en) | Game apparatus and game program | |
JP5127805B2 (ja) | ゲームプログラム、ゲーム装置、およびゲーム制御方法 | |
JP2016093360A (ja) | ゲームプログラム | |
JP6626613B2 (ja) | ゲームプログラム | |
JP5759571B2 (ja) | ゲームプログラムおよびゲーム装置 | |
JP4688473B2 (ja) | プログラム、情報記憶媒体及びゲーム装置 | |
JP2016093361A (ja) | ゲームプログラム | |
JP7672474B2 (ja) | プログラム | |
JP6738604B2 (ja) | プログラム及びゲーム装置 | |
JP2007325709A (ja) | ゲーム装置、プログラム及び情報記録媒体 | |
JP5759570B2 (ja) | ゲームプログラムおよびゲーム装置 | |
JP6260942B2 (ja) | コンピュータゲーム装置、コンピュータゲーム装置を制御するための制御方法及びゲームプログラム、並びにゲームプログラムを記録した記録媒体 | |
JP2016120039A (ja) | ゲームプログラム | |
JP2016195864A (ja) | 画像処理装置およびその方法 | |
JP2015186571A (ja) | 画像処理装置およびその方法 | |
JP5501426B2 (ja) | ゲームプログラム、ゲーム装置、およびゲーム制御方法 | |
JP6404077B2 (ja) | ゲームプログラム | |
JP6919050B1 (ja) | ゲームシステム、プログラム及び情報処理方法 | |
JP6307651B1 (ja) | ゲームプログラム、方法、および、情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12823441 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12823441 Country of ref document: EP Kind code of ref document: A1 |