WO2017169748A1 - Programme de jeu, procédé, et dispositif de traitement d'informations comprenant un écran tactile - Google Patents
Programme de jeu, procédé, et dispositif de traitement d'informations comprenant un écran tactile Download PDFInfo
- Publication number
- WO2017169748A1 WO2017169748A1 PCT/JP2017/010258 JP2017010258W WO2017169748A1 WO 2017169748 A1 WO2017169748 A1 WO 2017169748A1 JP 2017010258 W JP2017010258 W JP 2017010258W WO 2017169748 A1 WO2017169748 A1 WO 2017169748A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- game
- character
- touch screen
- game character
- virtual camera
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 16
- 238000000034 method Methods 0.000 title claims description 47
- 230000009471 action Effects 0.000 claims abstract description 96
- 230000033001 locomotion Effects 0.000 claims description 63
- 238000001514 detection method Methods 0.000 claims description 27
- 230000007704 transition Effects 0.000 claims description 4
- 230000004044 response Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 55
- 230000008569 process Effects 0.000 description 35
- 238000004891 communication Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 16
- 238000013459 approach Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 10
- 238000007726 management method Methods 0.000 description 9
- 238000005259 measurement Methods 0.000 description 6
- 238000003825 pressing Methods 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 4
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 3
- 238000013523 data management Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000002250 progressing effect Effects 0.000 description 3
- 238000012384 transportation and delivery Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/812—Ball games, e.g. soccer or baseball
Definitions
- the present invention relates to a technique of a game program that accepts an input operation by a touch operation and advances a game, and particularly relates to a technique that advances a game by displaying a screen according to the arrangement of virtual cameras in a virtual space on a display.
- a variety of businesses provide game programs that display game characters in a virtual space and advance the game. For example, sports games related to sports such as tennis, table tennis, hockey, and soccer are provided.
- a game character and an opponent character are arranged in the virtual space
- a virtual camera is arranged in the virtual space
- an image of the shooting range of the virtual camera is displayed on the display.
- the user operates the game character in the virtual space, moves the game character, and performs an operation such as hitting an object such as a ball with the opponent character. Can be advanced.
- Patent Document 1 describes a technique for changing a game process in a portable game device in consideration of an operation posture of a game device operated by a user. Specifically, Patent Literature 1 describes that a sensor acquires attitude data of a game device operated by a user, and switches a viewpoint mode for determining the placement of a virtual camera according to the operation attitude. ing. For example, Patent Literature 1 describes that there are an overhead viewpoint mode and a user viewpoint mode as viewpoint modes for determining the arrangement of virtual cameras. According to Patent Literature 1, as a viewpoint mode, in a tennis game, a bird's-eye viewpoint mode in which a virtual camera is arranged at a position where the entire tennis court object can be seen from above, and a user viewpoint in which a virtual camera is arranged behind the player object. It is described that the mode is switched according to the attitude of the game device.
- a viewpoint mode in a tennis game, a bird's-eye viewpoint mode in which a virtual camera is arranged at a position where the entire tennis court object can be seen from above
- Patent Document 1 has such a configuration, so that the user can easily set the user's favorite viewpoint mode by changing the attitude of the game device without performing a button operation by the user. . Thereby, the technique of patent document 1 makes it easier for the user to switch the viewpoint.
- Patent Document 1 describes a game device that receives a user's input operation using a physical cross-shaped key.
- terminals smarttphones, tablets, and other terminals
- the terminal detects an operation in which the user touches the touch screen, and receives an input operation for the user to move the game character in the virtual space from the user.
- the touch screen accepts a user input operation when a user brings a finger or the like close to the touch screen, and displays a screen as the game progresses. Therefore, while the user is performing an input operation such as a touch operation on the touch screen, the input operation covers the display, and the area where the touch screen can be viewed may be reduced by the user's own touch operation. . Therefore, there is a need for a technique that makes it easier to operate game characters displayed on a touch screen.
- an object of the present invention is to provide a technique for further enhancing the interest in games.
- a game program that is executed on a computer including a touch screen that accepts an input operation by a touch operation, a processor, and a memory.
- the game program displays a screen corresponding to the placement of the virtual camera in the virtual space on the touch screen.
- the game program causes the processor to perform a first action on the game character, a step of arranging a virtual camera so that the game character is displayed on the touch screen, and a screen display that causes the game character to perform the first action
- moving the virtual camera so that the game character is displayed in a certain area from at least one side of the touch screen toward the center, and a second action subsequent to the first action is performed on the game character.
- a method in which a computer including a touch screen that accepts an input operation by a touch operation, a processor, and a memory displays a screen according to the arrangement of a virtual camera in a virtual space on the touch screen.
- the processor causes the game character to perform the first action
- a step of arranging a virtual camera so that the game character is displayed on the touch screen and a screen display for causing the game character to perform the first action are provided.
- the game character is caused to move the virtual camera so that the game character is displayed in a certain area from the at least one side of the touch screen toward the center, and the second action following the first action.
- an information processing apparatus including a touch screen that receives an input operation by a touch operation.
- the information processing apparatus includes a storage unit and a control unit.
- the storage unit is configured to store a game program.
- the control unit is configured to control the operation of the information processing apparatus, and is configured to display a screen corresponding to the arrangement of the virtual camera in the virtual space on the touch screen by executing the game program.
- the control unit touches the game character in the case of causing the game character to perform the first action on the basis of the input operation on the touch screen and detecting the operation content of the input operation for moving the game character.
- a virtual camera is arranged so as to be displayed on the screen, a screen display is performed to cause the game character to perform the first action, and a virtual character is displayed so as to open a certain area from at least one side of the touch screen toward the center.
- the virtual camera is moved so as to move the camera, accept an input operation for causing the game character to perform the second action subsequent to the first action, and display the game character on the touch screen in accordance with the accepted input operation.
- a camera arrangement control unit to be moved.
- operation of a game character displayed on the touch screen can be further facilitated, and the fun of the game can be improved.
- FIG. 2 is a block diagram showing a configuration of a mobile terminal 10.
- FIG. 3 is a block diagram showing a functional configuration of a server 20.
- FIG. It is a figure which shows the process in which the movement operation detection part 193 detects the direction which moves a game character according to a user's input operation.
- FIG. 6 is a flowchart showing a process of advancing a game of a sports game according to camera work determined by a camera arrangement control unit 194 when the control unit 190 of Embodiment 1 executes a game program 151. It is a figure which shows the process which the camera arrangement
- positioning control part 194 of Embodiment 2 determines a camera work according to the position of a game character. 10 is a flowchart showing a process for advancing a game of a sports game according to camera work determined by a camera arrangement control unit 194 by executing a game program 151 by a control unit 190 according to the second embodiment.
- FIG. 18 is a diagram illustrating processing in which a control unit 190 according to the fourth embodiment displays an object on the display 132 according to a direction in which a game character is moved based on a detection result of a movement operation detection unit 193.
- a game program is executed by an information processing apparatus including a touch screen such as a smartphone
- a game screen is displayed on the touch screen
- an input operation by a user is accepted on the screen.
- the user touches with the user's finger as the game progresses.
- the display content of the screen may be covered.
- a technique for controlling camera work of a virtual camera arranged in a virtual space so that an input operation performed by a user on the touch screen becomes easier as the game progresses will be described.
- a user operates an information processing apparatus equipped with a touch screen such as a smartphone, for example, and advances the game while transmitting and receiving data related to the game between the game server and the smartphone.
- the game corresponds to single play where the user plays alone and multi-play where the user plays with other users.
- FIG. 1 is a diagram illustrating a configuration of a game distribution system 1 according to the embodiment.
- the game distribution system 1 includes an information processing device used by a user and a server 20, and these devices are connected to each other via a network 80 so as to communicate with each other.
- portable terminals 10 ⁇ / b> A, 10 ⁇ / b> B, and 10 ⁇ / b> C are collectively referred to as “mobile terminal 10”.
- a plurality of portable terminals are shown.
- the mobile terminal 10 ⁇ / b> A and the mobile terminal 10 ⁇ / b> B are connected to the network 80 by communicating with the radio base station 81.
- the mobile terminal 10C is connected to the network 80 by communicating with a wireless router 82 installed in a facility such as a house.
- the mobile terminal 10 is a terminal provided with a touch screen, and is, for example, a smartphone, a fablet, a tablet, or the like.
- the portable terminal 10 provides a user with an environment for playing a game according to the game program by executing the game program.
- the mobile terminal 10 installs a game program via a platform that distributes applications and the like.
- the mobile terminal 10 enables a user to play a game by executing a game program installed in the mobile terminal 10 or a game program preinstalled in advance.
- the mobile terminal 10 reads and executes the game program to establish communication connection between the mobile terminal 10 and the server 20 and transfer data related to the game between the mobile terminal 10 and the server 20 according to the progress of the game. Send and receive with.
- the server 20 advances the game play on the portable terminal 10 by appropriately transmitting data necessary for playing the game to the portable terminal 10.
- the server 20 manages various data related to the game of each user who plays the game.
- the server 20 communicates with the mobile terminal 10 and transmits images, sounds, text data, and other data to the mobile terminal 10 in accordance with the progress of each user's game.
- the server 20 includes a progress status of each user progressing a game story, game character information that can be used by each user among game characters appearing in the game, parameters indicating the game character's ability, It manages parameters and other various data indicating the performance of the tool used.
- the server 20 performs a process in which the game operator notifies the user of campaigns, occurrence of problems in the progress of the game, resolution of problems, and other information related to game management.
- the game program corresponds to a mode in which a user plays a game when a single user plays (single play) and when a plurality of users play in cooperation (multiplay).
- the server 20 provides each user with an environment for playing a game in multi-play by specifying users participating in multi-play and communicating with each mobile terminal 10 of each user.
- the game distribution system 1 is capable of playing a doubles game by forming a team with each user in the case of a tennis game, for example, by supporting multiplayer. Moreover, if it is a soccer game, the game delivery system 1 will enable each user to become a member of the same soccer team, and to play a game. Moreover, if the game delivery system 1 is an action game, a team will be formed by each user, and a plurality of users will be able to play the game mode which battles comparatively powerful characters, such as quest mode.
- the server 20 includes a communication IF (Interface) 22, an input / output IF 23, a memory 25, a storage 26, and a processor 29, which are connected to each other via a communication bus.
- IF Interface
- the server 20 includes a communication IF (Interface) 22, an input / output IF 23, a memory 25, a storage 26, and a processor 29, which are connected to each other via a communication bus.
- the communication IF 22 corresponds to various communication standards such as a LAN (Local Area Network) standard, and functions as an interface for transmitting and receiving data to and from an external communication device such as the mobile terminal 10.
- LAN Local Area Network
- the input / output IF 23 functions as an interface for receiving information input to the server 20 and outputting information to the outside of the server 20.
- the input / output IF 23 includes an input receiving unit that accepts connection of an information input device such as a mouse and a keyboard, and an output unit that accepts connection of an information output device such as a display for displaying an image or the like.
- the memory 25 is a storage device for storing data used for processing.
- the memory 25 provides the processor 29 with a work area for temporary use when the processor 29 performs processing.
- the memory 25 includes a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
- the storage 26 is a storage device for storing various programs and data for the processor 29 to read and execute.
- the information stored in the storage 26 includes a game program, information related to the game program, information on a user who plays the game program, and other information.
- the storage 26 includes a storage device such as an HDD (Hard Disk Drive) or a flash memory.
- the processor 29 controls the operation of the server 20 by reading and executing a program or the like stored in the storage 26.
- the processor 29 includes, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and the like.
- FIG. 2 is a block diagram showing the configuration of the mobile terminal 10.
- the mobile terminal 10 includes an antenna 110, a wireless communication IF 120, a touch screen 130, an input / output IF 140, a storage unit 150, a sound processing unit 160, a microphone 170, a speaker 180, And a control unit 190.
- the antenna 110 radiates a signal emitted from the mobile terminal 10 to space as a radio wave. Further, the antenna 110 receives radio waves from the space and gives a received signal to the wireless communication IF 120.
- the wireless communication IF 120 performs modulation / demodulation processing for transmitting and receiving signals via the antenna 110 and the like so that the mobile terminal 10 communicates with other communication devices.
- the wireless communication IF 120 is a communication module for wireless communication including a tuner, a high frequency circuit, and the like, performs modulation / demodulation and frequency conversion of a wireless signal transmitted / received by the mobile terminal 10, and provides a received signal to the control unit 190.
- the touch screen 130 receives an input from the user and outputs information to the display 132 to the user.
- Touch screen 130 includes a member (touch panel 131) for accepting a user's input operation.
- the touch screen 130 includes a menu screen and a member (display 132) for displaying the progress of the game on the screen.
- the touch panel 131 detects that the user's finger or the like has approached by using, for example, a capacitive type.
- the display 132 is realized by, for example, an LCD (Liquid Crystal Display), an organic EL (electroluminescence), or other display device.
- the input / output IF 140 functions as an interface for receiving information input to the mobile terminal 10 and outputting information to the outside of the mobile terminal 10.
- the storage unit 150 includes a flash memory, a RAM (Random Access Memory), and the like, and stores programs used by the mobile terminal 10 and various data received by the mobile terminal 10 from the server 20.
- the audio processing unit 160 performs modulation / demodulation of the audio signal.
- the audio processing unit 160 modulates the signal given from the microphone 170 and gives the modulated signal to the control unit 190.
- the audio processing unit 160 provides an audio signal to the speaker 180.
- the sound processing unit 160 is realized by, for example, a sound processing processor.
- the microphone 170 functions as an audio input unit for receiving an input of an audio signal and outputting it to the control unit 190.
- the speaker 180 functions as an audio output unit for outputting an audio signal to the outside of the mobile terminal 10.
- the control unit 190 controls the operation of the mobile terminal 10 by reading and executing a program stored in the storage unit 150.
- the control unit 190 is realized by an application processor, for example.
- the storage unit 150 stores a game program 151, game information 152, and user information 153.
- the mobile terminal 10 downloads a game program from the server 20 and stores the game program in the storage unit 150.
- the portable terminal 10 transmits / receives various data, such as game information 152 and user information 153, with the server 20 by communicating with the server 20 with progress of a game.
- the game program 151 is a program for causing the mobile terminal 10 to advance the game.
- the game information 152 includes various data referred to by the game program 151.
- the game information 152 includes, for example, information on objects to be arranged in the virtual space in the game, information on effects associated with the objects (including information on skills set for game characters), and the like.
- the user information 153 includes information about the user who plays the game.
- the user information 153 includes, for example, information that identifies a user of the mobile terminal 10 that plays a game, information that identifies other users who play a game in cooperation with each other, and other information.
- the control unit 190 reads and executes the game program 151 to thereby execute an input operation receiving unit 191, a game progress processing unit 192, a moving operation detection unit 193, a camera arrangement control unit 194, an object control unit 195, Each function of the display control unit 196 is exhibited.
- the input operation reception unit 191 receives a user input operation based on the output of the touch screen 130. Specifically, the input operation accepting unit 191 detects that a user's finger or the like has approached the touch panel 131 as coordinates in a coordinate system including a horizontal axis and a vertical axis of a surface constituting the touch screen 130.
- the input operation reception unit 191 determines a user operation on the touch screen 130.
- the input operation accepting unit 191 includes, for example, (1) “approach operation”, (2) “release operation”, (3) “tap operation”, (4) “double tap operation”, (5) “long press operation ( (Long touch operation) ”, (6)“ drag operation (swipe operation) ”, (7)“ move operation ”, (8)“ flick operation ”, and other user operations.
- the user operation determined by the input operation receiving unit 191 is not limited to the above. For example, when the touch panel 131 has a mechanism that can detect the magnitude of the pressure that the user presses against the touch panel 131, the input operation reception unit 191 determines the magnitude of the pressure that the user has pressed.
- (1) “approach operation” is an operation in which the user brings a finger or the like closer to the touch screen 130.
- the touch screen 130 detects that the user's finger or the like has approached (including that the user's finger or the like has touched the touch screen 130) by the touch panel 131, and outputs a signal corresponding to the detected coordinates of the touch screen 130 to the control unit. Output to 190.
- the control unit 190 determines that the state is the “touch-on state” when the approach is detected from the state in which the approach of the user's finger or the like to the touch screen 130 is not detected.
- “Release operation” is an operation to stop the state in which the user is operating the touch screen 130 to approach. For example, when the user performs an operation of releasing his / her finger from a state where the user touches the touch screen 130 with the finger or the like, the input operation receiving unit 191 determines that the user's operation is a “release operation”. The control unit 190 determines that the state is changed from the “touch-on state” to the “touch-off state” when the approach is not detected from the state in which the approach of the user's finger or the like to the touch screen 130 is detected. To do.
- “Tap operation” is to perform a release operation at a position where the user has performed an approach operation after the user performs an approach operation to bring a finger or the like closer to the touch screen 130.
- the input operation reception unit 191 outputs the touch screen 130 from the state where the approach operation is not detected (the user's finger or the like is away from the touch panel 131 and the touch panel 131 does not detect the approach of the user's finger or the like). Based on this, when it is detected that the user's finger or the like has approached, the detected coordinates are held as the “initial touch position”.
- the user's operation is determined as a “tap operation”.
- the “double tap operation” is an operation in which the user performs the tap operation twice within a predetermined time.
- the input operation reception unit 191 determines that the user operation is “double tap operation” when the user operation is determined to be the tap operation and the tap operation is determined again with the coordinates related to the tap operation within a certain time after the user operation is determined to be the tap operation. Determine.
- “Long press operation” is an operation in which the user continues to press the touch screen 130.
- the touch screen 130 detects the user's operation and discriminates the approaching operation. When the time during which the approaching operation continues at the coordinates where the approaching operation is detected exceeds a certain time, the touch screen 130 determines the user's operation as “ “Long press operation” (the “long press operation” may be referred to as “long touch operation”).
- the “drag operation” is an operation in which the user slides the finger while maintaining the approaching state in which the user brings the finger or the like closer to the touch screen 130.
- “Move operation” refers to a series of operations in which a user performs a release operation by moving a position where a finger or the like is approaching the touch screen 130 while maintaining the approach operation on the touch screen 130.
- “Flick operation” refers to an operation in which a user performs a move operation in a time shorter than a predetermined time.
- the flick operation is an operation in which the user flips a finger on the touch screen 130.
- the game progress processing unit 192 performs a process of progressing the game by calling various programs in accordance with user operations. For example, the game progress processing unit 192 communicates with the server 20 to transmit data to the server 20 according to the progress of the game, to receive data related to the game from the server 20, and to the user according to the progress of the game. A process for giving a reward, a process for measuring the passage of time, and other processes are performed.
- the movement operation detection unit 193 detects the operation content of the input operation for moving a character appearing in the game (hereinafter also referred to as “game character”) based on a user input operation on the touch screen 130. For example, when the game program 151 is a sports game, the movement operation detection unit 193 detects the direction in which the game character is moved based on the user's input operation (the movement operation detection unit 193 allows the user to move the game character). Accept input operation to specify direction).
- the movement operation detection unit 193 causes the user to move the finger closer to the touch screen 130 from the state in which the user's finger is separated from the touch screen 130, and the input operation reception unit 191 causes the touch panel 131 to move the user's finger.
- the moving direction of the game character is detected based on the initial touch position coordinates and the detection result of the touch screen 130. Detailed processing of the movement operation detection unit 193 will be described later.
- the camera placement control unit 194 determines how to show each object placed in the virtual space to the user. Specifically, the camera arrangement control unit 194 controls the arrangement (camera work) of the virtual camera in the virtual space generated when the control unit 190 reads and executes the game program 151. The control unit 190 provides the game play environment to the user by displaying the captured image of the virtual camera in the virtual space on the display 132.
- the object control unit 195 includes various objects that appear in the game that is progressed by the mobile terminal 10 executing the game program 151, and various objects that are generated based on the user operation content received by the input operation receiving unit 191. Processes such as generation, transformation, and movement (for example, GUI (Graphical User Interface) screen) are controlled.
- the object control unit 195 generates an object indicating the moving direction of the game character based on, for example, an input operation on the touch screen 130 for moving the game character by the user, and deforms the generated object.
- the display control unit 196 outputs an image according to the camera work of the virtual camera to the display 132.
- the display control unit 196 determines the display content of the display 132 according to the placement of the virtual camera in the virtual space, and outputs various types of information such as images and text according to the determined display content to the display 132.
- FIG. 3 is a block diagram showing a functional configuration of the server 20.
- the detailed configuration of the server 20 will be described with reference to FIG.
- the server 20 functions as the communication unit 220, the storage unit 250, and the control unit 290 by operating according to the program.
- the communication unit 220 functions as an interface for the server 20 to communicate with an external communication device such as the mobile terminal 10.
- the storage unit 250 stores various programs and data for the user to advance the game in the mobile terminal 10.
- the storage unit 250 stores a game program 251, game information 252, and user information 253.
- the game program 251 is a program for allowing the server 20 to communicate with the mobile terminal 10 and advance the game on the mobile terminal 10.
- the game program 251 refers to the game information 252 and user information 253 that are various data for proceeding with the game, and causes the game to proceed according to the user's input operation.
- the game program 251 is executed by the control unit 290, thereby processing to transmit / receive data to / from the mobile terminal 10, processing to advance the game in accordance with the operation content performed by the user of the mobile terminal 10, and user's playing the game
- the server 20 is caused to perform processing for updating information and other processing.
- the game information 252 includes various data that the game program 251 refers to.
- the game information 252 includes an object management table 252A, a passive skill management table 252B, and an active skill management table 252C.
- the object management table 252A shows the setting of objects arranged in the virtual space of the game.
- the mobile terminal 10 causes the display 132 to display an image obtained by photographing an object placed in the virtual space with a virtual camera placed in the virtual space, thereby causing the game to proceed.
- an object indicating a game character operated by the user and an object indicating an object to be worn by the game character.
- These objects are associated with objects when the user performs a predetermined input operation on the touch screen 130, meets certain conditions as the game progresses, and other various events occur. Processing is performed.
- the object is selected by the user.
- processing such as movement of an object selected by the user according to the user's input operation is performed.
- a process such as a reward for favoring the game is given to the user.
- the passive skill management table 252B information for identifying an object is associated with information on passive skills associated with the object.
- the passive skill is activated when, for example, a predetermined condition is satisfied in the game, and the user can advantageously advance the game.
- the game character can be advantageously advanced, such as the moving speed of the game character is improved.
- information for identifying an object is associated with information on active skills associated with the object.
- the active skill is, for example, in a state where it can be activated when a predetermined condition is satisfied in the game, and by receiving an input operation for activating the skill from the user, the user has an advantage in the game. It can be made to progress.
- User information 253 is information about a user who plays a game.
- the user information 253 includes a user management table 253A.
- the user management table 253A includes information for identifying each user, information indicating the degree of progress of the game by the user, items held by the user in the game, game characters, information such as wearing items used by the game character, and the like. Information.
- the control unit 290 executes functions of the transmission / reception unit 291, the server processing unit 292, the data management unit 293, the matching unit 294, and the measurement unit 295 by executing the game program 251 stored in the storage unit 250.
- the transmission / reception unit 291 receives various types of information from the mobile terminal 10 that executes the game program 151, and transmits various types of information to the mobile terminal 10.
- the mobile terminal 10 and the server 20 include a request to place an object associated with the user in the virtual space, a request to delete the object, a request to move the object, a request to update various parameters according to a reward acquired by the user, Information such as an image, audio, and other data for advancing, and a notification transmitted from the server 20 to the mobile terminal 10 are transmitted and received.
- the server processing unit 292 controls the operation of the entire server 20 and performs processing necessary for the progress of the game by calling various programs. For example, the server processing unit 292 updates data such as the game information 252 and the user information 253 based on information received from the mobile terminal 10, and transmits various data to the mobile terminal 10 to advance the game.
- the data management unit 293 performs a process of updating various data stored in the storage unit 250, a process of adding / updating / deleting a record to the database, and the like according to the processing result of the server processing unit 292.
- the matching unit 294 performs a series of processes for associating a plurality of users. For example, when the user performs an input operation for performing multi-play, the matching unit 294 performs processing for associating a user who plays a game in cooperation.
- the measuring unit 295 performs processing for measuring time. For example, the measurement unit 295 measures the passage of time for each object arranged in the virtual space. Moreover, the measurement part 295 measures the time for which the game is progressing.
- the server 20 receives information on various measurement results measured by executing the game program 151 on the mobile terminal 10 from the mobile terminal 10, and collates the received information with the measurement results of the measurement unit 295. The information regarding various times is synchronized between the mobile terminal 10 and the server 20.
- the game program 151 is, for example, a sports game, and the game is advanced by causing the touch screen 130 to display a screen corresponding to the placement of the virtual camera in the virtual space.
- the game progress processing unit 192 advances the story in accordance with the user's operation, determines the data to be displayed on the display 132 such as images and text, and the sports opponent's Basic processing such as processing for accepting selection from the user and processing for advancing a sports game in accordance with the user's operation is performed.
- the camera placement control unit 194 determines the placement location of the virtual camera and the direction of the shooting axis in the virtual space for playing a sports game according to the progress of the sports game. Decide sequentially.
- the camera arrangement control unit 194 controls camera work of the virtual camera. Detailed processing of the camera arrangement control unit 194 will be described later.
- the display control unit 196 receives and accepts an input operation for causing the game character to perform a first action (for example, a serve action) in a state where a virtual camera is arranged to display the game character in a certain area.
- a first action for example, a serve action
- the game character is displayed on the screen.
- FIG. 4 is a diagram illustrating a process in which the movement operation detection unit 193 detects a direction in which the game character is moved in accordance with a user input operation.
- the movement operation detecting unit 193 sets a position (initial touch position) where the user presses the touch screen 130 by bringing a finger or the like closer to the touch panel 131 from the state where the user is not pressing the touch screen 130 as a starting point.
- the input operation reception unit 191 determines that the user's operation is a drag operation
- the input operation receiving unit 191 determines the game character based on the coordinates that are the starting point and the coordinates that the touch screen 130 detects the approach of the user's finger or the like. The direction to move is detected.
- the state (A) in FIG. 4 shows a state in which the user brings the finger closer to the touch screen 130 from a state in which the user's finger is separated from the touch screen 130.
- the input operation reception unit 191 detects that the user's finger has approached the touch panel 131 and stores the detected coordinates in the memory as an initial touch position.
- the coordinates of the initial touch position held by the memory are shown as the initial touch position coordinates 155.
- the input operation reception unit 191 does not detect the detection result of the touch screen 130 (the coordinates at which the user's finger is approaching the touch screen 130 and the proximity of the user's finger to the touch screen 130 ( The detection result “null”)) is stored in the buffer memory 153 for a certain number of frames.
- the buffer memory 153 can store the detection result on the touch screen 130 for each frame (in the example of FIG. 3, 11 frames from the memory area fp [0] to the memory area fp [10]). it can.
- the buffer memory 153 can be realized as a ring buffer, for example.
- the position where the user pressed the touch screen 130 is shown as a pressed position 30A (coordinates (x0, y0) of the touch screen 130).
- the user performs a drag operation on the touch screen 130, and the pressing position on the touch screen 130 is changed from the pressing position 30A to the pressing position 30B (coordinates (x9, y9) of the touch screen 130).
- the input operation reception unit 191 stores the detection result of the touch screen 130 in the buffer memory 153 and refers to the value held in the buffer memory 153 to determine that the user's operation on the touch screen 130 is a drag operation.
- the position where the user is pressing the touch screen 130 is five frames (memory area fp [10]) from the pressed position 30B to the pressed position 30C (coordinates (x14, y14) of the touch screen 130). , Fp [0], fp [1], fp [2], and fp [3]).
- the movement operation detection unit 193 specifies the direction in which the user moves the game character in each of the state (B) and the state (C).
- the movement operation detection unit 193 manages information (writing position of the buffer memory 153) indicating which memory area is a target of writing the coordinates of the pressed position detected by the touch screen 130 in the buffer memory 153. .
- the movement operation detecting unit 193 determines from the coordinate 31A (coordinate (x0, y0)) indicating the initial touch position to the coordinate 31B (coordinate (x9, y9)) based on the determination result of the input operation receiving unit 191. ) Until the user performs a drag operation.
- the movement operation detection unit 193 starts the coordinate 31A of the initial touch position as a starting point, and moves the game character in a vector 32B ((y9-y0) / (x9-x0)) defined by the coordinates 31A and 31B. Detect as.
- the movement operation detecting unit 193 is dragged from the coordinate 31B (coordinate (x9, y9)) to the coordinate 31C (coordinate (x14, y14)) based on the determination result of the input operation receiving unit 191. Detects that an operation has been performed.
- the movement operation detection unit 193 starts the coordinate 31A of the initial touch position as a starting point, and moves the game character in a vector 31C ((y14 ⁇ y0) / (x14 ⁇ x0)) defined by the coordinates 31A and 31C. Detect as.
- the process in which the camera placement control unit 194 determines the camera work of the virtual camera in the virtual space will be described in detail.
- the camera arrangement control unit 194 arranges the virtual camera so that the game character is displayed on the touch screen 130 when causing the game character to perform a predetermined action (first action) for advancing the game.
- the game program 151 accepts an input operation, whereby a game character that is operated by a user who is a game player and an opponent character that fights against the game character alternately strikes an object such as a ball. It is a game.
- the first action to be performed by the game character includes a serve action in a sports game.
- the second action to be performed by the game character includes at least an action of moving the game character so as to alternately return the opponent character and the object (ball) after the serve action.
- the game program 151 is a sports game.
- a virtual camera is arranged in a predetermined range in the virtual space when the game character is to perform a serve action. It is preferable to arrange the virtual camera at a position where it is possible to provide a powerful screen to the user while facilitating an input operation for allowing the game character to perform a serve operation. Therefore, for example, the camera placement control unit 194 displays virtual game placement locations and virtual cameras so that a game character that performs a serve motion is displayed largely on the front side of the touch screen 130 and the opponent's game character can fit on the screen. Determine the shooting direction.
- FIG. 5 shows an input operation for moving the game character in order to strike back the ball alternately with the opponent character as the second action after the game character performs the serve action as the first action. It is a figure which shows the transition of the game screen which accepts.
- the opponent character 136 is arranged in the depth direction (vertical direction) of the screen with respect to the position of the own character 134 across the net.
- the screen example (A), the screen example (B), and the screen example (C) in FIG. 5 are screens that receive an input operation for causing the user character 134 operated by the user to perform a serve motion as a first motion.
- the game progress processing unit 192 displays, on the touch screen 130, the own character 134 that is a game character operated by the user and an opponent character 136 that is a game character that is an opponent of the tennis game. Is displayed, and an input operation for allowing the user character 134 to perform a serve operation is received.
- the game progress processing unit 192 accepts an input operation for designating a position at which the own character 134 starts the serve action in order for the user to cause the own character 134 to perform the serve action.
- the position at which the serve motion is started is defined in the direction along the side of the court (the left-right direction on the screen)
- the user performs an input operation to move the player character 134 in the left-right direction, and the serve motion is performed. Determine where to start.
- the game progress processing unit 192 receives an input operation for causing the player character 134 to perform a serve action from the user.
- the input operation receiving unit 191 detects that the user has performed a tap operation
- the game progress processing unit 192 causes the player character 134 to perform an action of tossing up the ball.
- the game progress processing unit 192 moves the hitting target object 138 as if the player character 134 tossed up the tennis ball. That is, the game progress processing unit 192 moves the hitting target object 138 so that the tennis ball moves up and down when the player character 134 throws the tennis ball.
- the game progress processing unit 192 receives an input operation from the user for causing the player character 134 to hit a tossed tennis ball.
- the user visually recognizes that the tossed hitting target object 138 moves so as to descend and then hits the hitting target object 138 by performing a tap operation on the touch screen 130 in a timely manner.
- the user character 134 is operated.
- the game progress processing unit 192 receives a tap operation from the user in a predetermined number of frames out of the total number of frames for displaying the hitting target object 138 ascending and descending, for example.
- the game progress processing unit 192 detects a tap operation in the fixed number of frames, the game progress processing unit 192 causes the player character 134 to perform an action of hitting the hitting target object 138 to fly the hitting target object 138 toward the opponent character. Move to make it.
- the user visually recognizes the tossed ball target object 138 and performs a tap operation at an appropriate timing to perform a serve operation on the player character 134. Let it be done. Therefore, for the user, the player character 134 and the hit ball target object 138 are arranged on the front side of the touch screen 130 and displayed relatively larger than when the player character 134 and the hit ball target object 138 are displayed relatively small. It becomes easier to input the serve operation.
- the user moves the player character 134 while moving the player character 134.
- the game is advanced by performing an input operation of alternately hitting the hitting target object 138 between the character 134 and the opponent character 136.
- the game progress processing unit 192 receives an input operation for designating a direction in which the player character 134 is moved by receiving a drag operation from the user.
- the user mainly performs an input operation for moving the player character 134 using the area on the near side of the touch screen 130. Therefore, the operability of the game can be improved by making it easier for the user to visually recognize the own character 134 so that the user's finger and the own character 134 do not overlap as much as possible.
- the camera arrangement control unit 194 displays the screen to cause the game character to perform the first action, and displays the game character so as to display a certain area from at least one side of the touch screen 130 toward the center. Move the camera. For example, the camera arrangement control unit 194 displays a game character with a certain area from about one quarter to about one third from one side of the touch screen 130 toward the center.
- the camera placement control unit 194 displays a screen that causes the game character to perform a serve motion as the first motion (screen example (A )
- the game character self character 1314 is displayed so as to display a certain area 139 from the side of the touch screen 130 toward the center of the screen from the side closer to the user.
- the placement position of the virtual camera is moved in parallel while maintaining the shooting direction of the virtual camera.
- FIG. 6 is a flowchart showing a process for advancing a game of a sports game according to the camera work determined by the camera arrangement control unit 194 when the control unit 190 of the first embodiment executes the game program 151.
- step S ⁇ b> 601 the control unit 190 captures the virtual camera according to the initial settings of the capture position and the capture axis of the virtual camera so that the user's own character 134 operated by the user is displayed along the near side of the touch screen 130.
- An image is displayed on the touch screen 130 (corresponding to the screen example (A) in FIG. 4).
- step S ⁇ b> 603 the control unit 190 moves the position of the player character 134 within a movable range until the user performs an input operation (tap operation) for causing the player character 134 to perform a serve operation.
- An input operation is accepted.
- the control unit 190 receives an input operation for moving the position of the player character 134 by, for example, a drag operation.
- the control unit 190 receives an input operation for moving the player character 134 in the left-right direction from the user in order to determine a position where the player character 134 performs a serve action.
- the control unit 190 moves the virtual camera in parallel in accordance with an input operation for moving the player character 134 in order to determine the position where the serve action is performed, and displays a captured image of the virtual camera on the touch screen 130.
- step S605 the control unit 190 detects an input operation (tap operation) for causing the own character 134 to perform a serve action, and displays a screen display for causing the own character 314 to perform a serve action according to the input operation. 130 (corresponding to screen example (B) and screen example (C) in FIG. 4).
- the control unit 190 causes the self-character 134 to perform a serve operation
- the control unit 190 displays the self-character 134 on the touch screen 130 with a fixed region 139 that is about one third from the front side of the touch screen 130 toward the center.
- the virtual camera is moved in parallel to the direction of the shooting axis so that the shot image of the virtual camera is displayed on the touch screen 130 (corresponding to the screen example (D) in FIG. 4).
- step S607 as the game progresses, the control unit 190 displays the own character 134, the opponent character 136, and the hitting target object 138 that is the target of the rally between the own character 134 and the opponent character 136.
- the virtual camera is moved in parallel with the direction of the shooting axis so as to fit on one screen, and the shot image of the virtual camera is displayed on the touch screen 130.
- the control unit 190 performs, for example, the process shown in FIG. 6 each time a game of a tennis game is advanced and the player character 134 performs a serve action. Further, when the opponent character 136 performs a serve action, the control unit 190 displays the own character 134 on the touch screen 130 in the state shown in the screen example (D) of FIG.
- FIG. 7 is a diagram showing processing in which the camera placement control unit 194 moves the virtual camera in parallel as the game progresses.
- the virtual space is configured in three dimensions (xyz axes).
- the object control unit 195 arranges an object indicating a tennis court along the xy plane in the virtual space.
- the object control unit 195 moves the player character 134 and the opponent character 136 in parallel to the x axis and the y axis in accordance with a user input operation.
- FIG. 7A is a diagram illustrating movement of the virtual camera in a plane (yz plane) perpendicular to the plane on which the tennis court is arranged in the virtual space.
- the camera placement control unit 194 places the virtual camera at the position 143 until the user causes the user character 134 to perform a serve action by a tap operation (state (C) in FIG. 5).
- FIG. 7 is a diagram showing the movement of the virtual camera in a plane (xy plane) parallel to the plane on which the tennis court is arranged in the virtual space.
- the game progress processing unit 192 performs a rally between the player character 134 and the opponent character 136, and thus an input for designating the moving direction of the player character 134. Accept operation from user.
- the camera placement control unit 194 moves the virtual camera in parallel from the position 143 to the position 145 so as to display the self-character 134 on the touch screen 130 while leaving a certain area 139 of the touch screen 130 (the state of FIG. 5). (D)).
- the game character is displayed in a certain area (shown in the screen example (D) in FIG. 5) from the at least one side of the touch screen 130 toward the center.
- the virtual camera may be arranged so as to be displayed in the certain area 139).
- the camera placement control unit 194 displays the touch screen 130 in front of the user when the game character is to perform a serve action as the first action.
- the game character is displayed in a certain area from the side to the center of the screen, and an input operation for causing the game character to perform a serve action is received from the user.
- the game character is displayed relatively large along the front side of the touch screen 130, so that the user can easily perform the serve operation.
- the game character to be operated by the user is displayed relatively large on the near side of the touch screen 130, so that a powerful screen can be provided to the user.
- the camera arrangement control unit 194 receives an input operation for causing the game character to perform the second action subsequent to the first action by the touch screen 130, and displays the game character on the touch screen 130 according to the received input operation. Move the virtual camera to display on the screen.
- the game character is moved as an input operation for causing the game character to perform the second action following the serve action (first action).
- the control unit 190 causes the display character 132 to be displayed on the display 132 with a certain area 139 extending from the near side of the touch screen 130 toward the center, so that the user performs an input operation on the touch screen 130.
- the input operation to the touch screen 130 can be easily performed while visually recognizing the player character 134.
- the game program is a sports game.
- the game genre is not limited to a tennis game.
- the camera arrangement control unit 194 determines in advance when the game character performs an action that triggers the start or restart of the game, such as kick-off or throw-in, as the first action. Place the virtual camera in the specified range.
- the camera arrangement control unit 194 may display the player character 134 on the touch screen 130 with a certain area 139 open when an input operation for moving the game character is accepted.
- the camera placement control unit 194 displays the virtual camera on the plane on which the tennis court is placed (the game character is The imaging axis of the virtual camera is moved in parallel so that the imaging axis of the virtual camera is at a certain angle with respect to the movable plane.
- the camera angle for displaying the game character is maintained at a certain level, so that the game character moves in the virtual space viewed from the virtual camera as viewed from the moving direction indicated by the drag operation for moving the game character. Therefore, the user operability can be made easier.
- FIG. 8 is a diagram illustrating processing in which the camera placement control unit 194 according to the second embodiment determines camera work in accordance with the position of the game character.
- the state (A) in FIG. 8 shows a state in which the camera placement control unit 194 places the virtual camera so that the fixed character 139 on the near side of the touch screen 130 is opened and the own character 134 is displayed.
- the camera arrangement control unit 194 accepts an input operation to advance the player character 134 in the court from the user, and maintains the display of the player character 134 with a certain area 139 left.
- the state which is moving the virtual camera is shown.
- the camera arrangement control unit 194 moves the imaging axis of the virtual camera in parallel from state (A) to state (B).
- the camera placement control unit 194 accepts an input operation from the user that causes the player character 134 to move far away from the court, and displays the player character 134 in the fixed area 139.
- the state which is moving is shown.
- the own character 134 moves to the certain area 139, the own character 134 moves to an area where the user can easily perform an input operation on the touch screen 130. Therefore, it becomes difficult for the user to visually recognize the own character 134 as compared with the state (A).
- the own character 134 moves far from the court, it is assumed that it is difficult for the own character 134 to strike back the hitting target object 138 that the opponent character 136 returns. That is, if the player character 134 is located far from the court, the possibility that the player character 134 can strike back the hit target object 138 is reduced, which is disadvantageous for the user.
- the camera arrangement control unit 194 arranges the virtual camera so as to display the own character 134 in the fixed area 139, so that the moving direction of the own character 134 is determined for the user. Suggest to the user that it will be at a disadvantage.
- FIG. 10 is a diagram showing a movement range 141 for determining whether to move the own character 134 regardless of the fixed area 139.
- the camera arrangement control unit 194 opens the fixed character 139 and opens the own character 134 as shown in the state (B) of FIG. 8.
- the placement of the virtual camera is determined to be displayed on the touch screen 130.
- the camera arrangement control unit 194 determines the arrangement of the virtual camera so that the own character 134 enters the fixed area 139.
- FIG. 9 is a flowchart showing a process for advancing a game of a sports game according to the camera work determined by the camera placement control unit 194 when the control unit 190 of the second embodiment executes the game program 151.
- step S ⁇ b> 901 the control unit 190 acquires the coordinates of the coordinate in which the own character 134 operated by the user is located in the virtual space.
- the control unit 190 acquires the coordinates of the player character 134 in the xy plane.
- step S903 the control unit 190 compares the coordinates of the player character 134 acquired in step S901 with a predetermined movement range 141, and determines whether the position of the player character 134 is within the movement range 141. Determine whether or not.
- the control unit 190 performs the process of step S905. Otherwise (NO in step S903), the process of step S907 is performed. Do.
- step S905 as the game progresses, the control unit 190 displays the own character 134, the opponent character 136, and the hitting target object 138 that is the target of the rally between the own character 134 and the opponent character 136.
- the position of the virtual camera is determined so that the virtual camera is moved in parallel with the direction of the shooting axis so as to fit on one screen.
- step S907 the control unit 190 stores the player character 134, the opponent character 136, and the hitting target object 138 on a single screen as the game progresses, and the player character 134 also in the fixed area 139 of the touch screen 130.
- the position of the virtual camera is determined so that is displayed. For example, when the position of the player character 134 is outside the range of the movement range 141, the virtual camera position is changed to the direction in which the hit ball is rallyed between the player character 134 and the opponent character 136 (touch screen in FIG. 8).
- the vertical direction 130 may not be moved.
- step S909 the control unit 190 displays a captured image of the virtual camera on the touch screen 130 according to the determined position of the virtual camera.
- the control unit 190 performs the processing shown in FIG. 9 for every fixed frame (for example, every frame), for example, after the own character 134 performs the serve action. Further, when the opponent character 136 performs a serve action, the control unit 190 performs the process shown in FIG. 9 for each fixed frame.
- the camera arrangement control unit 194 displays the touch screen 130 when the game character moves within a predetermined range within the range in which the game character can move in the virtual space generated when the control unit 190 executes the game program 151.
- the virtual camera is moved so as to display a game character with a certain area from at least one side to the center, and the game character moves outside the certain range, the virtual character is moved from at least one side of the touch screen 130 to the center.
- the movement of the virtual camera is controlled so that the game character is displayed on the touch screen 130 even in a certain area toward the screen.
- the camera arrangement control unit 194 leaves a certain area from the front side of the touch screen 130 toward the center.
- the virtual camera is moved to display the game character.
- the camera placement control unit 194 sends a hit ball returned from the opponent to the opponent. It is considered that the game character is in a position unsuitable for hitting again), and the virtual camera is moved so that the game character is moved in the fixed area without leaving a fixed area from the near side to the center.
- the position of the virtual camera is determined so as to move the game character so as to advance to the certain area.
- FIG. 10 is a diagram illustrating a process in which the camera placement control unit 194 according to the third embodiment places a virtual camera in accordance with the direction in which the player character 134 is moved and the hitting action of the player character 134.
- the camera arrangement control unit 194 has a different direction (the left and right sides of the touch screen 130) from the direction in which the hitting target object 138 is rallyed (the up and down direction of the touch screen 130).
- Direction a display area 137 where the player character 134 can be displayed is set.
- the camera arrangement control unit 194 controls the arrangement of the virtual cameras so that the camera work in the left and right direction is fixed when the user moves the player character 134 in the left and right direction.
- the camera arrangement control unit 194 moves the player character 134 in the horizontal direction. Along with this, the virtual camera is moved in parallel.
- the display area 137 in which the own character 134 can be displayed is set larger than the own character 134.
- the hitting target object 138 is in the vicinity of the boundary whether or not the opponent character 136 can hit the hitting target object 138 and the own character 134 can hit the hitting target object 138 again.
- the player character 134 hits the hit target object 138 in a state disadvantageous to the user (for example, the player character 134 The player moves the player in the right or left direction and hits the ball, and during this time, the user cannot move the player character 134.
- the parameters such as the physical strength value set in 134 may be relatively reduced.
- the camera placement control unit 194 moves the virtual camera in accordance with the action of the player character 134 while performing the action of the player character 134 hitting the ball in a posture. As described above, the camera placement control unit 194 receives the input operation for hitting the hit target object 138 from the user, and then the hit target object 138 is within a range where the player character 134 can hit the hit target object 138. It is determined whether or not there is.
- the camera arrangement control unit 194 causes the player character 134 to perform a hitting action without moving the virtual camera, and hits the hitting target object 138 near the boundary of the range.
- the player character 134 performs a hitting motion while moving his / her posture while moving his / her posture, and moves the virtual camera in parallel with the movement of the player character 134.
- FIG. 11 is a diagram showing a setting of a range in which the player character 134 can hit the target object 138 for hitting.
- the state (A) of FIG. 11 is an example in which the control unit 190 causes the touch screen 130 to display a range image 147 for allowing the user to visually recognize a target range in which the player character 134 can hit the hit target object 138. .
- the state (B) of FIG. 11 is a diagram showing a possible hitting range 149 that indicates a range in which the player character 134 can hit the hit target object 138. As shown in the state (B), a hitting range 148R and a hitting range 148L suitable for hitting the hitting target object 138 are set within the range of the range image 147 drawn around the player character 134.
- the control unit 190 determines whether or not the hitting target object 138 is within the hittable range 149 when the user performs an input operation for hitting the hitting target object 138.
- the hitting target object 138 is within the hittable range 149 and outside the range image 147, the hitting motion is performed while moving the player character 134 while moving its posture (state (D) in FIG. 10).
- the control unit 190 determines that the hitting target object is in the hitting range 148R or the hitting range 148L
- the control unit 190 displays the hitting target object 138 on the touch screen 130 so as to hit a hitting ball stronger than a normal hitting ball.
- the camera arrangement control unit 194 receives an input operation for designating the moving direction of the game character, moves the virtual camera according to the moving direction of the game character that has received the input operation, and moves the virtual character to an area where the game character can be displayed on the touch screen 130. Display game characters.
- the direction different from the moving direction to the certain area from at least one side of the touch screen 130 toward the center is compared with the moving direction to this certain area.
- An area where game characters can be displayed on the touch screen 130 is set wide.
- the camera arrangement control unit 194 moves the virtual camera according to the moving direction of the game character when moving the game character in a direction exceeding the area where the game character can be displayed.
- the moving direction to the certain area moves when the game character is displayed with a certain area from the near side of the touch screen 130 toward the center.
- the moving direction to the certain area moves when the game character is displayed with a certain area from the near side of the touch screen 130 toward the center.
- a wide area in which the game character can be displayed on the touch screen 130 is set in a direction different from the movement direction to the certain area (movement in the left-right direction).
- the virtual camera when moving the game character in the vertical direction, the virtual camera can move in the vertical direction of the touch screen 130 following the movement of the game character, whereas when moving the game character in the left and right direction, A margin is provided in comparison with the movement of the virtual camera in the vertical direction, and the virtual camera can be prevented from immediately following even if the game character switches back from moving to the right to moving to the left. By doing so, it is possible to reduce the frequency with which the screen during the game shakes in the left-right direction in accordance with the movement of the game character, and to make the game play more comfortable.
- the camera arrangement control unit 194 transitions from a first state in which the game character is moved in response to an input operation to the game character to a second state in which the game character performs a predetermined action that involves the movement of the game character.
- the virtual camera is moved according to the action of the game character corresponding to the second state regardless of the area where the game character can be displayed on the touch screen 130.
- the camera arrangement control unit 194 moves the virtual camera in accordance with the effect of breaking the posture of the game character regardless of whether or not the game character is an area that can be displayed on the touch screen 130.
- FIG. 12 is a diagram illustrating processing in which the control unit 190 according to the fourth embodiment displays an object on the display 132 according to the direction in which the game character is moved based on the detection result of the moving operation detection unit 193.
- the state (A) in FIG. 12 shows a situation in which the camera arrangement control unit 194 receives an input operation for moving the player character 134 from the user. It is assumed that the user performs a drag operation in the direction indicated by the dotted line from the pressed position 30E of the user's finger shown in the state (A).
- the state (B) in FIG. 12 shows a situation in which the user performs a drag operation and inputs a direction in which the player character 134 is moved.
- the user's finger has moved from the pressed position 30E to the pressed position 30F.
- the movement operation detection unit 193 detects an input operation for designating the direction in which the user moves the game character based on the coordinates of the pressed position 30F and the coordinates of the pressed position 30E of the touch screen 130.
- the control unit 190 moves the own character 134 from the coordinates of the pressed position 30E as a starting point and from any point on the line segment from the pressed position 30E to the pressed position 30F as an end point.
- An arrow object 186 that is an arrow-shaped object indicating a direction is displayed on the touch screen 130. While the drag operation is being performed, the control unit 190 detects an input operation for designating the direction in which the player character 134 is moved, and causes the touch screen 130 to display an object corresponding to the move direction.
- the state (C) in FIG. 12 shows a state in which the object indicating the direction in which the user moves the game character displays the elastic object 187, which is an elastic object, on the touch screen 130.
- the elastic object 187 is an object that starts from the coordinates of the pressed position 30E and ends at an arbitrary point on the line segment from the pressed position 30E to the pressed position 30F.
- control unit 190 controls the display content of the touch screen 130 so that the user can visually recognize the arrow object 186 while accepting the user's input operation on the touch screen 130. By doing so, the control unit 190 can accept an input operation for the user to move the player character 134 while allowing the user to recognize the direction in which the player character 134 is moved.
- the game program 151 displays a screen for causing the game character (self character 134) to perform the first action and leaves the fixed area 139 to play the game character. Is displayed on the touch screen 130, and the camera work is controlled by the camera arrangement control unit 194.
- the camera placement control unit 194 controls the camera work in accordance with the type of input operation accepted for operating the game character.
- the game program 151 causes the player character 134 to perform a serve operation by a tap operation.
- the control unit 190 restricts the movement of the player character 134 in the vertical direction of the touch screen 130 and accepts a tap operation when causing the player character 134 to perform a serve action.
- the game program 151 accepts an input of the moving direction of the player character 134 from the user by a drag operation after the player character 134 performs a serve action.
- the game program 151 arranges the player character 134 with the fixed area 139 open when the input of the moving direction of the player character 134 is accepted by the drag operation from the user without restriction in the up, down, left, and right directions. Control the camera work of the virtual camera.
- the user When the user needs to perform a drag operation without restriction in the up / down / left / right direction (when the game program 151 accepts an input of the moving direction of the game character by the drag operation without restriction in the up / down / left / right direction), the user touches When the self-character 134 is displayed with a space on the front side of the screen 130, an input operation on the touch screen 130 can be performed more easily than in the case where the local character 134 is not displayed. Thereby, the operativity of a game can be improved further.
- the mobile terminal 10 receives a user input operation on the touch screen 130 and transmits the received operation content to the server 20.
- the server 20 receives a user input operation from the mobile terminal 10, performs each process for proceeding with the game, generates a display screen for display on the mobile terminal 10 based on the placement of the virtual camera, and generates the display screen
- the display screen is sequentially displayed on the mobile terminal 10.
- 10 mobile terminal 20 server, 22 communication IF, 23 input / output IF, 25 memory, 26 storage, 29 processor, 80 network, 81 wireless base station, 82 wireless router, 110 antenna, 120 wireless communication IF, 130 touch screen, 131 Touch panel, 132 display, 134 own character, 136 opponent character, 138 hit object, 150 storage unit, 151 game program, 153 buffer memory, 155 initial touch position coordinates, 160 voice processing unit, 170 microphone, 180 speaker, 190 control Part, 191 input operation reception part, 192 game progress processing part, 193 movement operation detection part, 194 camera placement control part, 195 object control part, 196 display control Department, 220 communication unit, 250 storage unit, 251 game program 252 game information, 253 user information, 290 control unit, 291 reception unit, 292 the server processing unit, 293 data management unit, 294 matching unit, 295 measuring unit.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
La présente invention vise, dans un dispositif de traitement d'informations comprenant un écran tactile, à améliorer davantage l'intérêt d'un jeu qui affiche un écran sur la base de la position d'une caméra virtuelle dans un espace virtuel. Par un programme de jeu exécuté sur un dispositif de traitement d'informations, si une opération d'entrée est acceptée pour amener un personnage de jeu à réaliser une action de service en tant que première action, le programme de jeu amène le dispositif de traitement d'informations à déterminer un travail de caméra d'une caméra virtuelle de façon à positionner un personnage de joueur (134) sur le côté proche d'un écran tactile (130). Lors de l'acceptation par un utilisateur d'une opération d'appui pour amener le personnage de joueur (134) à réaliser la première action, le dispositif de traitement d'informations déplace la caméra virtuelle de telle sorte que le personnage de joueur (134) réalise la première action et une région donnée (139) est laissée libre. Pour amener un objet de balle frappé (138) à être rallié entre le personnage de joueur (134) et un personnage d'adversaire (136), le personnage de joueur (134) est déplacé sur la base d'une opération de glissement entrée.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016073379A JP6002344B1 (ja) | 2016-03-31 | 2016-03-31 | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 |
JP2016-073379 | 2016-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017169748A1 true WO2017169748A1 (fr) | 2017-10-05 |
Family
ID=57048596
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/010258 WO2017169748A1 (fr) | 2016-03-31 | 2017-03-14 | Programme de jeu, procédé, et dispositif de traitement d'informations comprenant un écran tactile |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6002344B1 (fr) |
WO (1) | WO2017169748A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020105269A1 (fr) * | 2018-11-19 | 2020-05-28 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003093735A (ja) * | 2002-08-05 | 2003-04-02 | Namco Ltd | ゲーム装置および情報記憶媒体 |
JP2005349174A (ja) * | 2004-05-10 | 2005-12-22 | Nintendo Co Ltd | ゲームプログラムおよびゲーム装置 |
JP2009000383A (ja) * | 2007-06-22 | 2009-01-08 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
JP2013039232A (ja) * | 2011-08-16 | 2013-02-28 | Sega Corp | コンピュータゲーム装置、コンピュータゲーム装置を制御するための制御方法及びゲームプログラム、並びにゲームプログラムを記録した記録媒体 |
JP2013056095A (ja) * | 2011-09-09 | 2013-03-28 | Nintendo Co Ltd | ゲームシステム、携帯型ゲーム装置、情報処理部の制御方法、および情報処理部の制御プログラム |
JP2015123244A (ja) * | 2013-12-26 | 2015-07-06 | 株式会社バンダイナムコエンターテインメント | プログラム及びゲーム装置 |
JP2016013410A (ja) * | 2014-06-13 | 2016-01-28 | 株式会社コナミデジタルエンタテインメント | ゲーム制御装置、ゲームシステム及びプログラム |
-
2016
- 2016-03-31 JP JP2016073379A patent/JP6002344B1/ja active Active
-
2017
- 2017-03-14 WO PCT/JP2017/010258 patent/WO2017169748A1/fr active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003093735A (ja) * | 2002-08-05 | 2003-04-02 | Namco Ltd | ゲーム装置および情報記憶媒体 |
JP2005349174A (ja) * | 2004-05-10 | 2005-12-22 | Nintendo Co Ltd | ゲームプログラムおよびゲーム装置 |
JP2009000383A (ja) * | 2007-06-22 | 2009-01-08 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
JP2013039232A (ja) * | 2011-08-16 | 2013-02-28 | Sega Corp | コンピュータゲーム装置、コンピュータゲーム装置を制御するための制御方法及びゲームプログラム、並びにゲームプログラムを記録した記録媒体 |
JP2013056095A (ja) * | 2011-09-09 | 2013-03-28 | Nintendo Co Ltd | ゲームシステム、携帯型ゲーム装置、情報処理部の制御方法、および情報処理部の制御プログラム |
JP2015123244A (ja) * | 2013-12-26 | 2015-07-06 | 株式会社バンダイナムコエンターテインメント | プログラム及びゲーム装置 |
JP2016013410A (ja) * | 2014-06-13 | 2016-01-28 | 株式会社コナミデジタルエンタテインメント | ゲーム制御装置、ゲームシステム及びプログラム |
Non-Patent Citations (2)
Title |
---|
"THE Tennis", GEKKAN APPLI STYLE, vol. 1, no. 6, 28 November 2014 (2014-11-28), pages 61 * |
"Very Hard no Kabe!", POWER SMASH CHALLENGE [PAWASUMA] PART.9, 10 April 2016 (2016-04-10), Retrieved from the Internet <URL:https://www.youtube.com/watch?v=qvWKHdeX4_M> * |
Also Published As
Publication number | Publication date |
---|---|
JP2017176764A (ja) | 2017-10-05 |
JP6002344B1 (ja) | 2016-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018113653A1 (fr) | Procédé de commutation de scène basé sur un terminal mobile et terminal mobile | |
CN112402971B (zh) | 虚拟对象控制方法、装置、计算机设备及存储介质 | |
JP6450875B1 (ja) | ゲームプログラム、ゲーム方法、および情報処理装置 | |
JP2020022574A (ja) | ゲームプログラム、方法、および情報処理装置 | |
JP6002345B1 (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP7510290B2 (ja) | ゲームプログラム、およびゲーム方法 | |
JP6924564B2 (ja) | ゲームプログラム | |
JP2017192538A (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP2017192537A (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP6002344B1 (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP6336168B1 (ja) | 情報処理方法、装置、及び当該情報処理方法をコンピュータに実行させるためのプログラム | |
JP6174202B1 (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP6668425B2 (ja) | ゲームプログラム、方法、および情報処理装置 | |
TWI729323B (zh) | 互動式遊戲系統 | |
US20220355189A1 (en) | Game program, game method, and information processing device | |
JP6480056B1 (ja) | ゲームプログラム、方法、および情報処理装置 | |
JP2020110603A (ja) | ゲームプログラム、方法、および情報処理装置 | |
JP2017185189A (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP2018020150A (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP2018033972A (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP2017221676A (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP6209243B1 (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP2020110352A (ja) | ゲームプログラム、ゲーム方法、および情報処理装置 | |
JP7216154B2 (ja) | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 | |
JP2019181167A (ja) | ゲームプログラム、方法、および情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17774292 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17774292 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |