KR20130112586A - Game device and controlling method for the same - Google Patents

Game device and controlling method for the same Download PDF

Info

Publication number
KR20130112586A
KR20130112586A KR1020120035081A KR20120035081A KR20130112586A KR 20130112586 A KR20130112586 A KR 20130112586A KR 1020120035081 A KR1020120035081 A KR 1020120035081A KR 20120035081 A KR20120035081 A KR 20120035081A KR 20130112586 A KR20130112586 A KR 20130112586A
Authority
KR
South Korea
Prior art keywords
touch
game
input
weapon
module
Prior art date
Application number
KR1020120035081A
Other languages
Korean (ko)
Inventor
임일교
Original Assignee
주식회사 드래곤플라이
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 드래곤플라이 filed Critical 주식회사 드래곤플라이
Priority to KR1020120035081A priority Critical patent/KR20130112586A/en
Publication of KR20130112586A publication Critical patent/KR20130112586A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen

Abstract

The present invention relates to a game device that can effectively operate a game through a touch screen and a control method thereof. Game device according to an embodiment of the present invention, the display module for displaying the virtual space in the game on the touch screen to the character viewpoint in the game; A manipulation module configured to generate a shooting event for firing a weapon in the virtual space by the character when the first touch is maintained on the touch screen and the second touch is input; And a shooting module for firing the weapon according to the shooting event.

Description

Game device and controlling method for the same}

The present invention relates to a game device and a control method thereof, and more particularly, to a game device and a control method that can effectively operate a game through a touch screen.

Games generally mean activities for recreation or entertainment, and recently, games using electronic devices such as personal computers, mobile devices, and console game consoles. Recently, a game device, which is an electronic device for driving a game, has been used in various mobile devices such as a mobile phone and a game device with a touch screen. Game devices equipped with such a touch screen are required to operate a game using a touch screen.

The problem to be solved by the present invention is to provide a game device and a control method thereof that can effectively operate the game through a touch screen.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

In order to achieve the above object, the game device according to an embodiment of the present invention, a display module for displaying the virtual space in the game on the touch screen from the character viewpoint in the game; A manipulation module configured to generate a shooting event for firing a weapon in the virtual space by the character when the first touch is maintained on the touch screen and the second touch is input; And a shooting module for firing the weapon according to the shooting event.

In order to achieve the above object, the game control method according to an embodiment of the present invention, the step of displaying the virtual space in the game on the touch screen of the character in the game; Inputting a first touch to the touch screen; Maintaining the input of the first touch on the touch screen and inputting a second touch; And a shooting event in which the character fires a weapon in the virtual space when the first touch is maintained and the second touch is input.

The details of other embodiments are included in the detailed description and drawings.

According to the game device and the control method of the present invention, there are one or more of the following effects.

First, there is an advantage that the event in which the character in the game of the weapon is fired by a simple touch screen operation.

Second, the character in the game has the advantage of changing the point of view and continuously firing weapons.

Third, even if a drag is input to all of a plurality of touches, there is an advantage that there is no confusion of the operation accordingly.

The effects of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the description of the claims.

1 is a perspective view of a game device according to an embodiment of the present invention.
2 is a hardware block diagram of a game device according to an embodiment of the present invention.
3 is a block diagram of a game device according to an embodiment of the present invention.
4 is a game screen displayed by the game device according to an embodiment of the present invention.
5 is a diagram illustrating the movement of a character in a game performed by a game apparatus according to an embodiment of the present invention.
6 to 8 are diagrams showing a shooting event of a game device according to an embodiment of the present invention.

Advantages and features of the present invention and methods for achieving them will be apparent with reference to the embodiments described below in detail with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. To fully disclose the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout.

Hereinafter, the present invention will be described with reference to the drawings for describing a game device and a method of controlling the same according to embodiments of the present invention.

The suffix "module" for components used in the following description is to be given or mixed with consideration only for ease of specification, and does not have a meaning or role that distinguishes itself.

At this point, it will be appreciated that the combinations of blocks and flowchart illustrations in the process flow diagrams may be performed by computer program instructions. Since these computer program instructions may be mounted on a processor of a general purpose computer, special purpose computer, or other programmable data processing equipment, those instructions executed through the processor of the computer or other programmable data processing equipment may be described in flow chart block (s). It creates a means to perform the functions. These computer program instructions may also be stored in a computer usable or computer readable memory capable of directing a computer or other programmable data processing apparatus to implement the functionality in a particular manner so that the computer usable or computer readable memory The instructions stored in the block diagram (s) are also capable of producing manufacturing items containing instruction means for performing the functions described in the flowchart block (s). Computer program instructions may also be stored on a computer or other programmable data processing equipment so that a series of operating steps may be performed on a computer or other programmable data processing equipment to create a computer- It is also possible for the instructions to perform the processing equipment to provide steps for executing the functions described in the flowchart block (s).

In addition, each block may represent a portion of a module, segment, or code that includes one or more executable instructions for executing a specified logical function (s). It should also be noted that in some alternative implementations, the functions mentioned in the blocks may occur out of order. For example, two blocks shown in succession may actually be executed substantially concurrently, or the blocks may sometimes be performed in reverse order according to the corresponding function.

1 is a perspective view of a game device according to an embodiment of the present invention.

The game device 10 is a device in which a user plays a game. The game device 10 may include various devices such as a PDA, a mobile phone, a tablet device, a pad device, a portable game machine, and the like, provided with a touch screen 13.

The touch screen 13 displays an image and a user can input a command by a touch operation. The touch screen 13 may be formed by forming a layer structure of a positive pressure capacitive touch pad and a display.

2 is a hardware block diagram of a game device according to an embodiment of the present invention.

The central processing unit 11 is an apparatus for executing a software command. The central processing unit 11 receives external information, stores it, analyzes the program's instructions, and outputs it to the outside. The central processing unit 11 includes a Complex Instruction Set Computer (CISC) that provides various instruction formats through micro programming, such as an X86 compatible processor, a Performance Optimization With Enhanced RISC-Performance Computing (PowerPC), and an Advanced RISC Machine (ARM). Like this, it is divided into RISC (Reduced Instruction Set Computer) which simplified control logic.

The main memory device 16 is a device that moves programs and other data from the auxiliary memory device 17 when a program is executed and temporarily stores the data so as to exchange data directly with the central processing unit 11. [ The main memory 16 includes a RAM and a ROM.

The auxiliary storage device 17 is a device for storing a program or other data for archiving. The auxiliary memory device 17 is installed with an operating system (OS) such as Windows, iOS, Unix, Linux, and Android for driving the system, and stores various data such as game programs and other user information. The auxiliary memory device 17 includes a hard disk drive (HDD), a solid state drive (SSD), a CD-ROM, a flash memory, and the like.

The image processing apparatus 12 is a device that processes image data output to the monitor 13 and outputs the processed image data. The image processing apparatus 12 may be integrally formed with the central processing unit 11, may be configured as a separate chipset such as a graphics processing unit (GPU), or may be configured as separate hardware such as a graphics card.

The touch screen 13 is connected to the auxiliary memory device 17 to display an image. The touch screen 13 is an LCD panel for displaying light and color, and new panels such as AMOLED are also used recently.

The touch screen 13 is a device in which a user inputs a command by touch, and the user touches to operate a game. A touch is a part of a user's body or a tool that touches a screen. An input using a touch input on the touch screen 13 includes a tap that is released immediately after the touch, and a touch point that moves after maintaining the touch. The drag may be divided into a hold, a hold for holding the touch in one place after the touch, and a double tap for quickly tapping twice.

The touch screen 13 may receive a multi-touch in which a plurality of touches are simultaneously performed. Two or more touches may be simultaneously input to the touch screen 13 or another one or more touches may be input when one or more touches are maintained. The touch screen 13 may be recognized as different inputs according to the number of touches.

The sound processing device 14 is a device for processing the voice data output to the speaker 15 and outputting the processed voice data. The sound processing apparatus 14 may be configured as a separate chipset, or may be configured in the form of a sound card. The speaker 15 is a device connected to the sound processing device 14 to output sound.

The communication device 18 is a device that transmits and receives network data and processes the network data. The communication device 18 may be configured as a separate chipset, or may be separately configured in the form of a network card. The communication device 18 can transmit and receive data by a communication method according to various standards such as Ethernet, wireless LAN, and cellular phone communication.

The auxiliary input device 19 is a device for inputting a command by a user, such as a keyboard, a game pad, a joystick, a touch pad, a button, and the like, a camera for receiving an image, a microphone for receiving an audio, and the like.

3 is a block diagram of a game device according to an embodiment of the present invention, FIG. 4 is a game screen displayed by the game device according to an embodiment of the present invention, and FIG. 5 is a game device according to an embodiment of the present invention. A diagram showing the movement of a character in a game performed at.

Game device according to an embodiment of the present invention, the game module 110 to play the overall game, the display module 120 to display the game screen on the touch screen 13, and the input to the touch screen 13 An operation module 130 for generating an event according to a command, a movement module 140 for moving a character in a game in a virtual space in the game, a viewpoint change module 150 for changing a viewpoint of a character in the game, and Fire module 160 for firing the weapon of the character in the virtual space in the game.

The game module 110 is stored in the auxiliary memory device 17 as a main program that proceeds the game, and is loaded into the main memory device 16 when the game is executed and processed by the central processing unit 11. The game module 110 processes data according to data input from the touch screen 13, the input device 19, and / or the communication device 18 to process the sound processing device 14, the image processing device 12, and the auxiliary device. Data is output to the storage device 17 or the communication device 18.

In the present embodiment, the game played by the game module 110 is a three-dimensional first-person shooting game in which a character in a game manipulated by a user moves through a three-dimensional virtual space and uses a weapon such as a gun or a grenade to select an opponent character. It is a game to kill or destroy things.

The display module 120 displays the game screen on the monitor 13 through the image processing apparatus 12. The display module 120 displays a graphic user interface (GUI) and a game screen formed by a virtual space in the game, a character in the game, a weapon of the character, a projectile projected from the weapon, and the like. The display module 120 displays a game screen including a virtual space in the game as a viewpoint of a character in the game.

The game module 110 loads game screen data about a virtual space, a character, a weapon, and other objects from the main memory 16 and / or the auxiliary memory 17 as the game progresses, and transmits the game screen data to the display module 120. The display module 120 renders the game screen data in three dimensions through the central processing unit 11 and / or the image processing apparatus 12 and displays the game screen data on the touch screen 13.

The display module 120 displays a GUI on the touch screen 13 so that a user can input a command through the touch screen 13. The display module 120 loads GUI data stored in the main memory device 16 and / or the auxiliary memory device 17 and displays the GUI data on the touch screen 13 through the image processing device 12.

In the present embodiment, the GUI displayed on the display module 120 includes a movement area 210 to which a movement command for moving a character in a game in a virtual space within the game is input, and a viewpoint change command for changing a viewpoint of a character in the game. The viewpoint change area 220 that is input, the other input button 230 through which other commands such as weapon exchange, game pause, and game menu call are input, and the viewpoint and direction of the weapon that the user manipulates are cross hairs 240. ).

The movement area 210 is an area where a movement command is input, and a user may input the movement command by touching the movement area 210. The moving area 210 is preferably displayed at the lower left of the game screen, and may be displayed at the lower right of some embodiments. The moving area 210 may be formed in the form of a virtual joystick represented by two concentric circles, and in some embodiments, may be formed in the shape of a cross game pad.

The viewpoint change area 220 is an area where a viewpoint change command is input, and a user may input a viewpoint change command by touching the viewpoint change area 220. The viewpoint change area 220 is all areas of the game screen displayed on the touch screen 13 except for the movement area 210 and the other input button 230. It is preferable that the viewpoint change area 220 occupy the largest area on the touch screen 13.

In the viewpoint change area 220, a shooting force at which a user's character fires a weapon in a virtual space is input. The user may input two shots by touching the viewpoint change area 220 at the same time. A detailed description thereof will be described later with reference to FIGS. 6 to 8.

The miscellaneous input button 230 is an area where miscellaneous commands such as weapon change, game pause, and game menu call are input. The other input button 230 may include a weapon exchange button, a game pause button, a game menu call button, and the like. Other input button 230 is preferably displayed on the upper right of the game screen.

The crosshair 240 is a view point of the character and / or a direction point of the weapon and is not touched and manipulated by the user, but merely indicates the view point of the character and / or the direction point of the weapon. The crosshairs 240 preferably display the game screen at the center.

The operation module 130 generates an event according to a user's command input to the touch screen 13. When the user touches the GUI displayed on the touch screen 13, an event is generated accordingly and transmitted to the game module 110.

The manipulation module 130 generates a movement event in which the character moves when a movement command is input to the movement region 210. Referring to FIG. 5, the movement refers to the movement on the horizontal plane according to the forward (F), backward (B), leftward movement (L), rightward movement (R), and a combination thereof.

The movement command may be input by various methods of touching the movement area 210. In the present embodiment, the movement command is input by touching the movement area 210 and dragging in a specific direction. The drag may leave the movement area 210 and lead to the viewpoint change area 220 or the other input button 230. If a drag is input in a specific direction after a touch is input to the movement area 210, the manipulation module 130 generates a movement event to move the character in the drag direction.

According to an exemplary embodiment, the move command may be input by tapping or holding the move area 210. At this time, the movement area 210 is formed in the shape of a game pad in which the direction is displayed, and when a tap or hold the displayed direction, a movement event is generated so that the character moves in the tap or held direction.

The manipulation module 130 generates a viewpoint change event for changing the viewpoint of the character when a viewpoint change command is input to the viewpoint change area 220. Referring to FIG. 5, the viewpoint change refers to a spherical movement around a character according to a combination of a top turn (UT), a bottom turn (DT), a left turn (LT), a right turn (RT), and a combination thereof. do. In this case, the upper rotation UT and the lower rotation DT mean a pitch, and the left rotation LT and the right rotation RT mean yaw. The change of viewpoint coincides with a change in the weapon's orientation. That is, the crosshairs 240 rotate in the virtual space.

The viewpoint change command may be input by various methods of touching the viewpoint change area 220. In the present embodiment, the viewpoint change command is input by touching the viewpoint change area 220 and dragging it in a specific direction. The drag may be separated from the viewpoint change area 220 to the movement area 210 or the other input button 230. If a drag is input in a specific direction after a touch is input to the viewpoint change area 220, the manipulation module 130 generates a viewpoint change event in which the viewpoint of the character is changed in the drag direction.

When a shooting command is input to the viewpoint change area 220, the manipulation module 130 generates a shooting event for firing the weapon toward the crosshair 240, which is a direction point in the virtual space.

The shooting event is input by touching two places of the viewpoint change area 220 at the same time. A detailed description thereof will be described later with reference to FIGS. 6 to 8.

When the miscellaneous command is input to the miscellaneous input button 230, the manipulation module 130 generates various events according to the corresponding command. The operation module 130 generates a weapon exchange event when a weapon exchange command is input to the other input button 230, generates a game pause event when a game pause command is input, and generates a game menu when a game menu call command is input. Raises a call event.

The movement module 140 moves the character in the game in the virtual space in the game. When the manipulation module 130 generates a movement event, the game module 110 receives the transfer event and transmits it to the movement module 140. When the moving module 140 receives the moving event, the moving module 140 moves the character in an appropriate moving direction and an appropriate speed. The movement module 140 transmits data to which the character is moved to the display module 120 through the game module 110, and the display module 120 displays a game screen on which the character is moved on the touch screen 13.

The viewpoint change module 150 changes the viewpoint of a character in the game. When the manipulation module 130 generates a viewpoint change event, the game module 110 receives the event change event and transmits it to the viewpoint change module 150. When the viewpoint change module 150 receives the viewpoint change event, the viewpoint change module 150 changes the viewpoint of the character in an appropriate direction and angular velocity. The viewpoint change module 150 transmits data on which the viewpoint of the character is changed to the display module 120 through the game module 110, and the display module 120 touches the game screen on which the viewpoint of the character is changed. ).

The firing module 160 launches the weapon of the character in the game in the virtual space in the game. When the operation module 130 generates a shooting event, the game module 110 receives this and delivers it to the shooting module 160. The firing module 160 fires a weapon toward the crosshair 240 according to the shooting event. The shooting module 160 transmits data for firing the weapon to the display module 120 through the game module 110, and the display module 120 displays the game screen on which the weapon is launched on the touch screen 13.

In the present embodiment, the weapon is a gun, a cannon, a grenade, and the display module 120 displays a bullet, a laser beam, a grenade, and a bomb that is a projectile to be fired.

6 to 8 are diagrams showing a shooting event of a game device according to an embodiment of the present invention.

As shown in FIG. 6, when the first touch 310 and the second touch 320 are input to the viewpoint change area 220, a shooting event occurs. The first touch 310 and the second touch may be input exactly simultaneously, and one touch may be maintained and the other touch may be input. In the present exemplary embodiment, the first touch 310 is input to the viewpoint change area 220 of the touch screen 13, the first touch is maintained, and the second touch is applied to the viewpoint change area 220 of the touch screen 13. 320) is input.

When the first touch 310 and the second touch 320 are input, the manipulation module 130 generates a shooting event for firing the weapon toward the crosshair 240, which is a direction point of the user's character in the virtual space.

When a shooting event occurs, the shooting module 160 fires a weapon of a character in the game toward the crosshair 240 in a virtual space in the game, and the display module 120 touches the game screen on which the weapon is fired. Mark on.

When at least one touch of the first touch 310 and the second touch 320 is released, the manipulation module 130 generates a shooting interruption event in which shooting of the weapon to be fired is stopped. When the shooting stop event occurs, the shooting module 160 stops the firing of the weapon, and the display module 120 displays the game screen on which the shooting of the weapon is stopped on the touch screen 13.

As shown in FIG. 7, when the first touch 310 and the second touch 320 are maintained and a drag is input to the first touch 310, the manipulation module 130 changes the direction of the weapon in the dragging direction. A series of firing events occur. The direction point of the weapon to be changed at this time is the point of view of the crosshair 240 as a character. Therefore, the continuous shooting event is that the shooting event is maintained at the same time that the viewpoint change event is substantially performed.

When the continuous shooting event occurs, the firing module 160 changes the direction of the weapon according to the drag direction and continuously fires the weapon. That is, when the continuous shooting event occurs, the viewpoint change module 150 changes the viewpoint of the character according to the drag direction, and the firing module 160 continuously fires the weapon toward the crosshair 240. The display module 120 displays the game screen at which the weapon is directed and the game screen is fired on the touch screen 13.

As shown in FIG. 8, when the first touch 310 and the second touch 320 are maintained and a drag is input to the first touch 310 and a drag is also input to the second touch 320, the manipulation module 130 First, the direction of the weapon is changed in the drag direction of the first touch 310 input, thereby generating a continuous shooting event for firing the weapon.

When the continuous shooting event occurs when the drag is made in different directions, the firing module 160 continuously fires the weapon by changing the direction of the weapon according to the drag direction of the first touch 310. That is, the viewpoint changing module 150 changes the viewpoint of the character according to the dragging direction of the first touch 310 and the firing module 160 continuously fires the weapon toward the crosshair 240. The display module 120 displays the game screen at which the weapon is directed and the game screen is fired on the touch screen 13.

As used herein, the term 'module' refers to software or a hardware component such as an FPGA or an ASIC, and a module plays a role. However, a module is not limited to software or hardware. A module may be configured to reside on an addressable storage medium and configured to play back one or more processors. Thus, by way of example, a module may include components such as software components, object-oriented software components, class components and task components, and processes, functions, attributes, procedures, Microcode, circuitry, data, databases, data structures, tables, arrays, and variables, as will be appreciated by those skilled in the art. The functionality provided within the components and modules may be combined into a smaller number of components and modules or further separated into additional components and modules. In addition, the components and modules may be implemented to play back one or more CPUs in a device or a secure multimedia card.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It should be understood that various modifications may be made by those skilled in the art without departing from the spirit and scope of the present invention.

13: touch screen 110: game module
120: display module 130: operation module
140: move module 150: viewpoint change module
160: shooting module 210: moving area
220: viewpoint change area 230: guitar input button
240: crosshair 310: first touch
320: second touch

Claims (11)

A display module for displaying a virtual space in the game on the touch screen at the character viewpoint in the game;
A manipulation module configured to generate a shooting event for firing a weapon in the virtual space by the character when the first touch is maintained on the touch screen and the second touch is input; And
And a shooting module for firing the weapon according to the shooting event.
The method of claim 1,
The display module displays a viewpoint change area in which the viewpoint of the character changes upon touch and drag on the touch screen,
And the first touch and the second touch are input to the viewpoint change area.
The method of claim 1,
The game device of claim 1, wherein the first touch and the second touch is maintained, and when a drag is input to the first touch, the direction of the weapon is changed in the drag direction, and the game device generates a continuous shooting event for firing the weapon.
The method of claim 1,
And the operation module is configured to generate a viewpoint change event in which the viewpoint of the character changes in the drag direction when the first touch is maintained and a drag is input to the first touch.
The method of claim 1,
The operation module is a game control device for generating a shooting interruption event to stop the firing of the weapon when at least one touch of the first touch and the second touch is released.
Displaying the virtual space in the game on the touch screen as a viewpoint of a character in the game;
Inputting a first touch to the touch screen;
Maintaining the input of the first touch on the touch screen and inputting a second touch; And
And a shooting event in which the character fires a weapon in the virtual space when the second touch is input and the second touch is input.
The method according to claim 6,
And the first touch and the second touch are input to a viewpoint change area where the viewpoint of the character changes when the touch is dragged after touching the touch screen.
The method according to claim 6,
And maintaining the first touch and the second touch, and when a drag is input to the first touch, changing a direction of the weapon in the drag direction and generating a continuous shooting event for firing the weapon. Way.
The method according to claim 6,
And a viewpoint change event in which the viewpoint of the character is changed in the drag direction when the first touch is maintained and the drag is input to the first touch.
The method according to claim 6,
And displaying that the weapon is fired on the touch screen when the shooting event occurs.
The method according to claim 6,
And generating a shooting interruption event in which the firing of the weapon is stopped when at least one of the first touch and the second touch is released.
KR1020120035081A 2012-04-04 2012-04-04 Game device and controlling method for the same KR20130112586A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120035081A KR20130112586A (en) 2012-04-04 2012-04-04 Game device and controlling method for the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120035081A KR20130112586A (en) 2012-04-04 2012-04-04 Game device and controlling method for the same

Publications (1)

Publication Number Publication Date
KR20130112586A true KR20130112586A (en) 2013-10-14

Family

ID=49633564

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120035081A KR20130112586A (en) 2012-04-04 2012-04-04 Game device and controlling method for the same

Country Status (1)

Country Link
KR (1) KR20130112586A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101525799B1 (en) * 2014-01-29 2015-06-03 주식회사 두바퀴소프트 Control system for game in touch screen device
KR101580210B1 (en) * 2015-01-08 2016-01-04 라인플러스 주식회사 Game method and system for providing smart control for touch type
CN105688409A (en) * 2016-01-27 2016-06-22 网易(杭州)网络有限公司 Game control method and device
KR102067121B1 (en) 2019-07-01 2020-01-16 김상섭 Linear coordinate detection knowledge shooting game device
CN111659118A (en) * 2020-07-10 2020-09-15 腾讯科技(深圳)有限公司 Prop control method and device, storage medium and electronic equipment

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101525799B1 (en) * 2014-01-29 2015-06-03 주식회사 두바퀴소프트 Control system for game in touch screen device
KR101580210B1 (en) * 2015-01-08 2016-01-04 라인플러스 주식회사 Game method and system for providing smart control for touch type
US10561933B2 (en) 2015-01-08 2020-02-18 Line Up Corporation Game methods for controlling game using virtual buttons and systems for performing the same
CN105688409A (en) * 2016-01-27 2016-06-22 网易(杭州)网络有限公司 Game control method and device
KR102067121B1 (en) 2019-07-01 2020-01-16 김상섭 Linear coordinate detection knowledge shooting game device
CN111659118A (en) * 2020-07-10 2020-09-15 腾讯科技(深圳)有限公司 Prop control method and device, storage medium and electronic equipment
CN111659118B (en) * 2020-07-10 2021-04-09 腾讯科技(深圳)有限公司 Prop control method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
KR102592632B1 (en) Methods and devices, electronic devices and storage media for generating mark information in a virtual environment
KR102635988B1 (en) Virtual object control method and device, terminal and storage medium
JP6438198B2 (en) Program and game device
JP2021154147A (en) Graphical user interface for game system
US11577171B2 (en) Method and apparatus for prompting that virtual object is attacked, terminal, and storage medium
US9764226B2 (en) Providing enhanced game mechanics
US20130217498A1 (en) Game controlling method for use in touch panel medium and game medium
CN110732135B (en) Virtual scene display method and device, electronic equipment and storage medium
CN110465087B (en) Virtual article control method, device, terminal and storage medium
KR20130112586A (en) Game device and controlling method for the same
JP2022533051A (en) Virtual object control methods, devices, devices and computer programs
JP6193586B2 (en) Program, system, and method
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
US20230013014A1 (en) Method and apparatus for using virtual throwing prop, terminal, and storage medium
JP2008093309A5 (en)
JP2024513658A (en) Method and apparatus for releasing skills of virtual objects, devices, media, and programs
KR20220082924A (en) Method and apparatus, device, storage medium and program product for controlling a virtual object
KR102495380B1 (en) Apparatus and method for controlling a user interface of a computing device
JP2018020001A (en) Information processing device and game program
US20230330530A1 (en) Prop control method and apparatus in virtual scene, device, and storage medium
CN114522423A (en) Virtual object control method and device, storage medium and computer equipment
TWI803147B (en) Virtual object control method, device, apparatus, storage medium, and program product thereof
JP2023526219A (en) Method, apparatus, device, medium and computer program for selecting virtual object interaction mode
KR20180116870A (en) Game device and computer program
WO2024001191A1 (en) Operation method and apparatus in game, nonvolatile storage medium, and electronic apparatus

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application