KR101525799B1 - Control system for game in touch screen device - Google Patents

Control system for game in touch screen device Download PDF

Info

Publication number
KR101525799B1
KR101525799B1 KR1020140011234A KR20140011234A KR101525799B1 KR 101525799 B1 KR101525799 B1 KR 101525799B1 KR 1020140011234 A KR1020140011234 A KR 1020140011234A KR 20140011234 A KR20140011234 A KR 20140011234A KR 101525799 B1 KR101525799 B1 KR 101525799B1
Authority
KR
South Korea
Prior art keywords
game
area
touch screen
information
touch
Prior art date
Application number
KR1020140011234A
Other languages
Korean (ko)
Inventor
권원석
Original Assignee
주식회사 두바퀴소프트
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 두바퀴소프트 filed Critical 주식회사 두바퀴소프트
Priority to KR1020140011234A priority Critical patent/KR101525799B1/en
Application granted granted Critical
Publication of KR101525799B1 publication Critical patent/KR101525799B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens

Abstract

A game operating system on a touch screen is provided. A game operating system on a touch screen according to an exemplary embodiment of the present invention includes a touch screen that senses touch information input from a client and outputs a game progress area as image information and a character object disposed on the touch screen Wherein the game progress area is moved corresponding to the touch information sensed in a lever area including a part of the character object.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a game control system,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a game operation system on a touch screen, and more particularly, to a game operation system capable of improving the convenience of operation in object control in a game implemented in a touch screen.

The game generally refers to all activities for recreation and entertainment using electronic devices such as personal computers, portable devices, and console game machines. Recently, the game industry is moving away from online games using computers, and is expanding its scope to mobile game market where users can access easily.

Touch screens are often used in electronic display systems as alternatives or additions to conventional keyboards or mice. Touch screens are generally intuitive to use and require relatively little training to operate. For example, a user can implement a complex command sequence by simply depressing the touch screen at the location identified by the appropriate icon. Such a touch screen is utilized as an input interface and display device for various games.

Recently, a variety of games using a mobile terminal equipped with a touch screen such as a smart phone and a smart pad which are widely popular are being developed.

In the conventional game using the touch screen, the button region for operating the character and the direction in the same image information is disposed separately from the object region implemented in the game due to the nature of the touch screen. Such a configuration hides a part of the game screen, which hinders the feeling of immersion when the game progresses, or it is difficult to accurately input the touch.

SUMMARY OF THE INVENTION The present invention has been made in view of the above problems, and it is an object of the present invention to provide a game operation system on a touch screen in which object information corresponding to an avatar of a player and an operation button are integrated.

It is another object of the present invention to provide a game operation system on a touch screen in which a game screen can be easily moved and switched.

Another object of the present invention is to provide a game operating system on a touch screen capable of informing target object information randomly generated in map information provided at the time of game progress in advance.

The technical objects of the present invention are not limited to the above-mentioned technical problems, and other technical subjects not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided a system including a touch screen for sensing touch information input from a client and outputting a game progress area as image information, And the game progress area is moved in correspondence with the touch information sensed in a lever area including a part of the character object.

And a map management unit for generating and managing map information provided when a game is played, wherein the map information may include the game progress area.

The map information may further include a plurality of spatial objects and a target object generated in randomly selected spatial objects, and the spatial objects may be selected at predetermined time intervals.

The target prediction unit may further include a target predicting unit disposed in the first area of the touch screen, and the target predicting unit may display an expected generation position or an expected generation position of the target object.

And a mini-map display unit disposed in the second area of the touch screen, wherein the mini-map display unit displays all of the plurality of spatial objects arranged in the map information in a reduced form.

The game progress region may be moved corresponding to the touch information sensed in the space object arranged in a predetermined area of the mini map display unit.

The touch information may include at least one of touch time information, touch duration information, repeated touch count information, drag direction information, and drag speed information.

The lever region may be divided into a first lever region and a second lever region.

When the touch information is sensed in the first lever region, the game progress region is moved in a first direction, and when the touch information is sensed in the second lever region, the game progress region is moved in a second direction have.

While the game progress area is moved corresponding to the touch information, the character object may be fixed on the touch screen.

According to the present invention as described above, the direction information is input by touching the lever area including a part of the character object, so that it is possible to provide an effect of directly manipulating the character, and a separate operation button is disposed on the touch screen So that the space on the touch screen can be efficiently used.

In addition, it is possible to improve the immersion feeling and achievement feeling by providing the predicted generation order and position of the randomly generated target objects in advance, and to move the remotely distant area using the minimap without any special operation.

1 is a block diagram showing a schematic configuration of a game operation system on a touch screen according to an embodiment of the present invention.
Figure 2 is a diagram illustrating various embodiments in which the lever regions of Figure 1 are disposed.
Figs. 3 to 4 are views showing an example of operating a game progress region in the lever region of Fig. 1. Fig.
5 to 6 are views showing an example of operating a game progress region using the mini map display unit of FIG.
7 to 9 are views showing an example in which the target predicting unit of FIG. 1 is applied in a game.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. To fully disclose the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.

Hereinafter, a game operation system on a touch screen according to embodiments of the present invention will be described with reference to the drawings.

Referring to Figure 1, there is shown a game operating system on a touch screen in accordance with one embodiment of the present invention. The game operating system on the touch screen according to the present embodiment includes a touch screen 100 that senses touch information input from a client and outputs the game progress area 200 as image information, And the game progress area 200 is moved in correspondence with the touch information sensed in the lever area 120 including a part of the character object 110. [

The touch screen 100 is a screen capable of receiving input data directly from a screen so that a user's hand or object touches a character or a specific position displayed on the screen, .

That is, the touch screen 100 can sense the touch information input from the client and output the game service provided by the server as image information. In the present embodiment, the touch screen 100 may be provided in a mobile device capable of wireless data communication, but the present invention is not limited thereto. The touch screen 100 may include a touch screen 100 provided in a general monitor, Image information can be output.

The game server 300 may be a computing device that receives game data from a game device equipped with the touch screen 100, processes the game data, and transmits the game data to the game device. The game server 300 may include various servers such as a web server, a database server, and a streaming server in addition to a server for playing a game. The game server 300 further receives game data from a plurality of game devices, collects and processes the game data, and further transmits game data to each of the plurality of game devices.

In addition, the game server 300 may include a map manager 310 for generating and managing map information provided when a game is played. The map information may include geographical information and geographical information generated in the geographical information. In this embodiment, the map information further includes a plurality of geographical objects and a target object generated in a randomly selected geographical object among the plurality of geographical objects .

In some other embodiments, the map information generated in the map management unit 310 or the map management unit 310 may be transmitted together when the game service is transmitted to the client in the form of an application program. Accordingly, the map information provided at the time of game execution is used, or map information is generated in the map management unit 310 included in the installed application program, and map information necessary for the game progress is obtained even if the map information is not received from the server can do.

The spatial object may be located anywhere the target object is created, and in this embodiment the spatial object may be in the form of a door or a room, but is not limited thereto. For example, if the spatial object is a statement form, it may be a game form in which the target object is created by opening the statement after the statement is closed before the target object is created.

The target object may be the target information for the game progress or mission performance of the character object. Thus, the target object may be implemented in the form of a monster, enemy, target of shooting, and the like.

The map management unit 310 may include the feature information, the spatial object, the target object, and the like, and may transmit the map information provided in the course of the game to the terminal device or the game device provided with the touch screen 100. The terminal device receiving the map information can output a part of the map information to the game progress area 200 by using the touch screen 100.

The character object 110 is disposed in a predetermined area of the game progress area 200 that is output as image information through the touch screen 100 or the touch screen 100, Or the like, which can execute a specific command such as a command. The character object 100 may be an object programmed in a computer language to provide a game service. The character object 110 may be generated in the game server 300 and transmitted to the touch screen 100, Or may be separately generated and disposed in a game device having the touch screen 100. [

The lever area 120 may be an operation button that allows the client to move the game progress area 200, that is, the currently displayed image information, when the game progresses. The direction in which the game progress area 200 is moved may be determined corresponding to the area where the touch information is input by sensing the touch information of the client detected in the lever area 120. [

That is, when a client touches a predetermined area of the touch screen 100, the touch panel included in the touch screen 100 determines whether an event has occurred in the area of the touch screen 100, An input signal can be generated. In this embodiment, the touch information may include a touch point, a duration of a touch, a number of repeated touches, a drag direction, and a drag speed. However, the present invention is not limited to this, and various types of motion information As shown in FIG.

In some other embodiments, the game progress area 200 output to the touch screen 100 or the touch screen 100 may further include a target predicting unit 130 or a minim map display unit 140.

The target predicting unit 130 may display a scheduled generation position or a scheduled generation order of the target object included in the map information. The mini-map display unit 140 may be in the form of a mini-map that displays all spatial objects disposed in the map information in a reduced form.

The game progress area 200 output to the touch screen 100 can not display all the map information and thus can smoothly progress the game using the target predicting unit 130 or the mini map display unit 140. [ The target predicting unit 130 and the mini-map display unit 140 will be described in detail later.

Referring to Figure 2, various embodiments are shown in which the lever region 120 of Figure 1 is disposed.

As described above, the lever region 120 is an object disposed in a predetermined region of the game progress region 200, which is partially output from the touch screen 100 or the touch screen 100, including the character object 110 . In the present embodiment, the character object 110 and the lever region 120 may be disposed at the lower end of the touch screen 100.

As shown, the character object 110 may be an object in the form of a bust of a person, but is not limited thereto. In addition, the lever region 120 may be similar to the outline of the character object 110, as shown in FIG. 2A. The area where the lever area 120 is disposed may include all areas where the character object 110 is disposed.

In some other embodiments, the lever region 120 may include only a portion of the character object 110, as shown in Figure 2B. The game progress area 200 can be moved by the touch information sensed in the lever area 120. Therefore, if the area is a predetermined distance apart from the character object 110 but the lever area 120 is disposed, Lt; / RTI > For example, even in a region slightly distanced leftward and rightward from the neck portion of the character object 110, the lever region 120 includes it, so that it is possible to sense the touch information for the direction manipulation.

In another embodiment, the lever region 120 may be disposed in the form as shown in FIG. 2C. In addition, the lever regions 120 may be arranged in a virtual form and may not be displayed as image information.

Accordingly, the client can feel as if the player directly manipulates the character object 110, and the space utilization of the game progress area 200 can be improved since the operation lever 120 is not displayed. Further, even if the character object 110 is formed small, the lever region 120 can complement the character object 110, thereby further improving the convenience of operation.

Referring to FIGS. 3 to 4, there is shown an example of operating the game progress area 200 with the lever area 120 of FIG.

3, the lever region may be divided into a first lever region 121 and a second lever region 122, and the game progress region 200 may be different from each other depending on the touch information sensed in each region Direction.

In a general lateral scroll game, the player can be provided with a game screen progressing to the left or right side. In this embodiment, when touch information is sensed in the first lever region 121, the game progress region 200 is moved to the left, and when the touch information is sensed in the second lever region 122, But is not limited thereto.

In addition, the character object 110 may be displayed in the game progress area 200, but the first lever area 121 and the second lever area 122 may not be displayed. Accordingly, although the client inputs touch information to the area where the character object 110 outputted as the image information is disposed, it can detect the touch information in the first and second lever areas 121 and 122 that are not substantially displayed have.

Meanwhile, the map information 350 includes a plurality of spatial objects 30_1 to 30_6, but in the actual game progress area 200 output on the touch screen, two complete spatial objects 30_3 and 30_4, Space objects 30_2 and 30_5, which are displayed only partially, may be displayed.

Therefore, if the client wishes to view all of the second spatial object 30_2, it is necessary to move the game progress area 200 to the left. For this purpose, the client can input touch information to the left area based on the center in the area where the character object 110 is disposed.

When the client inputs the touch information, the touch information can be sensed in the first lever area 121 arranged on the left side of the character object 110, and the distance or speed at which the game progress area 200 is moved corresponding to the touch information Can be determined. For example, the distance that the game progress area 200 is moved to the left is increased in proportion to the number of times the touch is repeated in the first lever area 121, or the game progresses in accordance with the direction and speed of dragging to the left after touching Area 200 may be moved to the left.

The lever region divided into the first lever region 121 and the second lever region 122 may include a part of the character object 110 or may be formed in the outline along the outline of the character object 110 Therefore, even if the client starts touching within the character object 110 and proceeds to the area outside the character object 110, the touch information can be detected up to the area including the lever area, have.

In some other embodiments, the character object 110 may be fixed in position on the touch screen 100 while the game progress area 200 is moved corresponding to the touch information. That is, even if the game progress area 200 moves left and right as in the first-person side scroll game, the character object 100 may be fixed at the initial position irrespective of the movement, Can be inclined at a predetermined angle. Even in this case, the area in which the character object 110 is disposed is not largely changed.

4, when the touch information of the client is sensed in the first lever area 121, the game progress area 200 may be shifted a predetermined distance to the left in the map information 350 corresponding to the touch information have. Accordingly, the second spatial object 30_2 can be displayed completely in the game progress area 200, and the fourth spatial object 30_4, which is already displayed perfectly, can be displayed only in a certain portion.

5 to 6, an example of operating the game progress area 200 using the mini-map display unit 140 of FIG. 1 is shown.

5, a part of the map information 350 including the first spatial object 30_1 to the sixth spatial object 30_6 may be displayed in the game progress area 200. [ In addition, a mini map display unit 140 capable of displaying all the map information 350 in a predetermined area of the game progress area 200 may be disposed. The mini-map display unit 140 may display the map information 350 and the spatial objects included therein in a reduced form, but the present invention is not limited thereto. A plurality of spatial objects 30_1 to 30_6 Or a separate icon for identifying the icon.

The mini-map display unit 140 may further include a current position display unit 141 which can further display which area of the map information 350 the game progress area 200 displays.

The client can move the game progress area 200 by inputting the touch information to the mini-map display unit 140 in addition to the operation of the character object 110. [ For example, the game progress area 200 in which the fourth to sixth spatial objects 30_4 to 30_6 are displayed is moved to the left of the map information 350 in which the first spatial object 30_1 is disposed . ≪ / RTI >

At this time, when the client is placed in the mini-map display unit 140 and touch information is input to a separate icon linked to the first spatial object 30_1 or the reduced spatial object 32_1, as shown in FIG. 6, The game progress area 200 can be moved to be the reference point of the first spatial object 30_1. In addition, the current position display unit 141 may also be moved corresponding to the moved game progress area 200.

3 to 4, the game progress area 200 can be linearly moved by touching a lever area including a part of the character object 110. [

On the other hand, when the game progress area 200 is moved using the mini-map display part 140, the game progress area 200 can be moved directly to the area corresponding to the touch information. That is, although the previous game progress area and the game progress area after the movement may be nonlinear, the movement process of the game formation area 200 using the mini-map display unit 140 is not limited thereto, and may be performed using the mini-map display unit 140 The game progress area 200 may be moved linearly as in the moving method using the character object 110. [

7 to 9, an example in which the target predicting unit 130 of FIG. 1 is applied in a game is shown.

As described above, the game progress area 200 may include a character object 110, a mini-map display unit 140, and a plurality of spatial objects 30_2 to 30_4, and may further include a target predicting unit 130 .

The target predicting unit 130 may display an expected generation point or an expected generation place of the target object randomly generated in the plurality of spatial objects 30_1 to 30_6.

As described above, the target object may be created in a specific spatial object that is randomly selected at predetermined time intervals among a plurality of spatial objects 30_1 to 30_6 disposed in the map information 350. [ The player, i. E. The client, can proceed with the game aiming at killing the target object created at randomly selected spatial objects.

However, since the game progress screen 200 can not display all of the map information 350 due to its characteristics, the game predicting unit 130 can display all the information about the target object generated in the map information 350 This can be understood.

In the present embodiment, the target predicting unit 130 may sequentially list the expected generation locations of the target objects on the basis of the generation time, but the present invention is not limited thereto, 350, and the reduced spatial object information may be displayed differently according to an expected generation time of the target object.

As shown in FIG. 7, the target predicting unit 130 may sequentially list the target objects to be generated on the basis of the left side of the target predicting unit 130. FIG. For example, after a target object is generated in the third spatial object 30_3, a target object is generated in the fifth spatial object 30_5 after a predetermined time, and again after a predetermined time, Can be generated.

The target predicting unit 140 may display the spatial objects in a reduced form or sequentially list the spatial objects to be generated by the target object using a separate icon or the like (31_1, 31_5, 31_1).

The client can move the game progress region 200 to the third spatial object 30_3 based on the prediction information displayed on the target prediction unit 130. [ The player can move the game progress area 200 to display the third spatial object 30_3 and then predict the next movement route with reference to the target prediction unit 140 while the target object is generated.

As shown in FIG. 8, the target object 33 is generated in the third spatial object 30_3, and the client can fire the target object 33 to proceed with the game. When the touch information is input in the area where the target object 33 is generated, a treatment event such as shooting for the target object may occur. At this time, when the target object 33 is created or processed, the target predicting unit 130 deletes the spatial object information on which the target information is generated, and can display the spatial object information 31_5 to be generated next with the highest priority.

9, the client processes the generated target object 33 in the third spatial object 30_3 and then refers to the target predicting unit 130 to determine the region in which the fifth spatial object 30_5 is disposed The game progress area 200 can be moved. As described above, the client can move the game progress area 200 by inputting touch information in a lever area including a part of the character object 110 or an area where the mini-map display part 140 is disposed.

The target predicting unit 130 updates the prediction information 31_1 for the first spatial object 30_1 to be generated next and displays the updated prediction information 31_1 on the leftmost side of the target predicting unit 130, As shown in FIG.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, You will understand. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

100: Touch screen
110: Character object
120: Lever area
130: target prediction unit
140: Mini map display unit
200: Game progress area
300: game server
310: map manager

Claims (10)

delete delete delete A touch screen for sensing touch information input from a client and outputting a game progress area as image information;
A character object disposed on the touch screen;
A game server including a map management unit for generating and managing map information provided when a game is played; And
And a target predicting unit arranged in the first area of the touch screen,
Wherein the game progress area is moved corresponding to the touch information sensed in a lever area including a part of the character object,
Wherein the map information includes the game progress area,
Wherein the map information further includes a plurality of spatial objects and a target object generated in randomly selected spatial objects, wherein the spatial objects are selected at predetermined time intervals,
Wherein the target predicting unit displays an expected generation point or an expected generation point of the target object.
delete delete delete delete delete delete
KR1020140011234A 2014-01-29 2014-01-29 Control system for game in touch screen device KR101525799B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140011234A KR101525799B1 (en) 2014-01-29 2014-01-29 Control system for game in touch screen device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140011234A KR101525799B1 (en) 2014-01-29 2014-01-29 Control system for game in touch screen device

Publications (1)

Publication Number Publication Date
KR101525799B1 true KR101525799B1 (en) 2015-06-03

Family

ID=53505328

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140011234A KR101525799B1 (en) 2014-01-29 2014-01-29 Control system for game in touch screen device

Country Status (1)

Country Link
KR (1) KR101525799B1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130103228A (en) * 2012-03-09 2013-09-23 (주)네오위즈게임즈 Method of providing diving game
KR20130112586A (en) * 2012-04-04 2013-10-14 주식회사 드래곤플라이 Game device and controlling method for the same
JP2013246708A (en) * 2012-05-28 2013-12-09 Nintendo Co Ltd Display control system, display control method, display control device and display control program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130103228A (en) * 2012-03-09 2013-09-23 (주)네오위즈게임즈 Method of providing diving game
KR20130112586A (en) * 2012-04-04 2013-10-14 주식회사 드래곤플라이 Game device and controlling method for the same
JP2013246708A (en) * 2012-05-28 2013-12-09 Nintendo Co Ltd Display control system, display control method, display control device and display control program

Similar Documents

Publication Publication Date Title
US20240062610A1 (en) Graphical user interface for a gaming system
US8545325B2 (en) Communication game system
US11290543B2 (en) Scene switching method based on mobile terminal
KR101398086B1 (en) Method for processing user gesture input in online game
KR101570967B1 (en) Game interface method and apparatus for mobile shooting game
JP7150108B2 (en) Game program, information processing device, information processing system, and game processing method
CN108153475B (en) Object position switching method and mobile terminal
KR101407483B1 (en) Method and system for playing on-line game using mobile phone equipped with a touch screen
KR101987859B1 (en) A program, a game system, an electronic device, a server, and a game control method for improving operability of user input
KR102495259B1 (en) Method and apparatus for targeting precisely at objects in on-line game
KR101525799B1 (en) Control system for game in touch screen device
KR102609293B1 (en) Apparatus and method for determining game action
KR102584901B1 (en) Apparatus and method for sending event information, apparatus and method for displayng event information
KR102557808B1 (en) Gaming service system and method for sharing memo therein
KR102614708B1 (en) Method for selecting target object and gaming device for executint the method
JP6459308B2 (en) Program and game device
KR20200080818A (en) Method for outputting screen and display device for executing the same
JP7170454B2 (en) System, terminal device and server
JP6668425B2 (en) Game program, method, and information processing device
KR102369251B1 (en) Method for providing user interface and terminal for executing the same
KR102369256B1 (en) Method for providing user interface and terminal for executing the same
KR101461211B1 (en) System and method for game control using timeline
KR20160126848A (en) Method for processing a gesture input of user
JP2018033617A (en) Execution method of game, program, and recording medium

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20181212

Year of fee payment: 4

R401 Registration of restoration