KR101582296B1 - Automatic aiming system and method for mobile game - Google Patents

Automatic aiming system and method for mobile game Download PDF

Info

Publication number
KR101582296B1
KR101582296B1 KR1020150091144A KR20150091144A KR101582296B1 KR 101582296 B1 KR101582296 B1 KR 101582296B1 KR 1020150091144 A KR1020150091144 A KR 1020150091144A KR 20150091144 A KR20150091144 A KR 20150091144A KR 101582296 B1 KR101582296 B1 KR 101582296B1
Authority
KR
South Korea
Prior art keywords
game
aiming
target
mobile
unit
Prior art date
Application number
KR1020150091144A
Other languages
Korean (ko)
Inventor
오승택
Original Assignee
주식회사 레드덕
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 레드덕 filed Critical 주식회사 레드덕
Priority to KR1020150091144A priority Critical patent/KR101582296B1/en
Application granted granted Critical
Publication of KR101582296B1 publication Critical patent/KR101582296B1/en
Priority to PCT/KR2016/006544 priority patent/WO2016208930A1/en
Priority to TW105119496A priority patent/TW201701932A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a system and a method for automatic aiming for a mobile game, and more specifically, to a system and a method for automatic aiming for a mobile game, which improve convenience in aiming at an object in a mobile game operated based on a touch input of a mobile terminal. The present invention allows automatic aiming through a touch movement when automatically aiming at an object in a game image to guarantee a difficulty of a certain level or higher in comparison to a method where a user directly touches an object to aim at the object to prevent the user from losing interest in the game, and accurately reacts to the touch movement of the object which is designated as an aiming object by a user to increase accuracy in object aiming and greatly improve reaction for touch input.

Description

[0001] The present invention relates to an automatic aiming system and method for a mobile game,

The present invention relates to an automatic aiming system and method for a mobile game, and more particularly, to an automatic aiming system and method for a mobile game, which improves the convenience of aiming an object in a mobile game operated on a touch input basis of the mobile terminal And methods.

In addition to the development of programming technology and graphic processing technology, various games that provide high reality are emerging. Game genres are also divided into various genres such as RPG, MMORPG, FPS, AOS, and sports. .

In recent years, different game genres are mutually converted and various experiences are provided in one game to meet the needs of users.

However, it is commonly applied to the operation methods of various games, and the most basic system element is a targeting function of an object displayed on the game.

In general, the aiming function for an object is a game operated on an input basis provided through a separate input device such as a mouse, a keyboard, a joystick, etc. By manipulating the aiming point displayed on the game, You can aim at.

However, in the mobile game developed for the mobile terminal using the user touch input using the touch screen as the input means instead of the input device, the aim point must be operated based on the touch.

Therefore, when operating the aim point through the touch input in the touch-based object aiming, unlike the manipulation of the aim point through the input device, it is difficult to fine-tune the object, If the aiming point is automatically positioned on the object so that the aiming is performed, the aiming is made too easy, and the interest of the game is reduced.

In recent years, a game system has been developed in which an interface including a button for changing a target for aiming is provided in a game, and a target object is automatically switched through the button input to automatically aim at the target.

However, in such a system, an undesired target object may be selected as the target of aiming, and it is necessary to sequentially switch the object according to the position of the object. Therefore, in a game requiring an immediate reaction such as an FPS game, Which causes a disadvantage in the game operation.

In order to solve this problem, it is required to develop a system for improving the aiming responsiveness of an object desired by a user, while assisting the automatic aiming easily, while ensuring that the aiming difficulty of the aiming point is maintained at a certain level or more.

Korean Patent No. 10-0900689

The present invention can easily target a desired object in consideration of various parameters such as a direction, a distance, or a speed according to a touch input at the time of changing a target in an object aiming process according to a touch-based input in a mobile game, But also to ensure the ease of operation while adequately ensuring.

In addition, the present invention supports the designation and change of the aiming object according to the position of the object on the three-dimensional space, which is detected through the view of the user in the mobile game operation in the three-dimensional environment, .

An automatic aiming system for a mobile game according to an embodiment of the present invention includes a game execution unit for generating a game image according to execution of a mobile game executed in a mobile terminal based on a touch input and providing the game image through a display of the mobile terminal, A target generating unit for generating at least one object as a target in cooperation with a game executing unit and displaying each of the objects on a game image according to the execution of the mobile game; To a game image in cooperation with the game execution unit and calculates a vector value according to the operation information generated in correspondence with the touch input, The method according to any one of the preceding claims, Depending on the vector value through the calculation can be automatically determined from the aiming target object displayed in the image or include aiming control unit for automatically changing the aimed target.

In one embodiment of the present invention, the vector value includes a direction according to the operation information, and may further include at least one of a moving distance and a speed.

In one embodiment of the present invention, the preset reference point setting method sets a center in the game image as a reference point or sets a reference object as a current object to be aimed.

In one embodiment of the present invention, the boresight control unit sets a predetermined center point corresponding to the object to be aimed, as a reference point.

In one embodiment of the present invention, the barycenter may set an identification mark for distinguishing an object that is not a barycenter object from a part of an area corresponding to the object determined as the barycenter object.

In one example of the present invention, the bout control unit sets coordinates corresponding to the game image, calculates a moving coordinate position moved based on the vector value from a coordinate position corresponding to the reference point, The target to be aimed is automatically determined on the basis of the target value.

In one embodiment of the present invention, the aiming control unit determines an object that is closest to the moving coordinate position as an object to be aimed.

As an example related to the present invention, the bout control unit sets three-dimensional coordinates corresponding to the game image in cooperation with the game executing unit, and sets a three-dimensional coordinate corresponding to the game image to a moving coordinate position And a second calculation unit.

In one embodiment of the present invention, the aiming control unit displays an aiming point moving in accordance with the operation information in the game image, and displays the movement of the aiming point according to the vector value.

As an example related to the present invention, the touch movement may be any one of a drag, a swipe, and a slide.

The automatic targeting method for a mobile game of a game application unit configured in a touch input based mobile terminal according to an embodiment of the present invention includes generating a game image according to execution of the mobile game, A step of generating at least one object as a target and displaying each of the objects on a game image according to the execution of the mobile game; Providing a game interface for changing to operation information corresponding to the mobile game through the game image and calculating a vector value according to the operation information generated corresponding to the touch input through the game interface; A reference point according to a reference point setting method, And automatically determining the aim of the object among the objects displayed in the image or automatically changing the aim of the aim according to the calculated vector value.

The object of the present invention is to prevent the half of the interest of the game from being halved by directly aiming the object by directly touching the object by touching the object in the game image, It has an effect of improving accuracy of object aiming and reactivity to touch input by reacting precisely to a touch movement of an object to be designated as a target to be aimed by a user.

In addition, the present invention provides a realistic operation by not only precisely automatically aiming a desired object by reacting precisely to a three-dimensional input in a three-dimensional space that a user feels for a mobile game provided as a three-dimensional image, It is possible to enhance the interest of the game by ensuring convenience.

1 is a block diagram of an automatic aiming system for a mobile game according to an embodiment of the present invention;
2 is a diagram illustrating an example of providing a game image of an automatic aiming system for a mobile game according to an embodiment of the present invention.
3 is a diagram illustrating an example of an automatic aiming based touch input of an automatic aiming system for a mobile game according to an embodiment of the present invention.
4 is a diagram illustrating an example of a touch input based automatic aiming of an automatic aiming system for a mobile game according to another embodiment of the present invention.
FIG. 5 is an exemplary view illustrating an automatic selection criterion of a target to be aimed in an automatic targeting system for a mobile game according to an embodiment of the present invention; FIG.
FIG. 6 illustrates an example of a target selection based on a vector value of an automatic aiming system for a mobile game according to an embodiment of the present invention; FIG.
FIG. 7 illustrates an example of a target selection in a three-dimensional space of an automatic aiming system for a mobile game according to an embodiment of the present invention; FIG.
8 is a flowchart of an automatic aiming method for a mobile game according to an embodiment of the present invention.

Hereinafter, detailed embodiments of the present invention will be described with reference to the drawings.

FIG. 1 is a block diagram of an automatic aiming system for a mobile game according to an embodiment of the present invention. As shown in FIG. 1, an input unit 11 constituted in the mobile terminal 10 and configured in the mobile terminal 10, And a game application unit 100 connected to the communication unit 15, the audio output unit 13, the storage unit 14, and the communication unit 15.

The game application unit 100 may include a control unit for controlling each component configured in the mobile terminal 10. The game application unit 100 may include a program and data stored in the storage unit 14, Performs overall control functions. For example, the game application unit 100 may include a RAM, a ROM, a CPU, a GPU, and a bus, and the RAM, the ROM, the CPU, and the GPU may be connected to each other via a bus. The game application unit 100 may perform booting using an O / S (Operating System) stored in the storage unit 14 and may use various programs stored in the storage unit 14, Various operations can be performed.

In addition, each component including the game application unit 100 may be implemented by a hardware circuit (e.g., a CMOS-based logic circuit), firmware, software, or a combination thereof. For example, it can be implemented utilizing transistors, logic gates, and electronic circuits in the form of various electrical structures.

In addition, the storage unit 14 may store various data for executing a mobile game, and the game application unit 100 may read the game data stored in the storage unit 14 to execute a mobile game.

The input unit 11 receives a command or a control signal generated by an operation such as button operation by the user or a function selection, arbitrary function selection, touch / scroll operation of the displayed screen, And various devices such as a touch pad (static pressure / static electricity), a touch screen, a stylus pen, and a touch pen can be used.

In addition, the display unit 12 can display various contents including a game image according to the execution of the game data stored in the storage unit 14 under the control of the game application unit 100.

The input unit 11 and the display unit 12 may be configured as components constituting a touch screen, and may include a touch sensor for sensing a user's touch gesture. The touch sensor may be one of various types such as an electrostatic type, a pressure sensitive type, a piezoelectric type, and the like.

An audio output unit 13 for outputting audio information included in a signal subjected to a predetermined signal processing by the game application unit 100 may be configured in the mobile terminal 10. The audio output unit 13 may be provided with a receiver a receiver, a speaker, a buzzer, and the like.

In addition, the mobile terminal 10 described in the present invention may be a smart phone, a portable terminal, a mobile terminal, a personal digital assistant (PDA), a portable multimedia player (PMP) terminal, a telematics terminal, a navigation terminal, a personal computer, a notebook computer, a slate PC, a tablet PC ), An ultrabook, a wearable device (e.g., a smartwatch, a glass-type terminal, and a head mounted display), a wibro terminal, an IPTV (Internet Protocol Television) terminal, a smart TV, a digital broadcasting terminal, an AVN (Audio Video Navigation) terminal, an A / V (Audio / Video) system and a flexible terminal. Can.

Meanwhile, the mobile game described in the present invention is described as a first-person shooter (FPS) game. However, the present invention is not limited thereto, and a mobile game such as RPG (Role Playing Game), MMORPG (Massive Multiplayer Online Role Playing Game), AOS Of Strife, TPS (Third-Person Shooter), sports games, and the like.

According to the above-described configuration, the game application unit 100 may process a frame according to game data execution to provide a game image, and may display one or more objects as a target through the game image in the game image.

The game application unit 100 determines which one of the target objects is to be aimed based on the input information according to the touch movement through the touch input of the user received through the input unit 11, The object to be aimed is shot according to the touch input through the touch input, and when the shooting target is hit, the fitness value set corresponding to the object is decreased, and when the fitness value is exhausted, can do.

In the above-described configuration, since it is extremely difficult to aim an object when the game is progressed through the touch-based input method, accuracy is lower than that through an input device capable of fine adjustment such as a mouse or a keyboard, Auto-aiming function should be assisted. At this time, the target object described in the present invention may mean an object that is a target.

However, in the case of aiming only by touching the position of the object displayed in the game image as in the conventional art, the game difficulty is rapidly reduced and the game interest is reduced, and an interface including a button for changing the aiming target is provided, It is necessary to select an object that is not an object of interest as a target to be aimed or to sequentially switch objects according to the position of the object. Therefore, in a game requiring an immediate reaction such as FPS, And the reactivity is rapidly decreased.

Accordingly, the game application unit 100 ensures that the degree of difficulty with respect to the aim of the object is maintained at a certain level or more, so that the interest in the game is not halved, and at the same time, Automatic aiming is performed so as to assure an aiming convenience. This will be described in detail with reference to the drawings with reference to the configuration of the game application unit 100 according to FIG.

1, the game application unit 100 may include a game execution unit 110, a target generation unit 120, a vector calculation unit 130, and a collimation control unit 140.

First, the game execution unit 110 may load and execute game data for executing a mobile game stored in the storage unit 14 as described above. In addition, the game execution unit 110 may receive input information And provides a game image by processing a frame in real time at the start of a game. At this time, the game image may be a game graphic image composed of polygons.

Meanwhile, the target generating unit 120 may generate one or more objects to be a target in cooperation with the game executing unit 110 at the start of a game, display the corresponding objects in the game image as shown in FIG. 2, The object may be moved within the game by controlling the movement of the object within the game.

The vector operation unit 130 changes the input information corresponding to the user's touch movement through the input unit 11 of the mobile terminal 10 to the operation information corresponding to the mobile game in cooperation with the game execution unit 110 And a game interface for providing the game image through the game image.

More specifically, the vector operation unit 130 may display a button for each of various functions of the mobile game through the game interface. For example, as shown in FIG. 2, in the case of an FPS game, Button, a fire button), a menu button, a firearm selection button for selecting a desired firearm among the currently stored firearms, and the like, through the game interface.

In addition, the vector operation unit 130 allocates a predetermined area in the game interface to each of the buttons, recognizes the button corresponding to the area to which the selected coordinates belong in accordance with the touch input through the input unit 11, The game execution unit 110 provides the operation information corresponding to the button to the game execution unit 110, and the game execution unit 110 can execute the function corresponding to the corresponding button.

That is, the vector operation unit 130 recognizes coordinates selected on the basis of the input information corresponding to the touch input, and when the button corresponding to the area to which the coordinate belongs corresponds to the shooting button (shooting button, shooting button) And provides the game information to the game execution unit 110. The game execution unit 110 performs a shooting function corresponding to the shooting button based on the operation information to fire an object to be aimed , The fitness value of the object to be aimed can be deducted.

At this time, the game interface may be game data composed of a GUI. In addition to the buttons, the game interface may include various contents related to the number of remaining bullets, the acquired score, the remaining physical strength, etc. according to the game play The content can be displayed in the game image.

The vector operation unit 130 and the collimation control unit 140 may be configured to automatically or automatically change the object to be targeted when the user selects a desired target object from among the objects generated by the target generator 120 as a target And at the same time, the aiming difficulty level can be guaranteed to be more than a predetermined level.

As shown in the figure, when a user selects an arbitrary position on the display unit 12 composed of a touch screen and performs a continuous touch in a desired direction, the input unit 11, which senses the touch input of the touch screen, The input information may be generated based on the touch movement according to the touch input and provided to the vector operation unit 130.

At this time, the touch movement may be a touch input in which continuous coordinate changes occur such as a drag, a swipe, a slide, and the like.

If the input information includes consecutive coordinates, the vector computing unit 130 may change the input information to operation information for changing the aiming object through the game interface.

In other words, the vector operation unit 130 can generate input information on the touch movement according to the continuous touch input, as operation information corresponding to the determination or change of the aim target.

Next, the vector operation unit 130 may calculate the vector value of the touch motion based on the operation information, and provide the vector value to the operation control unit. At this time, the vector value may include a moving distance, a moving speed, a moving direction, and the like depending on the start and end points of the vector.

The operation control unit receives the vector value of the vector operation unit 130 and automatically selects an object to be aimed among the objects displayed in the game image according to the vector value and a reference point according to a predetermined reference point setting method .

For example, as shown in FIG. 3 (a), when the motion is generated in the right direction on the basis of the operation information generated according to the user's touch movement, the operation control unit sets the reference point to the current game image The center point may be set, and the object corresponding to the target 1 located in the right direction with respect to the reference point may be automatically aimed.

As shown in FIG. 3 (b), when the upward movement is generated based on the operation information generated in accordance with the touch movement of the user, the operation control unit controls the movement of the target 2 corresponding to the target 2 located in the upward direction You can select the object as the target of the aim and automatically aim it.

At this time, the vector operation unit 130 may also calculate a vector value for an arbitrary touch input start point and an end point, thereby allowing the operation control unit to automatically aim based on the vector value.

That is, the present invention supports the user to touch input by avoiding the area where the target objects are located. Accordingly, in the present invention, when the target object is directly designated and aimed, In contrast to the case where the position can not be missed or confirmed, it can be easily aimed while confirming the movement of the target objects moving in real time, thereby maximizing the convenience of aiming.

4, the aiming control unit 140 may automatically aim an arbitrary object at the time of the appearance of a target object and set a center point of the corresponding object as a reference point Can be automatically set as aim point.

Accordingly, when the touch movement occurs in the left direction as shown in FIG. 4 (a), the aiming control unit 140 receives a vector value corresponding to the touch movement from the vector operation unit 130, The center point of the object corresponding to the target 1 set as the current target is set as the reference point and the object corresponding to the target 3 located on the right side is automatically set as the target of the aim based on the vector value based on the reference point, The target can be changed automatically from target 1 to target 3.

In the above-described configuration, the aiming control unit 140 may interlock with the game execution unit 110, and may provide the game execution unit 110 with the aiming target information about the object to be aimed, When the shooting button is input, receives the operation information provided from the vector operation unit 130, performs a shooting function on the object to be aimed, and stores a preset physical strength value corresponding to the object to be aimed It is possible to remove or disable the target object in the game image when the fitness value is exhausted.

3 to 6, the aiming control unit 140 may be configured to interoperate with the game executing unit 110 to identify an object that is not an object of aiming in a part of an area corresponding to the object to be searched, The display A can be set. For example, the color or thickness of the outline of the object to be aimed can be changed, or a separate aiming point can be marked and displayed corresponding to the object to be aimed.

In addition, the aiming control unit 140 may display the movement of the aiming point according to the vector value. For example, the aiming control unit 140 may move the aiming point in real time in response to the swipe input of the user.

Meanwhile, in the above-described configuration, the aiming control unit 140 sets a predetermined two-dimensional grid (a two-dimensional coordinate system) with coordinate values corresponding to different points (positions) corresponding to the game image, The coordinates of the position corresponding to the reference point and the position of the moving coordinate according to the vector value can be calculated based on the vector value on the grid.

In this case, the vector operation unit 130 may also calculate a vector value based on the two-dimensional grid, and the barycontrol unit 140 may calculate the vector value based on the vector value with reference to the coordinate position of the reference point via the two- The moving coordinate position may be calculated.

The barycontrol unit 140 may automatically perform the automatic collimation even when the moving coordinate position (moving coordinate value) according to the vector value does not belong to the area corresponding to the object to be aimed, which will be described with reference to FIG.

5, when the moving coordinate position corresponding to the end point of the vector according to the vector value is not included in any area of each object displayed in the game image with reference to the reference point, The coordinate position corresponding to the center point of the target object can be compared with the movement coordinate position to designate (set) the closest object as the aimed object.

For example, the aiming control unit 140 calculates the distance difference between the coordinate position corresponding to the center point of each object and the moving coordinate position among the objects corresponding to the target 3 and the target 4 close to the moving coordinate position, The object corresponding to the smallest target 3 can be automatically determined and aimed.

Thus, even if the coordinate position according to the vector value according to the user's touch movement does not belong within the area of the object, automatic aiming can be performed to assure the aiming convenience.

In addition, if the vector value is received at a location irrelevant to the position of all target objects according to the input error of the user, the aiming control unit 140 arranges the object order in the order of the most distant position based on the moving coordinate position, It is also possible to designate the object corresponding to the sequence to be automatically targeted.

In addition, the aiming control unit 140 sets a preset object range region corresponding to each object and based on the center point of the object, and ignores the vector value when the object coordinate range does not belong to any of the object range regions. And can maintain the existing aim target.

In the above-described embodiment, only the touch motion for four orientations (east, west, north, north) is described as an example. However, the aim control unit 140 and the vector operation unit 130 may set or automatically change the aim target for the diagonal touch movement Of course, it is needless to say that it is also possible to determine a vector value by measuring the displacement of the touch movement in the curved direction and to determine or change the aim of the aim.

In addition, the game execution unit 110 may change the size of the object as the object moves closer to the screen in cooperation with the target generation unit 120 in the game image displayed in the three-dimensional space, It goes without saying that the aiming control unit 140 can identify the center point of the object by tracking the center point according to the movement and size of the object.

If the object desired by the user is positioned at the first edge or the middle of the plurality of objects located in the same direction with reference to the reference point, the aiming control unit 140 may determine that the object The user can directly aim at the desired object to improve the aiming responsiveness, which will be described in detail with reference to FIG.

As shown in the figure, the aiming control unit 140 performs a touch movement in the left direction according to the touch input of the user, and when the object corresponding to the target 3 and the target 4 is located corresponding to the left direction, The vector coordinate calculating unit 130 can determine the position of the moving coordinate based on the reference point based on at least one of the distance and the speed in addition to the direction of the vector value calculated by the vector calculating unit 130. [

For example, as shown in FIG. 4, the aiming control unit 140 automatically targets an object corresponding to the target 3 when the moving distance is short or the moving speed is slow as shown in FIG. 4, If the movement distance according to the currently input touch movement is longer than the movement distance corresponding to the touch movement corresponding to the target 3 or the movement speed according to the currently input touch movement is faster than the movement speed according to the touch movement corresponding to the target 3 The automatic aiming can be performed on the object corresponding to the target 4 directly without targeting the target 3 as shown in FIG. 6 (b).

At this time, the aiming control unit 140 can calculate the position of the movement coordinate according to at least one of the distance and the velocity according to the vector value, the direction according to the vector value with reference to the coordinate position of the reference point, It is possible to omit the automatic aiming for the target 3 and automatically perform automatic aiming for the object corresponding to the target 4 closest to the moving coordinate position.

As another example, the aiming control unit 140 may measure the moment of the touch motion based on a preset formula according to the vector value, calculate the degree of movement according to the intensity of the momentum, And the object corresponding to the area closest to the moving coordinate position to which the moving coordinate position belongs or the object to which the moving coordinate position belongs may be automatically aimed.

As described above, according to the present invention, since the automatic aiming is performed through the touch movement when the object in the game image is automatically aimed, the aiming difficulty is guaranteed to be more than a certain level, It is possible to precisely respond to the touch movement of the object to be aimed by the user, thereby improving the accuracy of object aiming and greatly improving the reactivity to the touch input.

In addition, even if a plurality of objects are aligned and positioned in the same direction, the present invention can automatically correct an object desired by a user among the objects, thereby greatly improving the operational convenience and reactivity according to user's input.

1, the game execution unit 110 may be configured in the mobile terminal 10 to support a network game based on communication with another mobile terminal 10. For example, in the game execution unit 110, (110) generates a game room for the network game through the communication unit (15) configured in the mobile terminal (10) and generates a game corresponding to each of the plurality of mobile terminals (10) And can communicate with the server.

The game server transmits game related data related to the fighting room to the communication unit 15 through a communication network so that a mobile game is executed through the game executing unit 110. The mobile terminal 10 participating in the fighting room, Related information corresponding to other mobile terminals 10 may be provided to the target generating unit 120 through the communication unit 15. [

Herein, examples of the communication network include a wireless LAN (WLAN), a DLNA (Digital Living Network Alliance), a WIBRO, a World Interoperability for Microwave Access (WIMAX) ), Code Division Multi Access (CDMA), Code Division Multiple Access (CDMA2000), Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (EV-DO), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA) , HSUPA (High Speed Uplink Packet Access), IEEE 802.16, Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), Wireless Mobile Broadband Service (WMBS) (Bluetooth), Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), Ultra Sound Communication USC), Visible Light Communication (VLC), and This can include Wi-Fi, Wi-Fi Direct, and so on.

Accordingly, the target generating unit 120 may display an object corresponding to the other mobile terminal 10 in the game image.

In addition, the game execution unit 110 transmits the automatic aiming-related input information generated by the user's touch input through the vector operation unit 130 and the aiming control unit 140 to the game server through the communication unit 15 And the game server can automatically aim an object of another user corresponding to the other mobile terminal 10 based on the automatic aiming related input information.

At this time, the game server may include a part of the configuration of the game application unit 100.

According to the above-described configuration, the game application unit 100 can apply an automatic aiming system for a mobile game according to an embodiment of the present invention in a network game.

Meanwhile, the aiming control unit 140 can determine a target to be aimed based on a three-dimensional touch input according to a sense of space sensed by a user according to a two-dimensional or three-dimensional game image, which will be described in detail with reference to FIG.

First, as shown in FIG. 7A, the aiming control unit 140 determines whether a game image is provided in two dimensions in cooperation with the game executing unit 110. If the game image is provided as a two-dimensional game image, As described above, a two-dimensional grid can be applied to a game image to calculate coordinates based on a reference point and a vector value.

For example, when the vector control unit 140 receives a vector value for the touch movement in the diagonal direction in the target 1, the target control unit 140 may automatically aim the object corresponding to the target 3 located in the diagonal direction As described above.

7 (b), the aiming control unit 140 may determine whether the game image is provided in three dimensions in cooperation with the game executing unit 110, and may determine whether the game image includes three-dimensional game images Dimensional grid is set to a game image at a predetermined coordinate position (or point) as shown in the drawing, and the three-dimensional coordinate of the user is applied to the game image in cooperation with the vector operation unit 130, Can be applied on the three-dimensional grid to calculate a three-dimensional vector value.

At this time, the game execution unit 110 may provide the game image as a three-dimensional (3D) image. For example, the game execution unit 110 generates a synchronized game image corresponding to the left and right eyes of the user And simultaneously provide the game image through the display unit 12, thereby providing the game image as a stereoscopic image.

Accordingly, the aiming control unit 140 may determine an object corresponding to the target 2 located in the three-dimensional direction based on the three-dimensional direction according to the three-dimensional vector value, and automatically aim the object.

As described above, even if the same vector value is generated on the two-dimensional grid (two-dimensional coordinate system) and the three-dimensional grid (three-dimensional coordinate system) as shown in FIG. 7, have.

In addition, the present invention can precisely automatically aim an object desired by a user by reacting precisely to a three-dimensional input in a three-dimensional space that a user feels about a mobile game provided as a three-dimensional image.

8 is a flowchart of an automatic aiming method for a mobile game according to an embodiment of the present invention.

As shown in the figure, the game application unit 100 configured in the touch input based mobile terminal 10 and executing a mobile game generates a game image according to the execution of the mobile game, and displays the display of the mobile terminal 10 (S1).

Next, the game application unit 100 generates one or more objects to be a target, and displays each of the objects on the game image according to the execution of the mobile game (S2).

The game application unit 100 provides a game interface for changing input information corresponding to a user's touch movement through the mobile terminal 10 to operation information corresponding to the mobile game through the game image, (S3) according to the operation information generated corresponding to the touch input through the game interface (S4).

Next, the game application unit 100 may automatically determine an aim target among the objects displayed in the image or automatically change the aim target according to the calculated vector value based on a reference point according to a preset reference point setting method (S5 ).

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or essential characteristics thereof. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.

10: mobile terminal 11:
12: display unit 13: audio output unit
14: storage unit 15: communication unit
100: game application unit 110: game execution unit
120: target generating unit 130: vector calculating unit
140:

Claims (11)

A game execution unit for generating a game image according to execution of a mobile game executed in a mobile terminal based on a touch input and providing the game image through a display of the mobile terminal;
A target generator for generating at least one object to be a target in cooperation with the game executing unit and displaying each of the objects on a game image according to the execution of the mobile game;
Providing a game interface for changing input information corresponding to a user's touch movement through the mobile terminal to operation information corresponding to the mobile game in association with the game execution unit and providing the game interface with the game image, A vector operation unit for calculating a vector value of a touch motion based on one operation information; And
And an aiming control unit for automatically determining an aiming object among the objects displayed in the image according to a vector value calculated through the vector operation unit based on a reference point according to a predetermined reference point setting method or automatically changing the aiming object,
Wherein the vector value includes a moving distance, a moving speed, and a moving direction according to a start point and an end point of the vector due to the touch movement,
Wherein the preset reference point setting method sets a center in the game image as a reference point or sets a reference object as a reference object.
delete delete The method according to claim 1,
Wherein the aiming control unit sets a predetermined center point corresponding to the object to be aimed at as a reference point.
The method according to claim 1,
Wherein the aiming control unit sets an identification mark for distinguishing an object that is not a target object from a part of an area corresponding to the object determined as the aiming target.
The method according to claim 1,
The aiming control unit sets a coordinate system corresponding to the game image, calculates a moving coordinate position moved based on the vector value from a coordinate position corresponding to the reference point, and automatically sets the moving target position based on the moving coordinate position Wherein the mobile game system comprises:
The method of claim 6,
Wherein the aiming control unit determines an object that is closest to the movement coordinate position as an object to be aimed.
The method of claim 6,
Wherein the aiming control unit sets a three-dimensional coordinate system corresponding to the game image in cooperation with the game executing unit, and calculates a moving coordinate position on the three-dimensional coordinate system in accordance with the vector value with reference to the reference point. Automatic aiming system for game.
The method according to claim 1,
Wherein the aiming control unit displays an aiming point moving in accordance with the operation information in the game image and displays the movement of the aiming point according to the vector value.
The method according to claim 1,
Wherein the touch movement is one of a drag, a swipe, and a slide.
A method for automatically aiming a mobile game in a game application unit configured in a touch input based mobile terminal and executing a mobile game,
Generating a game image according to the execution of the mobile game and providing the game image through the display of the mobile terminal;
Creating at least one object as a target, and displaying each of the objects on a game image according to execution of the mobile game;
Wherein the game interface provides a game interface for changing input information corresponding to a user's touch movement through the mobile terminal to operation information corresponding to the mobile game through the game interface, Calculating a vector value of a touch motion based on the operation information; And
Automatically determining a target to be aimed or automatically changing the aimed target among objects displayed in the image according to the calculated vector value based on a reference point according to a predetermined reference point setting method,
Wherein the vector value includes a moving distance, a moving speed, and a moving direction according to a start point and an end point of the vector due to the touch movement,
Wherein the predetermined reference point setting method sets a center in the game image as a reference point or sets a reference object as a current object to be aimed at.
KR1020150091144A 2015-06-26 2015-06-26 Automatic aiming system and method for mobile game KR101582296B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
KR1020150091144A KR101582296B1 (en) 2015-06-26 2015-06-26 Automatic aiming system and method for mobile game
PCT/KR2016/006544 WO2016208930A1 (en) 2015-06-26 2016-06-20 Automatic aiming system and method for mobile game
TW105119496A TW201701932A (en) 2015-06-26 2016-06-22 Automatic aiming system and method for mobile game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150091144A KR101582296B1 (en) 2015-06-26 2015-06-26 Automatic aiming system and method for mobile game

Related Child Applications (1)

Application Number Title Priority Date Filing Date
KR1020150145427A Division KR20170001539A (en) 2015-10-19 2015-10-19 Automatic aiming system and method for mobile game

Publications (1)

Publication Number Publication Date
KR101582296B1 true KR101582296B1 (en) 2016-01-04

Family

ID=55164424

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150091144A KR101582296B1 (en) 2015-06-26 2015-06-26 Automatic aiming system and method for mobile game

Country Status (3)

Country Link
KR (1) KR101582296B1 (en)
TW (1) TW201701932A (en)
WO (1) WO2016208930A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101834986B1 (en) * 2017-08-28 2018-03-07 주식회사 솔트랩 Game system and method supporting disappearance processing
KR20180028046A (en) * 2016-09-07 2018-03-15 이철우 Device, method and program for making multi-dimensional reactive video, and method and program for playing multi-dimensional reactive video
WO2018048227A1 (en) * 2016-09-07 2018-03-15 이철우 Device, method and program for generating multidimensional reaction-type image, and method and program for reproducing multidimensional reaction-type image
CN109550243A (en) * 2018-11-20 2019-04-02 网易(杭州)网络有限公司 Control method and device, storage medium and the electronic device of game role
KR20190125456A (en) * 2017-07-19 2019-11-06 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Method and device for fixing target object in game scene, electronic device, and storage medium

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108144300B (en) * 2017-12-26 2020-03-03 网易(杭州)网络有限公司 Information processing method in game, electronic device and storage medium
CN108404407B (en) * 2018-01-05 2021-05-04 网易(杭州)网络有限公司 Auxiliary aiming method and device in shooting game, electronic equipment and storage medium
CN108837506A (en) * 2018-05-25 2018-11-20 网易(杭州)网络有限公司 Control method, device and the storage medium of virtual item in a kind of race games
CN109224439B (en) * 2018-10-22 2022-07-29 网易(杭州)网络有限公司 Game aiming method and device, storage medium and electronic device
CN109445662B (en) * 2018-11-08 2022-02-22 腾讯科技(深圳)有限公司 Operation control method and device for virtual object, electronic equipment and storage medium
CN112354181B (en) * 2020-11-30 2022-12-30 腾讯科技(深圳)有限公司 Open mirror picture display method and device, computer equipment and storage medium
TWI792304B (en) * 2021-05-07 2023-02-11 辰晧電子股份有限公司 Rfid bullet

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100900689B1 (en) 2007-06-13 2009-06-01 엔에이치엔(주) Online game method and system
JP2011215886A (en) * 2010-03-31 2011-10-27 Namco Bandai Games Inc Program, information storage medium, and image generation device
KR20140112102A (en) * 2012-10-10 2014-09-23 (주)네오위즈게임즈 Method and device of providing touch user interface and storage media storing the same
US20150157940A1 (en) * 2013-12-11 2015-06-11 Activision Publishing, Inc. System and method for playing video games on touchscreen-based devices

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130027621A (en) * 2011-06-03 2013-03-18 김성진 Smartphone fps control key technology idea
JP5727655B1 (en) * 2014-09-17 2015-06-03 株式会社Pgユニバース Information processing apparatus, information processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100900689B1 (en) 2007-06-13 2009-06-01 엔에이치엔(주) Online game method and system
JP2011215886A (en) * 2010-03-31 2011-10-27 Namco Bandai Games Inc Program, information storage medium, and image generation device
KR20140112102A (en) * 2012-10-10 2014-09-23 (주)네오위즈게임즈 Method and device of providing touch user interface and storage media storing the same
US20150157940A1 (en) * 2013-12-11 2015-06-11 Activision Publishing, Inc. System and method for playing video games on touchscreen-based devices

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180028046A (en) * 2016-09-07 2018-03-15 이철우 Device, method and program for making multi-dimensional reactive video, and method and program for playing multi-dimensional reactive video
WO2018048227A1 (en) * 2016-09-07 2018-03-15 이철우 Device, method and program for generating multidimensional reaction-type image, and method and program for reproducing multidimensional reaction-type image
KR102051981B1 (en) 2016-09-07 2019-12-04 이철우 Device, method and program for making multi-dimensional reactive video, and method and program for playing multi-dimensional reactive video
US11003264B2 (en) 2016-09-07 2021-05-11 Chui Woo Lee Device, method and program for generating multidimensional reaction-type image, and method and program for reproducing multidimensional reaction-type image
US11360588B2 (en) 2016-09-07 2022-06-14 Chui Woo Lee Device, method, and program for generating multidimensional reaction-type image, and method, and program for reproducing multidimensional reaction-type image
US12086335B2 (en) 2016-09-07 2024-09-10 Momenti, Inc. Device, method and program for generating multidimensional reaction-type image, and method and program for reproducing multidimensional reaction-type image
KR20190125456A (en) * 2017-07-19 2019-11-06 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Method and device for fixing target object in game scene, electronic device, and storage medium
KR102317522B1 (en) * 2017-07-19 2021-10-25 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Method and apparatus, electronic device, and storage medium for fixing a target object in a game scene
KR101834986B1 (en) * 2017-08-28 2018-03-07 주식회사 솔트랩 Game system and method supporting disappearance processing
WO2019045173A1 (en) * 2017-08-28 2019-03-07 주식회사 솔트랩 Game system and method for supporting loss processing
CN109550243A (en) * 2018-11-20 2019-04-02 网易(杭州)网络有限公司 Control method and device, storage medium and the electronic device of game role
CN109550243B (en) * 2018-11-20 2022-08-19 网易(杭州)网络有限公司 Game character control method and device, storage medium and electronic device

Also Published As

Publication number Publication date
TW201701932A (en) 2017-01-16
WO2016208930A1 (en) 2016-12-29

Similar Documents

Publication Publication Date Title
KR101582296B1 (en) Automatic aiming system and method for mobile game
US10850196B2 (en) Terminal device
KR101398086B1 (en) Method for processing user gesture input in online game
JP7150108B2 (en) Game program, information processing device, information processing system, and game processing method
US20130217498A1 (en) Game controlling method for use in touch panel medium and game medium
US11266904B2 (en) Game system, game control device, and information storage medium
US12048872B2 (en) Apparatus and method for controlling user interface of computing apparatus
JP6217000B2 (en) GAME DEVICE AND PROGRAM
WO2019207898A1 (en) Game control device, game system, and program
US9229614B2 (en) Storage medium storing information processing program, information processing device, information processing system, and method for calculating specified position
JP2015150215A (en) Movement control device and program
JP2023082039A (en) Game program, game processing method and game terminal
CN114404986A (en) Method and device for controlling player character, electronic device and storage medium
JP5918285B2 (en) Movement control apparatus and program
KR20170001539A (en) Automatic aiming system and method for mobile game
CN113663326B (en) Aiming method and device for game skills
KR101834986B1 (en) Game system and method supporting disappearance processing
CN114404944A (en) Method and device for controlling player character, electronic device and storage medium
JP2018153467A (en) Information processing method, apparatus, and program for implementing that information processing method in computer
JP2020089496A (en) Game program, game processing method and game terminal
JP2019188118A (en) Game controller, game system and program
JP2019080928A (en) Game system, game control device, and program
JP6668425B2 (en) Game program, method, and information processing device
CN118512760A (en) Game interaction method and device and electronic equipment
CN117085317A (en) Interactive control method, device and equipment for game and storage medium

Legal Events

Date Code Title Description
AMND Amendment
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20181224

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20191120

Year of fee payment: 5