WO2015130042A1 - Method for controlling object, device for controlling object, and recording medium - Google Patents

Method for controlling object, device for controlling object, and recording medium Download PDF

Info

Publication number
WO2015130042A1
WO2015130042A1 PCT/KR2015/001441 KR2015001441W WO2015130042A1 WO 2015130042 A1 WO2015130042 A1 WO 2015130042A1 KR 2015001441 W KR2015001441 W KR 2015001441W WO 2015130042 A1 WO2015130042 A1 WO 2015130042A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
operation information
movement path
drag
manipulation
Prior art date
Application number
PCT/KR2015/001441
Other languages
French (fr)
Korean (ko)
Inventor
최민식
Original Assignee
(주)네오위즈게임즈
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)네오위즈게임즈 filed Critical (주)네오위즈게임즈
Publication of WO2015130042A1 publication Critical patent/WO2015130042A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to an object control method, an object control apparatus and a recording medium.
  • Conventional object control apparatus displays a plurality of objects (Object) on the screen, and moves any object, such as a character, item, etc. from the current position to another position or target other objects such as characters, structures, etc. at different positions (Target) Controls the movement of an object by receiving operation information for targeting.
  • Object objects
  • Tiget Target
  • Such a conventional object control apparatus has an object in a curve when there is an obstacle between the current position of the object to be moved and the final destination position as a target, so that the final destination position cannot be reached by moving the object in a straight line. It provides a manipulation method for moving and object movement control accordingly.
  • the object control apparatus since the movement distance and the movement trajectory are determined by only one user operation, the object is moved to the target position precisely, such as moving the object too far or shorter than the target position. There has been a problem.
  • Another object of the present invention is to provide an object control method, an object control apparatus, and a recording medium capable of accurately moving an object to a target position through two different operation information.
  • the input unit of the object control device receives the first operation information according to the first operation on the object, An object operation information input step of receiving second operation information according to a second operation on the object distinguished from the first operation;
  • the controller of the object control apparatus may define an object movement path of the object according to the first operation information according to the presence or absence of input of the second operation information, or according to the first operation information and the second operation information.
  • the present invention in the recording medium recording a program for executing the object control method of the object control device, receiving the first operation information according to the first operation on the object, or distinguished from the first operation A function for receiving second operation information according to a second operation on the object and defining an object movement path of the object according to the first operation information, or according to the first operation information and the second operation information
  • a computer readable recording medium having recorded thereon a program implementing a function of defining an object movement path of the object and displaying the object moving according to the defined object movement path.
  • the input unit for receiving the first operation information according to the first operation on the object, or the second operation information according to the second operation on the object which is distinguished from the first operation ;
  • a control unit defining an object movement path of the object according to the first operation information or defining an object movement path of the object according to the first operation information and the second operation information;
  • a display unit which displays the object moving according to the defined object movement path.
  • FIG. 1 is a schematic block diagram of an object control apparatus according to embodiments of the present invention.
  • FIG. 2 is a system environment of an object control apparatus according to embodiments of the present invention.
  • FIG. 3 is a flowchart of an object control method according to embodiments of the present invention.
  • 4 and 5 are diagrams illustrating an object control method according to an embodiment of the present invention.
  • FIG. 6 is a view showing an object control method according to another embodiment of the present invention.
  • FIG. 1 is a schematic block diagram of an object control apparatus 100 according to embodiments of the present invention.
  • an object control apparatus 100 relates to an apparatus for controlling a movement path of an object, and includes one operation information of a user for controlling the object movement path (first An input unit 110 for receiving operation information) or other two operation information (first operation information, second operation information), and received one operation information (first operation information) or other two operation information (first operation information).
  • a control unit 120 defining an object movement path based on the operation information and the second operation information
  • a display unit 130 for displaying an object moving according to the object movement path defined by the control unit 120, and the like. do.
  • the above-described input unit 110 receives the first operation information according to one operation (first operation) for the object, or the second operation according to another operation (second operation) for the object which is distinguished from the first operation. 2 Receive the operation information.
  • the first operation information and the second operation information which are two pieces of operation information, may be different operation information input at the same time.
  • the first operation information and the second operation information may be, for example, information about different drag operations input together at the same time.
  • the display unit 130 includes a touch screen that supports multi-touch.
  • the controller 120 defines an object movement path of an object according to the inputted first operation information when the second operation information is not input, according to whether the second operation information is input or not, which is different from the first operation information.
  • the object movement path of the object is defined according to the two input manipulation information (first manipulation information and second manipulation information).
  • the controller 120 determines the object movement path of the object based on the input one operation information (first operation information).
  • first operation information can be defined as
  • the controller 120 uses two input operation information, that is, the first operation information and the second operation information for one object.
  • the object movement path for one object can be defined as a curve movement path.
  • the controller 120 determines the "movement direction” and “movement distance” for the linear movement component of the object according to the first operation information, and the "curve” of the object according to the second operation information. Trajectory ", and based on the movement direction, the movement distance and the curve trajectory thus determined, the curve movement path is defined as the object movement path. That is, the curved movement path may be defined by the moving direction, the moving distance, and the curve trajectory.
  • the display unit 130 displays the object moving according to the defined object movement path (linear movement path or curved movement path).
  • the object control apparatus 100 receives one or two pieces of operation information according to a user manipulation method for controlling a movement path of an object. It may be a game device that controls a movement path for one object on the basis, and may be implemented in the form of a user terminal.
  • the object control apparatus 100 may provide an object control method independently without interworking with other devices such as a server and a terminal, or other devices such as a server and a terminal. You can also provide an object control method through interworking with.
  • the object control apparatus 100 when the object control apparatus 100 according to the embodiments of the present invention provides an object control method through interworking with other devices such as a server and a terminal, the object control apparatus 100 may be configured as the schematic system of FIG. 2.
  • FIG. 2 is a diagram illustrating a system environment of an object control apparatus 100 according to embodiments of the present disclosure.
  • object information of the object And information and images of various other objects, structures, and the like may be stored in the object control device 100 or may be stored in another device 200.
  • the result information according to the collision between the object and another object or structure according to the movement of the object is stored in the object control device 100 or another device 200 or It can be displayed on the screen.
  • the other device 200 or the object control device 100 may store various information for performing object control or a function related thereto.
  • the information may further include object retention information of the user, object information of each user, and information related to the user (for example, information about a user ID, user service history, points, and experience values).
  • the object referred to herein may be, for example, a character, an avatar, an object, an item, or an item (eg, a weapon, etc.) possessed by the character.
  • the screen mentioned in the present specification may be a game screen.
  • the object control apparatus 100 may be, for example, a mobile terminal such as a smartphone, a tablet PC, a mobile terminal, or a user such as a general computer such as a personal computer (PC) or a laptop. It may be a terminal.
  • a mobile terminal such as a smartphone, a tablet PC, a mobile terminal, or a user such as a general computer such as a personal computer (PC) or a laptop. It may be a terminal.
  • the object control method according to embodiments of the present invention may be a method provided according to the execution of a game application.
  • FIG. 3 is a flowchart of an object control method according to embodiments of the present invention.
  • an object control method includes an object manipulation information input step S310, an object movement path definition step S320, an object display control step S330, and the like.
  • the input unit 110 of the object control apparatus 100 receives the first manipulation information according to the first operation on the object, or receives the first manipulation information on the object distinguished from the first operation. Receive the second operation information according to the second operation.
  • first operation information and the second operation information may be, for example, information about different drag operations input together at the same time.
  • the control unit 120 of the object control apparatus 100 may define an object movement path of an object according to the first operation information according to whether the second operation information is input or the first operation information.
  • the object movement path of the object may be defined according to the operation information and the second operation information.
  • the control unit 120 defines a linear movement path as the object movement path according to the first operation information.
  • the curved movement route may be defined as the object movement route according to the first manipulation information and the second manipulation information.
  • the control unit 120 of the object control apparatus 100 controls the screen display on the display unit 130 for the object according to the object movement path defined in the previous step.
  • the first manipulation information input in the object manipulation information input step S310 may include, for example, information about a drag direction and a drag length in which the user drags the corresponding object.
  • the controller 120 determines a direction opposite to the drag direction of the user operation identified from the first manipulation information as the movement direction with respect to the linear movement component of the object.
  • the controller 120 may determine a length corresponding to the drag length of the user operation identified from the first manipulation information as a movement distance with respect to the linear movement component of the object.
  • the object is first displayed regardless of what kind of movement pattern (movement path) the object is moving.
  • the final destination after the move in position is determined.
  • the user inputs only the first manipulation information through the first operation and the corresponding object. It would be most desirable to manipulate so that U moves along a linear travel path (object travel path) from the initial location to the final destination location.
  • the object may not move linearly from the initial position to the final destination position.
  • the user inputs the first operation information and the second operation information at the same time by combining the first operation and the second operation, and when the object moves from the initial position to the final destination position, a curve that can move to avoid obstacles. You need to manipulate the movement path (object movement path).
  • the object is a character or weapon item
  • the object is moved in a straight line from the initial position to the final destination position using only the first manipulation information, and the object is attacked when attacking another character or structure at the final destination.
  • the obstacle that exists on the line connecting the initial position of and the final destination position after completion of movement is unable to blow up another character or structure in the final destination
  • the user may perform the first operation and the second operation.
  • the first operation information and the second operation information are input at the same time, and the operation to create a curved movement path (object movement path) that allows the object to move without colliding with an obstacle when moving from the initial position to the final destination position need.
  • the present invention in addition to the first operation of the user to determine the moving direction and the moving distance with respect to the linear movement component of the object, other operations of the user to create a movement trajectory so that the object can move to avoid obstacles 2 operations) can be provided further.
  • the input unit 110 receives the first manipulation information when the touch is dropped off the screen while the first manipulation is performed by touching and dragging the first point of the screen.
  • the input unit 110 judges whether or not there is a second operation that is touched and dragged on the second point of the screen at the time of inputting the first operation information, and that the touch is not released from the screen.
  • the second operation information according to the second operation without touching the screen may be simultaneously input.
  • the input unit 110 receives the first operation information, and before the inputting time of the first operation information, If the second operation touched and dragged the second point is maintained without touching the screen even when the first operation information is input, the second operation information according to the second operation at the time of inputting the first operation information. To be input at the same time.
  • the input unit 110 receives a first operation information when a first operation of touching and dragging a first point of the screen is performed and the touch is dropped from the screen, and receives a second operation point of the screen before the input point of the first operation information. If the second operation touched and dragged is removed from the screen before the input point of the first operation information, the second operation information according to the second operation is not inputted at the time of inputting the first operation information.
  • the first operation and the second operation for creating the curved movement path are touch operations performed at the same time, and the object control apparatus should be capable of supporting multi-touch using, for example, a mutual capacitance touch method.
  • 4 and 5 are diagrams illustrating an object control method according to an embodiment of the present invention.
  • the user moves the object A from the current position (first position, P1) to the final destination position P3, while the user moves the curved path 400 from the current position P1 to the final destination position P3.
  • an operation method for each of the first operation 410 and the second operation 420 of the user is illustrated.
  • a user performs a drag operation 410 (first operation) on the object A by a predetermined distance in order to determine the final destination position P3 of the object A.
  • a drag operation 410 first operation
  • a direction P1 '-> P1 opposite to the drag direction P1-> P1' of the drag operation 410 (first operation) of the object A is determined as the moving direction of the object A.
  • the movement length D of the object A is determined.
  • of the first operation 410, or may be a constant of the drag length L1 of the first operation. It may also be (K) times (D K * L1, K> 1).
  • the drag operation 410 (first operation) of the object A determines the final destination position P3 after the object A completes its movement.
  • the user may perform the first operation according to the second operation 420 (eg, a drag operation). 2 You can also operate so that operation information is not input.
  • the second operation 420 is not performed at all or the first operation ( According to 410, if the touch used for the second operation 420 is released from the screen before the touch is released from the screen, only the first operation information is input when the touch used for the first operation 410 falls off the screen. Accordingly, the input unit 110 receives only the first operation information, and the control unit 120 uses only the first operation information, and the drag direction P1-> P1 'and the drag length L1 of the first operation 410 are received. According to the linear movement path 440 is defined.
  • the first operation information according to the first operation 410 If the user wants to move the object A along the curve moving path 400 from the current position P1 to the final destination position P3, the first operation information according to the first operation 410. In addition to inputting, the second operation information according to the second operation 420 should also be input.
  • the user performs the first operation 410 with the finger of the left hand while simultaneously dragging the moving direction indicator 430 or the object A of the object A displayed according to the first operation with the finger of the right hand.
  • the second operation 420 is performed, the curved movement path 400 bent in the drag direction of the second operation 420 is determined.
  • the input unit 110 receives the first operation information and the second operation information at the same time, and the control unit 120 defines the curved movement path 400 by using the first operation information and the second operation information at the same time. Done.
  • the longer the drag length L2
  • of the second operation 420, the greater the degree of bending (curvature of the curve) of the curved movement path. That is, referring to FIG. 4, the longer the drag length L2
  • the second manipulation information of the second manipulation 420 input together with the first manipulation information of the first manipulation 410 is, for example, the first manipulation information.
  • the control unit 120 curves according to the drag length L2 identified from the second operation information of the second operation 420.
  • the curved movement path 400 corresponding to the curve trajectory along which the object A moves from the initial position P1 to the final destination position P3 is defined.
  • the curve movement path 400 may mean the same as the curve trajectory, or may mean a portion of the curve trajectory that connects the initial position P1 and the final destination position P3 as part of the curve trajectory.
  • FIG. 5 shows an obstacle 500 on the curved movement path 400 of the object A as shown in FIG. 4 as well as the linear movement path 440 from the current position P1 to the final destination position P3. If present, move the object A to the target B (e.g. character, structure, base, etc.) located at the final destination location P3 or move the target B located at the final destination location P3.
  • the curve bending degree is larger than the curve moving path 400 of FIG. 4, so that the object A can move without being obstructed by the obstacle 500. Is a view showing a second operation 420 ′ that produces?
  • the degree of curvature of the curve is greater than that of the curved movement path 400 of FIG. 4.
  • the curved movement paths 400 and 400 are illustrated in FIGS. 4 and 5. 'Is made in the downward direction, and dragging the movement direction indicator 430 or the object A upwards, the curved movement paths 400 and 400' are upwards.
  • FIG. 6 is a view showing an object control method according to another embodiment of the present invention.
  • FIG. 6 a user for creating an object moving path 600, 600 ′, 600 ′′, ... for moving the object A from the current position P1 to the final destination position P3. It is a figure which shows an operation method.
  • a user drags an object A by a predetermined distance L1 in a predetermined direction P1-> P1 'in order to determine the final destination position P3 of the object A ( 610; first operation).
  • a direction P1 '-> P1 opposite to the drag direction P1-> P1' of the drag operation 410 (first operation) of the object A is determined as the moving direction of the object A.
  • the movement length D of the object A is determined.
  • of the first operation, and the constant K of the drag length L1 of the first operation. It may be double (D K * L1, K> 1).
  • the user may then perform a second operation 620 (e.g., drag operation) to determine which object movement path the object A will follow along from the current position P1 to the final destination position P3. It may not.
  • a second operation 620 e.g., drag operation
  • the second operation information according to the second operation 620 may not be input. It may be.
  • the second operation 620 is not performed at all or the first operation ( According to 610, before removing the touch from the screen, when the touch that has been performed by the second operation 620 is removed from the screen, only the first operation information is input when the touch that has been performed by the first operation 610 falls off the screen. Accordingly, the input unit 110 receives only the first operation information, and the control unit 120 uses only the first operation information, so that the drag directions P1-> P1 'and the drag length L1 of the first operation 610 are performed.
  • the linear movement path 600 is defined according to
  • the first operation information according to the first operation 410 If the user wants to move the object A along the curve moving path 400 from the current position P1 to the final destination position P3, the first operation information according to the first operation 410. In addition to inputting, the second operation information according to the second operation 420 should also be input.
  • the second operation of dragging the point 630 displayed around the object A by a specific length according to the first operation 610. (620).
  • the position of the point 630 dragged according to the second operation 620 may correspond to the curve bending level values Lv 0 to Lv 10 of various stages in the curve bending level setting bar 640.
  • the user may drag the point 630 by the length corresponding to the curve bending level value of the desired step.
  • the second operation information of the second operation 620 includes drag operation information on the point 630 displayed around the object A.
  • the drag operation information on the point 630 displayed around the object A may include drag position information (eg, a, b, c, d, e) or a curve bending level value corresponding thereto (eg, Lv 0). ⁇ Lv 10) may be included.
  • Second operation information according to the second operation 620 including the curve bending level value Lv 5 corresponding to the drag position information c or the drag position information c may be input.
  • the control unit 120 the drag position information (c) included in the drag operation information included in the second operation information of the second operation 620 or the corresponding curve bending level value Based on (Lv 5), by determining the curve trajectory corresponding to the curve curvature level value Lv 5, the curved path of movement of the object A from the initial position P1 to the final destination position P3. (600 ') can be defined.
  • the object control method includes an application basically installed in the object control apparatus 100 (which is included in a platform, which is basically included in the object control apparatus 100, or is included in an operating system or the like.
  • the application control server 100 through the application providing server, such as an application store server, an application or a web server associated with the corresponding service, and is compatible with the operating system of the object control apparatus 100. It may be executed by an application (that is, a program) installed directly on the 100.
  • the operating system of the object control apparatus 100 may be an operating system such as a window or a Macintosh installed in a general PC such as a desktop, or an iOS or Android installed in a mobile terminal such as a smartphone or a tablet PC. It may also be a mobile-only operating system such as (Android).
  • the object control method according to the above-described embodiments of the present invention is implemented as an application (that is, a program) that is basically installed in the object control apparatus 100 or directly installed by a user, and the object control apparatus 100 Or a computer readable recording medium.
  • the program implementing the object control method according to the embodiments of the present invention receives the first operation information according to the first operation on the object or according to the second operation on the object that is distinct from the first operation.
  • a function of receiving second operation information and defining an object movement path of the object according to the first operation information, or defining an object movement path of the object according to the first operation information and the second operation information And a function of displaying the object moving according to the defined object movement path.
  • all functions corresponding to the object control method according to the embodiments of the present invention described above can be executed.
  • Such a program is recorded on a recording medium that can be read by a computer such as the object control apparatus 100 and executed by a computer such as the object control apparatus 100 so that the above functions can be executed.
  • the above-described program may be a processor of a computer (The code may be coded in a computer language such as C, C ++, JAVA, or machine language, which can be read by the CPU.
  • Such code may include a function code associated with a function or the like that defines the above-described functions, and may include execution procedure-related control code necessary for a processor of the computer to execute the above-described functions according to a predetermined procedure.
  • the code may further include memory reference-related code for additional information or media required for a processor of the computer to execute the above-described functions at which location (address address) of the computer's internal or external memory. .
  • the code may be coded by the processor of the computer. It may further include communication-related codes such as how to communicate with any other computer or server in the remote using a wired and / or wireless communication module, and what information or media to transmit and receive during communication. .
  • codes and code segments associated therewith may be used in consideration of a system environment of a computer that reads a recording medium and executes the program. It may be easily inferred or changed by.
  • a computer-readable recording medium having recorded a program as described above may be distributed to computer systems connected through a network so that computer-readable codes may be stored and executed in a distributed manner.
  • any one or more of the plurality of distributed computers may execute some of the functions presented above, and transmit the results to one or more of the other distributed computers, and receive the results.
  • the computer may also execute some of the functions presented above, and provide the results to other distributed computers as well.
  • the computer-readable recording medium recording a program for executing the object control method according to the embodiments of the present invention, for example, ROM, RAM, CD-ROM, magnetic tape, floppy disk Optical media storage.
  • the computer-readable recording medium recording the application, which is a program for executing the object control method according to the embodiments of the present invention, an application store server (Application Store Server), an application or a web server associated with the service (Web) It may be a storage medium (eg, a hard disk, etc.) included in an application provider server including a server, or the like, or may be an application providing server itself, or another computer recording a program or a storage medium thereof.
  • an application store server Application Store Server
  • Web web server associated with the service
  • It may be a storage medium (eg, a hard disk, etc.) included in an application provider server including a server, or the like, or may be an application providing server itself, or another computer recording a program or a storage medium thereof.
  • the computer that can read the recording medium recording the application which is a program for executing the object control method according to the embodiments of the present invention, may be a caller terminal 110, as well as a general PC such as a general desktop or laptop, It may include mobile terminals such as smart phones, tablet PCs, personal digital assistants (PDAs), and mobile communication terminals, as well as to be interpreted as all computing devices.
  • a caller terminal 110 as well as a general PC such as a general desktop or laptop
  • PDAs personal digital assistants
  • mobile communication terminals as well as to be interpreted as all computing devices.
  • a computer capable of reading a recording medium recording an application that is a program for executing the object control method is a mobile terminal such as a smart phone, a tablet PC, a personal digital assistant (PDA) and a mobile communication terminal
  • the mobile terminal may download and install the corresponding application from an application providing server including an application store server, a web server, and the like.
  • the mobile terminal may be downloaded from the application providing server to a general PC. It may be installed in the terminal.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a method for controlling an object capable of accurately move the object to a targeted position, a device for controlling an object, and a recording medium.

Description

객체 제어 방법, 객체 제어 장치 및 기록매체Object control method, object control device and recording medium
본 발명은 객체 제어 방법, 객체 제어 장치 및 기록매체에 관한 것이다. The present invention relates to an object control method, an object control apparatus and a recording medium.
종래의 객체 제어 장치는, 여러 개의 객체(Object)를 화면에 표시하고, 캐릭터, 아이템 등의 어느 한 객체를 현재 위치에서 다른 위치로 이동시키거나 다른 위치에 있는 캐릭터, 구조물 등의 다른 객체를 타깃(Target) 하기 위한 조작정보를 입력받아, 객체의 이동을 제어한다. Conventional object control apparatus displays a plurality of objects (Object) on the screen, and moves any object, such as a character, item, etc. from the current position to another position or target other objects such as characters, structures, etc. at different positions (Target) Controls the movement of an object by receiving operation information for targeting.
이러한 종래의 객체 제어 장치는, 이동 대상이 되는 객체의 현재 위치와 목표로 하는 최종 목적지 위치 사이에 장애물 등이 있어 객체를 직선으로 이동시켜서는 최종 목적지 위치가 도달하게 할 수 없는 경우, 객체가 곡선으로 이동하도록 하는 조작법과 그에 맞는 객체 이동 제어를 제공하고 있다. Such a conventional object control apparatus has an object in a curve when there is an obstacle between the current position of the object to be moved and the final destination position as a target, so that the final destination position cannot be reached by moving the object in a straight line. It provides a manipulation method for moving and object movement control accordingly.
하지만, 종래의 객체 제어 장치에서는, 하나의 사용자 조작만으로 이동거리와 이동궤적이 결정되기 때문에, 객체가 목표로 하는 위치보다 너무 멀리 또는 짧게 이동하는 등과 같이, 객체를 목표로 하는 위치에 정확하게 이동시킬 수 없는 문제점이 있어왔다. However, in the conventional object control apparatus, since the movement distance and the movement trajectory are determined by only one user operation, the object is moved to the target position precisely, such as moving the object too far or shorter than the target position. There has been a problem.
이러한 배경에서, 본 발명의 목적은, 객체를 목표로 하는 위치에 정확하게 이동시킬 수 있는 객체 제어 방법, 객체 제어 장치 및 기록매체를 제공하는 데 있다. In this background, it is an object of the present invention to provide an object control method, an object control apparatus and a recording medium which can accurately move an object to a target position.
또한, 본 발명의 다른 목적은, 서로 다른 2개의 조작정보를 통해, 객체를 목표로 하는 위치에 정확하게 이동시킬 수 있는 객체 제어 방법, 객체 제어 장치 및 기록매체를 제공하는 데 있다. Another object of the present invention is to provide an object control method, an object control apparatus, and a recording medium capable of accurately moving an object to a target position through two different operation information.
전술한 목적을 달성하기 위하여, 일 측면에서, 본 발명은, 객체 제어 장치의 객체 제어 방법에 있어서, 상기 객체 제어 장치의 입력부가, 객체에 대한 제1조작에 따른 제1조작정보를 입력받거나, 상기 제1조작과는 구별되는 상기 객체에 대한 제2조작에 따른 제2조작정보를 입력받는 객체 조작정보 입력 단계; 상기 객체 제어 장치의 제어부가, 상기 제2조작정보의 입력 유무에 따라, 상기 제1조작정보에 따라 상기 객체의 객체 이동경로를 정의하거나, 상기 제1조작정보 및 상기 제2조작정보에 따라 상기 객체의 객체 이동경로를 정의하는 객체 이동경로 정의 단계; 및 상기 제어부가, 상기 정의된 객체 이동경로에 따라 상기 객체에 대한 표시부에서의 화면 표시를 제어하는 객체 표시 제어 단계를 포함하는 객체 제어 방법을 제공한다. In order to achieve the above object, in one aspect, the present invention, in the object control method of the object control device, the input unit of the object control device, receives the first operation information according to the first operation on the object, An object operation information input step of receiving second operation information according to a second operation on the object distinguished from the first operation; The controller of the object control apparatus may define an object movement path of the object according to the first operation information according to the presence or absence of input of the second operation information, or according to the first operation information and the second operation information. An object movement path defining step of defining an object movement path of the object; And an object display control step of controlling, by the control unit, screen display on the display unit for the object according to the defined object movement path.
다른 측면에서, 본 발명은, 객체 제어 장치의 객체 제어 방법을 실행시키기 위한 프로그램을 기록한 기록매체에 있어서, 객체에 대한 제1조작에 따른 제1조작정보를 입력받거나, 상기 제1조작과는 구별되는 상기 객체에 대한 제2조작에 따른 제2조작정보를 입력받는 기능과, 상기 제1조작정보에 따라 상기 객체의 객체 이동경로를 정의하거나, 상기 제1조작정보 및 상기 제2조작정보에 따라 상기 객체의 객체 이동경로를 정의하는 기능과, 상기 정의된 객체 이동경로에 따라 이동하는 상기 객체를 표시하는 기능을 구현하는 프로그램을 기록한 컴퓨터로 읽을 수 있는 기록매체를 제공한다. In another aspect, the present invention, in the recording medium recording a program for executing the object control method of the object control device, receiving the first operation information according to the first operation on the object, or distinguished from the first operation A function for receiving second operation information according to a second operation on the object and defining an object movement path of the object according to the first operation information, or according to the first operation information and the second operation information A computer readable recording medium having recorded thereon a program implementing a function of defining an object movement path of the object and displaying the object moving according to the defined object movement path.
또 다른 측면에서, 본 발명은, 객체에 대한 제1조작에 따른 제1조작정보를 입력받거나, 상기 제1조작과는 구별되는 상기 객체에 대한 제2조작에 따른 제2조작정보를 입력받는 입력부; 상기 제1조작정보에 따라 상기 객체의 객체 이동경로를 정의하거나, 상기 제1조작정보 및 상기 제2조작정보에 따라 상기 객체의 객체 이동경로를 정의하는 제어부; 및 상기 정의된 객체 이동경로에 따라 이동하는 상기 객체를 표시하는 표시부를 포함하는 객체 제어 장치를 제공한다. In another aspect, the present invention, the input unit for receiving the first operation information according to the first operation on the object, or the second operation information according to the second operation on the object which is distinguished from the first operation ; A control unit defining an object movement path of the object according to the first operation information or defining an object movement path of the object according to the first operation information and the second operation information; And a display unit which displays the object moving according to the defined object movement path.
이상에서 설명한 바와 같이 본 발명에 의하면, 객체를 목표로 하는 위치에 정확하게 이동시킬 수 있는 객체 제어 방법, 객체 제어 장치 및 기록매체를 제공하는 효과가 있다. As described above, according to the present invention, there is an effect of providing an object control method, an object control apparatus, and a recording medium capable of accurately moving an object to a target position.
또한, 본 발명에 의하면, 서로 다른 2개의 조작정보를 통해, 객체를 목표로 하는 위치에 정확하게 이동시킬 수 있는 객체 제어 방법, 객체 제어 장치 및 기록매체를 제공하는 효과가 있다. In addition, according to the present invention, there is an effect of providing an object control method, an object control device and a recording medium capable of accurately moving an object to a target position through two different operation information.
도 1은 본 발명의 실시예들에 따른 객체 제어 장치에 대한 개략적인 블록도이다. 1 is a schematic block diagram of an object control apparatus according to embodiments of the present invention.
도 2는 본 발명의 실시예들에 따른 객체 제어 장치의 시스템 환경이다. 2 is a system environment of an object control apparatus according to embodiments of the present invention.
도 3은 본 발명의 실시예들에 따른 객체 제어 방법의 흐름도이다. 3 is a flowchart of an object control method according to embodiments of the present invention.
도 4 및 도 5는 본 발명의 일 실시예에 따른 객체 제어 방법을 나타낸 도면이다. 4 and 5 are diagrams illustrating an object control method according to an embodiment of the present invention.
도 6은 본 발명의 다른 실시예에 따른 객체 제어 방법을 나타낸 도면이다. 6 is a view showing an object control method according to another embodiment of the present invention.
이하, 본 발명의 일부 실시예들을 예시적인 도면을 통해 상세하게 설명한다. 각 도면의 구성요소들에 참조부호를 부가함에 있어서, 동일한 구성요소들에 대해서는 비록 다른 도면상에 표시되더라도 가능한 한 동일한 부호를 가지도록 하고 있음에 유의해야 한다. 또한, 본 발명을 설명함에 있어, 관련된 공지 구성 또는 기능에 대한 구체적인 설명이 본 발명의 요지를 흐릴 수 있다고 판단되는 경우에는 그 상세한 설명은 생략한다.Hereinafter, some embodiments of the present invention will be described in detail through exemplary drawings. In adding reference numerals to the components of each drawing, it should be noted that the same reference numerals are assigned to the same components as much as possible even though they are shown in different drawings. In addition, in describing the present invention, when it is determined that the detailed description of the related well-known configuration or function may obscure the gist of the present invention, the detailed description thereof will be omitted.
또한, 본 발명의 구성 요소를 설명하는 데 있어서, 제 1, 제 2, A, B, (a), (b) 등의 용어를 사용할 수 있다. 이러한 용어는 그 구성 요소를 다른 구성 요소와 구별하기 위한 것일 뿐, 그 용어에 의해 해당 구성 요소의 본질이나 차례 또는 순서 등이 한정되지 않는다. 어떤 구성 요소가 다른 구성요소에 "연결", "결합" 또는 "접속"된다고 기재된 경우, 그 구성 요소는 그 다른 구성요소에 직접적으로 연결되거나 또는 접속될 수 있지만, 각 구성 요소 사이에 또 다른 구성 요소가 "연결", "결합" 또는 "접속"될 수도 있다고 이해되어야 할 것이다.In addition, in describing the component of this invention, terms, such as 1st, 2nd, A, B, (a), (b), can be used. These terms are only for distinguishing the components from other components, and the nature, order or order of the components are not limited by the terms. If a component is described as being "connected", "coupled" or "connected" to another component, that component may be directly connected to or connected to that other component, but there may be another configuration between each component. It is to be understood that the elements may be "connected", "coupled" or "connected".
도 1은 본 발명의 실시예들에 따른 객체 제어 장치(100)에 대한 개략적인 블록도이다. 1 is a schematic block diagram of an object control apparatus 100 according to embodiments of the present invention.
도 1을 참조하면, 본 발명의 실시예들에 따른 객체 제어 장치(100)는 객체의 이동 경로를 제어하는 장치에 관한 것으로서, 객체 이동 경로를 제어하기 위한 사용자의 1가지의 조작정보(제1조작정보) 또는 다른 2가지 조작정보(제1조작정보, 제2조작정보)를 입력받는 입력부(110)와, 입력받은 1가지 조작정보(제1조작정보) 또는 다른 2가지 조작정보(제1조작정보, 제2조작정보)에 근거하여, 객체 이동경로를 정의하는 제어부(120)와, 제어부(120)에 의해 정의된 객체 이동경로에 따라 이동하는 객체를 표시하는 표시부(130) 등을 포함한다. Referring to FIG. 1, an object control apparatus 100 according to embodiments of the present disclosure relates to an apparatus for controlling a movement path of an object, and includes one operation information of a user for controlling the object movement path (first An input unit 110 for receiving operation information) or other two operation information (first operation information, second operation information), and received one operation information (first operation information) or other two operation information (first operation information). A control unit 120 defining an object movement path based on the operation information and the second operation information), a display unit 130 for displaying an object moving according to the object movement path defined by the control unit 120, and the like. do.
전술한 입력부(110)는, 객체에 대한 한 가지의 조작(제1조작)에 따른 제1조작정보를 입력받거나, 제1조작과는 구별되는 객체에 대한 다른 조작(제2조작)에 따른 제2조작정보를 입력받는다. The above-described input unit 110 receives the first operation information according to one operation (first operation) for the object, or the second operation according to another operation (second operation) for the object which is distinguished from the first operation. 2 Receive the operation information.
여기서, 2가지 조작정보인 제1조작정보 및 제2조작정보는, 동일 시점에 입력되는 서로 다른 조작정보일 수 있다. Here, the first operation information and the second operation information, which are two pieces of operation information, may be different operation information input at the same time.
예를 들어, 제1조작정보 및 제2조작정보는, 일 예로, 동일 시점에서 함께 입력되는 서로 다른 드래그 조작에 대한 정보일 수 있다. For example, the first operation information and the second operation information may be, for example, information about different drag operations input together at the same time.
이를 위해, 표시부(130)는 멀티 터치(Multi-Touch)를 지원하는 터치스크린을 포함한다. To this end, the display unit 130 includes a touch screen that supports multi-touch.
전술한 제어부(120)는, 제1조작정보와는 구별되는 제2조작정보의 입력 유무에 따라, 제2조작정보가 입력되지 않은 경우 입력된 제1조작정보에 따라 객체의 객체 이동경로를 정의하거나, 제2조작정보가 입력된 경우 입력된 2가지 조작정보(제1조작정보 및 제2조작정보)에 따라 객체의 객체 이동경로를 정의한다. The controller 120 defines an object movement path of an object according to the inputted first operation information when the second operation information is not input, according to whether the second operation information is input or not, which is different from the first operation information. Alternatively, when the second manipulation information is input, the object movement path of the object is defined according to the two input manipulation information (first manipulation information and second manipulation information).
전술한 제어부(120)는, 제1조작정보와는 구별되는 제2조작정보가 입력되지 않은 경우, 입력된 1가지의 조작정보(제1조작정보)에 따라 객체의 객체 이동경로를 직선 이동경로로 정의할 수 있다. When the second operation information distinguished from the first operation information is not input, the controller 120 determines the object movement path of the object based on the input one operation information (first operation information). Can be defined as
전술한 제어부(120)는, 제1조작정보와는 구별되는 제2조작정보가 입력된 경우, 입력된 2가지 조작정보, 즉, 하나의 객체에 대한 제1조작정보 및 제2조작정보를 이용하여 하나의 객체에 대한 객체 이동경로를 곡선 이동경로로 정의할 수 있다. When the second operation information distinguished from the first operation information is input, the controller 120 uses two input operation information, that is, the first operation information and the second operation information for one object. Thus, the object movement path for one object can be defined as a curve movement path.
더욱 상세하게 설명하면, 제어부(120)는, 제1조작정보에 따라 해당 객체의 직선이동성분에 대한 "이동 방향" 및 "이동 거리"를 결정하고, 제2조작정보에 따라 해당 객체의 "커브 궤적"을 결정하며, 이렇게 결정된 이동 방향, 이동 거리 및 커브 궤적에 근거하여, 객체 이동경로로서 곡선 이동경로를 정의한다. 즉, 이동 방향, 이동 거리 및 커브 궤적으로 곡선 이동경로를 정의할 수 있다. In more detail, the controller 120 determines the "movement direction" and "movement distance" for the linear movement component of the object according to the first operation information, and the "curve" of the object according to the second operation information. Trajectory ", and based on the movement direction, the movement distance and the curve trajectory thus determined, the curve movement path is defined as the object movement path. That is, the curved movement path may be defined by the moving direction, the moving distance, and the curve trajectory.
이와 같이, 제어부(120)에 의해, 객체 이동경로가 정의되면, 표시부(130)는, 정의된 객체 이동경로(직선 이동경로 또는 곡선 이동경로)에 따라 이동하는 객체를 표시한다. As described above, when the object movement path is defined by the controller 120, the display unit 130 displays the object moving according to the defined object movement path (linear movement path or curved movement path).
도 1에 예시된 본 발명의 실시예들에 따른 객체 제어 장치(100)는, 일 예로, 객체의 이동 경로를 제어할 수 있도록 하는 사용자 조작법에 따른 1가지 또는 2가지의 조작정보를 입력받아 이를 토대로 하나의 객체에 대한 이동 경로를 제어하는 게임 장치일 수 있으며, 사용자 단말기 형태로 구현될 수 있다. The object control apparatus 100 according to the exemplary embodiments of the present invention illustrated in FIG. 1 receives one or two pieces of operation information according to a user manipulation method for controlling a movement path of an object. It may be a game device that controls a movement path for one object on the basis, and may be implemented in the form of a user terminal.
도 1에 예시된 본 발명의 실시예들에 따른 객체 제어 장치(100)는, 서버 및 단말 등의 다른 장치와 연동하지 않고 단독으로 객체 제어 방법을 제공할 수도 있고, 서버 및 단말 등의 다른 장치와의 연동을 통해 객체 제어 방법을 제공할 수도 있다. The object control apparatus 100 according to the exemplary embodiments of the present invention illustrated in FIG. 1 may provide an object control method independently without interworking with other devices such as a server and a terminal, or other devices such as a server and a terminal. You can also provide an object control method through interworking with.
이와 같이, 본 발명의 실시예들에 따른 객체 제어 장치(100)가 서버 및 단말 등의 다른 장치와의 연동을 통해 객체 제어 방법을 제공하는 경우, 도 2의 개략적인 시스템으로 구성될 수 있다. As such, when the object control apparatus 100 according to the embodiments of the present invention provides an object control method through interworking with other devices such as a server and a terminal, the object control apparatus 100 may be configured as the schematic system of FIG. 2.
도 2는 본 발명의 실시예들에 따른 객체 제어 장치(100)의 시스템 환경을 나타낸 도면이다. 2 is a diagram illustrating a system environment of an object control apparatus 100 according to embodiments of the present disclosure.
도 2에 도시된 시스템 환경에서, 본 발명의 실시예들에 따른 객체 제어 장치(100)가 서버 및 단말 등의 다른 장치(200)와의 연동을 통해 객체 제어 방법을 제공하는 경우, 객체의 객체 정보 및 객체 이미지 등과 각종 다른 객체, 구조물 등의 정보 및 이미지 등은 객체 제어 장치(100)에 저장되어 있을 수도 있고, 다른 장치(200)에 저장되어 있을 수도 있다. In the system environment shown in FIG. 2, when the object control apparatus 100 according to the embodiments of the present invention provides an object control method through interworking with another device 200 such as a server and a terminal, object information of the object And information and images of various other objects, structures, and the like, may be stored in the object control device 100 or may be stored in another device 200.
또한, 객체에 대한 이동 제어 이후, 그 결과, 예를 들어, 객체의 이동에 따라 해당 객체와 다른 객체 또는 구조물 간의 충돌 등에 따른 결과 정보가 객체 제어 장치(100) 또는 다른 장치(200)에 저장되거나 화면으로 표시될 수 있다. In addition, after the movement control on the object, as a result, for example, the result information according to the collision between the object and another object or structure according to the movement of the object is stored in the object control device 100 or another device 200 or It can be displayed on the screen.
한편, 다른 장치(200) 또는 객체 제어 장치(100)는, 객체 제어 또는 이와 관련된 기능 등을 수행하기 위한 각종 정보를 저장할 수 있다. 예를 들어, 사용자의 객체 보유정보, 각 객체정보, 사용자와 관련된 정보(예: 사용자 ID, 사용자 서비스 이력, 포인트, 경험치 등에 대한 정보)를 더 포함할 수도 있다. Meanwhile, the other device 200 or the object control device 100 may store various information for performing object control or a function related thereto. For example, the information may further include object retention information of the user, object information of each user, and information related to the user (for example, information about a user ID, user service history, points, and experience values).
본 명세서에서 언급되는 객체는, 일 예로, 게임 내 캐릭터, 아바타, 오브젝트, 아이템, 또는 캐릭터가 보유한 아이템(예: 무기 등) 등일 수 있다. 이와 같은 경우, 본 명세서에서 언급되는 화면은 게임화면일 수 있다. The object referred to herein may be, for example, a character, an avatar, an object, an item, or an item (eg, a weapon, etc.) possessed by the character. In this case, the screen mentioned in the present specification may be a game screen.
또한, 본 발명의 실시예들에 따른 객체 제어 장치(100)는, 일 예로, 스마트폰, 태블릿 PC, 휴대단말기 등의 모바일 단말기, 또는 PC(Personal Comuputer), 노트북 등의 일반적인 컴퓨터 등의 사용자의 단말기일 수 있다. In addition, the object control apparatus 100 according to the embodiments of the present invention may be, for example, a mobile terminal such as a smartphone, a tablet PC, a mobile terminal, or a user such as a general computer such as a personal computer (PC) or a laptop. It may be a terminal.
본 발명의 실시예들에 따른 객체 제어 방법은 게임 애플리케이션의 실행에 따라 제공되는 방법일 수 있다. The object control method according to embodiments of the present invention may be a method provided according to the execution of a game application.
아래에서, 이상에서 간단하게 설명한 본 발명의 실시예들에 따른 객체 제어 장치(100)가 제공하는 객체 제어 방법에 대하여, 도 3 내지 도 6을 참조하여 더욱 상세하게 설명한다. Hereinafter, an object control method provided by the object control apparatus 100 according to the exemplary embodiments of the present invention described above will be described in more detail with reference to FIGS. 3 to 6.
도 3은 본 발명의 실시예들에 따른 객체 제어 방법의 흐름도이다. 3 is a flowchart of an object control method according to embodiments of the present invention.
도 3을 참조하면, 본 발명의 실시예들에 따른 객체 제어 방법은, 객체 조작정보 입력 단계(S310), 객체 이동경로 정의 단계(S320), 객체 표시 제어 단계(S330) 등을 포함한다. Referring to FIG. 3, an object control method according to embodiments of the present disclosure includes an object manipulation information input step S310, an object movement path definition step S320, an object display control step S330, and the like.
객체 조작정보 입력 단계(S310)에서, 객체 제어 장치(100)의 입력부(110)는, 객체에 대한 제1조작에 따른 제1조작정보를 입력받거나, 제1조작과는 구별되는 객체에 대한 제2조작에 따른 제2조작정보를 입력받는다. In the object manipulation information input step (S310), the input unit 110 of the object control apparatus 100 receives the first manipulation information according to the first operation on the object, or receives the first manipulation information on the object distinguished from the first operation. Receive the second operation information according to the second operation.
여기서, 제1조작정보 및 제2조작정보는, 일 예로, 동일 시점에서 함께 입력되는 서로 다른 드래그 조작에 대한 정보일 수 있다. Here, the first operation information and the second operation information may be, for example, information about different drag operations input together at the same time.
객체 이동경로 정의 단계(S320)에서, 객체 제어 장치(100)의 제어부(120)는, 제2조작정보의 입력 유무에 따라, 제1조작정보에 따라 객체의 객체 이동경로를 정의하거나, 제1조작정보 및 제2조작정보에 따라 객체의 객체 이동경로를 정의할 수 있다. In the object movement path definition step (S320), the control unit 120 of the object control apparatus 100 may define an object movement path of an object according to the first operation information according to whether the second operation information is input or the first operation information. The object movement path of the object may be defined according to the operation information and the second operation information.
객체 이동경로 정의 단계(S320)에서, 객체 제어 장치(100)의 제어부(120)는, 제2조작정보가 미입력된 경우, 제1조작정보에 따라 객체 이동경로로서 직선 이동경로를 정의하고, 제2조작정보가 입력된 경우, 제1조작정보 및 제2조작정보에 따라 객체 이동경로로서 곡선 이동경로를 정의할 수 있다. In the object movement path definition step (S320), when the second operation information is not input, the control unit 120 defines a linear movement path as the object movement path according to the first operation information. When two manipulation information is input, the curved movement route may be defined as the object movement route according to the first manipulation information and the second manipulation information.
객체 표시 제어 단계(S330)에서, 객체 제어 장치(100)의 제어부(120)는, 전 단계에서 정의된 객체 이동경로에 따라 객체에 대한 표시부(130)에서의 화면 표시를 제어한다. In the object display control step (S330), the control unit 120 of the object control apparatus 100 controls the screen display on the display unit 130 for the object according to the object movement path defined in the previous step.
객체 조작정보 입력 단계(S310)에서 입력된 제1조작정보는, 일 예로, 해당 객체를 사용자가 드래그한 드래그 방향 및 드래그 길이 등에 대한 정보를 포함할 수 있다. The first manipulation information input in the object manipulation information input step S310 may include, for example, information about a drag direction and a drag length in which the user drags the corresponding object.
이에 따라, 객체 이동경로 정의 단계(S330)에서, 제어부(120)는, 제1조작정보로부터 확인된 사용자 조작의 드래그 방향의 반대 방향을 해당 객체의 직선이동성분에 대한 이동 방향으로 결정한다. Accordingly, in the object movement path definition step (S330), the controller 120 determines a direction opposite to the drag direction of the user operation identified from the first manipulation information as the movement direction with respect to the linear movement component of the object.
또한, 객체 이동경로 정의 단계(S330)에서, 제어부(120)는, 제1조작정보로부터 확인된 사용자 조작의 드래그 길이에 대응되는 길이를 해당 객체의 직선이동성분에 대한 이동 거리로 결정할 수 있다. In addition, in the object movement path definition step (S330), the controller 120 may determine a length corresponding to the drag length of the user operation identified from the first manipulation information as a movement distance with respect to the linear movement component of the object.
전술한 바와 같이, 제1조작정보에 따라 해당 객체의 직선이동성분에 대한 이동방향과 이동거리가 결정되면, 해당 객체가 어떠한 이동 패턴(이동 경로)의 형태로 이동하든 관계없이, 해당 객체가 최초 위치에서 이동을 완료한 이후의 해당 최종 목적지는 정해지게 된다. As described above, when the movement direction and the movement distance with respect to the linear movement component of the object are determined according to the first manipulation information, the object is first displayed regardless of what kind of movement pattern (movement path) the object is moving. The final destination after the move in position is determined.
만약, 해당 객체의 최초 위치와 이동 완료 후의 최종 목적지 위치를 직선으로 연결하는 선분 상에 객체 이동에 장애가 되는 장애물이 존재하지 않는다면, 사용자는 제1조작을 통해 제1조작정보만을 입력시켜, 해당 객체가 최초 위치에서 최종 목적지 위치까지 직선 이동 경로(객체 이동 경로)를 따라 이동하도록 조작하는 것이 가장 바람직할 것이다. If there is no obstacle that obstructs the movement of the object on the line connecting the initial position of the object and the final destination position after the completion of movement, the user inputs only the first manipulation information through the first operation and the corresponding object. It would be most desirable to manipulate so that U moves along a linear travel path (object travel path) from the initial location to the final destination location.
하지만, 해당 객체의 최초 위치와 이동 완료 후의 최종 목적지 위치를 직선으로 연결하는 선분 상에 객체 이동에 장애가 되는 장애물이 존재한다면, 해당 객체가 최초 위치에서 최종 목적지 위치까지 직선으로 이동하지 못할 수 있다. However, if there is an obstacle that obstructs the movement of the object on a line connecting the initial position of the object and the final destination position after completion of movement, the object may not move linearly from the initial position to the final destination position.
이러한 경우, 사용자는 제1조작과 제2조작을 함께하여 제1조작정보와 제2조작정보를 동시에 입력시켜, 해당 객체가 최초 위치에서 최종 목적지 위치까지 이동할 때, 장애물을 회피하여 이동할 수 있는 곡선 이동 경로(객체 이동 경로)를 만들어주는 조작법이 필요하다. In this case, the user inputs the first operation information and the second operation information at the same time by combining the first operation and the second operation, and when the object moves from the initial position to the final destination position, a curve that can move to avoid obstacles. You need to manipulate the movement path (object movement path).
예를 들어, 해당 객체가 캐릭터 또는 무기 아이템인 경우, 해당 객체를 제1조작정보만으로 최초 위치에서 최종 목적지 위치까지 직선으로 이동시켜 최종 목적지에 있는 다른 캐릭터 또는 구조물 등을 공격하는 경우에는, 해당 객체의 최초 위치와 이동 완료 후의 최종 목적지 위치를 직선으로 연결하는 선분 상에 존재하는 장애물로 인해, 최종 목적지에 있는 다른 캐릭터 또는 구조물 등을 폭파시킬 수 없는 경우, 사용자는 제1조작과 제2조작을 함께하여 제1조작정보와 제2조작정보를 동시에 입력시켜, 해당 객체가 최초 위치에서 최종 목적지 위치까지 이동할 때, 장애물과 충돌하지 않고 이동하도록 하는 곡선 이동 경로(객체 이동 경로)를 만들어주는 조작이 필요하다. For example, if the object is a character or weapon item, the object is moved in a straight line from the initial position to the final destination position using only the first manipulation information, and the object is attacked when attacking another character or structure at the final destination. If the obstacle that exists on the line connecting the initial position of and the final destination position after completion of movement is unable to blow up another character or structure in the final destination, the user may perform the first operation and the second operation. Together, the first operation information and the second operation information are input at the same time, and the operation to create a curved movement path (object movement path) that allows the object to move without colliding with an obstacle when moving from the initial position to the final destination position need.
따라서, 본 발명은, 해당 객체의 직선이동성분에 대한 이동 방향 및 이동 거리를 결정하도록 하는 사용자의 제1조작 이외에, 해당 객체가 장애물을 피해서 이동할 수 있도록 이동 궤적을 만들어주는 사용자의 다른 조작(제2조작)을 더 제공할 수 있다. Accordingly, the present invention, in addition to the first operation of the user to determine the moving direction and the moving distance with respect to the linear movement component of the object, other operations of the user to create a movement trajectory so that the object can move to avoid obstacles 2 operations) can be provided further.
한편, 객체 조작정보 입력 단계(S310)에서, 입력부(110)는, 화면의 제1지점을 터치하여 드래그하는 제1조작이 이루어지다가 터치가 화면에서 떨어지면, 제1조작정보를 입력받는다. On the other hand, in the object manipulation information input step (S310), the input unit 110 receives the first manipulation information when the touch is dropped off the screen while the first manipulation is performed by touching and dragging the first point of the screen.
이때, 입력부(110)는, 제1조작정보의 입력시점에, 화면의 제2지점을 터치하여 드래그하여 화면에서 터치가 떨어지지 않은 상태로 유지되고 있는 제2조작이 존재하는지를 판단하여, 존재한다고 판단된 경우, 화면에서 터치가 떨어지지 않은 상태의 제2조작에 따른 제2조작정보를 동시에 입력받을 수 있다. At this time, the input unit 110 judges whether or not there is a second operation that is touched and dragged on the second point of the screen at the time of inputting the first operation information, and that the touch is not released from the screen. In this case, the second operation information according to the second operation without touching the screen may be simultaneously input.
다시 말해, 입력부(110)는, 화면의 제1지점을 터치하여 드래그하는 제1조작이 이루어지다가 터치가 화면에서 떨어지면, 제1조작정보를 입력받고, 제1조작정보의 입력시점 이전에 화면의 제2지점을 터치하여 드래그 한 제2조작이 제1조작정보의 입력시점에도 화면에서 터치가 떨어지지 않은 상태로 유지되고 있는 경우, 제1조작정보의 입력시점에 제2조작에 따른 제2조작정보를 동시에 입력받는다. In other words, when the first operation of touching and dragging the first point of the screen is performed and the touch is dropped from the screen, the input unit 110 receives the first operation information, and before the inputting time of the first operation information, If the second operation touched and dragged the second point is maintained without touching the screen even when the first operation information is input, the second operation information according to the second operation at the time of inputting the first operation information. To be input at the same time.
입력부(110)는, 화면의 제1지점을 터치하여 드래그하는 제1조작이 이루어지다가 터치가 화면에서 떨어지면, 제1조작정보를 입력받고, 제1조작정보의 입력시점 이전에 화면의 제2지점을 터치하여 드래그 한 제2조작이 제1조작정보의 입력시점 이전에 화면에서 터치가 떨어진 경우, 제1조작정보의 입력시점에 제2조작에 따른 제2조작정보를 미 입력받는다. The input unit 110 receives a first operation information when a first operation of touching and dragging a first point of the screen is performed and the touch is dropped from the screen, and receives a second operation point of the screen before the input point of the first operation information. If the second operation touched and dragged is removed from the screen before the input point of the first operation information, the second operation information according to the second operation is not inputted at the time of inputting the first operation information.
아래에서는, 곡선 이동경로를 만들어주기 위한 사용자의 조작법에 대하여 도 4 내지 도 6을 참조하여 설명한다. 단, 곡선 이동경로를 만들어 주기 위한 제1조작과 제2조작은 동시에 이루어지는 터치 조작으로서, 객체 제어 장치는, 일 예로, 상호 캐패시턴스(Mutual Capacitance) 터치 방식 등을 이용하는 멀티 터치를 지원할 수 있어야 한다. Hereinafter, a user's manipulation method for creating a curved movement path will be described with reference to FIGS. 4 to 6. However, the first operation and the second operation for creating the curved movement path are touch operations performed at the same time, and the object control apparatus should be capable of supporting multi-touch using, for example, a mutual capacitance touch method.
도 4 및 도 5는 본 발명의 일 실시예에 따른 객체 제어 방법을 나타낸 도면이다. 4 and 5 are diagrams illustrating an object control method according to an embodiment of the present invention.
도 4는, 사용자는 객체(A)를 현재 위치(최초 위치, P1)에서 최종 목적지 위치(P3)까지 이동시키되, 현재 위치(P1)에서 최종 목적지 위치(P3)까지 곡선 이동 경로(400)를 따라 객체(A)를 이동시키고자 하는 경우에 대하여, 사용자의 제1조작(410)과 제2조작(420) 각각에 대한 조작법을 나타낸 도면이다. 4, the user moves the object A from the current position (first position, P1) to the final destination position P3, while the user moves the curved path 400 from the current position P1 to the final destination position P3. According to the case where the object A is to be moved, an operation method for each of the first operation 410 and the second operation 420 of the user is illustrated.
도 4를 참조하면, 먼저, 사용자는 객체(A)의 최종 목적지 위치(P3)를 정해주기 위해, 객체(A)를 일정한 방향으로 일정 거리만큼 드래그 조작(410; 제1조작)을 한다. Referring to FIG. 4, first, a user performs a drag operation 410 (first operation) on the object A by a predetermined distance in order to determine the final destination position P3 of the object A. FIG.
이에 따라, 객체(A)의 드래그 조작(410; 제1조작)의 드래그 방향(P1->P1')과 반대되는 방향(P1'->P1)이 객체(A)의 이동 방향으로 결정된다. Accordingly, a direction P1 '-> P1 opposite to the drag direction P1-> P1' of the drag operation 410 (first operation) of the object A is determined as the moving direction of the object A. FIG.
또한, 객체(A)의 드래그 조작(410; 제1조작)의 드래그 길이(L1)에 따라, 객체(A)의 이동 길이(D)가 결정된다. 여기서, 객체(A)의 이동 길이(D)는, 제1조작(410)의 드래그 길이(L1=|P1-P1'|)와 동일한 값일 수도 있고, 제1조작의 드래그 길이(L1)의 상수(K) 배일 수도 있다(D=K*L1, K>1). In addition, according to the drag length L1 of the drag operation 410 (first operation) of the object A, the movement length D of the object A is determined. Here, the moving length D of the object A may be the same value as the drag length L1 = | P1-P1 '| of the first operation 410, or may be a constant of the drag length L1 of the first operation. It may also be (K) times (D = K * L1, K> 1).
이와 같이, 객체(A)의 드래그 조작(410; 제1조작)에 의해, 객체(A)가 이동을 완료한 이후의 최종 목적지 위치(P3)가 결정된다. In this manner, the drag operation 410 (first operation) of the object A determines the final destination position P3 after the object A completes its movement.
만약, 사용자는 객체(A)가 현재 위치(P1)에서 최종 목적지 위치(P3)까지 직선 이동경로(440)를 따라 이동하기를 원하는 경우, 제2조작(420; 예: 드래그 조작)에 따른 제2조작정보가 입력되지 않도록 조작할 수도 있다. If the user wants the object A to move along the linear movement path 440 from the current position P1 to the final destination position P3, the user may perform the first operation according to the second operation 420 (eg, a drag operation). 2 You can also operate so that operation information is not input.
즉, 객체(A)를 현재 위치(P1)에서 최종 목적지 위치(P3)까지 직선 이동 경로(440)를 따라 이동시키고자 하는 경우에는, 제2조작(420)을 아예 하지 않거나, 제1조작(410)에 따라 화면에서 터치를 떼기 전에, 제2조작(420)을 하던 터치를 화면에서 떼게 되면, 제1조작(410)을 하던 터치가 화면에서 떨어질 때, 제1조작정보만이 입력된다. 이에 따라, 입력부(110)는 제1조작정보만을 입력받고, 제어부(120)는 제1조작정보만을 이용하여, 제1조작(410)의 드래그 방향(P1->P1')과 드래그 길이(L1)에 따라 직선 이동경로(440)를 정의한다. That is, when the object A is to be moved along the linear movement path 440 from the current position P1 to the final destination position P3, the second operation 420 is not performed at all or the first operation ( According to 410, if the touch used for the second operation 420 is released from the screen before the touch is released from the screen, only the first operation information is input when the touch used for the first operation 410 falls off the screen. Accordingly, the input unit 110 receives only the first operation information, and the control unit 120 uses only the first operation information, and the drag direction P1-> P1 'and the drag length L1 of the first operation 410 are received. According to the linear movement path 440 is defined.
만약, 사용자가, 객체(A)를 현재 위치(P1)에서 최종 목적지 위치(P3)까지 곡선 이동 경로(400)를 따라 이동시키고자 하는 경우에는, 제1조작(410)에 따른 제1조작정보를 입력시키는 것 이외에, 제2조작(420)에 따른 제2조작정보도 함께 입력시켜주어야 한다. If the user wants to move the object A along the curve moving path 400 from the current position P1 to the final destination position P3, the first operation information according to the first operation 410. In addition to inputting, the second operation information according to the second operation 420 should also be input.
이와 관련하여, 사용자는 왼손의 손가락으로 제1조작(410)을 하면서, 동시에, 제1조작에 따라 표시된 객체(A)의 이동 방향 지시자(430) 또는 객체(A)를 오른손의 손가락으로 드래그하는 제2조작(420)을 하게 되면, 제2조작(420)의 드래그 방향으로 굽은 곡선 이동 경로(400)가 정해진다. In this regard, the user performs the first operation 410 with the finger of the left hand while simultaneously dragging the moving direction indicator 430 or the object A of the object A displayed according to the first operation with the finger of the right hand. When the second operation 420 is performed, the curved movement path 400 bent in the drag direction of the second operation 420 is determined.
더욱 상세하게 설명하면, 사용자는 왼손의 손가락으로 제1조작(410)을 하는 동안, 오른손의 손가락으로 객체(A)의 이동 방향 지시자(430) 또는 객체(A)를 드래그 하는 제2조작(420)을 해야만 한다. 그리고, 사용자는 객체(A)를 드래그 하는 제1조작(410)에 따른 터치를 화면에서 떼는 시점(제1조작정보 입력 시점)에, 제2조작(420)에 따른 터치를 화면에서 떼지 않고 유지하고 있으면 된다. 이에 따라, 입력부(110)는, 제1조작정보와 제2조작정보를 동시에 입력받게 되고, 제어부(120)는 제1조작정보와 제2조작정보를 동시에 이용하여 곡선 이동경로(400)를 정의하게 된다. In more detail, while the user performs the first operation 410 with the finger of the left hand, the second operation 420 of dragging the movement direction indicator 430 or the object A of the object A with the finger of the right hand is performed. You must The user maintains the touch according to the second operation 420 without removing it from the screen at the time of releasing the touch according to the first operation 410 for dragging the object A from the screen (first operation information input time). All you have to do is. Accordingly, the input unit 110 receives the first operation information and the second operation information at the same time, and the control unit 120 defines the curved movement path 400 by using the first operation information and the second operation information at the same time. Done.
이때, 제2조작(420)의 드래그 길이(L2=|P2-P2'|)에 따라 곡선 이동 경로의 굽은 정도(커브의 휘어짐 정도)가 달라질 수 있다. 예를 들어, 제2조작(420)의 드래그 길이(L2=|P2-P2'|)가 길수록 곡선 이동 경로의 굽은 정도(커브의 휘어짐 정도)가 커질 수 있다. 즉, 도 4를 참조하면, 제2조작(420)의 드래그 길이(L2=|P2-P2'|)가 길수록, 곡선 이동 경로의 높이(H)가 높아질 수 있다. At this time, the bending degree (curvature degree of the curve) of the curved movement path may vary according to the drag length L2 = | P2-P2 '| of the second operation 420. For example, the longer the drag length L2 = | P2-P2 '| of the second operation 420, the greater the degree of bending (curvature of the curve) of the curved movement path. That is, referring to FIG. 4, the longer the drag length L2 = | P2-P2 '| of the second operation 420, the higher the height H of the curved movement path.
전술한 바에 따르면, 객체 조작정보 입력 단계(S310)에서, 제1조작(410)의 제1조작정보와 함께 입력된 제2조작(420)의 제2조작정보는, 일 예로, 제1조작정보에 따라 객체(A)에서 이동 방향(제1조작정보에 포함된 드래그 방향(P1->P1')의 반대 방향(P1'->P1))을 향해 표시된 이동 방향 지시자(430)를 드래그한 드래그 길이(L2) 또는 객체(A)를 드래그한 드래그 길이(L2)에 대한 정보를 포함할 수 있다. As described above, in the object manipulation information input step (S310), the second manipulation information of the second manipulation 420 input together with the first manipulation information of the first manipulation 410 is, for example, the first manipulation information. Dragging the moving direction indicator 430 displayed from the object A toward the moving direction (P1 '-> P1) opposite to the moving direction (the drag directions P1-> P1' included in the first operation information). It may include information about the length L2 or the drag length L2 to which the object A is dragged.
이러한 객체 조작정보 입력 단계(S310) 이후, 객체 이동경로 정의 단계(S330)에서, 제어부(120)는, 제2조작(420)의 제2조작정보로부터 확인된 드래그 길이(L2)에 따라, 커브 궤적을 결정함으로써, 최초 위치(P1)에서 최종 목적지 위치(P3)까지의 객체(A)가 이동하는 커브 궤적에 해당하는 곡선 이동 경로(400)를 정의하게 된다. 곡선 이동 경로(400)는 커브 궤적과 동일한 의미이거나, 커브 궤적의 일부분으로서 최초 위치(P1)와 최종 목적지 위치(P3)를 이어주는 커브 궤적 부분을 의미할 수도 있다. After the object manipulation information input step (S310), in the object movement path definition step (S330), the control unit 120 curves according to the drag length L2 identified from the second operation information of the second operation 420. By determining the trajectory, the curved movement path 400 corresponding to the curve trajectory along which the object A moves from the initial position P1 to the final destination position P3 is defined. The curve movement path 400 may mean the same as the curve trajectory, or may mean a portion of the curve trajectory that connects the initial position P1 and the final destination position P3 as part of the curve trajectory.
도 5는 객체(A)를 현재 위치(P1)에서 최종 목적지 위치(P3)까지 직선 이동 경로(440)뿐만 아니라, 도 4와 같은 객체(A)의 곡선 이동 경로(400) 상에도 장애물(500)이 존재하는 경우, 최종 목적지 위치(P3)에 위치한 타깃(B; 예: 캐릭터, 구조물, 기지 등)으로 객체(A)를 이동시키거나 최종 목적지 위치(P3)에 위치한 타깃(B)을 객체(A)로 공격하고자 할 때, 도 4의 곡선 이동 경로(400)보다 커브 휘어짐 정도를 더 크게 하여, 객체(A)가 장애물(500)에 방해를 받지 않고 이동할 수 있는 곡선 이동경로(400')를 만들어주는 제2조작(420')을 나타낸 도면이다. FIG. 5 shows an obstacle 500 on the curved movement path 400 of the object A as shown in FIG. 4 as well as the linear movement path 440 from the current position P1 to the final destination position P3. If present, move the object A to the target B (e.g. character, structure, base, etc.) located at the final destination location P3 or move the target B located at the final destination location P3. When attempting to attack with (A), the curve bending degree is larger than the curve moving path 400 of FIG. 4, so that the object A can move without being obstructed by the obstacle 500. Is a view showing a second operation 420 ′ that produces?
도 5를 참조하면, 일 예로, 사용자는, 이동 방향 지시자(430) 또는 객체(A)를 드래그하는 제2조작(420)을 할 때, 도 4의 곡선 이동 경로(400)보다 커브 휘어짐 정도가 더 큰 곡선 이동 경로(400')가 되도록, 드래그 길이(L2'=|P2-P2"|)를 도 4에서의 드래그 길이 L2보다 더 길게 하는 제2조작(420)을 할 수 있다. Referring to FIG. 5, for example, when the user performs the second operation 420 of dragging the movement direction indicator 430 or the object A, the degree of curvature of the curve is greater than that of the curved movement path 400 of FIG. 4. The second operation 420 can be performed to make the drag length L2 '= | P2-P2 "| longer than the drag length L2 in FIG. 4 so as to be a larger curved movement path 400'.
한편, 사용자가 제2조작(420)을 할 때, 이동 방향 지시자(430) 또는 객체(A)를 아래 방향으로 드래그하면, 도 4 및 도 5에 도시된 바와 같이, 곡선 이동경로(400, 400')가 아래 방향으로 만들어지고, 이동 방향 지시자(430) 또는 객체(A)를 윗 방향으로 드래그하면, 곡선 이동경로(400, 400')가 윗 방향으로 만들어진다.Meanwhile, when the user drags the movement direction indicator 430 or the object A downward when the second operation 420 is performed, the curved movement paths 400 and 400 are illustrated in FIGS. 4 and 5. 'Is made in the downward direction, and dragging the movement direction indicator 430 or the object A upwards, the curved movement paths 400 and 400' are upwards.
아래에서는, 해당 객체(A)가 장애물을 피해서 이동할 수 있도록 곡선 이동 경로를 만들어주는 사용자의 제2조작과 관련된 다른 조작법을 설명한다. In the following, another operation method related to the second operation of the user that creates the curved movement path so that the object A can move to avoid the obstacle will be described.
도 6은 본 발명의 다른 실시예에 따른 객체 제어 방법을 나타낸 도면이다.6 is a view showing an object control method according to another embodiment of the present invention.
도 6을 참조하면, 사용자는 객체(A)를 현재 위치(P1)에서 최종 목적지 위치(P3)까지 이동시키기 위한 객체 이동 경로(600, 600', 600", ...)를 만들어 주기 위한 사용자 조작법을 나타낸 도면이다. Referring to FIG. 6, a user for creating an object moving path 600, 600 ′, 600 ″, ... for moving the object A from the current position P1 to the final destination position P3. It is a figure which shows an operation method.
도 6을 참조하면, 먼저, 사용자는 객체(A)의 최종 목적지 위치(P3)를 정해주기 위해, 객체(A)를 일정한 방향(P1->P1')으로 일정 거리(L1)만큼 드래그 조작(610; 제1조작)을 한다. Referring to FIG. 6, first, a user drags an object A by a predetermined distance L1 in a predetermined direction P1-> P1 'in order to determine the final destination position P3 of the object A ( 610; first operation).
이에 따라, 객체(A)의 드래그 조작(410; 제1조작)의 드래그 방향(P1->P1')과 반대되는 방향(P1'->P1)이 객체(A)의 이동 방향으로 결정된다. Accordingly, a direction P1 '-> P1 opposite to the drag direction P1-> P1' of the drag operation 410 (first operation) of the object A is determined as the moving direction of the object A. FIG.
또한, 객체(A)의 드래그 조작(410; 제1조작)의 드래그 길이(L1)에 따라, 객체(A)의 이동 길이(D)가 결정된다. 여기서, 객체(A)의 이동 길이(D)는, 제1조작의 드래그 길이(L1=|P1-P1'|)와 동일한 값일 수도 있고, 제1조작의 드래그 길이(L1)의 상수(K) 배일 수도 있다(D=K*L1, K>1). In addition, according to the drag length L1 of the drag operation 410 (first operation) of the object A, the movement length D of the object A is determined. Here, the moving length D of the object A may be the same value as the drag length L1 = | P1-P1 '| of the first operation, and the constant K of the drag length L1 of the first operation. It may be double (D = K * L1, K> 1).
이후, 사용자는 객체(A)가 현재 위치(P1)에서 최종 목적지 위치(P3)까지 어떠한 객체 이동 경로를 따라 이동할지를 결정해주기 위하여, 제2조작(620; 예: 드래그 조작)을 할 수도 있고 하지 않을 수도 있다. The user may then perform a second operation 620 (e.g., drag operation) to determine which object movement path the object A will follow along from the current position P1 to the final destination position P3. It may not.
객체(A)를 현재 위치(P1)에서 최종 목적지 위치(P3)까지 직선 이동 경로(600)를 따라 이동시키고자 하는 경우에는 제2조작(620)에 따른 제2조작정보가 입력되지 않도록 조작할 수도 있다. When the object A is to be moved along the linear movement path 600 from the current position P1 to the final destination position P3, the second operation information according to the second operation 620 may not be input. It may be.
즉, 객체(A)를 현재 위치(P1)에서 최종 목적지 위치(P3)까지 직선 이동 경로(600)를 따라 이동시키고자 하는 경우에는, 제2조작(620)을 아예 하지 않거나, 제1조작(610)에 따라 화면에서 터치를 떼기 전에, 제2조작(620)을 하던 터치를 화면에서 떼게 되면, 제1조작(610)을 하던 터치가 화면에서 떨어질 때, 제1조작정보만이 입력된다. 이에 따라, 입력부(110)는 제1조작정보만을 입력받고, 제어부(120)는 제1조작정보만을 이용하여, 제1조작(610)의 드래그 방향(P1->P1')과 드래그 길이(L1)에 따라 직선 이동경로(600)를 정의한다. That is, when the object A is to be moved along the linear movement path 600 from the current position P1 to the final destination position P3, the second operation 620 is not performed at all or the first operation ( According to 610, before removing the touch from the screen, when the touch that has been performed by the second operation 620 is removed from the screen, only the first operation information is input when the touch that has been performed by the first operation 610 falls off the screen. Accordingly, the input unit 110 receives only the first operation information, and the control unit 120 uses only the first operation information, so that the drag directions P1-> P1 'and the drag length L1 of the first operation 610 are performed. The linear movement path 600 is defined according to
만약, 사용자가, 객체(A)를 현재 위치(P1)에서 최종 목적지 위치(P3)까지 곡선 이동 경로(400)를 따라 이동시키고자 하는 경우에는, 제1조작(410)에 따른 제1조작정보를 입력시키는 것 이외에, 제2조작(420)에 따른 제2조작정보도 함께 입력시켜주어야 한다. If the user wants to move the object A along the curve moving path 400 from the current position P1 to the final destination position P3, the first operation information according to the first operation 410. In addition to inputting, the second operation information according to the second operation 420 should also be input.
예를 들어, 사용자는 왼손의 손가락으로 제1조작(610)을 하면서, 동시에, 제1조작(610)에 따라 객체(A)의 주변에 표시된 포인트(630)를 특정 길이만큼 드래그하는 제2조작(620)을 한다. For example, while the user performs the first operation 610 with the finger of the left hand, at the same time, the second operation of dragging the point 630 displayed around the object A by a specific length according to the first operation 610. (620).
이러한 제2조작(620)에 따라 드래그 된 포인트(630)의 위치는, 커브 휘어짐 레벨 설정 바(640)에서의 여러 단계의 커브 휘어짐 레벨 값(Lv 0 ~ Lv 10)과 대응될 수 있다. The position of the point 630 dragged according to the second operation 620 may correspond to the curve bending level values Lv 0 to Lv 10 of various stages in the curve bending level setting bar 640.
따라서, 사용자는 원하는 단계의 커브 휘어짐 레벨 값에 대응되는 길이만큼 포인트(630)를 드래그하면 된다. Accordingly, the user may drag the point 630 by the length corresponding to the curve bending level value of the desired step.
즉, 제2조작(620)의 제2조작정보는, 객체(A)의 주변에 표시된 포인트(630)에 대한 드래그 조작정보를 포함한다. That is, the second operation information of the second operation 620 includes drag operation information on the point 630 displayed around the object A. FIG.
여기서, 객체(A)의 주변에 표시된 포인트(630)에 대한 드래그 조작정보는, 드래그 위치 정보(예: a, b, c, d, e) 또는 이와 대응되는 커브 휘어짐 레벨 값(예: Lv 0 ~ Lv 10 중 하나)을 포함할 수 있다. Here, the drag operation information on the point 630 displayed around the object A may include drag position information (eg, a, b, c, d, e) or a curve bending level value corresponding thereto (eg, Lv 0). ~ Lv 10) may be included.
도 6의 예시를 참조하면, 사용자는, 포인트(630)를 커브 휘어짐 레벨 설정 바(640)에서 c 위치에 대응되는 위치까지 드래그하는 제2조작(620)을 하여, 제2조작(620)에 따른 드래그 위치 정보(c) 또는 드래그 위치 정보(c)에 대응되는 커브 휘어짐 레벨 값(Lv 5)을 포함하는 제2조작(620)에 따른 제2조작정보를 입력시킬 수 있다. Referring to the example of FIG. 6, the user performs a second operation 620 of dragging the point 630 from the curve bending level setting bar 640 to a position corresponding to the c position. Second operation information according to the second operation 620 including the curve bending level value Lv 5 corresponding to the drag position information c or the drag position information c may be input.
이에, 객체 이동경로 정의 단계(S330)에서, 제어부(120)는, 제2조작(620)의 제2조작정보에 포함된 드래그 조작정보에 포함된 드래그 위치 정보(c) 또는 해당 커브 휘어짐 레벨 값(Lv 5)에 근거하여, 해당 커브 휘어짐 레벨 값(Lv 5)에 해당하는 커브 궤적을 결정함으로써, 해당 객체(A)가 최초 위치(P1)에서 최종 목적지 위치(P3)까지 이동하는 곡선 이동경로(600')를 정의할 수 있다. Thus, in the object movement path definition step (S330), the control unit 120, the drag position information (c) included in the drag operation information included in the second operation information of the second operation 620 or the corresponding curve bending level value Based on (Lv 5), by determining the curve trajectory corresponding to the curve curvature level value Lv 5, the curved path of movement of the object A from the initial position P1 to the final destination position P3. (600 ') can be defined.
이상에서 전술한 본 발명의 실시예들에 따른 객체 제어 방법은, 객체 제어 장치(100)에 기본적으로 설치된 애플리케이션(이는 객체 제어 장치(100)에 기본적으로 탑재된 플랫폼에 포함되거나 운영체제 등에 포함되거나 호환되는 프로그램일 수 있음)에 의해 실행될 수 있고, 또한, 사용자가 애플리케이션 스토어 서버, 애플리케이션 또는 해당 서비스와 관련된 웹 서버 등의 애플리케이션 제공 서버를 통해 객체 제어 장치(100)의 운영체제와 호환 가능하고 객체 제어 장치(100)에 직접 설치한 애플리케이션(즉, 프로그램)에 의해 실행될 수도 있다. 여기서, 객체 제어 장치(100)의 운영체제는, 데스크 탑 등의 일반 PC에 설치되는 윈도우(Window), 매킨토시(Macintosh) 등의 운영체제이거나, 스마트폰, 태블릿 PC 등의 모바일 단말기에 설치되는 iOS, 안드로이드(Android) 등의 모바일 전용 운영체제 등일 수도 있다. The object control method according to the above-described embodiments of the present invention includes an application basically installed in the object control apparatus 100 (which is included in a platform, which is basically included in the object control apparatus 100, or is included in an operating system or the like. The application control server 100 through the application providing server, such as an application store server, an application or a web server associated with the corresponding service, and is compatible with the operating system of the object control apparatus 100. It may be executed by an application (that is, a program) installed directly on the 100. The operating system of the object control apparatus 100 may be an operating system such as a window or a Macintosh installed in a general PC such as a desktop, or an iOS or Android installed in a mobile terminal such as a smartphone or a tablet PC. It may also be a mobile-only operating system such as (Android).
이러한 의미에서, 전술한 본 발명의 실시예들에 따른 객체 제어 방법은 객체 제어 장치(100)에 기본적으로 설치되거나 사용자에 의해 직접 설치된 애플리케이션(즉, 프로그램)으로 구현되고, 객체 제어 장치(100) 등의 컴퓨터로 읽을 수 있는 기록매체에 기록될 수 있다. In this sense, the object control method according to the above-described embodiments of the present invention is implemented as an application (that is, a program) that is basically installed in the object control apparatus 100 or directly installed by a user, and the object control apparatus 100 Or a computer readable recording medium.
본 발명의 실시예들에 따른 객체 제어 방법을 구현한 프로그램은, 객체에 대한 제1조작에 따른 제1조작정보를 입력받거나, 상기 제1조작과는 구별되는 상기 객체에 대한 제2조작에 따른 제2조작정보를 입력받는 기능과, 상기 제1조작정보에 따라 상기 객체의 객체 이동경로를 정의하거나, 상기 제1조작정보 및 상기 제2조작정보에 따라 상기 객체의 객체 이동경로를 정의하는 기능과, 상기 정의된 객체 이동경로에 따라 이동하는 상기 객체를 표시하는 기능 등을 실행한다. 이뿐만 아니라, 이상에서 설명한 본 발명의 실시예들에 따른 객체 제어 방법에 대응되는 모든 기능을 실행할 수 있다. The program implementing the object control method according to the embodiments of the present invention receives the first operation information according to the first operation on the object or according to the second operation on the object that is distinct from the first operation. A function of receiving second operation information and defining an object movement path of the object according to the first operation information, or defining an object movement path of the object according to the first operation information and the second operation information And a function of displaying the object moving according to the defined object movement path. In addition to this, all functions corresponding to the object control method according to the embodiments of the present invention described above can be executed.
이러한 프로그램은 객체 제어 장치(100) 등의 컴퓨터에 의해 읽힐 수 있는 기록매체에 기록되고 객체 제어 장치(100) 등의 컴퓨터에 의해 실행됨으로써 전술한 기능들이 실행될 수 있다. Such a program is recorded on a recording medium that can be read by a computer such as the object control apparatus 100 and executed by a computer such as the object control apparatus 100 so that the above functions can be executed.
이와 같이, 객체 제어 장치(100) 등의 컴퓨터가 기록매체에 기록된 프로그램을 읽어 들여 프로그램으로 구현된 본 발명의 실시예들에 따른 객체 제어 방법을 실행시키기 위하여, 전술한 프로그램은 컴퓨터의 프로세서(CPU)가 읽힐 수 있는 C, C++, JAVA, 기계어 등의 컴퓨터 언어로 코드화된 코드(Code)를 포함할 수 있다. As described above, in order to execute the object control method according to the embodiments of the present invention, in which a computer such as the object control apparatus 100 reads a program recorded on a recording medium and is implemented as a program, the above-described program may be a processor of a computer ( The code may be coded in a computer language such as C, C ++, JAVA, or machine language, which can be read by the CPU.
이러한 코드는 전술한 기능들을 정의한 함수 등과 관련된 기능적인 코드(Function Code)를 포함할 수 있고, 전술한 기능들을 컴퓨터의 프로세서가 소정의 절차대로 실행시키는데 필요한 실행 절차 관련 제어 코드를 포함할 수도 있다. Such code may include a function code associated with a function or the like that defines the above-described functions, and may include execution procedure-related control code necessary for a processor of the computer to execute the above-described functions according to a predetermined procedure.
또한, 이러한 코드는 전술한 기능들을 컴퓨터의 프로세서가 실행시키는데 필요한 추가 정보나 미디어가 컴퓨터의 내부 또는 외부 메모리의 어느 위치(주소 번지)에서 참조 되어야 하는지에 대한 메모리 참조 관련 코드를 더 포함할 수 있다. In addition, the code may further include memory reference-related code for additional information or media required for a processor of the computer to execute the above-described functions at which location (address address) of the computer's internal or external memory. .
또한, 객체 제어 장치(100) 등의 컴퓨터의 프로세서가 전술한 기능들을 실행시키기 위하여 원격(Remote)에 있는 어떠한 다른 컴퓨터나 서버 등과 통신이 필요한 경우, 코드는 컴퓨터의 프로세서가 컴퓨터의 통신 모듈(예: 유선 및/또는 무선 통신 모듈)을 이용하여 원격(Remote)에 있는 어떠한 다른 컴퓨터나 서버 등과 어떻게 통신해야만 하는지, 통신 시 어떠한 정보나 미디어를 송수신해야 하는지 등에 대한 통신 관련 코드를 더 포함할 수도 있다. In addition, when a processor of a computer, such as the object control apparatus 100, needs to communicate with any other computer or server in a remote in order to execute the above-described functions, the code may be coded by the processor of the computer. It may further include communication-related codes such as how to communicate with any other computer or server in the remote using a wired and / or wireless communication module, and what information or media to transmit and receive during communication. .
그리고, 본 발명을 구현하기 위한 기능적인(Functional) 프로그램과 이와 관련된 코드 및 코드 세그먼트 등은, 기록매체를 읽어서 프로그램을 실행시키는 컴퓨터의 시스템 환경 등을 고려하여, 본 발명이 속하는 기술분야의 프로그래머들에 의해 용이하게 추론되거나 변경될 수도 있다.In addition, a functional program for implementing the present invention, codes and code segments associated therewith may be used in consideration of a system environment of a computer that reads a recording medium and executes the program. It may be easily inferred or changed by.
또한 전술한 바와 같은 프로그램을 기록한 컴퓨터로 읽힐 수 있는 기록매체는 네트워크로 커넥션된 컴퓨터 시스템에 분산되어, 분산방식으로 컴퓨터가 읽을 수 있는 코드가 저장되고 실행될 수 있다. 이 경우, 다수의 분산된 컴퓨터 중 어느 하나 이상의 컴퓨터는 상기에 제시된 기능들 중 일부를 실행하고, 그 결과를 다른 분산된 컴퓨터들 중 하나 이상에 그 실행 결과를 전송할 수 있으며, 그 결과를 전송받은 컴퓨터 역시 상기에 제시된 기능들 중 일부를 실행하여, 그 결과를 역시 다른 분산된 컴퓨터들에 제공할 수 있다. In addition, a computer-readable recording medium having recorded a program as described above may be distributed to computer systems connected through a network so that computer-readable codes may be stored and executed in a distributed manner. In this case, any one or more of the plurality of distributed computers may execute some of the functions presented above, and transmit the results to one or more of the other distributed computers, and receive the results. The computer may also execute some of the functions presented above, and provide the results to other distributed computers as well.
이상에서 전술한 바와 같은, 본 발명의 실시예들에 따른 객체 제어 방법을 실행시키기 위한 프로그램을 기록한 컴퓨터로 읽힐 수 있는 기록매체는, 일 예로, ROM, RAM, CD-ROM, 자기 테이프, 플로피디스크, 광 미디어 저장장치 등이 있다. As described above, the computer-readable recording medium recording a program for executing the object control method according to the embodiments of the present invention, for example, ROM, RAM, CD-ROM, magnetic tape, floppy disk Optical media storage.
또한, 본 발명의 실시예들에 따른 객체 제어 방법을 실행시키기 위한 프로그램인 애플리케이션을 기록한 컴퓨터로 읽을 수 있는 기록매체는, 애플리케이션 스토어 서버(Application Store Server), 애플리케이션 또는 해당 서비스와 관련된 웹 서버(Web Server) 등을 포함하는 애플리케이션 제공 서버(Application Provider Server)에 포함된 저장매체(예: 하드디스크 등)이거나, 애플리케이션 제공 서버 그 자체일 수도 있으며, 프로그램을 기록한 다른 컴퓨터 또는 그 저장매체일 수도 있다. In addition, the computer-readable recording medium recording the application, which is a program for executing the object control method according to the embodiments of the present invention, an application store server (Application Store Server), an application or a web server associated with the service (Web) It may be a storage medium (eg, a hard disk, etc.) included in an application provider server including a server, or the like, or may be an application providing server itself, or another computer recording a program or a storage medium thereof.
본 발명의 실시예들에 따른 객체 제어 방법을 실행시키기 위한 프로그램인 애플리케이션을 기록한 기록매체를 읽을 수 있는 컴퓨터는, 발신자 단말기(110)일 수 있으며, 일반적인 데스크 탑이나 노트북 등의 일반 PC 뿐만 아니라, 스마트 폰, 태블릿 PC, PDA(Personal Digital Assistants) 및 이동통신 단말기 등의 모바일 단말기를 포함할 수 있으며, 이뿐만 아니라, 컴퓨팅(Computing) 가능한 모든 기기로 해석되어야 할 것이다. The computer that can read the recording medium recording the application, which is a program for executing the object control method according to the embodiments of the present invention, may be a caller terminal 110, as well as a general PC such as a general desktop or laptop, It may include mobile terminals such as smart phones, tablet PCs, personal digital assistants (PDAs), and mobile communication terminals, as well as to be interpreted as all computing devices.
만약, 본 발명의 실시예들에 따른 객체 제어 방법을 실행시키기 위한 프로그램인 애플리케이션을 기록한 기록매체를 읽을 수 있는 컴퓨터가 스마트 폰, 태블릿 PC, PDA(Personal Digital Assistants) 및 이동통신 단말기 등의 모바일 단말기인 경우, 모바일 단말기는 애플리케이션 스토어 서버, 웹 서버 등을 포함하는 애플리케이션 제공 서버로부터 해당 애플리케이션을 다운로드 받아 설치할 수 있고, 경우에 따라서는, 애플리케이션 제공 서버에서 일반 PC로 다운로드 된 이후, 동기화 프로그램을 통해 모바일 단말기에 설치될 수도 있다. If a computer capable of reading a recording medium recording an application that is a program for executing the object control method according to the embodiments of the present invention is a mobile terminal such as a smart phone, a tablet PC, a personal digital assistant (PDA) and a mobile communication terminal, In this case, the mobile terminal may download and install the corresponding application from an application providing server including an application store server, a web server, and the like. In some cases, the mobile terminal may be downloaded from the application providing server to a general PC. It may be installed in the terminal.
이상에서 설명한 바와 같이 본 발명에 의하면, 객체를 목표로 하는 위치에 정확하게 이동시킬 수 있는 객체 제어 방법, 객체 제어 장치 및 기록매체를 제공하는 효과가 있다. As described above, according to the present invention, there is an effect of providing an object control method, an object control apparatus, and a recording medium capable of accurately moving an object to a target position.
또한, 본 발명에 의하면, 서로 다른 2개의 조작정보를 통해, 객체를 목표로 하는 위치에 정확하게 이동시킬 수 있는 객체 제어 방법, 객체 제어 장치 및 기록매체를 제공하는 효과가 있다. In addition, according to the present invention, there is an effect of providing an object control method, an object control device and a recording medium capable of accurately moving an object to a target position through two different operation information.
이상에서, 본 발명의 실시예를 구성하는 모든 구성 요소들이 하나로 결합되거나 결합되어 동작하는 것으로 설명되었다고 해서, 본 발명이 반드시 이러한 실시예에 한정되는 것은 아니다. 즉, 본 발명의 목적 범위 안에서라면, 그 모든 구성 요소들이 하나 이상으로 선택적으로 결합하여 동작할 수도 있다. 또한, 그 모든 구성 요소들이 각각 하나의 독립적인 하드웨어로 구현될 수 있지만, 각 구성 요소들의 그 일부 또는 전부가 선택적으로 조합되어 하나 또는 복수 개의 하드웨어에서 조합된 일부 또는 전부의 기능을 수행하는 프로그램 모듈을 갖는 컴퓨터 프로그램으로서 구현될 수도 있다. 그 컴퓨터 프로그램을 구성하는 코드들 및 코드 세그먼트들은 본 발명의 기술 분야의 당업자에 의해 용이하게 추론될 수 있을 것이다. 이러한 컴퓨터 프로그램은 컴퓨터가 읽을 수 있는 저장매체(Computer Readable Media)에 저장되어 컴퓨터에 의하여 읽혀지고 실행됨으로써, 본 발명의 실시예를 구현할 수 있다. 컴퓨터 프로그램의 저장매체로서는 자기 기록매체, 광 기록매체, 등이 포함될 수 있다.In the above description, all elements constituting the embodiments of the present invention are described as being combined or operating in combination, but the present invention is not necessarily limited to the embodiments. In other words, within the scope of the present invention, all of the components may be selectively operated in combination with one or more. In addition, although all of the components may be implemented in one independent hardware, each or all of the components may be selectively combined to perform some or all functions combined in one or a plurality of hardware. It may be implemented as a computer program having a. Codes and code segments constituting the computer program may be easily inferred by those skilled in the art. Such a computer program may be stored in a computer readable storage medium and read and executed by a computer, thereby implementing embodiments of the present invention. The storage medium of the computer program may include a magnetic recording medium, an optical recording medium, and the like.
또한, 이상에서 기재된 "포함하다", "구성하다" 또는 "가지다" 등의 용어는, 특별히 반대되는 기재가 없는 한, 해당 구성 요소가 내재될 수 있음을 의미하는 것이므로, 다른 구성 요소를 제외하는 것이 아니라 다른 구성 요소를 더 포함할 수 있는 것으로 해석되어야 한다. 기술적이거나 과학적인 용어를 포함한 모든 용어들은, 다르게 정의되지 않는 한, 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자에 의해 일반적으로 이해되는 것과 동일한 의미를 가진다. 사전에 정의된 용어와 같이 일반적으로 사용되는 용어들은 관련 기술의 문맥 상의 의미와 일치하는 것으로 해석되어야 하며, 본 발명에서 명백하게 정의하지 않는 한, 이상적이거나 과도하게 형식적인 의미로 해석되지 않는다.In addition, the terms "comprise", "comprise" or "having" described above mean that the corresponding component may be included, unless otherwise stated, and thus excludes other components. It should be construed that it may further include other components instead. All terms, including technical and scientific terms, have the same meanings as commonly understood by one of ordinary skill in the art unless otherwise defined. Terms commonly used, such as terms defined in a dictionary, should be interpreted to coincide with the contextual meaning of the related art, and shall not be construed in an ideal or excessively formal sense unless explicitly defined in the present invention.
이상의 설명은 본 발명의 기술 사상을 예시적으로 설명한 것에 불과한 것으로서, 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자라면 본 발명의 본질적인 특성에서 벗어나지 않는 범위에서 다양한 수정 및 변형이 가능할 것이다. 따라서, 본 발명에 개시된 실시예들은 본 발명의 기술 사상을 한정하기 위한 것이 아니라 설명하기 위한 것이고, 이러한 실시예에 의하여 본 발명의 기술 사상의 범위가 한정되는 것은 아니다. 본 발명의 보호 범위는 아래의 청구범위에 의하여 해석되어야 하며, 그와 동등한 범위 내에 있는 모든 기술 사상은 본 발명의 권리범위에 포함되는 것으로 해석되어야 할 것이다.The above description is merely illustrative of the technical idea of the present invention, and those skilled in the art to which the present invention pertains may make various modifications and changes without departing from the essential characteristics of the present invention. Therefore, the embodiments disclosed in the present invention are not intended to limit the technical idea of the present invention but to describe the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The protection scope of the present invention should be interpreted by the following claims, and all technical ideas within the equivalent scope should be interpreted as being included in the scope of the present invention.
CROSS-REFERENCE TO RELATED APPLICATIONCROSS-REFERENCE TO RELATED APPLICATION
본 특허출원은 2014년 02월 26일 한국에 출원한 특허출원번호 제10-2014-0022517 호에 대해 미국 특허법 119(a)조 (35 U.S.C § 119(a))에 따라 우선권을 주장하며, 그 모든 내용은 참고문헌으로 본 특허출원에 병합된다. 아울러, 본 특허출원은 미국 이외에 국가에 대해서도 위와 동일한 이유로 우선권을 주장하면 그 모든 내용은 참고문헌으로 본 특허출원에 병합된다.This patent application claims priority under Patent Application No. 10-2014-0022517, filed in Korea on February 26, 2014, pursuant to Section 119 (a) (35 USC § 119 (a)). All content is incorporated by reference in this patent application. In addition, if this patent application claims priority for the same reason for countries other than the United States, all its contents are incorporated into this patent application by reference.

Claims (11)

  1. 객체 제어 장치의 객체 제어 방법에 있어서, In the object control method of the object control apparatus,
    상기 객체 제어 장치의 입력부가, 객체에 대한 제1조작에 따른 제1조작정보를 입력받거나, 상기 제1조작과는 구별되는 상기 객체에 대한 제2조작에 따른 제2조작정보를 입력받는 객체 조작정보 입력 단계; An object manipulation unit of the object control apparatus which receives first manipulation information according to a first operation on an object or receives second manipulation information according to a second operation on the object that is distinguished from the first manipulation Inputting information;
    상기 객체 제어 장치의 제어부가, 상기 제2조작정보의 입력 유무에 따라, 상기 제1조작정보에 따라 상기 객체의 객체 이동경로를 정의하거나, 상기 제1조작정보 및 상기 제2조작정보에 따라 상기 객체의 객체 이동경로를 정의하는 객체 이동경로 정의 단계; 및 The controller of the object control apparatus may define an object movement path of the object according to the first operation information according to the presence or absence of input of the second operation information, or according to the first operation information and the second operation information. An object movement path defining step of defining an object movement path of the object; And
    상기 제어부가, 상기 정의된 객체 이동경로에 따라 상기 객체에 대한 표시부에서의 화면 표시를 제어하는 객체 표시 제어 단계를 포함하는 객체 제어 방법. And an object display control step of controlling, by the controller, screen display on the display unit for the object according to the defined object movement path.
  2. 제1항에 있어서, The method of claim 1,
    상기 객체 이동경로 정의 단계에서, 상기 제어부는, In the object movement path definition step, the control unit,
    상기 제2조작정보가 미입력된 경우, 상기 제1조작정보에 따라 상기 객체 이동경로로서 직선 이동경로를 정의하고, If the second manipulation information is not input, a linear movement route is defined as the object movement route according to the first manipulation information,
    상기 제2조작정보가 입력된 경우, 상기 제1조작정보 및 상기 제2조작정보에 따라 상기 객체 이동경로로서 곡선 이동경로를 정의하는 것을 특징으로 하는 객체 제어 방법. And a curved movement path as the object movement path according to the first operation information and the second operation information when the second operation information is input.
  3. 제2항에 있어서, The method of claim 2,
    상기 객체 이동경로 정의 단계에서, 상기 제어부는, In the object movement path definition step, the control unit,
    상기 제1조작정보에 따라 상기 객체의 직선이동성분에 대한 이동 방향 및 이동 거리를 결정하고, 상기 제2조작정보에 따라 상기 객체의 커브 궤적을 결정하며, Determine a moving direction and a moving distance with respect to the linear movement component of the object according to the first manipulation information, determine a curve trajectory of the object according to the second manipulation information,
    상기 결정된 이동 방향, 이동 거리 및 커브 궤적에 근거하여, 상기 객체 이동경로로서 상기 곡선 이동경로를 정의하는 것을 특징으로 하는 객체 제어 방법. And the curve moving path is defined as the object moving path based on the determined moving direction, the moving distance, and the curve trajectory.
  4. 제3항에 있어서, The method of claim 3,
    상기 제1조작정보는, The first operation information,
    상기 객체를 드래그한 드래그 방향 및 드래그 길이에 대한 정보를 포함하는 것을 특징으로 하는 객체 제어 방법. And an information on a drag direction and a drag length of dragging the object.
  5. 제4항에 있어서, The method of claim 4, wherein
    상기 객체 이동경로 정의 단계에서, 상기 제어부는, In the object movement path definition step, the control unit,
    상기 드래그 방향의 반대 방향을 상기 객체의 직선이동성분에 대한 이동 방향으로 결정하고, 상기 드래그 길이에 대응되는 길이를 상기 객체의 직선이동성분에 대한 이동 거리로 결정하는 것을 특징으로 하는 객체 제어 방법. And determining a direction opposite to the drag direction as a moving direction with respect to the linear moving component of the object, and determining a length corresponding to the drag length as a moving distance with respect to the linear moving component of the object.
  6. 제3항에 있어서, The method of claim 3,
    상기 제2조작정보는, The second operation information,
    상기 제1조작정보에 따라 상기 객체에서 상기 이동 방향을 향해 표시된 이동 방향 지시자를 드래그한 드래그 길이 또는 상기 객체를 드래그한 드래그 길이에 대한 정보를 포함하는 것을 특징으로 하는 객체 제어 방법. And a drag length of dragging a moving direction indicator displayed in the object toward the moving direction or the drag length of dragging the object according to the first manipulation information.
  7. 제6항에 있어서, The method of claim 6,
    상기 객체 이동경로 정의 단계에서, 상기 제어부는, In the object movement path definition step, the control unit,
    상기 드래그 길이에 따라 상기 객체의 커브 궤적을 결정하는 것을 특징으로 하는 객체 제어 방법. And determining a curve trajectory of the object according to the drag length.
  8. 제3항에 있어서, The method of claim 3,
    상기 제2조작정보는, The second operation information,
    상기 객체의 주변에 표시된 포인트에 대한 드래그 조작정보를 포함하되, Including drag operation information for a point displayed around the object,
    상기 드래그 조작정보는 상기 제2조작에 따른 드래그 위치 정보 또는 상기 드래그 위치 정보에 대응되는 커브 휘어짐 레벨 값을 포함하는 것을 특징으로 하는 객체 제어 방법. The drag manipulation information may include drag position information according to the second operation or a curve bending level value corresponding to the drag position information.
  9. 제8항에 있어서, The method of claim 8,
    상기 객체 이동경로 정의 단계에서, 상기 제어부는, In the object movement path definition step, the control unit,
    상기 드래그 조작정보에 포함된 상기 드래그 위치 정보 또는 상기 커브 휘어짐 레벨 값에 근거하여, 상기 객체의 커브 궤적을 결정하는 것을 특징으로 하는 객체 제어 방법. And determining a curve trajectory of the object based on the drag position information or the curve bending level value included in the drag manipulation information.
  10. 제1항에 기재된 객체 제어 방법을 구현하기 위한 프로그램을 기록한 컴퓨터로 읽을 수 있는 기록 매체. A computer-readable recording medium having recorded thereon a program for implementing the object control method according to claim 1.
  11. 객체에 대한 제1조작에 따른 제1조작정보를 입력받거나, 상기 제1조작과는 구별되는 상기 객체에 대한 제2조작에 따른 제2조작정보를 입력받는 입력부; An input unit receiving first operation information according to a first operation on an object or inputting second operation information according to a second operation on the object which is distinguished from the first operation;
    상기 제1조작정보에 따라 상기 객체의 객체 이동경로를 정의하거나, 상기 제1조작정보 및 상기 제2조작정보에 따라 상기 객체의 객체 이동경로를 정의하는 제어부; 및 A control unit defining an object movement path of the object according to the first operation information or defining an object movement path of the object according to the first operation information and the second operation information; And
    상기 정의된 객체 이동경로에 따라 이동하는 상기 객체를 표시하는 표시부를 포함하는 객체 제어 장치. And a display unit for displaying the object moving according to the defined object movement path.
PCT/KR2015/001441 2014-02-26 2015-02-12 Method for controlling object, device for controlling object, and recording medium WO2015130042A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140022517A KR101566323B1 (en) 2014-02-26 2014-02-26 Method, apparatus, and recording medium for controlling object
KR10-2014-0022517 2014-02-26

Publications (1)

Publication Number Publication Date
WO2015130042A1 true WO2015130042A1 (en) 2015-09-03

Family

ID=54009311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/001441 WO2015130042A1 (en) 2014-02-26 2015-02-12 Method for controlling object, device for controlling object, and recording medium

Country Status (2)

Country Link
KR (1) KR101566323B1 (en)
WO (1) WO2015130042A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009056181A (en) * 2007-08-31 2009-03-19 Sega Corp Game machine
JP2009142510A (en) * 2007-12-14 2009-07-02 Namco Bandai Games Inc Program, information storage medium, and game device
KR20110050606A (en) * 2011-04-07 2011-05-16 이정대 Soccer simulation game to control when a person
KR20130127146A (en) * 2012-05-14 2013-11-22 삼성전자주식회사 Method for processing function correspond to multi touch and an electronic device thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009056181A (en) * 2007-08-31 2009-03-19 Sega Corp Game machine
JP2009142510A (en) * 2007-12-14 2009-07-02 Namco Bandai Games Inc Program, information storage medium, and game device
KR20110050606A (en) * 2011-04-07 2011-05-16 이정대 Soccer simulation game to control when a person
KR20130127146A (en) * 2012-05-14 2013-11-22 삼성전자주식회사 Method for processing function correspond to multi touch and an electronic device thereof

Also Published As

Publication number Publication date
KR20150101166A (en) 2015-09-03
KR101566323B1 (en) 2015-11-05

Similar Documents

Publication Publication Date Title
WO2016060370A1 (en) Terminal for internet of things and operation method of the same
WO2013168885A9 (en) Method for providing lock screen and terminal device for implementing same
WO2017135797A2 (en) Method and electronic device for managing operation of applications
WO2013191450A1 (en) Information display apparatus and method of user device
WO2012148179A2 (en) Method for displaying marker in a map service
WO2012153914A1 (en) Method and apparatus for providing graphic user interface having item deleting function
WO2016080596A1 (en) Method and system for providing prototyping tool, and non-transitory computer-readable recording medium
WO2013094881A1 (en) Method and apparatus for controlling application execution
WO2013125789A1 (en) Electronic apparatus, method for controlling the same, and computer-readable storage medium
WO2013111936A1 (en) System and method for controlling field facilities by using mobile terminal
WO2017090851A1 (en) Association of external device with scenario-based content of main device
WO2016068593A2 (en) Block and user terminal for modeling three-dimensional shape and method for modeling three-dimensional shape using same
WO2013022204A2 (en) System and method for inputting characters in touch-based electronic device
WO2017086689A1 (en) Electronic device and control method therefor
WO2019107799A1 (en) Method and apparatus for moving input field
WO2017175905A1 (en) Method and system for providing advertisement through lock screen on basis of message associated application
WO2016111514A1 (en) Method of displaying content and electronic device implementing same
WO2015130042A1 (en) Method for controlling object, device for controlling object, and recording medium
WO2016093448A1 (en) Mobile device and method for operating mobile device
WO2019132563A1 (en) Image panning method
WO2017115976A1 (en) Method and device for blocking harmful site by using accessibility event
WO2015130041A1 (en) Method for providing target information, and target device and recording medium for same
WO2013172522A1 (en) Terminal capable of text message makeup and control method
WO2013065912A1 (en) Method, terminal, and recording medium for controlling screen output
WO2013133500A1 (en) Content-processing apparatus and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15754752

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15754752

Country of ref document: EP

Kind code of ref document: A1