WO2017145570A1 - Game program - Google Patents

Game program Download PDF

Info

Publication number
WO2017145570A1
WO2017145570A1 PCT/JP2017/001232 JP2017001232W WO2017145570A1 WO 2017145570 A1 WO2017145570 A1 WO 2017145570A1 JP 2017001232 W JP2017001232 W JP 2017001232W WO 2017145570 A1 WO2017145570 A1 WO 2017145570A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
game program
position information
flick
action
Prior art date
Application number
PCT/JP2017/001232
Other languages
French (fr)
Japanese (ja)
Inventor
真一 上山
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Publication of WO2017145570A1 publication Critical patent/WO2017145570A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball

Definitions

  • the present invention relates to a game program for operating an object in a game space in response to a specific touch operation on a touch display.
  • Patent Document 1 discloses a technique for advancing a game in a soccer game while a user designates an action target character and position each time on a touch display.
  • a technique for performing an action (pass, dribble, shoot, etc.) on a moving body according to designation of a destination of the moving body (ball) by a touch operation and a touch operation type is disclosed.
  • Patent Document 2 when the touch operation is an operation that moves at a predetermined speed or more from the touchdown by a predetermined stroke distance, the touch operation is determined as a flick operation, and the player character is moved to the target of movement. Disclosed is a technique for moving toward the target.
  • An object of the present invention is to provide a game program capable of operating a game object at will while maintaining simple operability.
  • the computer when executed by a computer having a touch display, the computer detects arbitrary position information on the touch display and repeatedly acquires the touch position information for each predetermined frame rate; Determining the first operation information based on the number of touch position information, performing the first action on the moving body in the game space based on the first operation information, and determining the touch-off from the touch display A step of determining second operation information using touch position information in at least one frame immediately before the touch-off, a step of determining a flick operation based on the second operation information, and in response to the flick operation A step in which the first operation is released and the second operation is performed on the moving object based on the second operation information. And the first action moves the moving body to move in the game space together with the character, and the second action moves the moving body to move in the game space away from the character. A game program is obtained.
  • a game program capable of operating a game object at will while maintaining simple operability.
  • a game program according to an embodiment of the present invention has the following configuration.
  • (Item 1) When executed by a computer having a touch display, Detecting arbitrary position information on the touch display and repeatedly acquiring the touch position information for each predetermined frame rate; Determining first operation information based on a predetermined number of the touch position information; Performing a first action on a moving object in the game space based on the first operation information; Determining a touch-off from the touch display; Determining second operation information using the touch position information in at least one frame immediately before the touch-off; Determining a flick operation based on the second operation information; Executing the method including the step of releasing the first operation in response to the flick operation and performing the second operation on the moving body based on the second operation information; The first action moves the moving body to move in the game space together with the character, A game program in which the second action moves the moving body to move in the game space away from the character.
  • the second operation information includes a flick direction in the at least one previous frame, A game program in which, in the step of executing the second action, the second action is executed in a direction of the game space corresponding to the flick direction.
  • the second operation information includes a flick distance in at least one previous frame
  • a flick operation type is determined based on the flick distance
  • a game program in which in the step of executing the second operation, an operation associated with the flick operation type is executed.
  • step of performing the second operation on the moving body further includes: Determining whether the position of the character is included in a predetermined motion area in the game space; Associating the second action with a specific action type when the position of the character is included in the predetermined action area, A game program in which an operation associated with the specific action type is executed in the step of executing the second operation.
  • the game program is applied to a soccer game,
  • the first action is a “dribbling” action to the moving body by the character;
  • the second action is a “kick” action to the moving body by the character;
  • the specific action type is associated with a “shoot” operation.
  • the flick operation type includes a “strong” flick operation and a “weak” flick operation, In the second operation, the game program in which the “strong” flick operation is associated with “high trajectory” and the “weak” flick operation is associated with “low trajectory”.
  • the game program according to any one of items 1 to 9, wherein the method further includes: In accordance with the step of repeatedly acquiring the touch position information, an image is generated and displayed in association with the first touch position information detected for touch-on and the second touch position information currently detected for each frame rate.
  • the image is an image of an elastic body having a base and a tip; The base is associated with the first touch position information, and the distal end is associated with the second touch position information for each frame, so that the elastic body is changed along with the change of the second touch position information through a slide operation.
  • a game program in which the image is displayed so as to deform.
  • the game program according to item 10 or 11, wherein the step of generating and displaying the image further includes: A game program including a step of performing visual processing on an image of the elastic body according to a distance between the base and the tip.
  • the step of generating and displaying the image further includes an index indicating a slide direction based on current second touch position information,
  • a game program comprising the step of displaying the image together with the image.
  • FIG. 1 shows a block diagram of a basic configuration of a computer 100 according to an embodiment of the present invention.
  • the computer 100 is preferably a device including a touch display such as a smartphone or a tablet computer. It will be understood by those skilled in the art that the term “touch display” is interchangeable with “touch panel” and “touch screen” herein.
  • the computer 100 includes a touch display 102, a processing unit 108, a storage unit 110, and a communication unit 112.
  • the touch display 102 includes a display unit 104 for displaying various data stored in the storage unit 110 and various images generated by the processing unit 108.
  • the touch display 102 also touches the display unit 104 with an object such as a user's finger or a stylus (mainly physical contact such as touch operation, swipe operation, slide operation, flick operation, drag operation, and tap operation).
  • the contact detection unit 106 may be disposed in an upper layer of the display unit 104 such as a liquid crystal panel, an organic EL, or a plasma display due to the structure of the touch display 102.
  • the contact detection unit 106 may employ a pressure detection method, a resistance film method, a capacitance method, an electromagnetic induction method, or the like.
  • the contact detection unit 106 When an input by contact of an object such as a finger or a stylus is received on the contact detection unit 106, a change amount such as pressure, electrical resistance, electric capacity, elastic wave energy, etc. at the contact position is detected, and the display unit 104 responds.
  • the contact position coordinates to be specified are specified.
  • the processing unit 108 may be configured as a processor, and reads a computer program such as a user interface image display program and a game program according to the embodiment of the present invention from the storage unit 110 and the like, and performs processing according to the computer program. To run. The specific configuration and operation of the processing unit 108 will be described later.
  • the storage unit 110 can temporarily or permanently store game programs, various data related to users and games, etc., and can register, update, delete, etc. information in response to commands from the processing unit 108. Execute. Examples of the storage unit 110 include, but are not limited to, a main storage such as a ROM (Read Only Memory) and a RAM (Random Access Memory), and an auxiliary storage such as a hard disk, a flash memory, and an optical disk.
  • the communication unit 112 can be used to exchange game-related data with an external server (not shown) or the like via a communication network such as the Internet.
  • FIG. 2 is a block diagram showing in detail the detailed configuration of the computer 100 according to an embodiment of the present invention.
  • the computer 100 includes the touch display 102 including the display unit 104 and the contact detection unit 106, the processing unit 108, the storage unit 110, and the communication unit 112.
  • the processing unit 108 includes a touch position information acquisition unit 201, an operation information determination unit 202, a dribble operation execution unit 203, a flick operation determination unit 204, a kick operation execution unit 205, and a UI image generation unit 206 as functional blocks. Details of processing executed by each component will be described later.
  • it includes a basic function unit such as an animation generation unit that generates an animation during game progress, the illustration is omitted.
  • Each functional block is implemented by software, and is realized by reading and executing each program module stored in the memory 110 by the processor 108 shown in FIG.
  • FIG. 3 is an example of a game screen when the game program of one embodiment is applied to a soccer game.
  • a soccer field FD captured by a virtual camera (not shown) from the vertical direction of the space is displayed as a game space.
  • the score “1-0” and the elapsed time “second half 01:12” are displayed as additional information.
  • various game objects that can be operated by the user are arranged. Examples of objects include a moving body (soccer ball) MO, own characters CH11 to CH12 of the own team, opponent characters CH21 to CH23 of the opponent team, goal GL, and the like. Each character is displayed with a position such as “FW”.
  • the ball MB is associated with the player character CH12, and an active mark MK marked with ⁇ is displayed on the player character CH11.
  • the own character CH12 on which the active mark MK is displayed can be operated through a user's touch operation. Examples of the action of the player character include dribbling and kicking the ball, and movement in the field (positioning). The kick includes various operations such as pass, clear, centering, and chute.
  • field coordinates are defined as game space coordinates. That is, each object such as a ball to be arranged, own character CH11 to CH12, opponent character CH21 to CH23 of the opponent team, and goal GL has coordinate information.
  • a user interface image corresponding to a touch operation by the user (hereinafter referred to as “UI image”) is displayed at the bottom of the screen.
  • the UI image is associated with a motion command to the player character CH12.
  • the UI image may relate the sliding direction to the moving direction of the character in the game space and the sliding distance to the moving speed of the character in the slide operation. Thereby, it is possible to provide a UI image that can provide the user with intuitive operability.
  • the “game object” (also referred to simply as “object”) mainly includes a moving body (ball) and a character arranged in the game space, Can be operated remotely by the user.
  • the term “slide” operation means that the user touches the touch display 120 with an object such as a finger (touch on), maintains the contact while changing the position (touch now), and then releases the object ( It is a generic term for a series of touch operations (touch-off).
  • a swipe an operation of sliding an object on the touch display 120 for a certain period after touch-on.
  • an operation of touching off after sliding an object on the touch display 120 at a predetermined speed is referred to as a “flick” operation. That is, in this specification, the concept of “slide” operation encompasses the concepts of “swipe” operation and “flick” operation.
  • FIGS. 4 and 5 show an overview of information processing for operating an object in the game space.
  • FIG. 4 shows a schematic operation example of an object on the two-dimensional field FD according to one embodiment
  • FIG. 5 shows a flowchart of information processing corresponding to the object operation.
  • the character CH12 of the own team is activated and operated with the ball (or away from the ball). That is, in the state (a), the character CH12 is associated with the ball MB.
  • the ball MB is moved along the character CH12 in the sliding direction (the lower right direction in the figure) on the two-dimensional field FD.
  • the ball MB is moved by the “dribbling” operation of the character CH12.
  • the association of the character CH12 with the ball MB is released in response to the flick operation on the touch display. That is, the “dribbling” operation is released.
  • the ball MB is moved away from the character CH12 in the flick direction (right direction in the figure) on the two-dimensional field FD. That is, only the ball MB is moved to the character CH12 by the “kick” operation to the ball MB.
  • the user can perform a series of operations from “dribbling” to “kick” on the ball through a slide operation on the touch display, and can provide the user with a simple and intuitive touch operability. it can.
  • the character CH11 is associated with the ball MB automatically or with a predetermined user action.
  • the information processing includes each stage of a touch-on phase (S301 to S302), a touch-now phase (S303 to S307), and a touch-off phase (S308 to S310).
  • touch-on phase when the contact detection unit 106 detects contact at every predetermined frame rate, touch-on, touch-now, and touch-off are determined.
  • the touch position information acquisition unit 201 acquires the touch position information as a touch-on state and stores it in the storage unit 110 (S302).
  • the touch position information in the touch-on state can be an arbitrary position in the touch area on the touch display 102, and the user may touch a favorite position.
  • the touch now phase corresponds to the swipe operation state in the slide operation by the user.
  • the contact detection unit 106 continuously detects contact at every predetermined frame rate (for example, 30 fps) (S303). Then, the touch position information acquisition unit 201 repeatedly acquires touch position information for each frame rate as a touch now state (S304). In particular, among the acquired touch position information in the touch now state, the latest predetermined number of touch position information is stored in the storage unit 110.
  • the operation information determination unit 202 determines swipe operation information corresponding to each frame using the stored touch position information (S305).
  • the UI image generation unit 206 continuously generates UI images for each frame rate and displays them on the touch display 102 (S306).
  • the dribble motion execution unit 203 performs an operation on the object in the game space based on the swipe operation information (S307).
  • the character CH12 is caused to perform a “dribbling” action in the swipe direction (lower right direction in the figure) with the ball MB.
  • the processes of S304 to S307 are repeatedly performed for each predetermined frame rate.
  • the touch position information acquisition unit 201 cannot determine the touch position information, and thus determines that it is in the touch-off state. At that time, the UI image disappears (not shown).
  • the operation information determination unit 202 determines flick (slide) operation information using the stored touch position information in at least one frame immediately before the touch-off (S309). Based on the flick operation information, the flick operation determination unit 204 determines whether the immediately preceding operation is a flick operation and the flick operation type in that case (S310). In the case of a flick operation, the kick action performing unit 205 performs an action on an object in the game space based on the flick operation information ( S311). In the example of FIG. 4B, the character CH12 is caused to “kick” the ball MB and move the ball MB in the flick direction.
  • FIGS. 6A and 6B show a storage example when the touch position information acquired by the touch position information acquisition unit 201 in S302, S304, and S309 is stored.
  • a predetermined number of pieces of touch position information are stored in the array table fp in the storage unit 110 for each predetermined frame rate (for example, 30 fps).
  • the initial position coordinates are stored in the table for each slide operation.
  • the touch position information may be acquired as xy position coordinates of the touch area on the touch display 120. In general, the touch area (the size of the touch display 120) is different for each portable terminal 100 used by the user.
  • the xy position coordinates are adjusted so that, for example, the center of the touch area is the origin (0, 0), the vertical upper end is +1.0 and the vertical lower end is ⁇ 1.0, and the horizontal touch is adjusted accordingly.
  • the area range should be determined.
  • the horizontal touch area range when the portable terminal is held vertically is approximately ⁇ 0.6 (left end) to +0.6 (right end).
  • swipe operation information is determined in S305, and flick operation information is determined in S309.
  • a storage area for storing 11 pieces of touch position information for each frame rate from fp [0] to fp [10] is provided as the array table fp.
  • the number of pieces of touch position information is not limited and can be an arbitrary number.
  • the position coordinates (x0, y0) when touched on the touch area are stored in fp [0].
  • fp [1] to fp [10] are still null (empty) values, but the position coordinates are sequentially stored for each frame rate as the touch now state progresses.
  • the position coordinate data for each frame rate is sequentially overwritten again from fp [0], and the old position coordinate data is sequentially deleted. .
  • the touch-off determination is performed when a position value is not physically acquired and a null value is stored. In FIG. 6A, at least fp [5] has been overwritten and updated with the null value, so the touch-off determination is performed. If it is determined as touch-off, the acquisition of touch position information for a series of slide operations is completed.
  • Determination logic may be configured to determine touch-off.
  • the reason for this configuration is that it is sufficiently assumed that the finger is separated from the touch display for a moment when the slide operation is complicated. For example, in a “dribbling” action in a soccer game (for example, FIG. 4A), since a character turning action or the like is frequently performed, it is assumed that a complicated slide operation is performed.
  • FIG. 6B shows the result of the touch-off determination shown in FIG. 6A.
  • (1) When touch-off is determined and then the touch-on state is entered again.
  • An example of storing two patterns in the case of In both (1) and (2) (x16, y16) and (x17, y17) are added to fp [5] and fp [6]. Instead, a null value is stored.
  • the next touch position coordinate (x18, y18) is received, the (x18, y18) is stored in fp [7]. Thereafter, the position coordinates are sequentially stored from fp [8] for each frame rate (arrow).
  • the initial position coordinates are newly overwritten and updated with (x18, y18), whereas in the case of (2), the initial position coordinates are maintained as (x0, y0).
  • the arrangement table fp is not initialized even after the touch-off determination is made in (1). That is, it is preferable to maintain the latest touch position information (x14, y14), (x15, y15), etc. in the previous slide operation so that they can be used in separate processing. .
  • FIG. 7 shows details of the determination of the flick operation information in S309. 8 and 9 show details of the flick operation determination in S310.
  • FIG. 10 to FIG. 13 show details of the operation performed on the object in S311.
  • 14 to 16 show details of UI image generation in S306.
  • FIG. 7 shows a detailed flowchart of the flick operation information determination process (S309) by the operation information determination unit 202.
  • the touch-off state is specified, first, the association between the character and the ball so far is released (S401). As a result, the “dribbling” action involving the ball MB of the character is released, and the “kicking” action corresponding to the flick operation becomes possible (for example, FIG. 4B).
  • Touch position information in at least one frame immediately before the touch-off is extracted from the storage unit (S402). Using the extracted touch position information, a slide direction and a slide distance are determined as flick operation information (S403).
  • the slide operation information for the latest 10 frames is obtained. It is determined. Specifically, the flick operation information of the 10th frame immediately before the touch-off is determined using touch position information (position coordinates) of (x14, y14) of fp [3] and (x15, y15) of fp [4]. Is done. Further, the flick operation information of the ninth frame immediately before is determined using the position coordinates of (x13, y13) of fp [2] and (x14, y14) of fp [3].
  • the flick operation information for example, the xy coordinate movement direction (that is, the flick direction), the xy coordinate movement amount (that is, the flick distance), and the xy coordinate movement speed (that is, the flick speed calculated by the coordinate movement amount / frame rate).
  • the flick operation information is preferably calculated using the immediately preceding two or more frames.
  • the flick direction is preferably used so as to correspond to the direction in which the object is operated in the game space
  • the flick distance is preferably used so as to determine the speed of the action on the object.
  • FIG. 8 shows the detailed processing flow of S310.
  • FIG. 9 shows a graph example of flick (slide) operation information as an example.
  • the horizontal axis of the graph indicates the frame F in a predetermined frame rate unit, and the vertical axis indicates the slide distance (swipe distance or flick distance).
  • the touch-on state is detected in the first frame F [1], and the slide (swipe) distance is plotted as the touch now state up to the (n-1) th frame F [n-1]. Since the touch position information is not specified in the nth frame F [n], the slide distance cannot be calculated (N / A). As a result, the distance touch-off is detected.
  • the slide distance in the flick operation information is greater than or equal to the threshold value TH1 (S501). Specifically, it is determined whether the slide (flick) distance plotted in the immediately preceding n ⁇ 1 frame F [n ⁇ 1] or the immediately preceding n ⁇ 2 frame F [n ⁇ 2] is greater than or equal to the threshold value TH1. Is done. In the case of “Yes”, it is determined that the flick operation has been established, and the operation on the object is executed through the subsequent processing. On the other hand, in the case of “No”, it is determined that the flick operation is not established, and the operation on the object is not executed, and the process ends with the swipe process.
  • the plurality of immediately preceding frames used for the determination may be any number up to 2 and up to a number that can be traced back (10 in the example of FIGS. 6A and 6B). The reason for using a plurality of immediately preceding frames is to distinguish them from tap operations in which only the operation information of only the immediately preceding frame is determined.
  • the flick direction may be determined using the frame used in the above determination.
  • the direction in the latest frame may be determined as the flick direction, and when a plurality of frames are used for determination, the direction of the oldest frame may be determined as the flick direction.
  • the average direction of each frame may be calculated and determined as the flick direction.
  • the flick operation type is further determined based on the flick distance.
  • the flick operation type includes a “strong” flick operation and a “weak” flick operation. That is, it is determined whether the flick distance is greater than or equal to a further threshold value T2 (> T1) (S502). If “yes”, the flick operation type is set to “strong” flick operation (S503), and if “no”, the flick operation type is set to “weak” flick operation (S504).
  • T2 a further threshold value
  • T1 a further threshold value
  • the flick operation type is set to “strong” flick operation (S503), and if “no”, the flick operation type is set to “weak” flick operation (S504).
  • the maximum value among them is compared with the threshold value T2. In other words, in FIG. 8, the flick distance of F [n ⁇ 1] is the maximum value and is greater than TH2, and therefore the flick operation type is set to “strong” flick operation.
  • the flick operation type is associated with the operation to be performed.
  • FIG. 10 shows a detailed processing flow of S311.
  • FIG. 11 shows an outline of an operation region (shoot range SR) related to determination of a specific action type.
  • FIG. 12 shows a character table 200 for defining character characteristics, and
  • FIG. 13 shows a list 300 to 500 of action tables related to the action of the character on the ball.
  • the table group shown in FIG. 12 and FIG. 13 is reflected on the mobile terminal 100 by downloading from the game server together with the game program through the communication unit 112 of the mobile terminal.
  • the character position in the game space at the time of touch-off is determined (S601). It is determined whether the character position is included in the motion area (S602). In the case of “Yes”, it is further determined whether the direction in the game space corresponding to the slide direction acquired in S403 is in the goal direction (S603). In the case of “Yes”, an operation of a specific action type of “shoot” in the flick direction is associated with the flick operation (S604). On the other hand, if “No” in S602 or S603, the action of the default action type “kick” in the flick direction is associated with the flick operation (S605).
  • the action type action is associated with the flick operation
  • the actual action is performed according to the action type and the flick operation type set in S503 and S504 (S606).
  • the operation is performed, an animation related to the action in the flick direction of the character or the ball is reproduced according to the content.
  • the position data of each object such as a character or a ball placed on the field FD is managed by field coordinates (XY coordinates).
  • S602 and S603 are also preferably determined in the XY coordinate system.
  • the XY coordinate system may be any as long as it can define a two-dimensional plane in an orthogonal coordinate system, such as using the center of the field or the goal of the own team as the origin.
  • the player character CH12 is arranged so as to be included in the shoot range SR. In the action region of the shoot range SR, the player character CH12 performs a “shoot” action in response to the flick operation.
  • the shoot range SR includes the goal area GA and the penalty area PA, but is not necessarily limited to this. Further, the shape of the chute range SR is not limited to a rectangle.
  • the shoot range SR is preferably configured to be different for each character according to the characteristics of the character (player). Character characteristics are managed by a character table 200 in the storage unit 110 as shown in FIG. Specifically, in addition to general character information such as the character ID, player name, belonging team (own team / partner team), and position, in one embodiment, the shooting power (for example, 60/100) is used. Parameters should also be managed as character characteristics. That is, the shooting range SR is preferably set for each character based on the shooting power.
  • the goal direction in S ⁇ b> 603 is a goal formed by providing an angle ⁇ (solid line) outward from the left / right direction of the goal (goal post directions gd ⁇ b> 1, gd ⁇ b> 2 (dotted line)) with respect to the position of the player character CH ⁇ b> 12. It is defined as a direction toward the direction area GDA.
  • the operation according to the flick operation type and the action type is determined from the operation table group of FIG. 13 stored in the storage unit 110 as an example.
  • the table group includes a slide operation / motion table 300, a flick operation / motion table 400, and a flick operation / specific motion table 500, but is not necessarily limited thereto.
  • the slide operation / motion table 300 defines a default motion corresponding to a slide operation type.
  • a “kick” operation is associated. That is, in the case of the flick operation, the “kick” operation is determined as the default operation.
  • the “dribbling” operation associated with the swipe operation is also determined by referring to the slide operation / operation table 300.
  • the flick operation / action table 400 defines default actions according to the flick operation type. A “high trajectory” is associated with the “strong flick” operation whose flick operation type is “strong flick”, and a “low trajectory” is associated with the “weak lick” operation.
  • the flick operation / specific action table 500 defines a specific action type according to the target action area. As an example, a specific action type “shoot” is associated with the flick operation in the shoot range SR. In the own penalty area PA2, a specific action type “clear” is associated with the flick operation.
  • various specific action types can be associated, for example, by defining a partner corner area and associating “centering” as the specific action type. If the player character CH is not located in any of the motion areas defined in the flick operation / specific motion table 500, the “kick” that is the default motion in the slide operation / motion table 300 is performed as it is.
  • FIG. 14 shows a schematic diagram of a UI image generated according to an embodiment.
  • FIG. 15 shows a transition example of a UI image displayed on the touch display 102 through a series of slide operations from touch-on to touch-off.
  • FIG. 16 shows a detailed processing flow of S306.
  • the UI image is formed as an image I of a waterdrop-shaped elastic body having a base BP, a tip TP, and a connecting portion CP between them.
  • the initial shape of the elastic body is circular.
  • the shape of the elastic body can be formed by a set (mesh) of a plurality of polygons made of a square or a triangle. That is, the water droplet shape of the elastic body can be reproduced by substituting each vertex coordinate of each polygon into a predetermined mathematical expression.
  • the UI base BP of the elastic body is fixed at the touch start point and the tip TP of the elastic body follows the touch position of the finger through a series of slide operations as shown in FIGS. It is good to form like this.
  • a UI image in which the elastic body elastically deforms and expands and contracts is displayed.
  • a predetermined upper limit value is provided for the length from the base BP to the tip TP. If the upper limit value is exceeded, the UI image is displayed so that the tip TP is separated from the finger position.
  • touch position information (coordinates of touch start points) acquired in a touch-on state by the touch position information acquisition unit 201 is stored. Further, the touch position information (current coordinates of the touch point) currently acquired in the touch now state by the touch position information acquisition unit 201 is stored each time.
  • a distance d between the touch start point and the current touch point is calculated (S701). It is determined whether the distance d is less than or equal to the upper limit value D (S702). In the case of “Yes” (for example, FIG. 15B), the base BP of the elastic body is associated with the touch start point (S703), and the tip TP of the elastic body is associated with the current touch point (S705). .
  • the base BP of the elastic body is aligned with the touch start point, and the tip TP is aligned with the current touch point. Then, the base part BP and the tip part TP are displayed and further displayed so as to be connected by the connecting part CP, and the UI image I of the elastic body is generated (S707 / (b) and (c) in FIG. 15). )). Thereby, the UI image is displayed through the slide operation in such a manner that the elastic body is deformed in accordance with the change of the current touch point.
  • “No” in S702 for example, FIG.
  • the base BP of the elastic body is associated with the touch start point (S704), but the tip TP of the elastic body is now the touch start point.
  • To the current touch point is associated with the position of the point separated by the upper limit distance D (S706).
  • the elastic body UI image I is generated (S707).
  • the upper limit D is set for the length from the base part BP to the tip part TP because the operation distance becomes longer due to a complicated slide operation particularly in a “dribbling” operation (for example, FIG. 4A) in a soccer game. is there. In soccer, there are many player characters in the field FD. Even if the operation distance is long, if the UI image of the elastic body that follows it is completely displayed, the player character is covered with the UI image and hidden on the game screen.
  • a UI image is generated in S708, visual processing is further performed on the elastic UI image according to the distance d between the touch start point and the current touch point (S708).
  • the color processing is performed so that the UI image becomes white
  • the different color processing is performed so that the UI image becomes light blue. Is done.
  • the slide direction of the current slide operation is determined based on at least the coordinates of the current touch point (S709). More specifically, the slide direction of the current slide operation is determined by using at least two immediately preceding touch position coordinates as shown in FIG. 6A.
  • the indicator IND (arrows in FIGS.
  • the present invention provides a user with simple and intuitive touch operability. In addition, while maintaining simple operability, it is possible to perform a complex operation on the game object and smoothly advance the game.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention provides a game program that maintains simple operability and makes it possible to move an object as desired. Provided is a game program which, when executed by a computer provided with a touch display, causes the computer to execute a method that includes a step in which arbitrary position information on the touch display is detected and touch position information is repeatedly acquired for each prescribed frame rate, a step in which first operation information is decided on the basis of a prescribed number of pieces of touch position information, a step in which a first movement of a moving body in a game space is performed on the basis of the first operation information, a step in which a touch-off from the touch display is determined, a step in which second operation information is decided using the touch position information in at least one frame directly before the touch-off, a step in which a flicking operation is determined on the basis of the second operation information, and a step in which the first movement is canceled in response to the flicking operation and a second movement of the moving body based on the second operation information is performed, wherein the first movement causes the moving body to move within the game space in conjunction with a character, and the second movement causes the moving body to move within the game space separately from the character.

Description

ゲームプログラムGame program
 本発明は、タッチディスプレイ上での特定のタッチ操作に応じて、ゲーム空間内のオブジェクトを動作させるゲームプログラムに関する。 The present invention relates to a game program for operating an object in a game space in response to a specific touch operation on a touch display.
 昨今、タッチディスプレイを備えた携帯端末(例えばスマートフォン)によるゲームプログラムがインターネットを通じてユーザに配信されている。ゲームプログラムの中には、タッチディスプレイ上でのタッチ操作を通じて、ゲーム空間内のオブジェクトを移動制御するものも多い。特許文献1では、サッカーゲームにおいて、ユーザがタッチディスプレイ上で動作対象のキャラクタや位置を都度指定しながら、ゲームを進行させる技術が開示される。特に、タッチ操作による移動体(ボール)の移動先の指定やタッチ操作種別に応じて、移動体への動作(パス、ドリブル、シュート等)を実施する技術が開示される。また、特許文献2には、タッチ操作が、タッチダウンから所定の速度以上で所定のストローク距離だけ移動するような操作の場合に、当該タッチ操作をフリック操作として決定し、プレイヤ・キャラクタを移動目標に向けて移動させる技術が開示される。 Recently, a game program using a mobile terminal (for example, a smartphone) equipped with a touch display is distributed to users through the Internet. Many game programs control movement of objects in the game space through touch operations on a touch display. Patent Document 1 discloses a technique for advancing a game in a soccer game while a user designates an action target character and position each time on a touch display. In particular, a technique for performing an action (pass, dribble, shoot, etc.) on a moving body according to designation of a destination of the moving body (ball) by a touch operation and a touch operation type is disclosed. Further, in Patent Document 2, when the touch operation is an operation that moves at a predetermined speed or more from the touchdown by a predetermined stroke distance, the touch operation is determined as a flick operation, and the player character is moved to the target of movement. Disclosed is a technique for moving toward the target.
特開2014-79643号公報JP 2014-79643 A 国際公開第2013-024771号パンフレットInternational Publication No. 2013-024771 Pamphlet
 サッカーゲームのように、多くのキャラクタを移動体と共に動作させることを要するゲーム・アプリケーションでは、簡易で直感的なタッチ操作性をユーザに提供することが求められる。本発明は、簡易な操作性を維持しつつ、ゲーム・オブジェクトを意のままに動作させることが可能なゲームプログラムを提供することを目的とする。 In a game application that requires a large number of characters to move with a moving object, such as a soccer game, it is required to provide a user with a simple and intuitive touch operability. An object of the present invention is to provide a game program capable of operating a game object at will while maintaining simple operability.
 本発明によれば、タッチディスプレイを備えるコンピュータによって実行されると、コンピュータに、タッチディスプレイ上の任意の位置情報を検知し、所定のフレームレート毎にタッチ位置情報を繰り返し取得するステップと、所定の数のタッチ位置情報に基づいて、第1操作情報を決定するステップと、第1操作情報に基づき、ゲーム空間内の移動体への第1動作を実施するステップと、タッチディスプレイからのタッチオフを判定するステップと、タッチオフの直前の少なくとも1つのフレームにおけるタッチ位置情報を用いて第2操作情報を決定するステップと、第2操作情報に基づいて、フリック操作を判定するステップと、フリック操作に応じて第1動作が解除され、第2操作情報に基づく移動体への第2動作を実施するステップと、を含む方法を実行させ、第1動作が、キャラクタと共にゲーム空間内を移動すべく移動体を動作させ、第2動作が、キャラクタとは離れてゲーム空間を移動すべく移動体を動作させる、ゲームプログラムが得られる。 According to the present invention, when executed by a computer having a touch display, the computer detects arbitrary position information on the touch display and repeatedly acquires the touch position information for each predetermined frame rate; Determining the first operation information based on the number of touch position information, performing the first action on the moving body in the game space based on the first operation information, and determining the touch-off from the touch display A step of determining second operation information using touch position information in at least one frame immediately before the touch-off, a step of determining a flick operation based on the second operation information, and in response to the flick operation A step in which the first operation is released and the second operation is performed on the moving object based on the second operation information. And the first action moves the moving body to move in the game space together with the character, and the second action moves the moving body to move in the game space away from the character. A game program is obtained.
 本発明によれば、簡易な操作性を維持しつつ、ゲーム・オブジェクトを意のままに動作させることが可能なゲームプログラムが提供される。 According to the present invention, there is provided a game program capable of operating a game object at will while maintaining simple operability.
一実施形態におけるコンピュータの基本構成のブロック図である。It is a block diagram of the basic composition of the computer in one embodiment. 一実施形態におけるコンピュータの詳細構成のブロック図である。It is a block diagram of the detailed structure of the computer in one Embodiment. 一実施形態のゲームプログラムにより生成されるゲーム画面例である。It is an example of the game screen produced | generated by the game program of one Embodiment. 一実施形態によるオブジェクトへの概略動作例である。It is an example of a schematic operation | movement to the object by one Embodiment. 一実施形態による全体処理概要を示すフロー図である。It is a flowchart which shows the whole process outline | summary by one Embodiment. 一実施形態により取得されるタッチ位置情報の概略配列図である。It is a schematic arrangement | sequence diagram of the touch position information acquired by one Embodiment. 一実施形態により取得されるタッチ位置情報の概略配列図である。It is a schematic arrangement | sequence diagram of the touch position information acquired by one Embodiment. 一実施形態による詳細処理例を示すフロー図である。It is a flowchart which shows the example of a detailed process by one Embodiment. 一実施形態による詳細処理例を示すフロー図である。It is a flowchart which shows the example of a detailed process by one Embodiment. 一実施形態により決定されるスライド操作情報のグラフ例である。It is an example of the graph of the slide operation information determined by one Embodiment. 一実施形態による詳細処理例を示すフロー図である。It is a flowchart which shows the example of a detailed process by one Embodiment. 一実施形態のフィールド内領域例を示す概略図である。It is the schematic which shows the example of the field area | region of one Embodiment. 一実施形態のテーブル構成例を示す概略図である。It is the schematic which shows the table structural example of one Embodiment. 一実施形態のテーブル構成例を示す概略図である。It is the schematic which shows the table structural example of one Embodiment. 一実施形態により生成されるUI画像例である。It is an example of a UI image generated according to an embodiment. 一実施形態により表示されるUI画像表示の遷移例である。It is an example of a transition of UI image display displayed by one embodiment. 一実施形態による詳細処理例を示すフロー図である。It is a flowchart which shows the example of a detailed process by one Embodiment.
 〔本発明の実施形態の説明〕
 最初に、本発明の実施形態の内容を列記して説明する。本発明の実施形態によるゲームプログラムは、以下のような構成を備える。
[Description of Embodiment of the Present Invention]
First, the contents of the embodiment of the present invention will be listed and described. A game program according to an embodiment of the present invention has the following configuration.
  (項目1) タッチディスプレイを備えるコンピュータによって実行されると、前記コンピュータに、
 前記タッチディスプレイ上の任意の位置情報を検知し、所定のフレームレート毎にタッチ位置情報を繰り返し取得するステップと、
 所定の数の前記タッチ位置情報に基づいて、第1操作情報を決定するステップと、
 前記第1操作情報に基づき、ゲーム空間内の移動体への第1動作を実施するステップと、
 前記タッチディスプレイからのタッチオフを判定するステップと、
 前記タッチオフの直前の少なくとも1つのフレームにおける前記タッチ位置情報を用いて第2操作情報を決定するステップと、
 前記第2操作情報に基づいて、フリック操作を判定するステップと、
 前記フリック操作に応じて前記第1動作が解除され、前記第2操作情報に基づく前記移動体への第2動作を実施するステップと
を含む方法を実行させ、
 前記第1動作が、キャラクタと共に前記ゲーム空間内を移動すべく前記移動体を動作させ、
 前記第2動作が、前記キャラクタとは離れて前記ゲーム空間を移動すべく前記移動体を動作させる、ゲームプログラム。
(Item 1) When executed by a computer having a touch display,
Detecting arbitrary position information on the touch display and repeatedly acquiring the touch position information for each predetermined frame rate;
Determining first operation information based on a predetermined number of the touch position information;
Performing a first action on a moving object in the game space based on the first operation information;
Determining a touch-off from the touch display;
Determining second operation information using the touch position information in at least one frame immediately before the touch-off;
Determining a flick operation based on the second operation information;
Executing the method including the step of releasing the first operation in response to the flick operation and performing the second operation on the moving body based on the second operation information;
The first action moves the moving body to move in the game space together with the character,
A game program in which the second action moves the moving body to move in the game space away from the character.
  (項目2) 項目1記載のゲームプログラムであって、前記タッチオフを判定する前記ステップにおいて、複数のフレームレート単位にわたり前記タッチ位置情報が連続して取得できない場合に、前記タッチオフを判定する、ゲームプログラム。 (Item 2) The game program according to item 1, wherein in the step of determining the touch-off, the touch-off is determined when the touch position information cannot be continuously acquired over a plurality of frame rate units. .
  (項目3) 項目1または2記載のゲームプログラムにおいて、
 前記第2操作情報が、前記直前の少なくとも1つのフレームにおけるフリック方向を含み、
 前記第2動作を実施する前記ステップにおいて、前記フリック方向に対応する前記ゲーム空間の方向に前記第2動作が実施される、ゲームプログラム。
(Item 3) In the game program according to item 1 or 2,
The second operation information includes a flick direction in the at least one previous frame,
A game program in which, in the step of executing the second action, the second action is executed in a direction of the game space corresponding to the flick direction.
  (項目4) 項目1から3のいずれか一項記載のゲームプログラムにおいて、
 前記第2操作情報が、前記直前の少なくとも1つのフレームにおけるフリック距離を含み、
 前記フリック操作を判定する前記ステップにおいて、前記フリック距離に基づいてフリック操作種別が決定され、
 前記第2動作を実施する前記ステップにおいて、前記フリック操作種別に関連づけられる動作が実施される、ゲーム・プログム。
(Item 4) In the game program according to any one of Items 1 to 3,
The second operation information includes a flick distance in at least one previous frame,
In the step of determining the flick operation, a flick operation type is determined based on the flick distance,
A game program in which in the step of executing the second operation, an operation associated with the flick operation type is executed.
  (項目5) 項目4記載のゲームプログラムにおいて、
 前記フリック操作種別は、前記直前の連続する2つ以上のフレームにおける各フリック距離の最大値が、所定の閾値で規定される範囲内にあるか基づいて決定される、ゲームプログラム。
(Item 5) In the game program described in Item 4,
The game program in which the flick operation type is determined based on whether the maximum value of each flick distance in the immediately preceding two or more frames is within a range defined by a predetermined threshold.
  (項目6) 項目4または5記載のゲームプログラムであって、前記移動体への前記第2動作を実施する前記ステップが更に、
 前記ゲーム空間において前記キャラクタの位置が所定の動作領域に含まれるかを判定するステップと、
 前記キャラクタの位置が前記所定の動作領域に含まれる場合に前記第2動作を特定のアクション種別に関連づけるステップと
を含み、
 前記第2動作を実施する前記ステップにおいて、前記特定のアクション種別に関連づけられる動作が実施される、ゲームプログラム。
(Item 6) The game program according to item 4 or 5, wherein the step of performing the second operation on the moving body further includes:
Determining whether the position of the character is included in a predetermined motion area in the game space;
Associating the second action with a specific action type when the position of the character is included in the predetermined action area,
A game program in which an operation associated with the specific action type is executed in the step of executing the second operation.
  (項目7) 項目6記載のゲームプログラムにおいて、前記所定の動作領域が、前記キャラクタの特性に基づいて構成される、ゲームプログラム。 (Item 7) The game program according to Item 6, wherein the predetermined motion area is configured based on characteristics of the character.
  (項目8) 項目6または7記載のゲームプログラムにおいて、当該ゲームプログラムがサッカーゲームに適用され、
 前記第1動作が前記キャラクタによる前記移動体への「ドリブル」動作であり、
 前記第2動作が前記キャラクタによる前記移動体への「キック」動作であり、
 前記第2動作において、前記特定のアクション種別が「シュート」動作に関連づけられる、ゲームプログラム。
(Item 8) In the game program according to item 6 or 7, the game program is applied to a soccer game,
The first action is a “dribbling” action to the moving body by the character;
The second action is a “kick” action to the moving body by the character;
In the second operation, the specific action type is associated with a “shoot” operation.
  (項目9) 項目8記載のゲームプログラムであって、
 前記フリック操作種別が「強」フリック操作および「弱」フリック操作を含み、
 前記第2動作において、前記「強」フリック操作が「高い弾道」に関連づけられ、前記「弱」フリック操作が「低い弾道」に関連づけられる、ゲームプログラム。
(Item 9) A game program according to item 8,
The flick operation type includes a “strong” flick operation and a “weak” flick operation,
In the second operation, the game program in which the “strong” flick operation is associated with “high trajectory” and the “weak” flick operation is associated with “low trajectory”.
  (項目10) 項目1から9のいずれか一項記載のゲームプログラムであって、前記方法が、更に、
 前記タッチ位置情報を繰り返し取得する前記ステップに応じて、フレームレート毎に、タッチオンを検知した第1タッチ位置情報と現在検知している第2タッチ位置情報とに関連づけて、画像を生成および表示するステップであって、
 前記画像が、基部および先端部を有する弾性体の画像であり、
 前記基部が前記第1タッチ位置情報に関連づけられ、前記先端部がフレーム毎に前記第2タッチ位置情報に関連づけられることにより、スライド操作を通じて、前記第2タッチ位置情報の変更に伴って前記弾性体を変形させるように前記画像が表示される、ゲームプログラム。
(Item 10) The game program according to any one of items 1 to 9, wherein the method further includes:
In accordance with the step of repeatedly acquiring the touch position information, an image is generated and displayed in association with the first touch position information detected for touch-on and the second touch position information currently detected for each frame rate. Step,
The image is an image of an elastic body having a base and a tip;
The base is associated with the first touch position information, and the distal end is associated with the second touch position information for each frame, so that the elastic body is changed along with the change of the second touch position information through a slide operation. A game program in which the image is displayed so as to deform.
  (項目11) 項目10記載のゲームプログラムであって、前記画像を生成および表示する前記ステップにおいて、
 前記第1タッチ位置情報と前記第2タッチ位置情報の間の距離が所定の上限値を超える場合に、前記先端部が、前記第1タッチ位置情報から前記第2タッチ位置情報に向けて前記所定の上限値だけ離れた位置情報に関連づけられる、ゲームプログラム。
(Item 11) The game program according to item 10, wherein in the step of generating and displaying the image,
When the distance between the first touch position information and the second touch position information exceeds a predetermined upper limit value, the tip portion moves from the first touch position information toward the second touch position information. A game program associated with position information separated by an upper limit value.
  (項目12) 項目10または11記載のゲームプログラムであって、前記画像を生成および表示する前記ステップが、更に、
 前記基部と前記先端部の間の距離に応じて、前記弾性体の画像に視覚上の処理を実施するステップを含む、ゲームプログラム。
(Item 12) The game program according to item 10 or 11, wherein the step of generating and displaying the image further includes:
A game program including a step of performing visual processing on an image of the elastic body according to a distance between the base and the tip.
  (項目13) 項目10から12のいずれか一項記載のゲームプログラムにおいて、前記画像を生成および表示する前記ステップが、更に、現在第2タッチ位置情報に基づくスライド方向を示す指標を、前記先端部に関連づけて、前記画像と共に表示するステップを含む、ゲームプログラム。 (Item 13) In the game program according to any one of Items 10 to 12, the step of generating and displaying the image further includes an index indicating a slide direction based on current second touch position information, A game program comprising the step of displaying the image together with the image.
 〔本発明の実施形態の詳細〕
 本発明の実施形態に係るゲームプログラムの具体例を、以下に図面を参照しながら説明する。なお、本発明はこれらの例示に限定されるものではなく、特許請求の範囲によって示され、特許請求の範囲と均等の意味及び範囲内でのすべての変更が含まれることが意図される。以下の説明では、図面の説明において同一の要素には同一の符号を付し、重複する説明を省略する。
[Details of the embodiment of the present invention]
A specific example of the game program according to the embodiment of the present invention will be described below with reference to the drawings. In addition, this invention is not limited to these illustrations, is shown by the claim, and intends that all the changes within the meaning and range equivalent to the claim are included. In the following description, the same reference numerals are given to the same elements in the description of the drawings, and redundant descriptions are omitted.
 図1は、本発明の一実施形態によるコンピュータ100の基本構成のブロック図を示す。コンピュータ100は、スマートフォンやタブレット型コンピュータなどのタッチディスプレイを備えるデバイスとするのがよい。本明細書において、「タッチディスプレイ」という用語は、「タッチパネル」および「タッチ・スクリーン」と置き換え可能であることが当業者には理解される。図示されるように、コンピュータ100は、タッチディスプレイ102、処理部108、記憶部110および通信部112を備える。タッチディスプレイ102は、記憶部110に格納された各種データや処理部108により生成された各種画像などを表示するための表示部104を備える。タッチディスプレイ102はまた、表示部104に対し、ユーザの指やスタイラス等の物体による接触操作(主に、タッチ操作、スワイプ操作、スライド操作、フリック操作、ドラッグ操作、およびタップ操作等の物理的接触操作)を検知する接触検知部106を備える。接触検知部106は、タッチディスプレイ102の構造上、液晶パネル、有機EL、プラズマディスプレイ等の表示部104の上層に配置されてもよい。接触検知部106は、圧力検出方式、抵抗膜方式、静電容量方式や電磁誘導方式等を採用することができる。接触検知部106上で指やスタイラス等の物体の接触による入力を受けると、その接触位置における押圧、電気抵抗、電気容量や弾性波のエネルギー等の変化量が検知され、表示部104上の対応する接触位置座標が特定されることになる。 FIG. 1 shows a block diagram of a basic configuration of a computer 100 according to an embodiment of the present invention. The computer 100 is preferably a device including a touch display such as a smartphone or a tablet computer. It will be understood by those skilled in the art that the term “touch display” is interchangeable with “touch panel” and “touch screen” herein. As illustrated, the computer 100 includes a touch display 102, a processing unit 108, a storage unit 110, and a communication unit 112. The touch display 102 includes a display unit 104 for displaying various data stored in the storage unit 110 and various images generated by the processing unit 108. The touch display 102 also touches the display unit 104 with an object such as a user's finger or a stylus (mainly physical contact such as touch operation, swipe operation, slide operation, flick operation, drag operation, and tap operation). A contact detection unit 106 for detecting (operation). The contact detection unit 106 may be disposed in an upper layer of the display unit 104 such as a liquid crystal panel, an organic EL, or a plasma display due to the structure of the touch display 102. The contact detection unit 106 may employ a pressure detection method, a resistance film method, a capacitance method, an electromagnetic induction method, or the like. When an input by contact of an object such as a finger or a stylus is received on the contact detection unit 106, a change amount such as pressure, electrical resistance, electric capacity, elastic wave energy, etc. at the contact position is detected, and the display unit 104 responds. The contact position coordinates to be specified are specified.
 処理部108は、プロセッサとして構成されてもよく、本発明の実施形態によるユーザ・インタフェース画像表示プログラム、およびゲームプログラム等のコンピュータ・プログラムを記憶部110等から読み出して、それに従った処理をコンピュータ100に実行させる。処理部108の具体的な構成及び動作については後記する。記憶部110は、ゲームプログラム、ユーザやゲームに関連する各種データ等を一時的又は永続的に格納することができ、処理部108からの命令に応答して、情報の登録、更新、削除等を実行する。記憶部110の例は、ROM(Read Only Memory)、RAM(Random Access Memory)等の主記憶や、ハードディスク、フラッシュメモリ、光ディスク等の補助記憶を含むが、これらに限定されない。通信部112は、インタ
ーネット等の通信ネットワークを介して外部のサーバ(図示せず)等との間でゲームに関するデータのやり取りを実行するために使用することができる。
The processing unit 108 may be configured as a processor, and reads a computer program such as a user interface image display program and a game program according to the embodiment of the present invention from the storage unit 110 and the like, and performs processing according to the computer program. To run. The specific configuration and operation of the processing unit 108 will be described later. The storage unit 110 can temporarily or permanently store game programs, various data related to users and games, etc., and can register, update, delete, etc. information in response to commands from the processing unit 108. Execute. Examples of the storage unit 110 include, but are not limited to, a main storage such as a ROM (Read Only Memory) and a RAM (Random Access Memory), and an auxiliary storage such as a hard disk, a flash memory, and an optical disk. The communication unit 112 can be used to exchange game-related data with an external server (not shown) or the like via a communication network such as the Internet.
 図2は、本発明の一実施形態によるコンピュータ100の詳細構成を詳細に示すブロック図である。図1に関連して説明したように、コンピュータ100は、表示部104および接触検知部106を含むタッチディスプレイ102、処理部108、記憶部110および通信部112を備える。処理部108は、機能ブロックとして、タッチ位置情報取得部201、操作情報決定部202、ドリブル動作実施部203、フリック操作判定部204、キック動作実施部205、およびUI画像生成部206を有する。各コンポーネントによって実行される処理の詳細については後述する。他にも、ゲーム進行時のアニメーションを生成するアニメーション生成部等の基本機能部を含むが図示は省略する。各機能ブロックは、ソフトウェアによって実装され、図1に示したプロセッサ108がメモリ110に記憶した各プログラムモジュールを読み出して実行することで実現される。 FIG. 2 is a block diagram showing in detail the detailed configuration of the computer 100 according to an embodiment of the present invention. As described with reference to FIG. 1, the computer 100 includes the touch display 102 including the display unit 104 and the contact detection unit 106, the processing unit 108, the storage unit 110, and the communication unit 112. The processing unit 108 includes a touch position information acquisition unit 201, an operation information determination unit 202, a dribble operation execution unit 203, a flick operation determination unit 204, a kick operation execution unit 205, and a UI image generation unit 206 as functional blocks. Details of processing executed by each component will be described later. In addition, although it includes a basic function unit such as an animation generation unit that generates an animation during game progress, the illustration is omitted. Each functional block is implemented by software, and is realized by reading and executing each program module stored in the memory 110 by the processor 108 shown in FIG.
 図3は、一実施形態のゲームプログラムをサッカーゲームに適用する場合のゲーム画面例である。タッチディスプレイ102には、ゲーム空間として、空間の垂直方向から仮想カメラ(図示せず)で撮像したサッカー・フィールドFDが表示される。また、タッチディスプレイ102の上部には、スコア「1-0」や経過時間「後半01:12」が付加情報として表示される。サッカー・フィールドFDには、ユーザが操作可能な様々なゲーム・オブジェクトが配置される。オブジェクト例には、移動体(サッカー・ボール)MO、自チームの自キャラクタCH11~12、対戦相手チームの相手キャラクタCH21~23、およびゴールGL等が含まれる。各キャラクタには「FW」等のポジションが併せて表示される。ディスプレイ中央では、自キャラクタCH12にボールMBが関連づけられており、自キャラクタCH11の上には▽印の活性マークMKが表示されている。活性マークMKが表示されている自キャラクタCH12は、ユーザのタッチ操作を通じて動作可能である。自キャラクタの動作の例として、ボールに対するドリブルやキック、およびフィールド内移動(ポジショニング)もある。キックには、パス、クリア、センタリング、シュート等の各種動作が含まれる。フィールドFBにわたり、ゲーム空間座標としてフィールド座標が規定される。即ち、配置されるボール、自キャラクタCH11~12、対戦相手チームの相手キャラクタCH21~23、およびゴールGL等の各オブジェクトは座標情報を有する。画面下部に、ユーザによるタッチ操作に応じたユーザ・インタフェース画像(以下、「UI画像」と称する。)が表示される。UI画像は、自キャラクタCH12への動作指令に関連付けられる。一実施形態によれば、UI画像は、スライド操作において、スライド方向をキャラクタのゲーム空間内の移動方向に関連づけ、またスライドの距離をキャラクタの移動速度に関連付けるのがよい。これにより、直感的に優れた操作性をユーザに提供可能なUI画像を提供することができる。 FIG. 3 is an example of a game screen when the game program of one embodiment is applied to a soccer game. On the touch display 102, a soccer field FD captured by a virtual camera (not shown) from the vertical direction of the space is displayed as a game space. Further, on the upper part of the touch display 102, the score “1-0” and the elapsed time “second half 01:12” are displayed as additional information. In the soccer field FD, various game objects that can be operated by the user are arranged. Examples of objects include a moving body (soccer ball) MO, own characters CH11 to CH12 of the own team, opponent characters CH21 to CH23 of the opponent team, goal GL, and the like. Each character is displayed with a position such as “FW”. In the center of the display, the ball MB is associated with the player character CH12, and an active mark MK marked with ▽ is displayed on the player character CH11. The own character CH12 on which the active mark MK is displayed can be operated through a user's touch operation. Examples of the action of the player character include dribbling and kicking the ball, and movement in the field (positioning). The kick includes various operations such as pass, clear, centering, and chute. Over the field FB, field coordinates are defined as game space coordinates. That is, each object such as a ball to be arranged, own character CH11 to CH12, opponent character CH21 to CH23 of the opponent team, and goal GL has coordinate information. A user interface image corresponding to a touch operation by the user (hereinafter referred to as “UI image”) is displayed at the bottom of the screen. The UI image is associated with a motion command to the player character CH12. According to one embodiment, the UI image may relate the sliding direction to the moving direction of the character in the game space and the sliding distance to the moving speed of the character in the slide operation. Thereby, it is possible to provide a UI image that can provide the user with intuitive operability.
 本明細書において、「ゲーム・オブジェクト」(単に「オブジェクト」と称することもある。)には、主に、ゲーム空間に配置される移動体(ボール)およびキャラクタが含まれ、それぞれが、共にまたは離れて、ユーザにより動作させられ得る。本明細書において、「スライド」操作という用語は、タッチディスプレイ120上をユーザが指等の物体で接触し(タッチオン)、そのまま位置を変えながら接触を維持し(タッチナウ)、その後に物体を離す(タッチオフ)という一連のタッチ操作を総称したものである。特に、本明細書において、タッチオン後、一定の期間にわたりタッチディスプレイ120上で物体を滑らせる操作のことを「スワイプ」操作と称する。同様に、タッチディスプレイ120上で物体を所定の速度で滑らせた後にタッチオフする操作のことを「フリック」操作と称する。つまり、本明細書において、「スライド」操作の概念は「スワイプ」操作や「フリック」操作の概念を包括するものである。 In this specification, the “game object” (also referred to simply as “object”) mainly includes a moving body (ball) and a character arranged in the game space, Can be operated remotely by the user. In this specification, the term “slide” operation means that the user touches the touch display 120 with an object such as a finger (touch on), maintains the contact while changing the position (touch now), and then releases the object ( It is a generic term for a series of touch operations (touch-off). In particular, in this specification, an operation of sliding an object on the touch display 120 for a certain period after touch-on is referred to as a “swipe” operation. Similarly, an operation of touching off after sliding an object on the touch display 120 at a predetermined speed is referred to as a “flick” operation. That is, in this specification, the concept of “slide” operation encompasses the concepts of “swipe” operation and “flick” operation.
 (A)オブジェクトへの動作の全体的処理の概要
 図4および図5に、ゲーム空間内のオブジェクトを動作させる情報処理の全体概要を示す。図4に、一実施形態による2次元フィールドFD上のオブジェクトの概略動作例を示し、図5に、当該オブジェクト動作に対応する情報処理のフローチャートを示す。図4において、自チームのキャラクタCH12が活性化され、ボールと共に(またはボールと離れて)動作される。つまり、(a)の状態では、キャラクタCH12はボールMBと関連づけられている。タッチディスプレイ102上でのユーザによるスワイプ操作に応答して、ボールMBを、キャラクタCH12と共に、2次元フィールドFD上をスライド方向(図では右下方向)に移動させる。即ち、キャラクタCH12の「ドリブル」動作によりボールMBを移動させる。(b)の状態では、タッチディスプレイ上のフリック操作に応答して、キャラクタCH12のボールMBとの関連づけが解除される。即ち、上記「ドリブル」動作が解除される。そして、ボールMBを、キャラクタCH12とは離れて2次元フィールドFD上をフリック方向(図では右方向)に移動させる。即ち、キャラクタCH12に、ボールMBへの「キック」動作によりボールMBのみを移動させる。このように、ユーザは、ボールに対する「ドリブル」から「キック」までの一連の動作を、タッチディスプレイ上のスライド操作を通じて実施可能であり、簡易で直感的なタッチ操作性をユーザに提供することができる。
(A) Overview of Overall Processing of Action on Object FIGS. 4 and 5 show an overview of information processing for operating an object in the game space. FIG. 4 shows a schematic operation example of an object on the two-dimensional field FD according to one embodiment, and FIG. 5 shows a flowchart of information processing corresponding to the object operation. In FIG. 4, the character CH12 of the own team is activated and operated with the ball (or away from the ball). That is, in the state (a), the character CH12 is associated with the ball MB. In response to the swipe operation by the user on the touch display 102, the ball MB is moved along the character CH12 in the sliding direction (the lower right direction in the figure) on the two-dimensional field FD. That is, the ball MB is moved by the “dribbling” operation of the character CH12. In the state (b), the association of the character CH12 with the ball MB is released in response to the flick operation on the touch display. That is, the “dribbling” operation is released. Then, the ball MB is moved away from the character CH12 in the flick direction (right direction in the figure) on the two-dimensional field FD. That is, only the ball MB is moved to the character CH12 by the “kick” operation to the ball MB. In this way, the user can perform a series of operations from “dribbling” to “kick” on the ball through a slide operation on the touch display, and can provide the user with a simple and intuitive touch operability. it can.
 図5のフローチャートにおいて、スライド操作による処理開始時には、所定のユーザ作用または自動的に、キャラクタCH11はボールMBと関連づけられる。情報処理は、タッチオン・フェーズ(S301~S302)、タッチナウ・フェーズ(S303~S307)、およびタッチオフ・フェーズ(S308~S310)の各段階を含む。接触検知部106が所定のフレームレート毎に接触を検知することを通じて、タッチオン、タッチナウおよびタッチオフが判定される。タッチオン・フェーズでは、接触検知部106が最初に接触を検知すると(S301)、タッチ位置情報取得部201はタッチオン状態としてタッチ位置情報を取得し、記憶部110に格納する(S302)。タッチオン状態のタッチ位置情報は、タッチディスプレイ102上のタッチ領域の任意の位置とすることができ、ユーザは好きな位置をタッチすればよい。 In the flowchart of FIG. 5, at the start of the process by the slide operation, the character CH11 is associated with the ball MB automatically or with a predetermined user action. The information processing includes each stage of a touch-on phase (S301 to S302), a touch-now phase (S303 to S307), and a touch-off phase (S308 to S310). When the contact detection unit 106 detects contact at every predetermined frame rate, touch-on, touch-now, and touch-off are determined. In the touch-on phase, when the contact detection unit 106 first detects contact (S301), the touch position information acquisition unit 201 acquires the touch position information as a touch-on state and stores it in the storage unit 110 (S302). The touch position information in the touch-on state can be an arbitrary position in the touch area on the touch display 102, and the user may touch a favorite position.
 タッチナウ・フェーズは、ユーザによるスライド操作の内のスワイプ操作状態に相当する。接触検知部106が所定のフレームレート毎(例えば30fbps)に継続して接触を検知する(S303)。すると、タッチ位置情報取得部201は、タッチナウ状態として、タッチ位置情報をフレームレート毎に繰り返し取得する(S304)。特に、取得したタッチナウ状態でのタッチ位置情報の内、最新の所定数のタッチ位置情報が記憶部110に格納される。操作情報決定部202は、格納済みのタッチ位置情報を用いて、各フレームに対応するスワイプ操作情報を決定する(S305)。UI画像生成部206は、フレームレート毎にUI画像を連続的に生成してタッチディスプレイ102に表示する(S306)。ドリブル動作実施部203は、スワイプ操作に応じて、ゲーム空間内のオブジェクトへの動作を、スワイプ操作情報に基づいて実施させる(S307)。図4(a)の例では、キャラクタCH12に、ボールMBを伴いながらスワイプ方向(図では右下方向)に「ドリブル」動作を実施させる。タッチナウ・フェーズではS304~S307の処理が所定のフレームレート毎に繰り返し実施される。 The touch now phase corresponds to the swipe operation state in the slide operation by the user. The contact detection unit 106 continuously detects contact at every predetermined frame rate (for example, 30 fps) (S303). Then, the touch position information acquisition unit 201 repeatedly acquires touch position information for each frame rate as a touch now state (S304). In particular, among the acquired touch position information in the touch now state, the latest predetermined number of touch position information is stored in the storage unit 110. The operation information determination unit 202 determines swipe operation information corresponding to each frame using the stored touch position information (S305). The UI image generation unit 206 continuously generates UI images for each frame rate and displays them on the touch display 102 (S306). In response to the swipe operation, the dribble motion execution unit 203 performs an operation on the object in the game space based on the swipe operation information (S307). In the example of FIG. 4A, the character CH12 is caused to perform a “dribbling” action in the swipe direction (lower right direction in the figure) with the ball MB. In the touch now phase, the processes of S304 to S307 are repeatedly performed for each predetermined frame rate.
 タッチオフ・フェーズでは、接触検知部106が接触を検知しない場合に(S308)、タッチ位置情報取得部201はタッチ位置情報を特定できないことから、タッチオフ状態と判定する。その際、UI画像は消滅される(図示せず)。操作情報決定部202は、格納されたタッチオフの直前の少なくとも1つフレームにおけるタッチ位置情報を用いて、フリック(スライド)操作情報を決定する(S309)。フリック操作判定部204は、フリック操作情報に基づいて、直前の操作がフリック操作であるか、且つその場合のフリック操作種別を判定する(S310)。フリック操作の場合に、キック動作実施部205は、ゲーム空間内のオブジェクトへの動作を、フリック操作情報に基づき実施させる(
S311)。図4(b)の上記例では、キャラクタCH12に、ボールMBへの「キック」動作をさせ、フリック方向にボールMBを移動させる。
In the touch-off phase, when the contact detection unit 106 does not detect contact (S308), the touch position information acquisition unit 201 cannot determine the touch position information, and thus determines that it is in the touch-off state. At that time, the UI image disappears (not shown). The operation information determination unit 202 determines flick (slide) operation information using the stored touch position information in at least one frame immediately before the touch-off (S309). Based on the flick operation information, the flick operation determination unit 204 determines whether the immediately preceding operation is a flick operation and the flick operation type in that case (S310). In the case of a flick operation, the kick action performing unit 205 performs an action on an object in the game space based on the flick operation information (
S311). In the example of FIG. 4B, the character CH12 is caused to “kick” the ball MB and move the ball MB in the flick direction.
 (B)タッチ位置情報の格納(S302、S304およびS309)
 図6(a)および図6(b)に、上記のS302、S304およびS309において、タッチ位置情報取得部201が取得したタッチ位置情報を格納する際の格納例を示す。スライド操作を通じて、所定数のタッチ位置情報が、所定のフレームレート(例えば、30fbps)毎に記憶部110内の配列テーブルfpに格納される。また、スライド操作毎にその初期位置座標がテーブルに格納される。タッチ位置情報は、タッチディスプレイ120上のタッチ領域のxy位置座標として取得されるのがよい。一般に、ユーザが用いる携帯端末100毎にタッチ領域(タッチディスプレイ120の大きさ)が異なるものとなる。xy位置座標は、例えば、タッチ領域の中心を原点(0,0)とし、縦方向上端が+1.0且つ縦方向下端が-1.0となるように調整し、それに合わせて横方向のタッチ領域範囲を決定するのがよい。一例として、一般的な携帯端末であれば、それを縦持ちする場合の横方向のタッチ領域範囲は、概ね-0.6(左端)から+0.6(右端)となる。タッチ位置情報に基づくことにより、S305でスワイプ操作情報が、S309でフリック操作情報が決定される。
(B) Storage of touch position information (S302, S304, and S309)
FIGS. 6A and 6B show a storage example when the touch position information acquired by the touch position information acquisition unit 201 in S302, S304, and S309 is stored. Through the slide operation, a predetermined number of pieces of touch position information are stored in the array table fp in the storage unit 110 for each predetermined frame rate (for example, 30 fps). The initial position coordinates are stored in the table for each slide operation. The touch position information may be acquired as xy position coordinates of the touch area on the touch display 120. In general, the touch area (the size of the touch display 120) is different for each portable terminal 100 used by the user. The xy position coordinates are adjusted so that, for example, the center of the touch area is the origin (0, 0), the vertical upper end is +1.0 and the vertical lower end is −1.0, and the horizontal touch is adjusted accordingly. The area range should be determined. As an example, in the case of a general portable terminal, the horizontal touch area range when the portable terminal is held vertically is approximately −0.6 (left end) to +0.6 (right end). Based on the touch position information, swipe operation information is determined in S305, and flick operation information is determined in S309.
 図6(a)の格納例では、配列テーブルfpとして、fp[0]からfp[10]までフレームレート毎に11個のタッチ位置情報を格納する記憶領域を設ける。なお、タッチ位置情報の個数は限定的でなく任意の数とすることができる。最初のタッチオン状態では、タッチ領域上でタッチされた時点での位置座標(x0,y0)がfp[0]に格納される。当該タッチオン状態ではfp[1]からfp[10]は未だnull(空)値であるが、タッチナウ状態が進行するにつれて、フレームレート毎に位置座標が順次格納される。例えば、タッチナウ状態1では、fp[1]からfp[5]まで順に5個の位置座標が格納され、タッチナウ状態2では、更にfp[6]からfp[10]まで順に5個の位置座標が格納されている。配列テーブルfpは、11個のデータしか格納しないため、容量が一杯になり次第、古いデータを順次削除する必要がある。なお、スワイプ方向を常に検知できるように、タッチオン座標情報(x0,y0)を初期位置座標として別途保存する。初期位置座標は、タッチオフ状態に至るまでスライド操作を通じて削除されないことが好ましい。 In the storage example of FIG. 6A, a storage area for storing 11 pieces of touch position information for each frame rate from fp [0] to fp [10] is provided as the array table fp. The number of pieces of touch position information is not limited and can be an arbitrary number. In the first touch-on state, the position coordinates (x0, y0) when touched on the touch area are stored in fp [0]. In the touch-on state, fp [1] to fp [10] are still null (empty) values, but the position coordinates are sequentially stored for each frame rate as the touch now state progresses. For example, in the touch now state 1, five position coordinates are sequentially stored from fp [1] to fp [5], and in the touch now state 2, five position coordinates are sequentially stored from fp [6] to fp [10]. Stored. Since the array table fp stores only 11 data, it is necessary to delete old data sequentially as soon as the capacity becomes full. Note that the touch-on coordinate information (x0, y0) is separately stored as initial position coordinates so that the swipe direction can always be detected. It is preferable that the initial position coordinates are not deleted through the slide operation until the touch-off state is reached.
 タッチナウ状態2から更にタッチナウ状態が進行してその次のタッチオフ判定に至るまで、フレームレート毎の位置座標のデータを用いて再度fp[0]から順次上書きされ古い位置座標のデータが順次削除される。タッチオフ判定は、物理的に位置座標が取得されず、null値が格納される場合に実施される。図6(a)では、少なくともfp[5]がnull値で上書き更新されたために、タッチオフ判定が実施される。タッチオフと判定された場合には、一連のスライド操作のタッチ位置情報の取得が完了する。タッチオフ判定では、タッチ位置情報が1回のみ取得できない(null値が1つの)場合ではなく、複数のフレームレート単位にわたり連続して取得できない(null値が連続して2つ以上の)場合に、タッチオフを判定するように判定ロジックを構成してもよい。このように構成するのは、スライド操作が複雑なものである場合に、ほんの一瞬指がタッチディスプレイから離れてしまうことが十分に想定されるためである。例えば、サッカーゲームにおける「ドリブル」動作(例えば図4(a))では、キャラクタの切り返し動作等が頻繁に実施されることになるために、複雑なスライド操作が行われることが想定される。 From the touch now state 2 to the next touch-off determination until the next touch-off determination, the position coordinate data for each frame rate is sequentially overwritten again from fp [0], and the old position coordinate data is sequentially deleted. . The touch-off determination is performed when a position value is not physically acquired and a null value is stored. In FIG. 6A, at least fp [5] has been overwritten and updated with the null value, so the touch-off determination is performed. If it is determined as touch-off, the acquisition of touch position information for a series of slide operations is completed. In the touch-off determination, when the touch position information cannot be acquired only once (one null value), but cannot be continuously acquired over a plurality of frame rate units (the null value is continuously two or more), Determination logic may be configured to determine touch-off. The reason for this configuration is that it is sufficiently assumed that the finger is separated from the touch display for a moment when the slide operation is complicated. For example, in a “dribbling” action in a soccer game (for example, FIG. 4A), since a character turning action or the like is frequently performed, it is assumed that a complicated slide operation is performed.
 図6(b)に、図6(a)のタッチオフ判定の結果、(1)タッチオフ判定され、その後再度タッチオン状態となる場合、および(2)タッチオフ判定されずその後もタッチナウ状態3としてタッチナウが継続する場合の2つのパターンの格納例を示す。(1)(2)共に、fp[5]およびfp[6]に、(x16,y16)および(x17,y17)
ではなくnull値が格納される。また、次のタッチ位置座標(x18,y18)の入力を受けると、fp[7]に当該(x18,y18)が格納される。その後もフレームレート毎にfp[8]から位置座標が順次格納される(矢印)。一方、(1)の場合は、初期位置座標が新たに(x18,y18)で上書き更新されるのに対し、(2)の場合は、初期位置座標が(x0,y0)として維持される。(2)に限らず、(1)でタッチオフ判定された後においても、配列テーブルfpが初期化されないように構成されるのがよい。つまり、前回スライド操作での直近のタッチ位置情報(x14,y14)や(x15,y15)等を別途の処理で使用可能なようにしておくために、これらの情報を維持しておくのが好ましい。
FIG. 6B shows the result of the touch-off determination shown in FIG. 6A. (1) When touch-off is determined and then the touch-on state is entered again. An example of storing two patterns in the case of In both (1) and (2), (x16, y16) and (x17, y17) are added to fp [5] and fp [6].
Instead, a null value is stored. When the next touch position coordinate (x18, y18) is received, the (x18, y18) is stored in fp [7]. Thereafter, the position coordinates are sequentially stored from fp [8] for each frame rate (arrow). On the other hand, in the case of (1), the initial position coordinates are newly overwritten and updated with (x18, y18), whereas in the case of (2), the initial position coordinates are maintained as (x0, y0). In addition to (2), it is preferable that the arrangement table fp is not initialized even after the touch-off determination is made in (1). That is, it is preferable to maintain the latest touch position information (x14, y14), (x15, y15), etc. in the previous slide operation so that they can be used in separate processing. .
 以下に、上記S301~S311の幾らかについて、より詳細に個別説明する。図7に、S309でのフリック操作情報の決定の詳細を示す。図8および図9に、S310でのフリック操作判定の詳細を示す。図10~図13に、S311でのオブジェクトへの動作の実施の詳細を示す。図14~図16に、S306でのUI画像生成の詳細を示す。 Hereinafter, some of S301 to S311 will be individually described in more detail. FIG. 7 shows details of the determination of the flick operation information in S309. 8 and 9 show details of the flick operation determination in S310. FIG. 10 to FIG. 13 show details of the operation performed on the object in S311. 14 to 16 show details of UI image generation in S306.
 (C)フリック操作情報の決定(S309)
 図7に、操作情報決定部202によるフリック操作情報の決定の処理(S309)について詳細なフローチャートを示す。タッチオフ状態が特定されると、最初に、これまでのキャラクタとボールの関連づけが解除される(S401)。これにより、キャラクタのボールMBを伴う「ドリブル」動作が解除され、フリック操作に応じた「キック」動作が可能になる(例えば図4(b))。タッチオフの直前の少なくとも1つフレームにおけるタッチ位置情報が記憶部から抽出される(S402)。抽出されたタッチ位置情報を用いて、フリック操作情報として、スライド方向およびスライド距離が決定される(S403)。
(C) Determination of flick operation information (S309)
FIG. 7 shows a detailed flowchart of the flick operation information determination process (S309) by the operation information determination unit 202. When the touch-off state is specified, first, the association between the character and the ball so far is released (S401). As a result, the “dribbling” action involving the ball MB of the character is released, and the “kicking” action corresponding to the flick operation becomes possible (for example, FIG. 4B). Touch position information in at least one frame immediately before the touch-off is extracted from the storage unit (S402). Using the extracted touch position information, a slide direction and a slide distance are determined as flick operation information (S403).
 上記の図6(b)の(2)のタッチオフ状態の例では、最新の11個の位置座標を参照することにより、最新の10フレーム分(第1フレームから第10フレーム)のスライド操作情報が決定される。具体的には、タッチオフ直前の第10フレームのフリック操作情報は、fp[3]の(x14,y14)およびfp[4]の(x15,y15)のタッチ位置情報(位置座標)を用いて決定される。更にその直前の第9フレームのフリック操作情報は、fp[2]の(x13,y13)およびfp[3]の(x14,y14)の位置座標を用いて決定される。フリック操作情報として、例えば、xy座標移動方向(即ち、フリック方向)、xy座標移動量(即ち、フリック距離)、およびxy座標移動速度(即ち、座標移動量/フレームレートで計算されるフリック速度)が計算される。フリック操作情報は、直前の連続する2つまたはそれ以上のフレームを用いて計算するのがよい。一例として、フリック方向は、ゲーム空間においてオブジェクトを操作する方向に対応させるように用いるのがよく、また、フリック距離はオブジェクトへの動作のスピードを決定するように用いるのがよい。このように、限定的な個数の位置座標を取得することにより、スライド操作に関する様々な操作パラメータが特定可能である。 In the example of the touch-off state in (2) of FIG. 6B above, by referring to the latest 11 position coordinates, the slide operation information for the latest 10 frames (from the first frame to the 10th frame) is obtained. It is determined. Specifically, the flick operation information of the 10th frame immediately before the touch-off is determined using touch position information (position coordinates) of (x14, y14) of fp [3] and (x15, y15) of fp [4]. Is done. Further, the flick operation information of the ninth frame immediately before is determined using the position coordinates of (x13, y13) of fp [2] and (x14, y14) of fp [3]. As flick operation information, for example, the xy coordinate movement direction (that is, the flick direction), the xy coordinate movement amount (that is, the flick distance), and the xy coordinate movement speed (that is, the flick speed calculated by the coordinate movement amount / frame rate). Is calculated. The flick operation information is preferably calculated using the immediately preceding two or more frames. As an example, the flick direction is preferably used so as to correspond to the direction in which the object is operated in the game space, and the flick distance is preferably used so as to determine the speed of the action on the object. Thus, by obtaining a limited number of position coordinates, various operation parameters relating to the slide operation can be specified.
 (D)フリック操作判定およびフリック操作種別判定(S310)
 図8および図9を参照して、フリック操作判定部204によるフリック操作判定およびフリック操作種別判定の処理(S310)について更に説明する。図8にS310の詳細処理フローを示す。また、図9に一例のフリック(スライド)操作情報のグラフ例を示す。グラフの横軸は所定のフレームレート単位のフレームFを、縦軸はスライド距離(スワイプ距離またはフリック距離)を示す。第1フレームF[1]でタッチオン状態が検知され、第n-1フレームF[n-1]まで、タッチナウ状態としてスライド(スワイプ)距離がプロットされている。第nフレームF[n]で、タッチ位置情報が特定されないことで、スライド距離が計算不能(N/A)となる。これにより、距離タッチオフが検知されている状況である。
(D) Flick operation determination and flick operation type determination (S310)
With reference to FIGS. 8 and 9, the flick operation determination and flick operation type determination processing (S310) by the flick operation determination unit 204 will be further described. FIG. 8 shows the detailed processing flow of S310. FIG. 9 shows a graph example of flick (slide) operation information as an example. The horizontal axis of the graph indicates the frame F in a predetermined frame rate unit, and the vertical axis indicates the slide distance (swipe distance or flick distance). The touch-on state is detected in the first frame F [1], and the slide (swipe) distance is plotted as the touch now state up to the (n-1) th frame F [n-1]. Since the touch position information is not specified in the nth frame F [n], the slide distance cannot be calculated (N / A). As a result, the distance touch-off is detected.
 S309における直前の少なくとも1つフレームにおけるフリック操作情報の決定に応じて、フリック操作情報の内のスライド距離について、閾値TH1以上かが判定される(S501)。具体的には、直前の第n-1フレームF[n-1]やその更に直前の第n-2フレームF[n-2]でプロットされたスライド(フリック)距離が閾値TH1以上かが判定される。「はい」の場合は、フリック操作が成立したものと判定され、以降の処理を通じてオブジェクトへの動作が実行されることになる。他方、「いいえ」の場合はフリック操作が不成立であると判定され、オブジェクトへの動作は実行されずにスワイプ処理のまま終了する。F[n-1]およびF[n-2]の2つのフリック距離を用いる場合は、何れもが閾値TH1以上である場合に、フリック操作の成立が判定されるのがよい。つまり、F[n-1]の1フレーム分のフリック距離のみが閾値TH1以上であったとしても、フリック操作の成立は判定されない。判定に用いる複数の直前のフレームは、2以上且つ遡ることが可能な数(図6(a)および図6(b)の例では10個)までの何れの数としてよい。複数の直前のフレームを用いるのは、直前1つのみのフレームの操作情報しか決定されないタップ操作と区別するためである。なお、フリック距離のみならず、フリック方向についても上記判定で用いたフレームを用いて決定するのがよい。直近のフレームでの方向をフリック方向として決定してもよいし、複数のフレームを判定に用いる場合は、最も古いフレームの方向をフリック方向として決定してもよい。これ以外にも、各フレームの平均の方向を計算してフリック方向として決定してもよい。 In response to the determination of the flick operation information in at least one frame immediately before in S309, it is determined whether the slide distance in the flick operation information is greater than or equal to the threshold value TH1 (S501). Specifically, it is determined whether the slide (flick) distance plotted in the immediately preceding n−1 frame F [n−1] or the immediately preceding n−2 frame F [n−2] is greater than or equal to the threshold value TH1. Is done. In the case of “Yes”, it is determined that the flick operation has been established, and the operation on the object is executed through the subsequent processing. On the other hand, in the case of “No”, it is determined that the flick operation is not established, and the operation on the object is not executed, and the process ends with the swipe process. When two flick distances F [n−1] and F [n−2] are used, it is preferable to determine whether or not the flick operation is established when both are equal to or greater than the threshold value TH1. That is, even if only the flick distance for one frame of F [n−1] is greater than or equal to the threshold value TH1, the establishment of the flick operation is not determined. The plurality of immediately preceding frames used for the determination may be any number up to 2 and up to a number that can be traced back (10 in the example of FIGS. 6A and 6B). The reason for using a plurality of immediately preceding frames is to distinguish them from tap operations in which only the operation information of only the immediately preceding frame is determined. Note that not only the flick distance but also the flick direction may be determined using the frame used in the above determination. The direction in the latest frame may be determined as the flick direction, and when a plurality of frames are used for determination, the direction of the oldest frame may be determined as the flick direction. In addition, the average direction of each frame may be calculated and determined as the flick direction.
 S501の結果、フリック操作の成立が判定されると、更に、フリック距離に基づいてフリック操作種別が決定される。フリック操作種別には、「強」フリック操作および「弱」フリック操作が含まれる。つまり、フリック距離が更なる閾値T2(>T1)以上であるかが判定される(S502)。「はい」の場合はフリック操作種別が「強」フリック操作に設定され(S503)、「いいえ」の場合はフリック操作種別が「弱」フリック操作に設定される(S504)。連続する複数のフレームのフリック距離を用いる場合は、それらの内の最大値を閾値T2と比較することで判断されるのがよい。即ち、図8ではF[n-1]のフリック距離が最大値で且つTH2より大きいため、ここではフリック操作種別が「強」フリック操作と設定される。フリック操作種別は、実施されることになる動作が関連付けられる。 If it is determined in step S501 that the flick operation is established, the flick operation type is further determined based on the flick distance. The flick operation type includes a “strong” flick operation and a “weak” flick operation. That is, it is determined whether the flick distance is greater than or equal to a further threshold value T2 (> T1) (S502). If “yes”, the flick operation type is set to “strong” flick operation (S503), and if “no”, the flick operation type is set to “weak” flick operation (S504). When the flick distances of a plurality of consecutive frames are used, it is preferable that the maximum value among them is compared with the threshold value T2. In other words, in FIG. 8, the flick distance of F [n−1] is the maximum value and is greater than TH2, and therefore the flick operation type is set to “strong” flick operation. The flick operation type is associated with the operation to be performed.
 (E)オブジェクトへの動作の実施(S311)
 図10から図12を参照して、サッカーゲームプログラムにおけるキック動作実施部203によるオブジェクトへの動作(即ち、キャラクタによるボールへの動作)の実施の処理(S311)について更に説明する。図10に、S311の詳細処理フローを示す。図11に、特定のアクション種別の決定に関する動作領域(シュート・レンジSR)の概略を示す。図12にキャラクタ特性を定義するキャラクタ・テーブル200を、図13にキャラクタによるボールへの動作に関する動作テーブルの一覧300~500をそれぞれ示す。図12および図13に示すテーブル群は、携帯端末の通信部112を通じてゲームプログラムと共にゲームサーバからダウンロードすることにより、携帯端末100に反映される。
(E) Implementation of operation on object (S311)
With reference to FIG. 10 to FIG. 12, the processing (S311) of performing the action on the object (that is, the action on the ball by the character) by the kick action executing unit 203 in the soccer game program will be further described. FIG. 10 shows a detailed processing flow of S311. FIG. 11 shows an outline of an operation region (shoot range SR) related to determination of a specific action type. FIG. 12 shows a character table 200 for defining character characteristics, and FIG. 13 shows a list 300 to 500 of action tables related to the action of the character on the ball. The table group shown in FIG. 12 and FIG. 13 is reflected on the mobile terminal 100 by downloading from the game server together with the game program through the communication unit 112 of the mobile terminal.
 最初にタッチオフ時点におけるゲーム空間内のキャラクタ位置が決定される(S601)。キャラクタ位置が動作領域に含まれるかが判断される(S602)。「はい」の場合は、S403で取得したスライド方向に対応するゲーム空間内の方向について、ゴール方向を向いているかが更に判断される(S603)。「はい」の場合は、フリック操作に対し、フリック方向への「シュート」という特定のアクション種別の動作が関連付けられる(S604)。一方、S602またはS603で「いいえ」の場合は、フリック操作に対し、フリック方向への「キック」というデフォルトのアクション種別の動作が関連付けられる(S605)。フリック操作に対しアクション種別の動作が関連付けられると、アク
ション種別およびS503,S504で設定済みのフリック操作種別に応じて、実際の動作が実施される(S606)。動作の実施時には、その内容に応じてキャラクタやボールのフリック方向のアクションに関するアニメーションが再生される。
First, the character position in the game space at the time of touch-off is determined (S601). It is determined whether the character position is included in the motion area (S602). In the case of “Yes”, it is further determined whether the direction in the game space corresponding to the slide direction acquired in S403 is in the goal direction (S603). In the case of “Yes”, an operation of a specific action type of “shoot” in the flick direction is associated with the flick operation (S604). On the other hand, if “No” in S602 or S603, the action of the default action type “kick” in the flick direction is associated with the flick operation (S605). When the action type action is associated with the flick operation, the actual action is performed according to the action type and the flick operation type set in S503 and S504 (S606). When the operation is performed, an animation related to the action in the flick direction of the character or the ball is reproduced according to the content.
 フィールドFD上に配置されるキャラクタやボールのような各オブジェクトの位置データはフィールド座標(XY座標)で管理される。S602やS603もXY座標系において判断されるのがよい。XY座標系は、フィールドの中心や自チームのゴールを原点にする等、直交座標系での2次元平面を定義できるのであれば何れでもよい。S602に関連し、図11のフィールドFDの例では、自キャラクタCH12は、シュート・レンジSR内に含まれるように配置されている。シュート・レンジSRという動作領域では、フリック操作に応じて自キャラクタCH12は「シュート」動作を実施する。シュート・レンジSRはゴール・エリアGAやペナルティ・エリアPAを包含しているが必ずしもこれに限定されない。また、シュート・レンジSRの形状も矩形に限定されない。シュート・レンジSRは、キャラクタ(プレイヤ)の特性に応じて、キャラクタ毎に異なるように構成するのがよい。キャラクタの特性は、一例として、図12に示されるような、記憶部110内のキャラクタ・テーブル200で管理される。具体的には、キャラクタID、選手名、所属チーム(自チーム/相手チーム)、ポジションのような一般的なキャラクタ情報に加え、一実施形態では、シュート力(例えば、60/100)のようなパラメータもキャラクタ特性として管理するのがよい。つまり、シュート・レンジSRはシュート力に基づいてキャラクタ毎に設定されるのがよい。例えば、ミドルシュートが得意な選手は、シュート力が高く、シュート・レンジSRも広く設定されるのがよい。なお、キャラクタ・テーブル200の項目は例示に過ぎず、これらに限定されない。他にも利き足(右利き/左利き)を管理してシュート・レンジSRの構成に反映させてもよい。S603におけるゴール方向とは、例えば図11では、自キャラクタCH12の位置に対し、ゴールの左右方向(ゴールポスト方向gd1,gd2(点線))から外側に角度α(実線)を設けて成されるゴール方向領域GDAに向く方向と規定される。 The position data of each object such as a character or a ball placed on the field FD is managed by field coordinates (XY coordinates). S602 and S603 are also preferably determined in the XY coordinate system. The XY coordinate system may be any as long as it can define a two-dimensional plane in an orthogonal coordinate system, such as using the center of the field or the goal of the own team as the origin. In relation to S602, in the example of the field FD in FIG. 11, the player character CH12 is arranged so as to be included in the shoot range SR. In the action region of the shoot range SR, the player character CH12 performs a “shoot” action in response to the flick operation. The shoot range SR includes the goal area GA and the penalty area PA, but is not necessarily limited to this. Further, the shape of the chute range SR is not limited to a rectangle. The shoot range SR is preferably configured to be different for each character according to the characteristics of the character (player). Character characteristics are managed by a character table 200 in the storage unit 110 as shown in FIG. Specifically, in addition to general character information such as the character ID, player name, belonging team (own team / partner team), and position, in one embodiment, the shooting power (for example, 60/100) is used. Parameters should also be managed as character characteristics. That is, the shooting range SR is preferably set for each character based on the shooting power. For example, a player who is good at middle shots should have a high shooting power and a wide shooting range SR. The items in the character table 200 are merely examples, and are not limited to these. In addition, the dominant hand (right-handed / left-handed) may be managed and reflected in the configuration of the shoot range SR. For example, in FIG. 11, the goal direction in S <b> 603 is a goal formed by providing an angle α (solid line) outward from the left / right direction of the goal (goal post directions gd <b> 1, gd <b> 2 (dotted line)) with respect to the position of the player character CH <b> 12. It is defined as a direction toward the direction area GDA.
 S606に関し、フリック操作種別およびアクション種別に応じた動作は、一例として、記憶部110に格納された図13の動作テーブル群から決定される。テーブル群には、スライド操作・動作テーブル300、フリック操作・動作テーブル400、およびフリック操作・特定動作テーブル500が含まれるが、必ずしもこれに限定されない。 Regarding S606, the operation according to the flick operation type and the action type is determined from the operation table group of FIG. 13 stored in the storage unit 110 as an example. The table group includes a slide operation / motion table 300, a flick operation / motion table 400, and a flick operation / specific motion table 500, but is not necessarily limited thereto.
 スライド操作・動作テーブル300は、スライド操作種別に応じたデフォルトの動作が規定される。フリック操作の場合は「キック」動作が関連づけられる。つまり、フリック操作の場合はデフォルト動作として「キック」動作が決定される。なお、図5のS307では、スワイプ操作に関連づけられる「ドリブル」動作についても、スライド操作・動作テーブル300を参照することで決定される。フリック操作・動作テーブル400は、フリック操作種別に応じたデフォルトの動作が規定される。フリック操作種別が「強フリック」操作には「高い弾道」が関連づけられ、「弱リック」操作には「低い弾道」が関連づけられる。これらスライド操作・動作テーブル300とフリック操作・動作テーブル400に基づくことにより、例えば図8のS503でフリック操作種別が「強フリック」操作と設定された場合に、「高い弾道」の「キック」が動作として決定される。同様にS504で「弱リック」操作と設定された場合に、「低い弾道」の「キック」が動作として決定される。加えて、フリック操作・特定動作テーブル500は、対象動作領域に応じた特定のアクション種別が規定される。一例として、シュート・レンジSRには特定アクション種別「シュート」がフリック操作に関連づけられる。自ペナルティ・エリアPA2には特定アクション種別「クリア」がフリック操作に関連づけられる。これ以外にも、例えば相手コーナー・エリアを定義して特定アクション種別として「センタリング」を関連づける等、様々な特定アクション種別が関連づけられ得る。なお、自キャラクタCHがフリック操作・特定動作テーブル500に規定される何れの動作領域にも位置しない場合は、スラ
イド操作・動作テーブル300におけるデフォルト動作である「キック」がそのまま実施される。
The slide operation / motion table 300 defines a default motion corresponding to a slide operation type. In the case of a flick operation, a “kick” operation is associated. That is, in the case of the flick operation, the “kick” operation is determined as the default operation. In S307 of FIG. 5, the “dribbling” operation associated with the swipe operation is also determined by referring to the slide operation / operation table 300. The flick operation / action table 400 defines default actions according to the flick operation type. A “high trajectory” is associated with the “strong flick” operation whose flick operation type is “strong flick”, and a “low trajectory” is associated with the “weak lick” operation. Based on the slide operation / motion table 300 and the flick operation / motion table 400, for example, when the flick operation type is set to “strong flick” operation in S503 of FIG. Determined as action. Similarly, when the “weak lick” operation is set in step S504, “kick” of “low trajectory” is determined as an action. In addition, the flick operation / specific action table 500 defines a specific action type according to the target action area. As an example, a specific action type “shoot” is associated with the flick operation in the shoot range SR. In the own penalty area PA2, a specific action type “clear” is associated with the flick operation. In addition to this, various specific action types can be associated, for example, by defining a partner corner area and associating “centering” as the specific action type. If the player character CH is not located in any of the motion areas defined in the flick operation / specific motion table 500, the “kick” that is the default motion in the slide operation / motion table 300 is performed as it is.
 図13の例では、フリック操作・動作テーブル400での「フリック操作種別」、およびフリック操作・特定動作テーブル500での「特定アクション種別」の内容によって、少なくとも次の6つの動作パターンが想定される。即ち、「高い弾道」の「シュート」動作、「高い弾道」の「クリア」動作、「高い弾道」の「キック」動作、「低い弾道」の「シュート」動作、「低い弾道」の「クリア」動作、「低い弾道」の「キック」動作である。S606ではこれら少なくとも6つの動作パターンの内の1つが、フリック方向に対応する空間内の方向に向けて実施される。なお、上記例は一例にすぎず、必ずしもこれらに限定されない。 In the example of FIG. 13, at least the following six operation patterns are assumed depending on the contents of “flick operation type” in the flick operation / operation table 400 and “specific action type” in the flick operation / specific operation table 500. . That is, “shoot” operation of “high trajectory”, “clear” operation of “high trajectory”, “kick” operation of “high trajectory”, “shoot” operation of “low trajectory”, “clear” of “low trajectory” Operation, “low trajectory” “kick” operation. In S606, one of these at least six operation patterns is performed in the direction in the space corresponding to the flick direction. In addition, the said example is only an example and is not necessarily limited to these.
 このように、ユーザのフリック操作に様々な動作を割り当てることができるという意味で、簡易で直感的なタッチ操作性をユーザに提供することができる。また、簡易な操作性を維持しつつ、キャラクタがボールに複雑な動作を実施させることで円滑にゲーム進行が可能となる。 In this way, a simple and intuitive touch operability can be provided to the user in the sense that various operations can be assigned to the user's flick operation. In addition, the game can smoothly proceed by allowing the character to perform a complex motion on the ball while maintaining simple operability.
 (F)UI画像生成処理(S306)
 図14および図15を参照して、UI画像生成部206により、UI画像が生成されタッチディスプレイに表示される処理(S306)について更に説明する。図14に、一実施形態により生成されるUI画像の概略図を示す。図15に、タッチオンからタッチオフまでの一連のスライド操作を通じてタッチディスプレイ102に表示されるUI画像の遷移例を示す。図16に、S306の詳細処理フローを示す。図14に示されるように、UI画像は、基部BP、先端部TPおよびこれらの間の接続部CPを有する水滴形状の弾性体の画像Iとして形成される。図15(a)のように、弾性体の初期形状は円形状である。弾性体の形状は、正方形または3角形状から成る複数のポリゴンのセット(メッシュ)によって形成可能であることが当業者に理解される。つまり、各ポリゴンの各頂点座標を所定の数式に代入することにより、弾性体の水滴形状を再現することができる。図15(a)~(c)のように、一連のスライド操作を通じて、UI画像は、弾性体の基部BPがタッチ開始点に固定され、弾性体の先端部TPが指のタッチ位置に追従するように形成するのがよい。フレームレート毎に、弾性体が弾性変形して伸縮する態様のUI画像が表示される。基部BPから先端部TPまでの長さには、所定の上限値が設けられる。上限値を超える場合は、先端部TPが指の位置から分離されるようにUI画像が表示される。
(F) UI image generation processing (S306)
With reference to FIG. 14 and FIG. 15, the UI image generation unit 206 further generates a UI image and displays it on the touch display (S306). FIG. 14 shows a schematic diagram of a UI image generated according to an embodiment. FIG. 15 shows a transition example of a UI image displayed on the touch display 102 through a series of slide operations from touch-on to touch-off. FIG. 16 shows a detailed processing flow of S306. As shown in FIG. 14, the UI image is formed as an image I of a waterdrop-shaped elastic body having a base BP, a tip TP, and a connecting portion CP between them. As shown in FIG. 15A, the initial shape of the elastic body is circular. It will be understood by those skilled in the art that the shape of the elastic body can be formed by a set (mesh) of a plurality of polygons made of a square or a triangle. That is, the water droplet shape of the elastic body can be reproduced by substituting each vertex coordinate of each polygon into a predetermined mathematical expression. As shown in FIGS. 15A to 15C, the UI base BP of the elastic body is fixed at the touch start point and the tip TP of the elastic body follows the touch position of the finger through a series of slide operations as shown in FIGS. It is good to form like this. For each frame rate, a UI image in which the elastic body elastically deforms and expands and contracts is displayed. A predetermined upper limit value is provided for the length from the base BP to the tip TP. If the upper limit value is exceeded, the UI image is displayed so that the tip TP is separated from the finger position.
 S306のUI画像生成処理では、タッチ位置情報取得部201によってタッチオン状態で取得されるタッチ位置情報(タッチ開始点の座標)が記憶されている。また、タッチ位置情報取得部201によってタッチナウ状態で現在取得しているタッチ位置情報(現在のタッチ点の座標)が都度記憶されている。最初に、タッチ開始点と現在のタッチ点の間の距離dが計算される(S701)。距離dが上限値D以下であるかが判定される(S702)。「はい」の場合(例えば図15(b))は、弾性体の基部BPが、タッチ開始点に関連づけられる(S703)と共に、弾性体の先端部TPが現在のタッチ点に関連づけられる(S705)。即ち、弾性体の基部BPがタッチ開始点に位置合わせされ、先端部TPが現在のタッチ点に位置合わせされる。そして、基部BPおよび先端部TPが表示され、更にこれらの間を連結部CPで連結するように表示されて、弾性体のUI画像Iが生成される(S707/図15の(b)(c))。これにより、スライド操作を通じて、現在のタッチ点の変更に伴って弾性体を変形させるような態様でUI画像が表示される。他方、S702で「いいえ」の場合(例えば図15(d))は、弾性体の基部BPはタッチ開始点に関連づけられる(S704)が、弾性体の先端部TPは、今度は、タッチ開始点から現在のタッチ点に向けて上限値の距離Dだけ離れた点の位置に関連づけられる(S706)。そして、上記同様、弾性体のUI画像Iが生成される(S707)。図15(d
)のように、現在のタッチ点と先端部TPは分離された位置関係となる。基部BPから先端部TPまでの長さに上限値Dを設けるのは、特にサッカーゲームでの「ドリブル」操作(例えば図4(a))では、複雑なスライド操作によって操作距離が長くなるからである。サッカーではフィールドFD内に多くのプレイヤ・キャラクタが存在する。仮に、操作距離が長い場合であってもそれに追従する弾性体のUI画像を完全に表示すると、ゲーム画面上、プレイヤ・キャラクタがUI画像で覆われ、隠れてしまうためである。
In the UI image generation process of S306, touch position information (coordinates of touch start points) acquired in a touch-on state by the touch position information acquisition unit 201 is stored. Further, the touch position information (current coordinates of the touch point) currently acquired in the touch now state by the touch position information acquisition unit 201 is stored each time. First, a distance d between the touch start point and the current touch point is calculated (S701). It is determined whether the distance d is less than or equal to the upper limit value D (S702). In the case of “Yes” (for example, FIG. 15B), the base BP of the elastic body is associated with the touch start point (S703), and the tip TP of the elastic body is associated with the current touch point (S705). . That is, the base BP of the elastic body is aligned with the touch start point, and the tip TP is aligned with the current touch point. Then, the base part BP and the tip part TP are displayed and further displayed so as to be connected by the connecting part CP, and the UI image I of the elastic body is generated (S707 / (b) and (c) in FIG. 15). )). Thereby, the UI image is displayed through the slide operation in such a manner that the elastic body is deformed in accordance with the change of the current touch point. On the other hand, in the case of “No” in S702 (for example, FIG. 15D), the base BP of the elastic body is associated with the touch start point (S704), but the tip TP of the elastic body is now the touch start point. To the current touch point is associated with the position of the point separated by the upper limit distance D (S706). Then, similarly to the above, the elastic body UI image I is generated (S707). FIG.
), The current touch point and the tip portion TP are separated from each other. The upper limit D is set for the length from the base part BP to the tip part TP because the operation distance becomes longer due to a complicated slide operation particularly in a “dribbling” operation (for example, FIG. 4A) in a soccer game. is there. In soccer, there are many player characters in the field FD. Even if the operation distance is long, if the UI image of the elastic body that follows it is completely displayed, the player character is covered with the UI image and hidden on the game screen.
 図16に戻り、S708でUI画像が生成されると、更に、タッチ開始点と現在のタッチ点の間の距離dに応じて、弾性体のUI画像に視覚上の処理が実施される(S708)。例えば、図15(b)の状態ではUI画像が白色となるよう色彩処理され、他方で図15(d)の距離dが上限値Dに達した状態ではUI画像が水色となるよう異なる色彩処理がされる。現在のタッチ点の座標に少なくとも基づいて、現在のスライド操作のスライド方向が決定される(S709)。より詳細には、図6(a)に示したような連続する直前の2つのタッチ位置座標を少なくとも用いることにより、現在のスライド操作のスライド方向が決定される。決定されたスライド方向に関する指標IND(図15(b)から(d)の矢印)が先端部TPに関連づけられる。即ち、指標INDが弾性体の先端部TPに位置合わせされて、UI画像と共に表示される(S710)。このように、本発明は、簡易で直感的なタッチ操作性をユーザに提供する。また、簡易な操作性を維持しつつ、ゲーム・オブジェクトへの複雑な動作を実施し、円滑にゲームを進行させることを可能にする。 Returning to FIG. 16, when a UI image is generated in S708, visual processing is further performed on the elastic UI image according to the distance d between the touch start point and the current touch point (S708). ). For example, in the state of FIG. 15B, the color processing is performed so that the UI image becomes white, and in the state where the distance d in FIG. 15D reaches the upper limit value D, the different color processing is performed so that the UI image becomes light blue. Is done. The slide direction of the current slide operation is determined based on at least the coordinates of the current touch point (S709). More specifically, the slide direction of the current slide operation is determined by using at least two immediately preceding touch position coordinates as shown in FIG. 6A. The indicator IND (arrows in FIGS. 15B to 15D) regarding the determined slide direction is associated with the tip TP. That is, the index IND is aligned with the tip TP of the elastic body and displayed together with the UI image (S710). Thus, the present invention provides a user with simple and intuitive touch operability. In addition, while maintaining simple operability, it is possible to perform a complex operation on the game object and smoothly advance the game.
 上述した実施形態は、本発明の理解を容易にするための例示に過ぎず、本発明を限定して解釈するためのものではない。本発明は、その趣旨を逸脱することなく、変更、改良することができると共に、本発明にはその均等物が含まれることは言うまでもない。 The embodiment described above is merely an example for facilitating understanding of the present invention, and is not intended to limit the present invention. The present invention can be changed and improved without departing from the gist thereof, and it is needless to say that the present invention includes equivalents thereof.

Claims (13)

  1.  タッチディスプレイを備えるコンピュータによって実行されると、前記コンピュータに、
     前記タッチディスプレイ上の任意の位置情報を検知し、所定のフレームレート毎にタッチ位置情報を繰り返し取得するステップと、
     所定の数の前記タッチ位置情報に基づいて、第1操作情報を決定するステップと、
     前記第1操作情報に基づき、ゲーム空間内の移動体への第1動作を実施するステップと、
     前記タッチディスプレイからのタッチオフを判定するステップと、
     前記タッチオフの直前の少なくとも1つのフレームにおける前記タッチ位置情報を用いて第2操作情報を決定するステップと、
     前記第2操作情報に基づいて、フリック操作を判定するステップと、
     前記フリック操作に応じて前記第1動作が解除され、前記第2操作情報に基づく前記移動体への第2動作を実施するステップと
    を含む方法を実行させ、
     前記第1動作が、キャラクタと共に前記ゲーム空間内を移動すべく前記移動体を動作させ、
     前記第2動作が、前記キャラクタとは離れて前記ゲーム空間を移動すべく前記移動体を動作させる、ゲームプログラム。
    When executed by a computer comprising a touch display,
    Detecting arbitrary position information on the touch display and repeatedly acquiring the touch position information for each predetermined frame rate;
    Determining first operation information based on a predetermined number of the touch position information;
    Performing a first action on a moving object in the game space based on the first operation information;
    Determining a touch-off from the touch display;
    Determining second operation information using the touch position information in at least one frame immediately before the touch-off;
    Determining a flick operation based on the second operation information;
    Executing the method including the step of releasing the first operation in response to the flick operation and performing the second operation on the moving body based on the second operation information;
    The first action moves the moving body to move in the game space together with the character,
    A game program in which the second action moves the moving body to move in the game space away from the character.
  2.  請求項1記載のゲームプログラムであって、前記タッチオフを判定する前記ステップにおいて、複数のフレームレート単位にわたり前記タッチ位置情報が連続して取得できない場合に、前記タッチオフを判定する、ゲームプログラム。 2. The game program according to claim 1, wherein, in the step of determining the touch-off, the touch-off is determined when the touch position information cannot be continuously acquired over a plurality of frame rate units.
  3.  請求項1または2記載のゲームプログラムにおいて、
     前記第2操作情報が、前記直前の少なくとも1つのフレームにおけるフリック方向を含み、
     前記第2動作を実施する前記ステップにおいて、前記フリック方向に対応する前記ゲーム空間の方向に前記第2動作が実施される、ゲームプログラム。
    The game program according to claim 1 or 2,
    The second operation information includes a flick direction in the at least one previous frame,
    A game program in which, in the step of executing the second action, the second action is executed in a direction of the game space corresponding to the flick direction.
  4.  請求項1から3のいずれか一項記載のゲームプログラムにおいて、
     前記第2操作情報が、前記直前の少なくとも1つのフレームにおけるフリック距離を含み、
     前記フリック操作を判定する前記ステップにおいて、前記フリック距離に基づいてフリック操作種別が決定され、
     前記第2動作を実施する前記ステップにおいて、前記フリック操作種別に関連づけられる動作が実施される、ゲーム・プログム。
    In the game program according to any one of claims 1 to 3,
    The second operation information includes a flick distance in at least one previous frame,
    In the step of determining the flick operation, a flick operation type is determined based on the flick distance,
    A game program in which in the step of executing the second operation, an operation associated with the flick operation type is executed.
  5.  請求項4記載のゲームプログラムにおいて、
     前記フリック操作種別は、前記直前の連続する2つ以上のフレームにおける各フリック距離の最大値が、所定の閾値で規定される範囲内にあるか基づいて決定される、ゲームプログラム。
    The game program according to claim 4,
    The game program in which the flick operation type is determined based on whether the maximum value of each flick distance in the immediately preceding two or more frames is within a range defined by a predetermined threshold.
  6.  請求項4または5記載のゲームプログラムであって、前記移動体への前記第2動作を実施する前記ステップが更に、
     前記ゲーム空間において前記キャラクタの位置が所定の動作領域に含まれるかを判定するステップと、
     前記キャラクタの位置が前記所定の動作領域に含まれる場合に前記第2動作を特定のアクション種別に関連づけるステップと
    を含み、
     前記第2動作を実施する前記ステップにおいて、前記特定のアクション種別に関連づけられる動作が実施される、ゲームプログラム。
    6. The game program according to claim 4, wherein the step of performing the second operation on the moving body further includes:
    Determining whether the position of the character is included in a predetermined motion area in the game space;
    Associating the second action with a specific action type when the position of the character is included in the predetermined action area,
    A game program in which an operation associated with the specific action type is executed in the step of executing the second operation.
  7.  請求項6記載のゲームプログラムにおいて、前記所定の動作領域が、前記キャラクタの特性に基づいて構成される、ゲームプログラム。 7. The game program according to claim 6, wherein the predetermined motion area is configured based on characteristics of the character.
  8.  請求項6または7記載のゲームプログラムにおいて、当該ゲームプログラムがサッカーゲームに適用され、
     前記第1動作が前記キャラクタによる前記移動体への「ドリブル」動作であり、
     前記第2動作が前記キャラクタによる前記移動体への「キック」動作であり、
     前記第2動作において、前記特定のアクション種別が「シュート」動作に関連づけられる、ゲームプログラム。
    The game program according to claim 6 or 7, wherein the game program is applied to a soccer game,
    The first action is a “dribbling” action to the moving body by the character;
    The second action is a “kick” action to the moving body by the character;
    In the second operation, the specific action type is associated with a “shoot” operation.
  9.  請求項8記載のゲームプログラムであって、
     前記フリック操作種別が「強」フリック操作および「弱」フリック操作を含み、
     前記第2動作において、前記「強」フリック操作が「高い弾道」に関連づけられ、前記「弱」フリック操作が「低い弾道」に関連づけられる、ゲームプログラム。
    A game program according to claim 8, wherein
    The flick operation type includes a “strong” flick operation and a “weak” flick operation,
    In the second operation, the game program in which the “strong” flick operation is associated with “high trajectory” and the “weak” flick operation is associated with “low trajectory”.
  10.  請求項1から9のいずれか一項記載のゲームプログラムであって、前記方法が、更に、
     前記タッチ位置情報を繰り返し取得する前記ステップに応じて、フレームレート毎に、タッチオンを検知した第1タッチ位置情報と現在検知している第2タッチ位置情報とに関連づけて、画像を生成および表示するステップであって、
     前記画像が、基部および先端部を有する弾性体の画像であり、
     前記基部が前記第1タッチ位置情報に関連づけられ、前記先端部がフレーム毎に前記第2タッチ位置情報に関連づけられることにより、スライド操作を通じて、前記第2タッチ位置情報の変更に伴って前記弾性体を変形させるように前記画像が表示される、ゲームプログラム。
    The game program according to any one of claims 1 to 9, wherein the method further includes:
    In accordance with the step of repeatedly acquiring the touch position information, an image is generated and displayed in association with the first touch position information detected for touch-on and the second touch position information currently detected for each frame rate. Step,
    The image is an image of an elastic body having a base and a tip;
    The base is associated with the first touch position information, and the distal end is associated with the second touch position information for each frame, so that the elastic body is changed along with the change of the second touch position information through a slide operation. A game program in which the image is displayed so as to deform.
  11.  請求項10記載のゲームプログラムであって、前記画像を生成および表示する前記ステップにおいて、
     前記第1タッチ位置情報と前記第2タッチ位置情報の間の距離が所定の上限値を超える場合に、前記先端部が、前記第1タッチ位置情報から前記第2タッチ位置情報に向けて前記所定の上限値だけ離れた位置情報に関連づけられる、ゲームプログラム。
    The game program according to claim 10, wherein the step of generating and displaying the image includes:
    When the distance between the first touch position information and the second touch position information exceeds a predetermined upper limit value, the tip portion moves from the first touch position information toward the second touch position information. A game program associated with position information separated by an upper limit value.
  12.  請求項10または11記載のゲームプログラムであって、前記画像を生成および表示する前記ステップが、更に、
     前記基部と前記先端部の間の距離に応じて、前記弾性体の画像に視覚上の処理を実施するステップを含む、ゲームプログラム。
    12. The game program according to claim 10, wherein the step of generating and displaying the image further comprises:
    A game program including a step of performing visual processing on an image of the elastic body according to a distance between the base and the tip.
  13.  請求項10から12のいずれか一項記載のゲームプログラムにおいて、前記画像を生成および表示する前記ステップが、更に、現在第2タッチ位置情報に基づくスライド方向を示す指標を、前記先端部に関連づけて、前記画像と共に表示するステップを含む、ゲームプログラム。 The game program according to any one of claims 10 to 12, wherein the step of generating and displaying the image further associates an index indicating a slide direction based on current second touch position information with the tip portion. A game program comprising the step of displaying together with the image.
PCT/JP2017/001232 2016-02-25 2017-01-16 Game program WO2017145570A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016033747A JP6043448B1 (en) 2016-02-25 2016-02-25 Game program
JP2016-033747 2016-02-25

Publications (1)

Publication Number Publication Date
WO2017145570A1 true WO2017145570A1 (en) 2017-08-31

Family

ID=57543930

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/001232 WO2017145570A1 (en) 2016-02-25 2017-01-16 Game program

Country Status (2)

Country Link
JP (1) JP6043448B1 (en)
WO (1) WO2017145570A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10413814B2 (en) * 2017-06-09 2019-09-17 Supercell Oy Apparatus and method for controlling user interface of computing apparatus
TWI733089B (en) * 2018-03-13 2021-07-11 日商古河電氣工業股份有限公司 Copper alloy plate and manufacturing method thereof, heat dissipation part and shielding shell for electric and electronic equipment
JP7345093B2 (en) * 2018-12-04 2023-09-15 株式会社Mixi Game program, game processing method, and game terminal
JP7256942B2 (en) 2018-12-04 2023-04-13 株式会社Mixi Game program, game processing method and game terminal
JP7034538B2 (en) * 2020-05-01 2022-03-14 株式会社コロプラ Game program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005279165A (en) * 2004-03-31 2005-10-13 Nintendo Co Ltd Game program which changes movement of game object relating to input location
JP2009018202A (en) * 2004-12-28 2009-01-29 Sega Corp Image processor and its method
JP2010035887A (en) * 2008-08-06 2010-02-18 Konami Digital Entertainment Co Ltd Game device, game device control method and program
JP2013039232A (en) * 2011-08-16 2013-02-28 Sega Corp Computer game device, control method and game program for controlling computer game device, and storage medium on which game program is stored
JP2015150215A (en) * 2014-02-14 2015-08-24 株式会社コナミデジタルエンタテインメント Movement control device and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005279165A (en) * 2004-03-31 2005-10-13 Nintendo Co Ltd Game program which changes movement of game object relating to input location
JP2009018202A (en) * 2004-12-28 2009-01-29 Sega Corp Image processor and its method
JP2010035887A (en) * 2008-08-06 2010-02-18 Konami Digital Entertainment Co Ltd Game device, game device control method and program
JP2013039232A (en) * 2011-08-16 2013-02-28 Sega Corp Computer game device, control method and game program for controlling computer game device, and storage medium on which game program is stored
JP2015150215A (en) * 2014-02-14 2015-08-24 株式会社コナミデジタルエンタテインメント Movement control device and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Fami Tsu App iPhone", FIFA14, vol. 10, 17 October 2013 (2013-10-17), pages 22 *

Also Published As

Publication number Publication date
JP6043448B1 (en) 2016-12-14
JP2017148257A (en) 2017-08-31

Similar Documents

Publication Publication Date Title
WO2017145570A1 (en) Game program
US11707669B2 (en) Program, control method, and information processing apparatus
US11766611B2 (en) Game device having improved slide-operation-driven user interface
CN107519644B (en) Visual angle adjusting method and device in 3D game
US11266904B2 (en) Game system, game control device, and information storage medium
JP5830806B1 (en) Program, electronic device, and method for improving operability of user input
JP6706438B2 (en) Game program
JP6078719B2 (en) Movement control apparatus and program
US11759702B2 (en) Game system, processing method, and information storage medium
JP6084719B1 (en) Image processing method and image processing program
JP5953418B1 (en) Program, electronic apparatus, system and method for improving user input operability
JP6457984B2 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
JP6626613B2 (en) Game program
JP6530186B2 (en) Game program
JP7034538B2 (en) Game program
JP5888793B2 (en) Object control program and object control method
JP7340054B2 (en) game program
JP5993513B1 (en) Baseball game program, game program, and computer
JP2019080928A (en) Game system, game control device, and program
JP2018050937A (en) Game program, method, and information processor
JP2017148487A (en) Image processing method, and image processing program
KR20160126848A (en) Method for processing a gesture input of user
JP6190505B1 (en) GAME PROGRAM, RECORDING MEDIUM, METHOD, AND INFORMATION PROCESSING DEVICE
JP2016157293A (en) Computer program for selecting icon
JP2016077439A (en) Game program

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17756012

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17756012

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP