WO2015151640A1 - ユーザインターフェースプログラムおよびゲームプログラム - Google Patents
ユーザインターフェースプログラムおよびゲームプログラム Download PDFInfo
- Publication number
- WO2015151640A1 WO2015151640A1 PCT/JP2015/054783 JP2015054783W WO2015151640A1 WO 2015151640 A1 WO2015151640 A1 WO 2015151640A1 JP 2015054783 W JP2015054783 W JP 2015054783W WO 2015151640 A1 WO2015151640 A1 WO 2015151640A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- elastic object
- contact point
- user interface
- touch panel
- elastic
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 claims description 36
- 230000009471 action Effects 0.000 claims description 29
- 230000004044 response Effects 0.000 claims description 3
- 238000000034 method Methods 0.000 description 53
- 230000008569 process Effects 0.000 description 49
- 238000010586 diagram Methods 0.000 description 26
- 230000015572 biosynthetic process Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000009825 accumulation Methods 0.000 description 7
- 239000004973 liquid crystal related substance Substances 0.000 description 5
- 206010003694 Atrophy Diseases 0.000 description 4
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 4
- 230000037444 atrophy Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000002860 competitive effect Effects 0.000 description 3
- 230000005489 elastic deformation Effects 0.000 description 3
- 230000003796 beauty Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/822—Strategy games; Role-playing games
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
Definitions
- the present invention relates to a user interface program. More specifically, the present invention is a user interface program used in a game executed on a smartphone, a user interface program that deforms and displays the shape of an elastic object on a touch panel, and an operation on the touch panel. The present invention relates to a game program used in a game that controls and displays the action of a character in a virtual space.
- FIG. 1 a virtual joystick as shown in Drawing 1 is known, and it is adopted in many smart phone games released today.
- the user interface shown in FIG. 1 is a display in which two large and small circles are arranged concentrically, and when the user performs a sliding operation, the small circle moves in the direction in which the sliding operation is performed. It is. Thereby, the user can confirm the moving direction of the game character.
- Virtual joysticks provide a game experience like a conventional home game machine (such as Nintendo's Family Computer (registered trademark)) on a smartphone by simulating physical operation keys. Although it has been realized, it is only that. Moreover, since there is no tactile sensation, the user's feeling of operation remains unsatisfactory.
- the cursor shown in FIG. 2 is adopted.
- the user interface shown in FIG. 2 is displayed by arranging two large and small circles at one end on the start point side and the other end on the end point side, and by connecting the two circles with a line, It is formed as a cursor (see also paragraphs [0071] to [0076] and FIG. 5 of Patent Document 2). Further, as shown in FIG. 2, the cursor area is narrowed so that the cursor area becomes constant as the cursor lengthens (see also paragraph [0018] of Patent Document 2).
- Patent Document 2 With the technology disclosed in Patent Document 2, the user can perform the drag operation while recognizing the position of the start point and end point of the drag operation, the distance from the start point to the end point, and the direction from the start point to the end point (Patent Document). (See also paragraph 2 [0006]).
- the cursor of Patent Document 2 only discloses that two large and small circles are connected together, and does not disclose any specific principle or implementation method regarding how to connect them. In this regard, it cannot be said that the cursor disclosed in Patent Document 2 realizes a user interface having sufficient aesthetics for a user (for example, having a smooth curved beauty).
- An object of the present invention is to provide a technology related to a user interface that can provide a new game experience that can eliminate such dissatisfaction with a virtual joystick and also provide a pseudo-tactile experience.
- an object of the present invention is to provide a technique related to a user interface that behaves in an elastically deformable manner as an elastic object formed as a very smooth curve by extending a circular shape.
- the user interface targeted by the present invention is realized by the following user interface program and game program. That is, according to the present invention, the first user interface program deforms and displays the shape of the elastic object on the touch panel of the mobile terminal. And the 1st formation part which forms and displays the 1st elastic object which has circular shape around the 1st contact point on a touch panel, and the 1st judgment part which judges whether slide operation with an object was performed on a touch panel.
- the first determination unit determines a slide operation from the first contact point to the second contact point
- the deformed second elastic object is formed by extending the circular shape toward the second contact point.
- the first elastic object is accommodated in a mesh region composed of a plurality of meshes, and each of the plurality of meshes is set as a second contact point along the direction of the sliding operation.
- the mobile terminal is caused to function as a second forming unit that deforms the circular shape of the first elastic object by extending progressively longer as the mesh is closer.
- the ninth user interface program sets a reference point on a line in the slide operation direction extending from the first contact point on the touch panel in the second forming unit, and determines the slide operation distance
- the circular shape of the first elastic object is deformed by forming the plurality of expanded meshes that accommodate the second elastic object based on the distances from the reference points of the plurality of meshes that accommodate the one elastic object.
- the third user interface program may change the shape of the second elastic object when the first determination unit further determines a slide operation from the second contact point to the third contact point.
- a third forming unit for forming and displaying a deformed third elastic object, wherein the second elastic object is rotated by an angle between the second contact point and the third contact point with respect to the first contact point The portable terminal is caused to function as a third forming unit that forms and displays the third elastic object by extending the shape of the rotated second elastic object toward the third contact point.
- the fourth user interface program determines whether the object has left the touch panel at the second contact point, and the second determination unit determines whether the object has left the touch panel at the second contact point.
- the deformed second elastic object is gradually shrunk toward the first contact point, thereby functioning on the mobile terminal as a fourth forming unit that restores and displays the first elastic object.
- the fifth user interface program deforms and displays the shape of the elastic object on the touch panel of the mobile terminal.
- the 1st formation part which forms and displays the 1st elastic object which has circular shape around the 1st contact point on a touch panel
- the 1st judgment part which judges whether slide operation was performed on the touch panel, When the sliding operation from the first contact point to the second contact point is determined by the first determination unit, the deformed second elastic object is formed by extending the circular shape toward the second contact point.
- a second forming unit for displaying wherein the first elastic object is accommodated in a mesh region including a plurality of meshes, and the first elastic object is divided into a first part and a second part based on the mesh region; Depending on the distance of the slide operation, the first part is enlarged around the first contact point, the second part is moved around the second contact point, the enlarged first part and the moved second part, A series of curves By, curvilinear shape to form a second elastic object deformed to function in a mobile terminal as a second forming unit.
- the first game program is used in a game that controls and displays the action of the character in the virtual space in accordance with an operation on the touch panel of the mobile terminal.
- a first forming unit that forms and displays a first elastic object having a circular shape around the first contact point on the touch panel; a first determination unit that determines whether a slide operation has been performed on the touch panel; When a slide operation from the first contact point to the second contact point is determined by the one determination unit, the deformed second elastic object is formed and displayed by extending the circular shape toward the second contact point.
- the first elastic object is accommodated in a mesh region composed of a plurality of meshes, and each of the plurality of meshes is closer to the second contact point along the direction of the sliding operation.
- an object that behaves as an elastic body when the user operates the touch panel is displayed, and the object is elastically deformed according to the operation.
- the object is elastically deformed according to the operation.
- Such a user interface is associated with the action action of the character in the virtual space (for example, character movement or attack in a competitive RPG game), and the user can control the action of the character.
- required can be improved.
- FIG. 1 shows an example of a prior art user interface.
- FIG. 2 shows an example of a prior art user interface.
- FIG. 3 is a schematic diagram of a portable terminal for executing a user interface program according to the embodiment of the present invention.
- FIG. 4 is a block diagram schematically showing the configuration of the mobile terminal of FIG.
- FIG. 5 is a block diagram showing an outline of input / output in the mobile terminal of FIG.
- FIG. 6 is a schematic diagram of a user interface displayed at the contact start point of the slide operation.
- FIG. 7 is a schematic diagram of a user interface displayed at the contact end point of the slide operation.
- FIG. 8 is a schematic diagram of a user interface displayed after the end of the slide operation.
- FIG. 1 shows an example of a prior art user interface.
- FIG. 2 shows an example of a prior art user interface.
- FIG. 3 is a schematic diagram of a portable terminal for executing a user interface program according to the embodiment of the present invention.
- FIG. 4
- FIG. 9 is a functional block diagram implemented using the user interface program according to the embodiment of the present invention.
- FIG. 10 is a user interface image of the elastic object displayed at the contact start point of the slide operation.
- FIG. 11 is a schematic diagram of polygons forming a part of the elastic object of FIG.
- FIG. 12 is a schematic diagram showing a change of a polygon when a part of the elastic object of FIG. 11 is elastically deformed.
- FIG. 13 is a schematic diagram relating to the elastic object deformation process of the first embodiment.
- FIG. 14 is a schematic diagram relating to a polygon direction adjustment process performed when the elastic object is deformed.
- FIG. 15 is a schematic diagram showing the elastic object deformation process of the first embodiment over time.
- FIG. 10 is a user interface image of the elastic object displayed at the contact start point of the slide operation.
- FIG. 11 is a schematic diagram of polygons forming a part of the elastic object of FIG.
- FIG. 12 is a schematic diagram showing
- FIG. 16 is a schematic diagram illustrating a case where a further sliding operation is performed with respect to the elastic object deformation process according to the first embodiment.
- FIG. 17 is a schematic diagram relating to the elastic object deformation process of the second embodiment.
- FIG. 18 is a schematic process flowchart relating to the elastic object deformation process according to the embodiment of the present invention.
- FIG. 19 is a detailed process flowchart regarding the elastic object deformation process of the second embodiment.
- FIG. 20 is a schematic diagram relating to the elastic object deformation process of the third embodiment.
- FIG. 21 is a detailed process flowchart regarding the elastic object deformation process according to the third embodiment.
- FIG. 22 is a schematic diagram showing the elastic object deformation process of the third embodiment over time.
- FIG. 23 is a schematic diagram relating to the elastic object deformation process of the fourth embodiment.
- FIG. 24 is a screen diagram showing a game application example of the user interface implemented in the first example or the second example.
- FIG. 25 is a screen diagram showing a game application example of the user interface implemented in the first example or the second example.
- FIG. 26 is a screen diagram showing a game application example of the user interface implemented in the first example or the second example.
- FIG. 27 is a screen view showing a game application example of the user interface implemented according to the fourth embodiment.
- FIG. 28 is a schematic diagram showing another game application example of the user interface formed by executing the user interface program according to the embodiment of the present invention.
- FIG. 29 is a schematic diagram showing deformation of the elastic object shown in FIG. FIG.
- FIG. 30 is a schematic diagram related to a pool movement process based on the user interface shown in FIG.
- FIG. 31 is a process flowchart regarding the accumulation movement process based on the user interface shown in FIG.
- FIG. 32 is a screen diagram showing a game application example of the user interface corresponding to FIG.
- a user interface program for deforming and displaying the shape of an elastic object on the touch panel of the mobile terminal, and a mobile terminal using the elastic object A game program used in a game that controls and displays the action of a character in a virtual space according to an operation on the touch panel will be described.
- the same components are denoted by the same reference numerals.
- the user interface program according to the embodiment of the present invention mainly constitutes a part of a game program as a smartphone game. More specifically, the user interface program is used as a part of the game program to control the operation of the game character (game object) in the virtual space.
- the smartphone 1 includes a touch panel 2, and the user can control the action of the game character through a user operation on the touch panel 2.
- the mobile terminal that executes the user interface program according to the present embodiment is not limited to the smartphone 1, and can be any device as long as it is a terminal having a touch panel, such as a PDA or a tablet computer device. .
- the smartphone 1 includes a CPU 3, a main memory 4, an auxiliary memory 5, a transmission / reception unit 6, a display unit 7, and an input unit 8 that are connected to each other via a bus.
- the main memory 4 is composed of, for example, a DRAM
- the auxiliary memory 5 is composed of, for example, an HDD.
- the auxiliary memory 5 is a recording medium capable of recording the user interface program and the game program according to the present embodiment.
- the user interface program stored in the auxiliary memory 4 is expanded on the main memory 4 and executed by the CPU 3.
- the main memory 4 also temporarily stores data generated while the CPU 3 is operating according to the user interface program and the game program and data used by the CPU 3.
- the transmission / reception unit 6 establishes connection (wireless connection and / or wired connection) between the smartphone 1 and the network under the control of the CPU 3 and transmits / receives various information.
- the display unit 7 displays various information to be presented to the user under the control of the CPU.
- the input unit 8 detects an input operation (mainly, a physical contact operation such as a touch operation, a slide (swipe) operation, and a tap operation) performed on the touch pal 2 by the user.
- the display unit 7 and the input unit 8 correspond to the touch panel 2 described above.
- the touch panel 2 includes a touch sensing unit 301 corresponding to the input unit 8 and a liquid crystal display unit 302 corresponding to the display unit 7.
- the touch panel 2 displays an image under the control of the CPU 3, receives an interactive touch operation (such as a physical contact operation on the touch panel 2) by the player, and displays a corresponding graphic on the liquid crystal based on the control of the control unit 303. It is displayed on the display unit 302.
- the touch sensing unit 301 outputs an operation signal corresponding to the touch operation by the user to the control unit 303.
- the touch operation can be performed by any object.
- the touch operation may be performed by a user's finger, a stylus, or the like.
- the touch sensing unit 301 can employ, for example, a capacitance type, but is not limited thereto.
- the control unit 303 detects an operation signal from the touch sensing unit 301, the control unit 303 determines as an operation instruction to the user's character, and transmits a graphic (not shown) corresponding to the instruction operation to the liquid crystal display unit as a display signal. Process.
- the liquid crystal display unit 302 displays a graphic corresponding to the display signal.
- the operation object 400 includes a fixed circle 410 and an elastic object 420 positioned inside the fixed circle 410.
- the operation object 400 is displayed on the touch panel 2 when it is detected that the user's finger has touched the touch panel 2.
- the elastic object 420 is formed to have a circular shape around a contact point on the touch panel 2 as an initial shape at the time of finger contact.
- circular shape is formed centering on a contact point, it is not necessarily limited to this. For example, it may be formed to be shifted upward or downward by a certain distance. By shifting by a certain distance, it is possible to prevent the circle from being hidden by the user's finger when displaying a circular shape.
- the elastic object according to the present embodiment behaves like an elastic body in response to a user operation on the touch panel.
- the elastic object 420 is pulled by the user's finger.
- the elastic object 420 is positioned near the base 430 where the position of the slide operation is started and the position is fixed, and the contact end point of the slide operation (note: the finger is in contact). It is comprised so that the front-end
- the base portion 430, the connecting portion 440, and the tip portion 450 may be collectively referred to as an elastic object 420.
- the elastic object is formed so as to elastically extend in the direction in which the slide operation is performed. That is, the elastic object 420 'elastically deformed is formed and displayed by extending the initial circular shape toward the contact end point.
- the elastic object 420 is formed so that the base portion 430 is larger than the tip portion 450, but the invention is not limited to this, and conversely, even if the tip portion 450 is formed larger than the base portion 430. Good.
- the tip portion 450 further moves following the movement, and the extending direction of the elastic object 420 also changes.
- the elastically deformed elastic object is in accordance with the restoring force. Then, the image is displayed so as to be restored to the initial shape shown in FIG. 6 by gradually shrinking toward the contact start point.
- the elastic object 420 is displayed so as to protrude from the fixed circle 410 in the atrophy direction opposite to the extending direction of the elastic object 420, and then restored to the initial shape.
- the illustrated elastic object 420 is deformed along with the restoration, but is not limited thereto. For example, the initial shape may be restored without performing such deformation, and the position of the elastic object may be shifted so as to vibrate in the atrophy direction opposite to the extension direction.
- the function set includes a user operation unit 800 related to a user input operation through the touch panel, and a character operation unit 900 for operating the character by controlling the action of the character in the virtual space of the game according to the operation on the touch panel. including.
- each of the contact determination unit 810, the slide operation determination unit 830, and the non-contact determination unit 860 performs user input operation determination processing.
- the corresponding initial shape object forming unit 820, polygon Various object forming processes are executed in the direction adjusting unit 840, the deformed object forming unit 850, and the restored object forming unit 870.
- the contact determination unit 810 determines whether or not there is contact by an object on the touch panel.
- the initial shape object forming unit 820 forms and displays an elastic object having a circular shape around the contact point on the touch panel.
- the slide operation determination unit 830 determines whether a slide operation with an object has been performed on the touch panel.
- the polygon direction adjustment unit 840 performs adjustment processing by rotating the polygon so that the polygon direction becomes the object movement direction. To implement.
- the deformed object forming unit 850 forms and displays a deformed elastic object by extending the initial circular shape toward the contact end point.
- the non-contact determination unit 860 determines whether the object has left the touch panel at the contact end point in the slide operation. When it is determined by the non-contact determination unit 860 that the object has left, the restored object formation unit 870 gradually shrinks the elastic object deformed by the deformed object formation unit 850 toward the contact start point. The elastic object having the initial circular shape formed by the initial shape object forming unit 820 is restored and displayed.
- the icon image forming unit 920 further generates and displays at least one icon image around the elastic object when the initial circular object is formed by the initial shape object forming unit 820.
- the icon selection determination unit 930 determines whether the contact point on the touch panel corresponds to the icon image arrangement position. When the slide operation determination unit 830 determines a slide operation, and when the icon selection determination unit 930 determines that the contact end point corresponds to the arrangement position of the icon image, the character control unit 910 displays the character associated with the icon image. Perform an action.
- FIG. 10 schematically shows a user interface image of an elastic object having a circular shape that is formed when a finger touches the touch panel 2.
- the elastic object according to the present embodiment is displayed, the elastic object is generated as a substantially square user interface image 750 and superimposed as a part of the game image.
- the user interface image 750 includes a translucent area 760 and a transparent area 770, and the translucent area 760 is displayed on the screen as a basic display area of the elastic object.
- 11 and 12 schematically show a part of an elastic object as an example.
- the elastic object according to the present embodiment expresses elastic deformation by moving the coordinates of each vertex 720 of the plate-like polygon 700 divided into a plurality of meshes 710.
- Each vertex 720 is arranged in a mesh shape.
- the other vertex 720 is also coordinated according to a movement vector (for example, a moving direction and a moving distance) related to the vertex 720A. Is changed.
- FIG. 13 schematically shows an example of the elastic object 420a deformed based on the first embodiment.
- the deformed object forming unit 850 forms the deformed elastic object 420a ′ by extending the initial circular shape generated by the initial shape object forming unit 820 toward the end point along the direction of the slide operation. To do.
- each of the plurality of meshes maintains the same rectangular shape for each column (# 1 to # 4).
- the meshes # 1A to # 1D all have the same rectangular shape.
- the mesh in the row (# 1) closer to the contact end point is progressively more progressive than the mesh in the far row (# 4). Stretch to long.
- the expansion rate of each row may be weighted and distributed according to the moving distance L by the slide operation.
- the polygon direction adjustment unit 840 performs an adjustment process (polygon direction adjustment process) by rotating the polygon so that the polygon direction becomes the moving direction of the object. carry out.
- the deformed elastic object is always a constant width W (see FIG. 10) with respect to the slide operation direction regardless of the slide movement distance, as schematically shown in FIG.
- the diameter of the circular translucent region 760 shown in FIG. As will be understood by those skilled in the art as compared with the prior art shown in FIG. 2, the elastic object deformed according to this embodiment can be configured to always have a constant width W, and the initial circle.
- the above-described width W in FIG. 15 is not limited to being fixed to the diameter of the circle, and for example, the width W may be gradually increased according to the slide operation. . That is, the enlargement ratio may be configured so that the width W is gradually increased as the moving distance is increased as in FIGS. 15 (a), 15 (b), and 15 (c). Thereby, it becomes easier for the user to visually recognize the amount of the slide operation as the size of the elastic body object.
- the user does not remove his / her finger from the contact end point 1 (FIG. 16A), and yet another contact end point 2 (FIG.
- the elastic object is continuously deformed. That is, the direction of the polygon is rotated by an angle between the contact end point 1 and the contact end point 2 with respect to the contact start point in the slide operation, and the deformed elastic object 420a ′ is rotated. The shape is subsequently extended to end point 2 to form a further deformed elastic object 420a ′′.
- FIG. 17 schematically illustrates an example of the elastic object 420b deformed based on the second embodiment.
- the elastic object can be formed so as to have a smoother curved shape than the first embodiment.
- the polygon direction adjustment processing is first performed.
- the present embodiment is not limited to extending each mesh while maintaining the rectangular shape of each row (FIG. 15). That is, in this embodiment, first, one of the mesh points on the line in the slide operation direction extending from the contact start point of the slide operation on the touch panel is set as the reference point O (0, Ystd).
- this embodiment starts in step S101, and in step S102, the contact determination unit 810 determines contact with the touch panel. If it is determined that there is contact with an object (finger), the process proceeds to step S103, and the initial shape object forming unit 820 forms an initial circular elastic object around the contact start point (see also FIG. 10). Subsequently, the process proceeds to step S104, and the slide operation determination unit 830 determines whether a slide operation with an object has been performed on the touch panel. If it is determined that the slide operation has been performed, the process proceeds to step S105, where the polygon direction adjusting unit 840 and the deformed object forming unit 850 extend the initial circular shape toward the contact end point, thereby deforming the deformed elastic object. Form.
- a reference point (0, Ystd) is set on a line in the slide operation direction (Y direction) extending from the contact start point on the touch panel.
- Y direction slide operation direction
- step S204 the changed object forming unit 850 moves the vertices P (x0, y0) of the plurality of meshes that accommodate the initial circular elastic object when the elastic object is deformed. That is, the corresponding vertex P ′ (x1, y1) of the plurality of meshes is determined.
- L is a moving distance and R is a distance from the reference point (0, Ystd) of each point (x0, y0)
- the vertex in which the distance R of each point (x0, y0) from the reference point (0, Ystd) becomes larger is calculated with less movement in the y direction, and the vertex does not move much.
- the above step S204 is performed for all the vertices of the plurality of meshes to determine all the corresponding vertices of the stretched mesh, thereby forming a deformed elastic object.
- the elastic object formed by the deformable object forming unit 850 does not need to maintain a rectangular shape as in the first embodiment, and thus can form a smoother curved shape.
- step S106 ends in step S106.
- the elastic object is further deformed. That is, the polygon direction adjusting unit 840 rotates the deformed elastic object by rotating the direction of the polygon by an angle between the contact end point 1 and the contact end point 2 with respect to the contact start point in the slide operation.
- the deformed object forming unit 850 extends the shape of the rotated elastic object to the end point 2 to form a further deformed elastic object.
- step S106 when the non-contact determination unit 860 determines that the user has lifted his / her finger from the touch panel, the restoration object forming unit 870 performs elasticity as described above with reference to FIG.
- the deformed elastic object is shrunk stepwise toward the starting point according to its restoring force.
- the atrophy process re-selects the reference point (0, Ystd) as appropriate for each stage, and uses the movement distance and the distance of each point (x0, y0) from the selected reference point (0, Ystd). It will be understood by those skilled in the art that this can be realized by performing a calculation based on the above formula.
- FIG. 20 schematically shows an example of the elastic object 420c deformed based on the third embodiment.
- the elastic object 420c ′ can be formed so as to have a more dynamic curved shape in conjunction with the slide operation of the object.
- the polygon direction adjustment process is first performed.
- the overall outline of the deformation process in the present embodiment is substantially the same as the flowchart of FIG. 18 described in the second embodiment.
- the deformed object forming unit 850 does not stretch each of the plurality of meshes along the direction of the slide operation. Instead, in this embodiment, the deformed object forming unit 850 divides the initial circular elastic object into two parts based on the mesh area, enlarges one mesh area part, and contacts the other mesh area. Move around the end point. Then, by connecting them, a deformed elastic object is formed.
- step S303 the deformed object forming unit 850 divides the initial circular elastic object into two mesh regions, an upper portion and a lower portion, based on a plurality of mesh regions.
- the initial circular shape is formed by dividing the semicircular shape in the direction perpendicular to the slide operation direction (Y direction) (X direction). Note that the two mesh regions of the upper part and the lower part may be partially overlapped, and in the example of FIG.
- step S304 the deformed object forming unit 850 first enlarges the mesh area around the contact start point at an enlargement rate corresponding to the moving distance L of the slide operation for the lower part.
- the deformed object forming unit 850 moves the mesh region in the Y direction around the contact end point for the upper portion.
- the deformed object forming unit 850 connects the lower circumferential portion enlarged in step S304 and the semicircular portions in the upper mesh region moved in step S305 to form a deformed elastic object.
- a straight line may be used as shown in FIGS. 20B and 20C, but in addition to this, in order to form a smoother curved shape, the connection line is set to X. Any effect processing that stretches in the direction should be performed.
- FIG. 22 schematically shows a series of elastic objects 420c formed as smoother curved shapes.
- FIG. 22A shows a state of a series of changes in the elastic object 420c in time series
- FIG. 22B shows each elastic object in an overlapping manner.
- the lower part of the initial circular shape is enlarged more greatly according to the slide operation distance, so that it will be understood by those skilled in the art that the greater the slide operation distance, the greater the width of the elastic object 420c. .
- This embodiment is different from Embodiments 1 and 2 in which the width is constant in terms of the shape. In this embodiment, by increasing the width of the elastic object 420c, it is possible to form the elastic object 420c 'so as to have a more dynamic curved shape in conjunction with the slide operation of the object.
- the shape of the elastic object 420d shown in FIG. 23 may be changed according to parameters used in the game. For example, in a competitive RPG game, the magnitude of damage given to an enemy character, the type of weapon used by the game character, etc., and the occurrence of a continuous combo (continuously attacking the enemy character), etc. The size and shape may be changed.
- FIGS. 24 to 27 are screen examples when the user interface program according to the present embodiment and a game program including the user interface program are executed.
- a game object for example, a game character
- the game program is implemented according to any of the various embodiments described above.
- a character action based on the moving direction and moving distance of the slide operation is executed, and the character action is displayed together with the deformed elastic object.
- Game application example 1 24, 25, and 26 are game screen examples when a game program including a user interface program implemented according to the first embodiment or the second embodiment is executed.
- the character control unit 910 performs a character movement operation based on the movement distance and movement direction of the slide operation, and displays the character movement operation together with the deformed elastic object.
- FIG. 24 and FIG. 25 are game screen examples in a two-dimensional virtual space
- the game screen in FIG. 26 is a game screen example in a three-dimensional virtual space.
- the elastic object 420 is displayed to be elastically deformed in the upper right direction.
- the game character 460 moves three-dimensionally in the upper right direction.
- FIG. 27 is a screen example when a game program including a user interface program implemented according to the fourth embodiment is executed. This screen example is an example in which an attack action by a character is executed and displayed in accordance with a user operation.
- the elastic object 420d mounted according to the fourth embodiment is displayed.
- the game character 460 executes an attacking action.
- FIG. 28 is an example of a user interface image when executing at least a user interface program implemented according to any of the first to third embodiments according to the embodiment of the present invention.
- the user interface image example is displayed when the user touches the touch panel.
- the icon images 510 and 520 labeled “SKILL” and the icon images 510 and 520 arranged apart from the elastic object 420 are included.
- a set of substantially elliptical elastic objects 610 and 620 to be formed is superimposed as a user interface image.
- the user interface image is operated by the icon image forming unit 920 so as to appear when, for example, the finger contact state by the user is continued for a certain period of time (that is, when the user presses the touch panel for a long time).
- the user interface image display state when a slide operation with a user's finger is performed, an operation of moving the character according to the slide operation can be performed while the elastic object 420 is deformed as in the game application example 1.
- the character control unit 910 interrupts the movement operation of the character and selects the selected “SKILL ”Icon. Specifically, it is determined whether the slide contact end point on the touch panel is at the icon image arrangement position, and if it is at the arrangement position, the character action associated with the “SKILL” icon is executed.
- “SKILL” is a character action (action) associated with the icon images 510 and 520, and may be, for example, one of attack actions by game characters in the game.
- the icon images 510 and 520 are continuously displayed unless the contact state with the touch panel is released.
- FIG. 30 is a schematic diagram showing the accumulation movement of the game character.
- a scenario is assumed in which the user presses and holds the touch panel for a long time at the contact start point, and then performs a slide operation to the contact end point 1 (slide operation 1) and further slides to the contact end point 2.
- both the elastic object 420 and the “SKILL” icon images 510 and 520 are displayed as user interface images. Even after the slide operation 1, the user can perform the “SKILL2” action by performing the further slide operation 2 up to the “SKILL2” icon 520 without releasing the finger.
- step S405 the slide operation determination unit 830 determines whether a slide operation with an object has been performed on the touch panel. If it is determined that the slide operation has been performed, the process proceeds to step S406, where the initial circular shape is expanded toward the contact end point 1 by the polygon direction adjusting unit 840 and the deformed object forming unit 850, thereby deforming the elastic object. Form.
- the step S406 is continuously performed during the slide operation.
- the process continues to step S407, and the icon selection determination unit 930 determines whether the contact end point 2 of the slide operation corresponds to the icon image arrangement position.
- step S405 When the slide operation from the contact end point 1 to the contact end point 2 is determined (step S405) and it is determined that the contact end point 2 corresponds to the arrangement position of the icon image (step S407), In step S408, the character control unit 910 executes the character action “SKILL” associated with the icon image. Finally, the process proceeds to step S409, and the accumulation movement process is terminated.
- FIG. 32 shows an example of a game screen when a game program including a user interface program that implements the accumulation movement process is executed. 32 superimposes the user interface of FIG. 30 on the game screen.
- a sword icon image is displayed, and when this is selected, a powerful attack with a specific weapon is enabled.
- the game character can be moved by the elastic object while the icon image is displayed.
- the elastic object displayed by the user interface program of the present invention associates the amount of slide operation by the user (that is, the movement distance of the finger on the touch panel) with the movement distance of the game character. Accordingly, by displaying the elastic object, the magnitude (movement distance) of the movement instruction to the game character can be more easily recognized. Moreover, it becomes easy to recognize a controller that tends to be hidden by a finger (see FIG. 1) like a conventional virtual joystick. Furthermore, by displaying the elastic object together with the icon image, usability can be improved in a smartphone game that requires operation speed.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
以下、図6から図8を参照して本実施の形態によるユーザインターフェースの基本構成について説明する。図6に示すように、本実施の形態による操作オブジェクト400は、固定円410、および当該固定円410の内部に位置する弾性オブジェクト420を備えている。操作オブジェクト400は、ユーザの指がタッチパネル2に接触したことが検知されるとタッチパネル2に表示される。弾性オブジェクト420は、図示のように、指接触時の初期形状として、タッチパネル2上の接触点の周囲に円形状を有するように形成される。なお、図6では、接触点を中心に円形状を形成しているが、必ずしもこれに限定されない。例えば、一定距離だけ上部または下部にずらして形成してもよい。一定距離だけずらすことにより、円形状を表示する際に、ユーザの指で隠れてしまうのを防止することができる。
図9を参照して、本実施の形態によるユーザインターフェースプログラムおよびゲームプログラムで実装される主要な機能のセットについて説明する。当該機能セットを通じて、操作信号としての入力を処理して、表示信号としての出力を生成することになる。機能セットは、タッチパネルを通じたユーザ入力操作に関連するユーザ操作部800、および、タッチパネル上の操作に応じてゲームの仮想空間内のキャラクタの動作を制御してキャラクタを操作するためのキャラクタ操作部900を含む。ユーザ操作部800では、接触判定部810、スライド操作判定部830、および非接触判定部860の各々によりユーザ入力操作の判定処理を行い、判定の結果に従って、対応する初期形状オブジェクト形成部820、ポリゴン方向調整部840および変形オブジェクト形成部850、並びに復元オブジェクト形成部870において各種オブジェクト形成処理を実行する。
続いて、図10から図23を参照して、上記に基本構成を説明した本実施の形態による弾性オブジェクトの変形に関する処理について、幾らかの例示によりその詳細を説明する。
(3-a)弾性オブジェクト変形処理の基本原理
図10は、タッチパネル2への指の接触時に形成される、円形状を有する弾性オブジェクトのユーザインターフェース画像を模式的に示したものである。図10(a)に示すように、本実施の形態による弾性オブジェクトは、表示される際、略正方形状のユーザインターフェース画像750として画像生成がされ、ゲーム画像の一部として重畳される。ユーザインターフェース画像750は、半透明領域760および透明領域770から成り、半透明領域760が弾性オブジェクトの基本表示領域として画面表示される。
(3-b)実施例1
図13は、実施例1に基づいて変形される弾性オブジェクト420aの一例を模式的に示したものである。本実施例では、変形オブジェクト形成部850において、スライド操作の方向に沿って初期形状オブジェクト形成部820で生成した初期円形状を終了点に向けて伸張させることにより、変形した弾性オブジェクト420a’を形成する。特に、図13の例では、分割された複数のメッシュの各頂点の座標を移動させる際に、当該複数のメッシュの各々は、列(#1~#4)ごとに同一の長方形状を維持したまま(例えばメッシュ#1A~#1Dは全て同一の長方形状を有している。)、接触終了点に近い列(#1)のメッシュほど、遠い列(#4)のメッシュと比べて累進的に長く伸張させる。一例として、スライド操作による移動距離Lに応じて、各列の伸張率を加重配分するように構成してもよい。
図17は、実施例2に基づいて変形される弾性オブジェクト420bの一例を模式的に示したものである。実施例2によれば、上記実施例1よりも更により滑らかな曲線形状を有するように弾性オブジェクトを形成可能である。なお、上記実施例1と同様に、本処理例でも、最初に上記ポリゴン方向調整処理を実施する。本実施例は、上記実施例1とは異なり、各列の長方形状を維持して各メッシュを伸張する(図15)ことに限定されない。即ち、本実施例では、まず、タッチパネル上においてスライド操作の接触開始点から延びるスライド操作方向の線上のメッシュ点の1つを基準点O(0,Ystd)として設定する。
x1=x0,
y1=y0+L/R
によって計算される。
図20は、実施例3に基づいて変形される弾性オブジェクト420cの一例を模式的に示したものである。実施例3によれば、実施例1,2と比べて、更に、物体のスライド操作に連動してよりダイナミックな曲線形状を有するように弾性オブジェクト420c’を形成可能である。実施例1,2と同様、本実施例でも、最初にポリゴン方向調整処理を実施する。本実施例での変形処理の全体概要については、実施例2で説明した図18のフローチャートと概ね同様である。
図23は、実施例4に基づいて変形される弾性オブジェクトの一例を模式的に示したものである。図23に示すように、本実施の形態による変形例として、タップ操作(タッチパネル2にタッチ(瞬間的な接触操作)がされる操作)がされると、弾性オブジェクト900が押しつぶされたように弾性変形した状態で当該タップ操作点に表示される。弾性オブジェクト420dの外周は下記の一般式で表される正弦曲線により構成されている。本実施の形態において、A、ω及びTの値は所定の制限範囲内でランダムに定められる。これにより、弾性オブジェクトの形状をランダムに変形させることができ、現実に弾性体を押しつぶした時に現れる外周の形状に近づけることができる。
なお、図23に示される弾性オブジェクト420dの形状は、ゲーム内で使用されるパラメータに従って変化させることとしてもよい。例えば、対戦型RPGゲームにおいて敵キャラクタに与えたダメージの大きさや、ゲームキャラクタが使用する武器の種類等、更には連続コンボ(連続的に敵キャラクタに攻撃を加えること)が発生する等によって星の大きさや形状を変えることとしてもよい。
図24から図32を参照して、上記の本実施の形態による、弾性オブジェクトのユーザインターフェースに関するゲームへの適用例について説明する。図24から図27は、本実施の形態によるユーザインターフェースプログラム、および当該ユーザインターフェースプログラムを備えたゲームプログラムが実行された際の画面例である。スマートフォンゲームとして、ゲーム内の仮想空間に配置され、タッチパネルに表示されたゲームオブジェクト(例えばゲームキャラクタ)をユーザの指等の物体で操作を行う対戦型RPGゲームを想定する。当該ゲームプログラムは、上記の各種実施例の何れかに従って実装されたものである。ユーザのスライド操作に応じて、スライド操作の移動方向および移動距離に基づくキャラクタ動作を実行し、該キャラクタ動作が、上記変形された弾性オブジェクトと共に表示される。
図24、図25および図26は、上記の実施例1または実施例2に従って実装されたユーザインターフェースプログラムを備えたゲームプログラムが実行された場合のゲーム画面例である。ユーザのスライド操作に応じて、キャラクタ制御部910によってスライド操作の移動距離および移動方向に基づくキャラクタ移動動作を実行し、該キャラクタ移動動作を、上記変形した弾性オブジェクトと共に表示する。
図28は、本発明の実施の態様により、上記実施例1から実施例3の何れかに従って実装されたユーザインターフェースプログラムを少なくとも実行する場合のユーザインターフェース画像例である。当該ユーザインターフェース画像例は、ユーザがタッチパネルを接触した所定の場合に表示される。図示のように、上記の初期円形状の弾性オブジェクト420に加えて、弾性オブジェクト420から離れて配置された「SKILL」と記されたアイコン画像510,520、およびアイコン画像510,520を包含して形成される略楕円形形状の弾性オブジェクト610,620のセットがユーザインターフェース画像として重畳される。
2 タッチパネル
3 CPU
4 主記憶
5 補助記憶
6 送受信部
7 表示部
8 入力部
301 タッチセンシング部
302 液晶表示部
303 制御部
400 操作オブジェクト
410 固定円
420,420a,420b,420c
弾性オブジェクト
430 基部
440 接続部
450 先端部
460 ゲームキャラクタ
510,520 アイコン画像
610,620 弾性オブジェクト
750 ユーザインターフェース画像
800 ユーザ操作部
810 接触判定部
820 初期形状オブジェクト形成部
830 スライド操作判定部
840 ポリゴン方向調整部
850 変形オブジェクト形成部
860 非接触判定部
870 復元オブジェクト形成部
900 キャラクタ操作部
910 キャラクタ制御部
920 アイコン画像形成部
930 アイコン選択判定部
Claims (17)
- 携帯端末のタッチパネル上で弾性オブジェクトの形状を変形して表示するユーザインターフェースプログラムであって、
前記タッチパネル上の第1接触点の周囲に円形状を有する第1弾性オブジェクトを形成して表示する第1形成部と、
前記タッチパネル上で物体によるスライド操作が行われたかを判定する第1判定部と、
前記第1判定部において前記第1接触点から第2接触点までのスライド操作が判定された場合に、前記円形状を前記第2接触点に向けて伸張させることにより、変形した第2弾性オブジェクトを形成して表示する第2形成部であって、
前記第1弾性オブジェクトが、複数のメッシュから成るメッシュ領域内に収容され、
前記スライド操作の方向に沿って、前記複数のメッシュの各々を、前記第2接触点に近い前記メッシュほど累進的に長く伸張することにより、前記第1弾性オブジェクトの円形状を変形させる、
第2形成部と
として前記携帯端末に機能させる、ユーザインターフェースプログラム。 - 前記第2形成部で形成される前記第2弾性オブジェクトが、前記スライド操作方向に対して一定の幅を有し、該一定の幅が前記第1弾性オブジェクトの前記円形状の直径以上である、請求項1記載のユーザインターフェースプログラム。
- 請求項1または2記載のユーザインターフェースプログラムであって、
前記第2形成部において、
前記タッチパネル上において前記第1接触点から延びる前記スライド操作方向の線上に基準点が設定され、
前記スライド操作の距離と、前記第1弾性オブジェクトを収容する前記複数のメッシュの前記基準点からの距離とに基づいて、前記第2弾性オブジェクトを収容する前記伸張された複数のメッシュを形成する
ことにより、前記第1弾性オブジェクトの円形状を変形させる、ユーザインターフェースプログラム。 - 請求項3記載のユーザインターフェースプログラムであって、前記第2形成部において、
前記第1接触点を原点とし、且つ、前記スライド操作の方向がY軸方向となるように、前記タッチパネル上のXY座標系が規定され、
前記第1弾性オブジェクトを収容する前記複数のメッシュを構成する各点(x0,y0)に対応する、前記第2弾性オブジェクトを収容する前記伸張された複数のメッシュの各点(x1,y1)が、
x1=x0,
y1=y0+L/R
(但し、Lは前記スライド操作距離、Rは各点(x0,y0)の前記基準点からの距離を示す。)
により決定される、ユーザインターフェースプログラム。 - 請求項1から4のいずれか一項記載のユーザインターフェースプログラムにおいて、更に、
前記第1判定部において前記第2接触点から前記第3接触点までのスライド操作が更に判定された場合に、前記第2弾性オブジェクトの形状を更に変形した第3弾性オブジェクトを形成して表示する第3形成部であって、
前記第1接触点に対する、前記第2接触点と前記第3接触点の間の角度だけ前記第2弾性オブジェクトを回転し、
前記回転された第2弾性オブジェクトの形状を前記第3接触点に向けて伸張して前記第3弾性オブジェクトを形成して表示する
第3形成部
として前記携帯端末に機能させる、ユーザインターフェースプログラム。 - 請求項1から5のいずれか一項記載のユーザインターフェースプログラムにおいて、更に、
前記物体が前記第2接触点で前記タッチパネルから離れたかを判定する第2判定部と、
前記第2判定部において第2接触点で前記タッチパネルから離れたと判定された場合に、前記変形した第2弾性オブジェクトを、前記第1接触点に向けて段階的に萎縮させることにより、前記第1弾性オブジェクトを復元して表示する第4形成部と
として前記携帯端末に機能させる、ユーザインターフェースプログラム。 - 携帯端末のタッチパネル上で弾性オブジェクトの形状を変形して表示するユーザインターフェースプログラムであって、
前記タッチパネル上の第1接触点の周囲に円形状を有する第1弾性オブジェクトを形成して表示する第1形成部と、
前記タッチパネル上でスライド操作が行われたかを判定する第1判定部と、
前記第1判定部において前記第1接触点から第2接触点までのスライド操作が判定された場合に、前記円形状を前記第2接触点に向けて伸張させることにより、変形した第2弾性オブジェクトを形成して表示する第2形成部であって、
前記第1弾性オブジェクトが、複数のメッシュから成るメッシュ領域内に収容され、
前記第1弾性オブジェクトを、前記メッシュ領域に基づいて第1部分と第2部分に分割し、
前記スライド操作の距離に応じて、前記第1接触点の周囲で前記第1部分を拡大し、
前記第2部分を前記第2接触点の周囲に移動し、
前記拡大した第1部分と前記移動した第2部分とを曲線で連結することにより、曲線形状から成る、前記変形した第2弾性オブジェクトを形成する、
第2形成部と
として前記携帯端末に機能させる、ユーザインターフェースプログラム。 - 前記第2形成部において、
前記第1部分および前記第2部分が、前記スライド操作方向の直角方向に前記円形状を半円分割することにより形成される、
請求項7記載のユーザインターフェースプログラム - 前記第2形成部において、
前記第1部分が、前記スライド操作方向および/または前記スライド操作方向の直角方向に対して拡大される、
請求項7または8記載のユーザインターフェースプログラム。 - 前記第2形成部において、更に、
前記移動される第2部分が、前記スライド操作の距離に基づいて拡大される、
請求項7から9のいずれか一項記載のユーザインターフェースプログラム。 - 前記第2形成部において、更に、
前記移動される第2部分の拡大率が、前記第1部分の拡大率よりも小さい、
請求項10記載のユーザインターフェースプログラム。 - 請求項7から11のいずれか一項記載のユーザインターフェースプログラムにおいて、更に、
前記第1判定部において前記第2接触点から前記第3接触点までのスライド操作が更に判定された場合に、前記第2弾性オブジェクトの形状を更に変形した第3弾性オブジェクトを形成して表示する第3形成部であって、
前記第1接触点に対する、前記第2接触点と前記第3接触点の間の角度だけ前記第2弾性オブジェクトを回転し、
前記回転された第2弾性オブジェクトの形状を前記第2接触点から前記第3接触点まで伸張して前記第3弾性オブジェクトを形成して表示する
第3形成部
として前記携帯端末に機能させる、ユーザインターフェースプログラム。 - 請求項7から12のいずれか一項記載のユーザインターフェースプログラムにおいて、更に、
前記物体が前記第2接触点で前記タッチパネルから離れたかを判定する第2判定部と、
前記第2判定部において第2接触点で前記タッチパネルから離れたと判定された場合に、前記変形した第2弾性オブジェクトを、前記第1接触点に向けて段階的に萎縮させることにより、前記第1弾性オブジェクトを復元して表示する第4形成部と
として前記携帯端末に機能させる、ユーザインターフェースプログラム。 - 携帯端末のタッチパネル上の操作に応じて仮想空間内のキャラクタの動作を制御して表示するゲームで使用されるゲームプログラムであって、
前記タッチパネル上の第1接触点の周囲に円形状を有する第1弾性オブジェクトを形成して表示する第1形成部と、
前記タッチパネル上でスライド操作が行われたかを判定する第1判定部と、
前記第1判定部において前記第1接触点から第2接触点までのスライド操作が判定された場合に、前記円形状を前記第2接触点に向けて伸張させることにより、変形した第2弾性オブジェクトを形成して表示する第2形成部であって、
前記第1弾性オブジェクトが、複数のメッシュから成るメッシュ領域内に収容され、
前記スライド操作の方向に沿って、前記複数のメッシュの各々を、前記第2接触点に近い前記メッシュほど累進的に長く伸張することにより、前記第1弾性オブジェクトの円形状を変形させる、
第2形成部と、
前記スライド操作に応じて、前記スライド操作方向に基づく第1キャラクタ動作を実行し、該キャラクタ動作を、前記形成された第2弾性オブジェクトと共に表示する、キャラクタ制御部
として前記携帯端末に機能させる、ゲームプログラム。 - 請求項14記載のゲームプログラムであって、更に、
前記第1判定部において、前記第2接触点から前記第3接触点までのスライド操作が更に判定された場合に、前記第2弾性オブジェクトの形状を更に変形した第3弾性オブジェクトを形成して表示する第3形成部であって、
前記第1接触点に対する、前記第2接触点と前記第3接触点の間の角度だけ前記第2弾性オブジェクトを回転し、
前記回転された第2弾性オブジェクトの形状を前記第3接触点に向けて伸張して前記第3弾性オブジェクトを形成する
ことを含む第3形成部
として前記携帯端末に機能させる、ゲームプログラム。 - 請求項15記載のゲームプログラムにおいて、更に、
第4形成部であって、
前記第1形成部において前記円形状の第1弾性オブジェクトが形成された場合に、更に、前記第1弾性オブジェクトの周囲に少なくとも1つのアイコン画像を生成して表示する、第4形成部
として前記携帯端末に機能させる、ゲームプログラム。 - 請求項16記載のゲームプログラムにおいて、更に、
前記タッチパネル上での接触点が前記アイコン画像の配置位置に対応するかを判定する第2判定部
として前記携帯端末に機能させ、
前記第1判定部において前記第2接触点から前記第3接触点のスライド操作が判定され、且つ前記第2判定部において前記第3接触点が前記1つのアイコン画像の配置位置に対応すると判定される場合に、前記キャラクタ制御部が、
前記アイコン画像に関連付けられた第2キャラクタ動作を実行する、
ように構成される、ゲームプログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15773157.1A EP3128408A4 (en) | 2014-04-04 | 2015-02-20 | User interface program and game program |
KR1020167028509A KR101919349B1 (ko) | 2014-04-04 | 2015-02-20 | 사용자 인터페이스 프로그램 및 게임 프로그램 |
JP2015529730A JP5848857B1 (ja) | 2014-04-04 | 2015-02-20 | ユーザインターフェースプログラムおよびゲームプログラム |
AU2015241900A AU2015241900B2 (en) | 2014-04-04 | 2015-02-20 | User interface program and game program |
CN201580018855.XA CN106255952B (zh) | 2014-04-04 | 2015-02-20 | 用户界面程序以及游戏程序 |
US15/276,692 US20170007921A1 (en) | 2014-04-04 | 2016-09-26 | User interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-078244 | 2014-04-04 | ||
JP2014078244 | 2014-04-04 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/276,692 Continuation US20170007921A1 (en) | 2014-04-04 | 2016-09-26 | User interface |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015151640A1 true WO2015151640A1 (ja) | 2015-10-08 |
Family
ID=54239976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/054783 WO2015151640A1 (ja) | 2014-04-04 | 2015-02-20 | ユーザインターフェースプログラムおよびゲームプログラム |
Country Status (8)
Country | Link |
---|---|
US (1) | US20170007921A1 (ja) |
EP (1) | EP3128408A4 (ja) |
JP (7) | JP5848857B1 (ja) |
KR (1) | KR101919349B1 (ja) |
CN (1) | CN106255952B (ja) |
AU (1) | AU2015241900B2 (ja) |
TW (1) | TWI620589B (ja) |
WO (1) | WO2015151640A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017084308A (ja) * | 2015-10-30 | 2017-05-18 | 京セラドキュメントソリューションズ株式会社 | 表示装置および表示制御プログラム |
WO2017215258A1 (zh) * | 2016-06-16 | 2017-12-21 | 中兴通讯股份有限公司 | 一种虚拟按键的触发方法、装置、终端及计算机存储介质 |
KR20180005222A (ko) * | 2015-10-10 | 2018-01-15 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | 정보 처리 방법, 단말 및 컴퓨터 저장 매체 |
JP2018068781A (ja) * | 2016-10-31 | 2018-05-10 | 株式会社バンク・オブ・イノベーション | ビデオゲーム処理装置、及びビデオゲーム処理プログラム |
JP2020039880A (ja) * | 2015-10-10 | 2020-03-19 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 情報処理方法、端末及びコンピュータ記憶媒体 |
JP2022533321A (ja) * | 2020-04-15 | 2022-07-22 | ▲騰▼▲訊▼科技(深▲セン▼)有限公司 | 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム |
JP2022534661A (ja) * | 2020-04-28 | 2022-08-03 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 仮想オブジェクトの制御方法並びにその装置、端末およびコンピュータプログラム |
JP2023506106A (ja) * | 2020-11-13 | 2023-02-15 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 仮想オブジェクトの制御方法及び装置、電子機器並びにコンピュータプログラム |
JP7337732B2 (ja) | 2018-08-24 | 2023-09-04 | 株式会社コロプラ | プログラム |
JP7409770B2 (ja) | 2018-12-28 | 2024-01-09 | 株式会社バンダイナムコエンターテインメント | ゲームシステム、プログラム及び端末装置 |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9017188B2 (en) | 2009-04-08 | 2015-04-28 | Shoot-A-Way, Inc. | System and method for improving a basketball player's shooting including a detection and measurement system |
JP5981617B1 (ja) * | 2015-08-20 | 2016-08-31 | 株式会社コロプラ | ユーザ・インタフェース画像表示のためのコンピュータ・プログラムおよびコンピュータ実装方法 |
JP5941207B1 (ja) * | 2015-09-10 | 2016-06-29 | 株式会社コロプラ | ユーザ・インタフェース・プログラムおよびコンピュータ実装方法 |
JP5993513B1 (ja) * | 2015-11-25 | 2016-09-14 | 株式会社コロプラ | 野球ゲームプログラム、ゲームプログラム及びコンピュータ |
TWI582681B (zh) * | 2015-12-31 | 2017-05-11 | 鴻海精密工業股份有限公司 | 三維物件創建方法及應用該方法的電子裝置 |
JP6000482B1 (ja) * | 2016-02-08 | 2016-09-28 | 株式会社コロプラ | ユーザ・インタフェース画像表示方法およびプログラム |
JP6097427B1 (ja) * | 2016-02-29 | 2017-03-15 | 株式会社コロプラ | ゲームプログラム |
US10292338B2 (en) * | 2016-06-08 | 2019-05-21 | Organized Thought LLC | Vertical cultivation system, components thereof, and methods for using same |
WO2018084169A1 (ja) * | 2016-11-01 | 2018-05-11 | 株式会社コロプラ | ゲーム方法およびゲームプログラム |
JP6216862B1 (ja) * | 2016-11-01 | 2017-10-18 | 株式会社コロプラ | ゲーム方法およびゲームプログラム |
JP6180610B1 (ja) * | 2016-11-01 | 2017-08-16 | 株式会社コロプラ | ゲーム方法およびゲームプログラム |
JP6189515B1 (ja) * | 2016-11-01 | 2017-08-30 | 株式会社コロプラ | ゲーム方法およびゲームプログラム |
JP6143934B1 (ja) | 2016-11-10 | 2017-06-07 | 株式会社Cygames | 情報処理プログラム、情報処理方法、及び情報処理装置 |
JP2017209573A (ja) * | 2017-09-08 | 2017-11-30 | 株式会社コロプラ | ゲームプログラム、方法及びタッチスクリーンを備える情報処理装置 |
CN107656620B (zh) * | 2017-09-26 | 2020-09-11 | 网易(杭州)网络有限公司 | 虚拟对象控制方法、装置、电子设备及存储介质 |
JP7093690B2 (ja) * | 2018-07-05 | 2022-06-30 | フォルシアクラリオン・エレクトロニクス株式会社 | 情報制御装置、及び表示変更方法 |
JP6672401B2 (ja) * | 2018-08-24 | 2020-03-25 | 株式会社コロプラ | ゲームプログラム、方法、および情報処理装置 |
JP6668425B2 (ja) * | 2018-08-24 | 2020-03-18 | 株式会社コロプラ | ゲームプログラム、方法、および情報処理装置 |
JP6641041B2 (ja) * | 2019-02-07 | 2020-02-05 | グリー株式会社 | 表示制御プログラム、表示制御方法、及び表示制御システム |
CN110025953B (zh) * | 2019-03-15 | 2022-06-10 | 网易(杭州)网络有限公司 | 一种游戏界面显示的方法、装置、存储介质及电子装置 |
CN110052034A (zh) * | 2019-04-12 | 2019-07-26 | 网易(杭州)网络有限公司 | 游戏中的信息标注方法、装置、介质及电子设备 |
JP7314622B2 (ja) * | 2019-05-29 | 2023-07-26 | 富士フイルムビジネスイノベーション株式会社 | 画像表示装置、及び画像表示プログラム |
US11577146B1 (en) | 2019-06-07 | 2023-02-14 | Shoot-A-Way, Inc. | Basketball launching device with off of the dribble statistic tracking |
US11400355B1 (en) | 2019-06-07 | 2022-08-02 | Shoot-A-Way, Inc. | Basketball launching device with a camera for detecting made shots |
JP7172017B2 (ja) * | 2020-01-07 | 2022-11-16 | 株式会社カプコン | ゲームプログラム、ゲーム装置およびゲームシステム |
CN112095708A (zh) * | 2020-09-23 | 2020-12-18 | 三一重机有限公司 | 挖掘机触控控制方法、系统及挖掘机 |
US11712610B1 (en) | 2023-01-11 | 2023-08-01 | Shoot-A-Way, Inc. | Ultrasonic shots-made detector for basketball launching device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001291118A (ja) * | 2000-04-07 | 2001-10-19 | Sony Corp | 3次元モデル処理装置および3次元モデル処理方法、並びにプログラム提供媒体 |
JP2006031715A (ja) * | 2004-07-21 | 2006-02-02 | Solidworks Corp | コンピュータ生成モデルの変形 |
JP2006181286A (ja) * | 2004-12-28 | 2006-07-13 | Sega Corp | 画像処理装置およびその方法 |
JP2008295970A (ja) * | 2007-06-04 | 2008-12-11 | Konami Digital Entertainment:Kk | ゲーム装置、ゲーム装置の制御方法及びプログラム |
WO2009044770A1 (ja) * | 2007-10-02 | 2009-04-09 | Access Co., Ltd. | 端末装置、リンク選択方法および表示プログラム |
JP2009240620A (ja) * | 2008-03-31 | 2009-10-22 | Sega Corp | オブジェクト表示制御方法、オブジェクト表示制御装置、記録媒体、及びプログラム |
JP2010088641A (ja) * | 2008-10-08 | 2010-04-22 | Namco Bandai Games Inc | プログラム、情報記憶媒体及びゲーム装置 |
JP2012033060A (ja) * | 2010-07-30 | 2012-02-16 | Sony Corp | 情報処理装置、表示制御方法及び表示制御プログラム |
Family Cites Families (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2999019B2 (ja) * | 1991-06-21 | 2000-01-17 | 株式会社日立製作所 | 文字図形変形処理装置 |
JP3164617B2 (ja) * | 1991-11-07 | 2001-05-08 | 株式会社日立製作所 | 文字図形変形処理装置および方法 |
JP3546337B2 (ja) * | 1993-12-21 | 2004-07-28 | ゼロックス コーポレイション | 計算システム用ユーザ・インタフェース装置及びグラフィック・キーボード使用方法 |
JPH09204426A (ja) * | 1996-01-25 | 1997-08-05 | Sharp Corp | データの編集方法 |
JP3835005B2 (ja) * | 1998-07-22 | 2006-10-18 | 株式会社セガ | ゲーム装置及びゲーム制御方法及び記憶媒体 |
JP2001338306A (ja) * | 2000-05-29 | 2001-12-07 | Sony Corp | 編集ツール属性変更処理装置、編集ツール属性変更処理方法、および3次元モデル処理装置、3次元モデル処理方法、並びにプログラム提供媒体 |
US7231609B2 (en) * | 2003-02-03 | 2007-06-12 | Microsoft Corporation | System and method for accessing remote screen content |
JP3917532B2 (ja) * | 2003-02-10 | 2007-05-23 | 株式会社バンダイナムコゲームス | ゲーム装置及び情報記憶媒体 |
JP4792878B2 (ja) * | 2005-04-11 | 2011-10-12 | 株式会社セガ | 対戦ビデオゲーム制御プログラム |
JP2006314349A (ja) * | 2005-05-10 | 2006-11-24 | Nintendo Co Ltd | ゲームプログラムおよびゲーム装置 |
JP4832826B2 (ja) * | 2005-07-26 | 2011-12-07 | 任天堂株式会社 | オブジェクト制御プログラムおよび情報処理装置 |
US8062115B2 (en) * | 2006-04-27 | 2011-11-22 | Wms Gaming Inc. | Wagering game with multi-point gesture sensing device |
KR101003283B1 (ko) * | 2007-06-20 | 2010-12-21 | 엔에이치엔(주) | 온라인 게임 방법 및 시스템 |
US8313375B2 (en) * | 2007-09-28 | 2012-11-20 | Konami Digital Entertainment Co., Ltd. | Input instruction processing device, communication system therefor, computer program therefor, and information recording medium therewith |
JP5522902B2 (ja) * | 2008-03-24 | 2014-06-18 | 任天堂株式会社 | 情報処理プログラムおよび情報処理装置 |
JP2010067178A (ja) * | 2008-09-12 | 2010-03-25 | Leading Edge Design:Kk | 複数点入力可能な入力装置及び複数点入力による入力方法 |
JP2010088642A (ja) * | 2008-10-08 | 2010-04-22 | Namco Bandai Games Inc | プログラム、情報記憶媒体及びゲーム装置 |
KR20100136156A (ko) * | 2009-06-18 | 2010-12-28 | 삼성전자주식회사 | 터치스크린을 구비하는 휴대 단말기의 화면 스크롤 방법 및 장치 |
US8217246B2 (en) * | 2009-07-09 | 2012-07-10 | David Sangster | Sliding chord producing device for a guitar and method of use |
US8616971B2 (en) * | 2009-07-27 | 2013-12-31 | Obscura Digital, Inc. | Automated enhancements for billiards and the like |
US8292733B2 (en) * | 2009-08-31 | 2012-10-23 | Disney Enterprises, Inc. | Entertainment system providing dynamically augmented game surfaces for interactive fun and learning |
JP5072937B2 (ja) * | 2009-10-23 | 2012-11-14 | 株式会社コナミデジタルエンタテインメント | ゲームプログラム、ゲーム装置、ゲーム制御方法 |
JP5107332B2 (ja) * | 2009-12-01 | 2012-12-26 | 株式会社スクウェア・エニックス | ユーザインタフェース処理装置、およびユーザインタフェース処理プログラム |
JP4932010B2 (ja) * | 2010-01-06 | 2012-05-16 | 株式会社スクウェア・エニックス | ユーザインタフェース処理装置、ユーザインタフェース処理方法、およびユーザインタフェース処理プログラム |
JP5557316B2 (ja) * | 2010-05-07 | 2014-07-23 | Necカシオモバイルコミュニケーションズ株式会社 | 情報処理装置、情報生成方法及びプログラム |
JP5679977B2 (ja) * | 2010-06-25 | 2015-03-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 接触入力位置補正装置、入力装置、接触入力位置補正方法、プログラム、及び集積回路 |
JP5494337B2 (ja) * | 2010-07-30 | 2014-05-14 | ソニー株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JP2012033058A (ja) * | 2010-07-30 | 2012-02-16 | Sony Corp | 情報処理装置、情報処理方法及び情報処理プログラム |
US9047006B2 (en) * | 2010-09-29 | 2015-06-02 | Sony Corporation | Electronic device system with information processing mechanism and method of operation thereof |
WO2012066591A1 (ja) * | 2010-11-15 | 2012-05-24 | 株式会社ソニー・コンピュータエンタテインメント | 電子機器、メニュー表示方法、コンテンツ画像表示方法、機能実行方法 |
JP5813948B2 (ja) * | 2010-12-20 | 2015-11-17 | 株式会社バンダイナムコエンターテインメント | プログラム及び端末装置 |
US20120326993A1 (en) * | 2011-01-26 | 2012-12-27 | Weisman Jordan K | Method and apparatus for providing context sensitive interactive overlays for video |
JP5379250B2 (ja) | 2011-02-10 | 2013-12-25 | 株式会社ソニー・コンピュータエンタテインメント | 入力装置、情報処理装置および入力値取得方法 |
JP5792971B2 (ja) * | 2011-03-08 | 2015-10-14 | 任天堂株式会社 | 情報処理システム、情報処理プログラム、および情報処理方法 |
KR101802760B1 (ko) * | 2011-06-27 | 2017-12-28 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
KR101948645B1 (ko) * | 2011-07-11 | 2019-02-18 | 삼성전자 주식회사 | 그래픽 오브젝트를 이용한 컨텐츠 제어 방법 및 장치 |
US20130044114A1 (en) * | 2011-08-17 | 2013-02-21 | Battelle Memorial Institute | Visual Representation of Data According to an Abstraction Hierarchy |
JP2013127683A (ja) * | 2011-12-16 | 2013-06-27 | Namco Bandai Games Inc | プログラム、情報記憶媒体、端末、サーバ及びネットワークシステム |
WO2013116926A1 (en) * | 2012-02-06 | 2013-08-15 | Hothead Games, Inc. | Virtual opening of boxes and packs of cards |
US9658695B2 (en) * | 2012-11-08 | 2017-05-23 | Cuesta Technology Holdings, Llc | Systems and methods for alternative control of touch-based devices |
US9205337B2 (en) * | 2013-03-04 | 2015-12-08 | Gree, Inc. | Server device, method for controlling the same, computer readable recording medium, and game system |
TW201435651A (zh) * | 2013-03-06 | 2014-09-16 | Ind Tech Res Inst | 行動通訊裝置以及其人機介面操作方法 |
JP2014182638A (ja) * | 2013-03-19 | 2014-09-29 | Canon Inc | 表示制御装置、表示制御方法、コンピュータプログラム |
JP6210911B2 (ja) * | 2013-03-26 | 2017-10-11 | 株式会社Nttドコモ | 情報端末、表示制御方法、及び表示制御プログラム |
US20140357356A1 (en) * | 2013-05-28 | 2014-12-04 | DeNA Co., Ltd. | Character battle system controlled by user's flick motion |
CN103472986B (zh) * | 2013-08-09 | 2018-03-30 | 深圳Tcl新技术有限公司 | 触摸滑动操作自适应控制方法、装置及触摸板 |
-
2015
- 2015-02-20 JP JP2015529730A patent/JP5848857B1/ja active Active
- 2015-02-20 WO PCT/JP2015/054783 patent/WO2015151640A1/ja active Application Filing
- 2015-02-20 CN CN201580018855.XA patent/CN106255952B/zh active Active
- 2015-02-20 KR KR1020167028509A patent/KR101919349B1/ko active IP Right Grant
- 2015-02-20 AU AU2015241900A patent/AU2015241900B2/en not_active Ceased
- 2015-02-20 EP EP15773157.1A patent/EP3128408A4/en not_active Withdrawn
- 2015-03-24 TW TW104109315A patent/TWI620589B/zh active
- 2015-08-04 JP JP2015154178A patent/JP5864810B2/ja active Active
- 2015-11-09 JP JP2015219273A patent/JP6449133B2/ja active Active
-
2016
- 2016-09-26 US US15/276,692 patent/US20170007921A1/en not_active Abandoned
-
2018
- 2018-11-29 JP JP2018223765A patent/JP6592171B2/ja active Active
-
2019
- 2019-09-18 JP JP2019169400A patent/JP6697120B2/ja active Active
-
2020
- 2020-04-20 JP JP2020074464A patent/JP6938706B2/ja active Active
-
2021
- 2021-08-31 JP JP2021141601A patent/JP7174820B2/ja active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001291118A (ja) * | 2000-04-07 | 2001-10-19 | Sony Corp | 3次元モデル処理装置および3次元モデル処理方法、並びにプログラム提供媒体 |
JP2006031715A (ja) * | 2004-07-21 | 2006-02-02 | Solidworks Corp | コンピュータ生成モデルの変形 |
JP2006181286A (ja) * | 2004-12-28 | 2006-07-13 | Sega Corp | 画像処理装置およびその方法 |
JP2008295970A (ja) * | 2007-06-04 | 2008-12-11 | Konami Digital Entertainment:Kk | ゲーム装置、ゲーム装置の制御方法及びプログラム |
WO2009044770A1 (ja) * | 2007-10-02 | 2009-04-09 | Access Co., Ltd. | 端末装置、リンク選択方法および表示プログラム |
JP2009240620A (ja) * | 2008-03-31 | 2009-10-22 | Sega Corp | オブジェクト表示制御方法、オブジェクト表示制御装置、記録媒体、及びプログラム |
JP2010088641A (ja) * | 2008-10-08 | 2010-04-22 | Namco Bandai Games Inc | プログラム、情報記憶媒体及びゲーム装置 |
JP2012033060A (ja) * | 2010-07-30 | 2012-02-16 | Sony Corp | 情報処理装置、表示制御方法及び表示制御プログラム |
Non-Patent Citations (2)
Title |
---|
KATSUYA IMURA, ILLUSTRATOR CS 4 SUPER-REFERENCE FOR MACINTOSH, 28 February 2009 (2009-02-28), pages 242 - 243, XP008185543 * |
See also references of EP3128408A4 * |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11003261B2 (en) | 2015-10-10 | 2021-05-11 | Tencent Technology (Shenzhen) Company Limited | Information processing method, terminal, and computer storage medium |
JP2018517449A (ja) * | 2015-10-10 | 2018-07-05 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 情報処理方法、端末及びコンピュータ記憶媒体 |
JP2020039880A (ja) * | 2015-10-10 | 2020-03-19 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 情報処理方法、端末及びコンピュータ記憶媒体 |
KR20180005222A (ko) * | 2015-10-10 | 2018-01-15 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | 정보 처리 방법, 단말 및 컴퓨터 저장 매체 |
KR102041170B1 (ko) | 2015-10-10 | 2019-11-06 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | 정보 처리 방법, 단말 및 컴퓨터 저장 매체 |
US10444871B2 (en) | 2015-10-10 | 2019-10-15 | Tencent Technology (Shenzhen) Company Limited | Information processing method, terminal, and computer storage medium |
JP2017084308A (ja) * | 2015-10-30 | 2017-05-18 | 京セラドキュメントソリューションズ株式会社 | 表示装置および表示制御プログラム |
WO2017215258A1 (zh) * | 2016-06-16 | 2017-12-21 | 中兴通讯股份有限公司 | 一种虚拟按键的触发方法、装置、终端及计算机存储介质 |
JP2018068781A (ja) * | 2016-10-31 | 2018-05-10 | 株式会社バンク・オブ・イノベーション | ビデオゲーム処理装置、及びビデオゲーム処理プログラム |
JP7337732B2 (ja) | 2018-08-24 | 2023-09-04 | 株式会社コロプラ | プログラム |
JP7409770B2 (ja) | 2018-12-28 | 2024-01-09 | 株式会社バンダイナムコエンターテインメント | ゲームシステム、プログラム及び端末装置 |
JP2022533321A (ja) * | 2020-04-15 | 2022-07-22 | ▲騰▼▲訊▼科技(深▲セン▼)有限公司 | 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム |
JP7350088B2 (ja) | 2020-04-15 | 2023-09-25 | ▲騰▼▲訊▼科技(深▲セン▼)有限公司 | 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム |
US12017141B2 (en) | 2020-04-15 | 2024-06-25 | Tencent Technology (Shenzhen) Company Limited | Virtual object control method and apparatus, device, and storage medium |
JP2022534661A (ja) * | 2020-04-28 | 2022-08-03 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 仮想オブジェクトの制御方法並びにその装置、端末およびコンピュータプログラム |
JP2023506106A (ja) * | 2020-11-13 | 2023-02-15 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 仮想オブジェクトの制御方法及び装置、電子機器並びにコンピュータプログラム |
JP7418554B2 (ja) | 2020-11-13 | 2024-01-19 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 仮想オブジェクトの制御方法及び装置、電子機器並びにコンピュータプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN106255952A (zh) | 2016-12-21 |
JP2015222595A (ja) | 2015-12-10 |
KR20160145578A (ko) | 2016-12-20 |
JP2020116425A (ja) | 2020-08-06 |
JP6697120B2 (ja) | 2020-05-20 |
KR101919349B1 (ko) | 2018-11-19 |
JP6449133B2 (ja) | 2019-01-09 |
AU2015241900B2 (en) | 2017-09-28 |
JP7174820B2 (ja) | 2022-11-17 |
EP3128408A4 (en) | 2018-02-28 |
JPWO2015151640A1 (ja) | 2017-04-13 |
JP2022000769A (ja) | 2022-01-04 |
TWI620589B (zh) | 2018-04-11 |
EP3128408A1 (en) | 2017-02-08 |
CN106255952B (zh) | 2020-01-07 |
JP2016048571A (ja) | 2016-04-07 |
US20170007921A1 (en) | 2017-01-12 |
TW201542278A (zh) | 2015-11-16 |
JP5864810B2 (ja) | 2016-02-17 |
JP5848857B1 (ja) | 2016-01-27 |
JP2019040632A (ja) | 2019-03-14 |
JP6938706B2 (ja) | 2021-09-22 |
AU2015241900A1 (en) | 2016-09-22 |
JP2020072788A (ja) | 2020-05-14 |
JP6592171B2 (ja) | 2019-10-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6592171B2 (ja) | ユーザインターフェースプログラムおよびゲームプログラム | |
JP5865535B1 (ja) | ユーザ・インタフェース・プログラム | |
JP5995909B2 (ja) | ユーザインターフェースプログラム | |
JP2016134052A (ja) | インターフェースプログラム及びゲームプログラム | |
JP5676036B1 (ja) | ユーザインターフェースプログラム及び当該プログラムを備えたゲームプログラム | |
JP5815143B1 (ja) | タッチ入力によりゲームを進行するインタフェース・プログラム、及び端末 | |
JP6216862B1 (ja) | ゲーム方法およびゲームプログラム | |
JP2016202875A (ja) | ユーザ・インタフェース・プログラム | |
JP5856343B1 (ja) | ユーザ・インタフェース・プログラムおよびコンピュータ実装方法 | |
JP2015097583A (ja) | タッチパネルを備えるゲーム装置、その制御方法およびプログラム | |
JP2016004500A (ja) | ユーザインターフェースプログラム | |
JP5938501B1 (ja) | コンピュータプログラム及びタッチ操作によりゲームを進行するゲームプログラム | |
JP6073432B2 (ja) | タッチ入力によりゲームを進行するインタフェース・プログラム、及び端末 | |
JP2016002385A (ja) | ユーザインターフェースプログラム | |
JP2018069040A (ja) | ゲーム方法およびゲームプログラム | |
JP2017016622A (ja) | ユーザ・インタフェース・プログラムおよびコンピュータ実装方法 | |
JP2017086941A (ja) | タッチ入力によりゲームを進行するインタフェース・プログラム、及び端末 | |
JP2017055825A (ja) | タッチ入力によりゲームを進行する方法、インタフェース・プログラム、及び端末 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015529730 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15773157 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015241900 Country of ref document: AU Date of ref document: 20150220 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2015773157 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015773157 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20167028509 Country of ref document: KR Kind code of ref document: A |