JP5941207B1 - User interface program and computer mounting method - Google Patents

User interface program and computer mounting method Download PDF

Info

Publication number
JP5941207B1
JP5941207B1 JP2015178708A JP2015178708A JP5941207B1 JP 5941207 B1 JP5941207 B1 JP 5941207B1 JP 2015178708 A JP2015178708 A JP 2015178708A JP 2015178708 A JP2015178708 A JP 2015178708A JP 5941207 B1 JP5941207 B1 JP 5941207B1
Authority
JP
Japan
Prior art keywords
image
value
feature amount
coordinate
touch panel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2015178708A
Other languages
Japanese (ja)
Other versions
JP2017051492A (en
Inventor
洋平 三宅
洋平 三宅
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Priority to JP2015178708A priority Critical patent/JP5941207B1/en
Application granted granted Critical
Publication of JP5941207B1 publication Critical patent/JP5941207B1/en
Publication of JP2017051492A publication Critical patent/JP2017051492A/en
Application status is Active legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

As a UI image generated in conjunction with a touch operation, a UI program that dynamically realizes various and flexible image generation processing while limiting polygon vertex calculation and sufficiently reducing processing load. provide. In this UI program, an area determination unit that determines an area corresponding to a user's touch operation on a touch panel, and a coordinate specification that specifies and associates a predetermined range of a predetermined coordinate system with the area. A feature amount determination unit that determines a feature amount of each coordinate based on each coordinate value with respect to the predetermined range, and an image generation unit that generates an image that is associated with the region and that applies the feature amount. And an image display unit for displaying an image in the region on the touch panel, and a computer having the touch panel as a function. [Selection] Figure 5b

Description

  The present invention relates to a UI program and a computer-implemented method for displaying a user interface (hereinafter referred to as “UI”) image. More specifically, it is used in a game executed on a smartphone (hereinafter referred to as “smart phone game”), and displays a UI image on the touch panel effectively in response to a touch operation with an object such as a finger. The present invention relates to a UI program and a computer mounting method.

  In recent user terminals equipped with a touch panel such as a smartphone, various UIs are used on the touch panel. For example, in Patent Document 1, in accordance with a slide operation on the touch panel, a cursor that extends from the start point to the end point of the slide operation and has a different size or shape at one end on the start point side and the other end on the end point side is displayed. A UI is disclosed (see [Summary] in Patent Document 1). Here, over the slide operation, visual deformation processing is also performed in which the cursor width is narrowed so that the cursor area becomes constant as the cursor lengthens (see paragraph [0018] of Patent Document 1).

JP 2012-33060 A

  An object of the present invention is to provide a UI program capable of dynamically realizing various and flexible image generation processing while sufficiently reducing the processing load, and a computer-implemented method therefor.

  In order to solve the above problems, the first invention relates to an area determination unit that determines an area corresponding to a user's touch operation on the touch panel, and a predetermined range of a predetermined coordinate system is associated with the area. A coordinate defining unit to be defined, a feature amount determining unit that determines a feature amount of each coordinate based on each coordinate value for the predetermined range, and an image that is associated with the region and that applies the feature amount A user interface (UI) program that causes a computer having a touch panel to function as an image generation unit that performs the above operation and an image display unit that displays an image in the region on the touch panel is obtained.

  The second invention is a user interface (UI) program that displays a deformed shape of an elastic object on a touch panel, and generates a first elastic object image around the first contact point on the touch panel. When the slide operation from the first contact point to the second contact point is determined on the touch panel and the first forming unit to be displayed, the first elastic object image is directed to the second contact point in conjunction with the slide operation. A second forming unit that forms and displays a deformed second elastic object image, and functions as a portable terminal having a touch panel, and each of the first forming unit and the second forming unit has a first contact point. A rectangular area including the predetermined coordinate system is defined in association with the rectangular area, and a feature of each coordinate is determined based on each coordinate value with respect to the predetermined range. Determines, associated with the square area and is configured to display the generated image according to the feature quantities, UI program is obtained.

  Further, the third invention includes a step of determining an area corresponding to a user's touch operation on the touch panel, a step of defining a predetermined range of a predetermined coordinate system in association with the area, and the predetermined A step of determining a feature amount of each coordinate based on each coordinate value for a range, a step of generating an image associated with the region and applying each feature amount, and displaying an image in the region on the touch panel A method for user interface (UI) image display implemented on a computer comprising the touch screen.

  The features and advantages of the present invention will become apparent from the following detailed description of the invention, as well as from the accompanying drawings and claims.

FIG. 1 is a schematic diagram of an example user terminal capable of executing a UI program according to an embodiment of the present invention. FIG. 2 is a schematic diagram showing a physical configuration of the user terminal of FIG. FIG. 3 is a schematic diagram showing input / output in the user terminal of FIG. FIG. 4 is an exemplary game screen in which the UI program according to the embodiment of the present invention is applied to a game program. FIG. 5a is a series of operation examples for displaying a UI image on a touch panel in conjunction with a touch operation according to an embodiment of the present invention. FIG. 5b is a series of operation examples for displaying a UI image on the touch panel in conjunction with a touch operation according to an embodiment of the present invention. FIG. 6 shows a UV coordinate system for applying the UI image generated in FIG. 5b. FIG. 7 is a main functional block diagram implemented using a UI program according to an embodiment of the present invention. FIG. 8a is a basic process flow diagram implemented by a UI program according to an embodiment of the present invention. FIG. 8b is a detailed process flow diagram implemented by the UI program according to an embodiment of the present invention. FIG. 9 is a schematic image diagram regarding the feature amount applied in the UI program according to the embodiment of the present invention. FIG. 10a is a simplified image example generated using a UI program according to an embodiment of the present invention. FIG. 10B is a flowchart relating to luminance value calculation for generating the UI image image illustrated in FIG. 10A. FIG. 11a shows a procedural image of the first example generated using a UI program according to an embodiment of the present invention. FIG. 11b shows a procedural image of the first example generated using a UI program according to an embodiment of the present invention. FIG. 11c illustrates a first example procedural image generated using a UI program according to an embodiment of the present invention. FIG. 12 shows a procedural image of the second example generated using the UI program according to the embodiment of the present invention. FIG. 13 shows a procedural image of the third example generated using the UI program according to the embodiment of the present invention. FIG. 14 shows a procedural image of the fourth example generated using the UI program according to the embodiment of the present invention. FIG. 15 shows a comparison of procedural images generated using a UI program according to an embodiment of the present invention with other methods.

[Description of Embodiment of the Present Invention]
First, the contents of the embodiment of the present invention will be listed and described. The UI program according to the embodiment of the present invention has the following configuration.

(Item 1)
An area determination unit for determining an area corresponding to a user's touch operation on the touch panel;
A coordinate defining section that associates and defines a predetermined range of a predetermined coordinate system for the region;
A feature amount determination unit that determines a feature amount of each coordinate based on each coordinate value for the predetermined range;
An image generation unit that generates an image that is associated with the region and that applies the feature amount;
A user interface (UI) program that causes a computer including the touch panel to function as an image display unit that displays the image in the area on the touch panel.
According to this item, it is possible to generate an image with a minimum amount of calculation. Thereby, the memory size can be saved, and the CPU calculation amount can be reduced without degrading the image quality.

(Item 2)
The UI program according to item 1, wherein the touch operation is a slide operation, and the region is determined based on a slide direction and a slide distance.
According to this item, it is possible to generate an image excellent in processing load over the entire slide operation.

(Item 3)
The UI program according to item 1 or 2, wherein the image generation unit employs a luminance value as the feature amount.
According to this item, it is possible to generate an image having a gradation with excellent visual effect by applying the luminance value.

(Item 4)
The UI program according to any one of items 1 to 3, wherein an alpha value is adopted as the feature amount and used together with a predetermined RGB value in the image generation unit.
According to this item, it is possible to generate an image having a gradation with excellent visual effects by applying an alpha value.

(Item 5)
The coordinate system is a UV coordinate system, and each coordinate value in the predetermined range is defined by a u value and a v value in a range from 0 to 1,
In the feature amount determination unit, the feature amount of each coordinate is determined by applying a characterization rule based on each coordinate value,
The UI program according to any one of items 1 to 4, wherein the characterization rule includes a quadratic function calculation rule based on a u value and a v value.
According to this item, a visually superior curve can be expressed without applying an excessive processing load by applying a calculation rule of a quadratic function.

(Item 6)
In the feature amount determination unit, the feature amount of each coordinate is determined by applying a characterization rule based on each coordinate value,
The UI program according to any one of claims 1 to 5, wherein the characterization rule includes an interpolation rule based on a u value and / or a v value and a slide distance when the touch operation is a slide operation.
According to this item, an image having a visually excellent gradation can be generated without applying a processing load by applying an interpolation rule.

(Item 7)
The UI program according to any one of items 1 to 6, wherein the image is an image formed from two polygons having a triangular shape.
According to this item, it is possible to generate an image from only two polygons having a triangular shape. That is, it is possible to generate an image with a minimum amount of calculation, and it is possible to realize saving of memory size and reduction of CPU calculation amount. Further, the image generated in this way is never blurred even when enlarged.

(Item 8)
A user interface (UI) program for deforming and displaying the shape of an elastic object on a touch panel,
A first forming unit that generates and displays a first elastic object image around a first contact point on the touch panel;
When a slide operation from the first contact point to the second contact point is determined on the touch panel, the first elastic object image is deformed toward the second contact point in conjunction with the slide operation. As a second forming part that forms and displays a second elastic object image, the portable terminal including the touch panel functions, and each of the first forming part and the second forming part includes:
Determining a rectangular region including the first contact point;
A predetermined range of a predetermined coordinate system is associated with the rectangular area and defined,
For the predetermined range, determine the feature amount of each coordinate based on each coordinate value,
A UI program configured to generate and display an image that is associated with the rectangular area and to which the feature amount is applied.
According to this item, it is possible to generate an image with a minimum amount of calculation. Thereby, the memory size can be saved, and the CPU calculation amount can be reduced without degrading the image quality.
(Item 9)
The UI program according to item 8, wherein in the first forming unit,
The rectangular area is a square area around the first contact point;
The UI program, wherein the first elastic object image is a circularly radiated image formed in the square area and an image formed from two polygons having a triangular shape.
According to this item, it is possible to generate a characteristic image having a visually excellent gradation without applying an excessive processing load.

(Item 10)
The UI program according to item 9, wherein in the second forming unit,
The rectangular region is a rectangular region around the first contact point and the second contact point;
The second object image is an image further gradationd so as to divide and separate the circularly radiated image within the rectangular region and interpolate a portion between the separated images, and 3 A UI program, which is an image formed from two polygons having a square shape.
According to this item, it is possible to generate a characteristic image having a visually excellent gradation without applying an excessive processing load.

(Item 11)
Determining an area corresponding to a user's touch operation on the touch panel;
Defining a predetermined range of a predetermined coordinate system in association with the area;
Determining a feature amount of each coordinate based on each coordinate value for the predetermined range;
Generating an image associated with the region and applying each feature amount;
And displaying the image in the area on the touch panel. A method for user interface (UI) image display implemented in a computer comprising the touch screen.
According to this item, it is possible to generate an image with a minimum amount of calculation. Thereby, the memory size can be saved, and the CPU calculation amount can be reduced without degrading the image quality.

(Item 12)
The method according to item 11, wherein the touch operation is a slide operation, and the region is determined based on a slide direction and a slide distance.
According to this item, it is possible to generate an image excellent in processing load over the entire slide operation.

(Item 13)
Item 13. The method according to Item 11 or 12, wherein the image generation unit employs a luminance value as the feature amount.
According to this item, it is possible to generate an image having a gradation with excellent visual effect by applying the luminance value.

(Item 14)
14. The method according to any one of items 11 to 13, wherein the image generation unit adopts an alpha value as the feature value and uses the feature value together with a predetermined RGB value.
According to this item, by applying the alpha value, it is possible to generate a gradation image with excellent visual effects.

[Details of the embodiment of the present invention]
A UI program and a computer-implemented method for displaying a UI image according to an embodiment of the present invention will now be described with reference to the drawings. In the figure, the same components are denoted by the same reference numerals. This UI program is mainly applicable as part of a game program as a smartphone game. More specifically, the UI program can be used to advance the game as part of the game program and to control the operation of the virtual space and the game character in the virtual space.

Hardware Configuration of Mobile Terminal A smartphone 1 shown in FIG. 1 is an example of a mobile terminal and includes a touch panel 2. The user of the smartphone can control the progress of the game through a user operation including a touch operation on the touch panel 2. Note that the mobile terminal for executing the UI program according to the embodiment of the present invention is not limited to the smartphone 1 as shown in FIG. 2, and is, for example, a mobile terminal having a touch panel such as a PDA or a tablet computer device. It will be appreciated that any device can be used.

  As shown in FIG. 2, the smartphone 1 includes a CPU 3, a main memory 4, an auxiliary memory 5, a transmission / reception unit 6, a display unit 7, and an input unit 8 that are connected to each other via a bus. Of these, the main memory 4 is composed of, for example, a DRAM, and the auxiliary memory 5 is composed of, for example, an HDD. The auxiliary memory 5 is a recording medium capable of recording a UI program and a game program according to the embodiment of the present invention. Various programs stored in the auxiliary memory 5 are expanded on the main memory 4 and executed by the CPU 3. Note that data generated while the CPU 3 operates in accordance with the UI program and data used by the CPU 3 are also temporarily stored on the main memory 4. The transmission / reception unit 6 establishes connection (wireless connection and / or wired connection) between the smartphone 1 and the network under the control of the CPU 3 and transmits / receives various information. The display unit 7 displays various information to be presented to the user under the control of the CPU. The input unit 8 detects a touch input operation (mainly a physical contact operation such as a tap operation, a slide (swipe) operation, and a flick operation) on the touch pal 2 of the user.

  The display unit 7 and the input unit 8 correspond to the touch panel 2 described above. As illustrated in FIG. 3, the touch panel 2 includes a touch sensing unit 301 corresponding to the input unit 8 and a liquid crystal display unit 302 corresponding to the display unit 7. The touch panel 2 displays an image under the control of the CPU 3 and receives an interactive touch operation (such as a physical contact operation on the touch panel 2) by a smartphone user. Then, based on the control by the control unit 303, the corresponding graphic is displayed on the liquid crystal display unit 302.

  More specifically, the touch sensing unit 301 outputs an operation signal corresponding to the touch operation by the user to the control unit 303. The touch operation can be performed by any object. For example, the touch operation may be performed by a user's finger, or a stylus may be used. In addition, as the touch sensing unit 301, for example, a capacitance type can be adopted, but is not limited thereto. When the control unit 303 detects an operation signal from the touch sensing unit 301, the control unit 303 determines as an operation instruction to the user's character, and transmits a graphic (not shown) corresponding to the instruction operation to the liquid crystal display unit as a display signal. Process. The main functions implemented by the control unit 303 will be described later with reference to FIG. The liquid crystal display unit 302 displays a graphic corresponding to the display signal.

Game Screen Example With reference to the game screen example of FIG. 4, the operation of the UI program and the computer mounting method according to the embodiment of the present invention will be schematically described. In the screen example of FIG. 4, the character 10 is arranged in the three-dimensional virtual game space 20. A field-of-view image obtained by photographing the character 10 from the upper front side of the character 10 with a virtual camera (not shown) is displayed on the touch panel as a two-dimensional image. At the same time, two UI images (30 at the upper left of the screen and 40 at the lower right of the screen) are superimposed on the view field image. The UI images 30 and 40 are displayed on the touch panel as a result of a touch operation on the touch panel by the user (for example, a slide operation with a finger). The UI images 30 and 40 are generated by the UI program and the computer mounting method according to the embodiment of the present invention.

  In a smartphone game, a user usually holds a smartphone vertically and operates with one hand, but of course, the present invention is not limited to this, and the terminal may be operated with both hands by holding horizontally. When the screen is operated with both hands, the touch operation is allowed with the left hand and the right hand, respectively. The UI images 30 and 40 shown in the figure also correspond to those obtained by touch operations of the left hand and the right hand, respectively.

  A game progress command is generated and executed by these UI images. For example, as the UI image 30 with the left hand is displayed as “Move”, an instruction to move the character 10 on the plane in the game space 20 in the direction (upper right) indicated by the UI image is executed. Similarly, as the UI image 40 with the right hand is displayed as “Camera”, an instruction to move the virtual camera in the game space 20 is executed so as to change the field of view in the upper right direction indicated by the UI image.

  As described above, the UI program for displaying a UI image according to the embodiment of the present invention generates and displays a UI image according to a user's touch operation. When this UI program is applied particularly to a game program, the computer is caused to execute a character operation function corresponding to the UI program. Hereinafter, for the sake of simplicity, UI image generation and display processing according to a one-handed operation by a user will be described by way of example.

UI Image Display Processing A UI program and a computer-implemented method for displaying a UI image according to an embodiment of the present invention will now be described with reference to FIGS. 5a to 8b. First, an outline of a series of operations for outputting a user interface (UI) image in conjunction with a touch operation on the touch panel will be described with reference to FIGS. 5a and 5b. FIG. 5a shows a case where the touch operation is a tap operation. On the other hand, FIG. 5b shows a case where the touch operation is a slide operation.

As shown in the example of FIG. 5a, first, in (i), the user performs a tap operation with a finger on the touch panel 2 at an arbitrary position. In response to the tap operation, a square area D 1 (dotted line area) is determined around the tap position in (ii). Continuing circle radial image I 1 on the region D 1 in (iii) is generated and displayed. Here, “circular radial” means a state having a circular shape and radially radiating from the center of the circle toward the outer periphery. In the following, as in the case of “circular radial image”, “circular gradation”, and the like, in the example of FIG. 5B, in FIG. 5B, in FIG. From the slide start point to the slide end point. In the case of FIG. 5B, a rectangular region D 2 (dotted line region) is determined around the slide start point and slide end point in (ii) for the slide operation. The width of the region is constant before and after the slide operation, but is not limited to this.

Subsequently, in (iii), the image I 2 is generated and superimposed on the rectangular area D 2 in conjunction with the slide operation. The image I 2 is deformed so that the image I 1 in FIG. 5 a is stretched toward the slide end point. Specifically, the image I 2 on the region D 2 which is displayed as (iii) in FIG. 5b, is separated by dividing into two slide operation direction image I 1 of the (iii) of FIG. 5a And the part between the said distance is produced | generated with a gradation so that it may interpolate further based on u value and / or v value, and a slide distance.

According to an embodiment of the present invention, when the region D 2 in (ii) in FIG. 5b is determined, as shown in FIG. 6 (a), with respect to the region D 2, it referred to the UV coordinates and general A coordinate system is applied (note that the UV coordinate system may be referred to as a texture coordinate system). The UV coordinate system is a two-dimensional coordinate system in which the u value represents the width and the v value represents the height. Then, each coordinate value of the u value and the v value is expressed in a decimal point format in the range from 0 to 1, and each constitutes the position of the texture coordinate. For example, the position in the UV coordinate system is specified by the predetermined number of sets of u and v values such as coordinate values (0.5, 0.5) and (0.125, 0.625). be able to. That is, the region D 2 is, u value and v values is 0 or more and 1 or less in the range, respectively, as in FIG. 6 (a), (0,0) , (1,0), (0,1), This is a rectangular area composed of four points (1, 1).

Image generated on the regions D 2, i.e., a texture image I 2 generated from the texture coordinates are those that can be formed from only one polygon having two polygons or square shape, having a triangular shape. That is, since the texture image I 2 can be displayed by storing only at least four vertex information in the memory, the texture image I 2 is greatly different from the one composed of a large number of polygons in terms of processing load. . The present invention makes it possible to generate a texture image having a visually excellent gradation with a minimum amount of calculation. At the same time, the memory size at the time of processing can be saved, and the CPU calculation amount can be reduced without deteriorating the image quality. Hereinafter, a texture image dynamically generated from the texture coordinates will be referred to as a “procedural image”. In general, "procedural" is a word that means to generate dynamically.

As shown in FIG. 6 (b), with respect to the rectangular region D 2, associates the generated procedural image I 2. That is, the map so as to cover the procedural image I 2 which is gradation represented on a rectangular region D 2. The generation of the procedural image I 2, will be described below with reference to subsequent figures 9, especially for gradient representation. In FIG. 6B and subsequent drawings, the background color is shown in gray when the procedural image expressed in gradation is illustrated. This has been dealt with in the drawings of the present application in order to explicitly visualize the white color.

  FIG. 7 shows a set of main functions that are implemented as a UI program according to an embodiment of the present invention and that cause a mobile terminal having a touch panel to function. The function set constitutes a control unit 50 corresponding to the control unit 303 shown in FIG. That is, input as an operation signal is processed through the control unit 50 (303), and an output as a display signal is generated. The control unit 50 includes a user operation unit 100 related to user input operation and procedural image formation through a touch panel, and a UI formation unit 200 for generating and displaying a UI image as shown in FIG. 5 in conjunction with the touch operation. In addition, a game progress unit (character operation) 500 for operating a game in accordance with the user input operation and a character in the virtual space is included.

  The user operation unit 100 includes a contact / non-contact determination unit 120 and a slide operation determination unit 140. The contact / non-contact determination unit 120 determines a touch operation or a detach operation on the touch panel. Further, the slide operation determination unit 140 determines whether the user operation by the touch operation / detach operation through contact is a slide operation (or just a tap operation).

  In response to a touch operation or a slide operation determined by the user operation unit 100, the UI formation unit 200 functions as a first formation unit and a second formation unit. The first forming unit generates and displays a procedural image having a circular radial gradation around the contact point as described with reference to (iii) of FIG. 5a. Further, in the second forming unit, as described with reference to (iii) of FIG. 5b, the procedural image having a circular radial gradation is subjected to a further gradation that is deformed toward the slide end point. Produce and display procedural images. The UI formation unit 200 includes an area determination unit 210, a UV coordinate definition unit 230, a feature amount determination unit 250, a UI image generation unit 270, and a UI image output unit 290, as will be described below.

  The area determination unit 210 defines an area corresponding to the touch operation described with reference to FIG. The UV coordinate defining unit 230 defines the UV coordinate system described with reference to FIG. 6 in association with the region so that the u value and the v value are predetermined values in the range from 0 to 1. The feature amount determination unit 250 determines the feature amount of each coordinate by applying a characterization rule based on each coordinate value to the range of the region in the UV coordinate system. The UI image generation unit 270 generates an image that is associated with the region and to which each feature amount is applied as a UI image. Then, the UI image display unit 290 displays the generated UI image in the region portion on the touch panel. UI image generation will be described later with reference to FIG.

  For example, in the game as shown in FIG. 4, the UI image generated by the UI forming unit 200 is superimposed on the virtual game space image, and the game progressing unit 500 generates and executes a corresponding game progress command. Advance the game.

    8a and 8b are flow diagrams illustrating a computer-implemented method for displaying UI images according to an embodiment of the present invention. The details of step S107 shown in the flowchart of FIG. 8a correspond to the flowchart of FIG. 8b.

  In FIG. 8 a, when the process is started in step S <b> 101, in step S <b> 103, the touch / non-contact determination unit 120 determines a touch operation on the touch panel. In step S105, the slide operation determination unit 140 further determines whether the touch operation is a tap operation or a slide operation in accordance with step S103. Here, the tap operation means a touch operation composed of only one contact point on the touch panel. On the other hand, the slide operation means a continuous touch operation that includes two contact points on the touch panel (that is, a contact start point and a contact end point).

When the touch operation determination is made in step S105, a procedural image is generated and displayed by the UI forming unit 200 in step S107. If the touch operation is a tap operation, the display in step S1071, also shown in Figure 5a, a UI image to generate the image I 1 with a circular radial gradient which is formed in a square area around a contact point To do. On the other hand, when the touch operation is a slide operation, in step S1073, also shown in FIG. 5b, in a rectangular area around the touch start point and end point, the image I 1 with a circular radial gradient in the sliding operation direction It generates an image I 2 which is deformed and displayed. Specifically, spaced by dividing the image I 1, and displays the UI image and generates an image I 2 which is gradation to interpolate portions between spaced-apart.

  In step S108, the game progress unit 500 generates and executes a corresponding game progress command for the UI image generated in step S107, and performs the game progress control shown in FIG. Here, the game progress command preferably includes a character operation command for causing the character to perform an action such as movement, and a virtual camera operation command for controlling the operation of the virtual camera for photographing the game space.

  Next, with reference to FIG. 8B, step S107 will be described in detail. As shown, step S107 includes steps S201 to S205. As shown in FIG. 8a, two types of steps S1071 and S10173 are assumed for step S107 depending on the touch operation. These are due to differences when applied in step S203 described later. .

In step S <b> 201, the region determination unit 210 determines a region corresponding to the user's touch operation on the touch panel. For example, the region D 2 of the region D 1 and Figure 5b in Figure 5a corresponds to this. In step S202, the UV coordinate defining unit 230 defines the u value and the v value in the UV coordinate system in a range from 0 to 1 with respect to the region. In step S203, the feature amount determination unit 250 determines the feature amount of each coordinate by applying a characterization rule based on each coordinate value to the range from 0 to 1. Step S1071 and step S1072 differ greatly in the points of the characterization rules given in step S203.

  Next, in step S204, the UI image generation unit 270 generates an image to which the feature amount determined in step S203 is applied in association with the region. Here, each position in the UV coordinate system is associated with each pixel included in the UI image, and the feature amount at each position is applied as a value (a luminance value or an opacity value described later in FIG. 9). . Finally, in step S205, the UI image display unit displays the UI image in a corresponding area on the touch panel.

FIG. 9 is a schematic image diagram illustrating the concept in relation to applying the feature amount determined in the feature amount application step S203 to the image in step S205. According to the embodiment of the present invention, each position in the UV coordinate system is associated with each pixel included in the procedural image, and each feature amount is applied to each pixel. In FIG. 9, the feature amount is also expressed in the range from 0 to 1 so that the u value and the v value correspond to the range from 0 to 1 in the uv coordinate system (FIG. 6A). However, it should be understood by those skilled in the art that the feature amount is not limited to the range.

  FIG. 9 shows a state of a series of transitions as to how a corresponding pixel is displayed when the feature amount changes from 0 to 1. It should be noted that although each pixel has a black frame in the figure, this is for clarity of illustration and is not reflected in the actual pixel. In FIG. 9, the feature amount is changed in increments of 0.25 from 0.0 → 0.25 → 0.5 → 0.75 → 1.0. In this case, the display pattern changes when (1) “luminance value” is used as the feature amount, and (2) “white background-based opacity value” is used as the feature amount. ing.

  The case (1) is a case where the feature value is “brightness value”, the brightness value 0 is “black”, and the brightness value 1 is “white” in monochrome. As the luminance value approaches from 0 to 1, it is understood that the color gradually changes from black to white. In this example, the “luminance value” is used as the feature quantity, but the present invention is not limited to this, and a similar result can also be obtained when color information such as “brightness” is used. It will be understood by those skilled in the art. Further, the color information is not limited to monochrome as in this example, and is generally used as RGB (Red: Red, Green: Green, Blue: Blue) values represented by integer values from 0 to 255. Those skilled in the art will understand that any combination of decimal ranges from 0 to 1 determined on the basis of them can be used.

  On the other hand, in the above (2), the feature amount is “white background based opacity”, the opacity value 0 is associated with “completely transparent” (transparency: 100%), and the opacity value 1 is “completely opaque”. ”(Transparency: 0%). Here, the opacity value is different in concept from the color information expressed in RGB of (1) above, and corresponds to an index indicating transparency generally referred to as “alpha value” in the technical field. The alpha value normally expressed in the range of integers from 0 to 255 is represented by the decimal range from 0 to 1, which is the “opacity value” here. As shown in the figure, when the opacity value is 0, it is in a completely transparent state and is assimilated to the background gray color. Then, as the opacity value approaches 1, it is understood that the reference color transitions to white, which is the base color. In this example, the reference base color is set to “white” (that is, a value of “R: 255, G: 255, B: 255”). However, the present invention is not limited to this, and any value expressed by RGB values. Can be set to That is, a procedural image can also be formed as an image having a gradation of an arbitrary reference base color. By changing the reference base color, for example, two UI images 30 and 40 in the game screen example shown in FIG. 4 can be used to generate images having gradations of different colors.

  In FIG. 9, the display mode of one pixel based on the feature amount has been described. In the next FIG. 10a, an image composed of a plurality of pixels in the UV coordinate system and having gradation will be described. FIG. 10a shows an image with gradation, and FIG. 10b shows the processing flow for generating the image. Here, it is assumed that (1) “luminance value” in FIG. 9 is adopted as the feature amount.

An example of an image having gradation as shown in FIG. 10a has eight rectangular areas formed from four points (0,0), (1,0), (0,1), (1,1) in the UV coordinate system. Are divided into small regions L (i, j) (ie, L (1,1) to L (4,2) ), and gradation is applied as a whole by applying a characterization rule described below to each small region Is generated. The image is set such that the luminance values are spaced by “0.25” for each adjacent small region in both the U-axis direction and the V-axis direction. For example, the small region L (1, 1) is obtained from four points (0, 0), (0.25, 0), (0, 0.5), (0.25, 0.5) in the UV coordinate system. Each pixel is set so that the luminance value over the rectangular area is “0.25”. Similarly, the small region L (4, 2) has four points (0.75, 0.5), (1, 0.5), (0.75, 1), (1, 1) in the UV coordinate system. Each pixel is set so that the luminance value over the region is “0.75”.

  An image having a gradation as shown in FIG. 10a is implemented using a luminance value calculation as shown in the processing flow of FIG. 10b, that is, using a characterization rule based on each (u, v) coordinate value. Specifically, first, in steps S301 and S302, the u value and the v value are associated with the i value and the j value, respectively. In step S301, the u value × 4 is calculated, and the integer value i is set by rounding up the value after the decimal point. According to the calculation, i = 1 in the range of u value from 0 to 0.25, i = 2 in the range of u value from 0.25 to 0.5, and i in the range of u value from 0.5 to 0.75. = 3, i = 4 is set in the range of 0.75 to 1, and it is understood that the u value and the i value in FIG. Similarly, in step S302, v value × 2 is calculated, and an integer value j is set by rounding up the value after the decimal point. According to this calculation, j = 1 is set when the v value is in the range of 0 to 0.5, and j = 2 is set when the u value is in the range of 0.5 to 1, and the v value and the j value in FIG. It is understood that

  In the next calculation of the luminance value of L (i, j), the calculation formula is divided depending on whether the value of j is “1” or “2”. In other words, if j = 1 (step S303), the luminance value is calculated in step S304 by the formula “Luminance value of L (i, 1) = 0.25 × integer i”. On the other hand, in the case of j = 2 (step S305), the luminance value is calculated by the equation “Luminance value of L (i, 2) = 0.25 × (integer i−1)” in step S306.

  By calculating the luminance value of L (i, j) in this way, the luminance value to be set for the eight small areas shown in FIG. 10a is determined, and an image having gradation as a whole rectangular area of the UV coordinate system Can be generated. The generated image is formed from only two polygons having a triangular shape or one polygon having a quadrangular shape. In this respect, this processing does not require vertex calculation of a large number of polygons, memory storage / memory access of each vertex, and the processing load can be very light. In the example of FIGS. 10a and 10b, the luminance value is used as the feature amount to generate an image. Alternatively, as described with reference to FIG. 9, opacity (alpha value) is added to the feature amount. Those skilled in the art will appreciate that it may be employed to generate an image with RGB values, or even a combination of luminance values and opacity.

  As described above, the gradation aspect of the procedural image generated based on the UV coordinate value can assume many patterns. Thus, various gradation aspects are further described below along with some preferred embodiments. In the following embodiments, it is assumed that the touch operation by the user is a slide operation for sliding by a distance d. That is, the area determination unit 210 determines a rectangular area based on the slide direction and the slide distance d. Further, it is assumed that the width of the determined rectangular area is constant (2 × r). Lum (u, v) represents a luminance value corresponding to the coordinate value (u, v) in the UV coordinate system.

First Embodiment In the first embodiment for forming a series of images shown in FIGS. 11a to 11c, each of the features is combined by combining eight characterization rules (1-1) to (1-8) based on UV coordinate values. The gradation is determined while calculating the feature value of the UV coordinate based on the conversion rule.

1) In the characterization rule (1-1) shown in FIG. 11a, a luminance value is adopted as a feature amount. Then, the luminance value is continuously distributed from 0 to 1 in the U-axis direction. That is, the luminance value Lum (u, v) is the following for a range of 0 ≦ u value ≦ 1 and 0 ≦ v value ≦ 1 in the UV coordinate system (hereinafter abbreviated as 0 ≦ uv value ≦ 1). Equation Lum (u, v) = u
It is calculated by a functional expression of the parameter u defined as By applying the calculated luminance value as a procedural image, it is possible to generate a procedural image having gradation along the U-axis direction as illustrated.

2) Also in the characterization rule (1-2) shown in FIG. 11a, the luminance value is adopted as the feature amount. Then, the luminance value is continuously distributed from 0 to 1 in the V-axis direction. That is, the luminance value Lum (u, v) is expressed by the following formula Lum (u, v) = 1−v with respect to a range of 0 ≦ uv value ≦ 1 in the UV coordinate system.
Is calculated by a functional expression of the parameter v defined as By applying the calculated luminance value as a procedural image, it is possible to generate a procedural image having gradation along the V-axis direction as illustrated.

3) In the characterization rule (1-3) shown in FIG. 11a, the luminance value is adopted as the feature amount. In the U-axis direction, luminance values 0 to 1 and 1 to 0 are continuously distributed with u value = 0.5 as a boundary. That is, the luminance value Lum (u, v) is expressed by the following formula Lum (u, v) = abs {(u−0.5) × 2} with respect to a range of 0 ≦ uv value ≦ 1 in the UV coordinate system.
It is calculated by a functional expression of the parameter u defined as The abs function is known as a function that returns an absolute value for an input value. By applying the calculated luminance value as a procedural image, it is possible to generate a procedural image having gradation along the U-axis direction as illustrated.

4) Also in the characterization rule (1-4) shown in FIG. In the U-axis direction, luminance values 0 to 1 and 1 to 0 are continuously distributed with u value = 0.5 as a boundary. That is, the luminance value Lum (u, v) is expressed by the following formula Lum (u, v) = {(u−0.5) × 2} 2 in the range of 0 ≦ uv value ≦ 1 in the UV coordinate system.
It is calculated by a functional expression of the parameter u defined as It differs from the characterization rule (1-3) in that it is an absolute value or squared. By applying the calculated luminance value as a procedural image, a procedural image having gradation along the U-axis direction as illustrated can be generated.

5) Also in the characterization rule (1-5) shown in FIG. 11b, the luminance value is adopted as the feature amount. In the V-axis direction, the luminance value is continuously distributed from 0 to 1 and v value = 0.5 from 1 to 0 with the boundary being 0.5. That is, the luminance value Lum (u, v) is expressed by the following formula Lum (u, v) = {(v−0.5) × 2} 2 in the range of 0 ≦ uv value ≦ 1 in the UV coordinate system.
Is calculated by a functional expression of the parameter v defined as By applying the calculated luminance value as a procedural image, a procedural image having gradation along the V-axis direction as illustrated can be generated.

6) Also in the characterization rule (1-6) shown in FIG. Then, in both the U-axis and V-axis directions, luminance values are continuously distributed from 0 to 1, and from 1 to 0, with u value = 0.5 and v value = 0.5, respectively. That is, the luminance value Lum (u, v) is expressed by the following formula Lum (u, v) = {(u−0.5) × 2} 2 in the range of 0 ≦ uv value ≦ 1 in the UV coordinate system. + {(V−0.5) × 2} 2
Is calculated by a quadratic function expression of parameters u and v defined as By applying the calculated luminance value as a procedural image, it is possible to generate a procedural image having an elliptical radial gradation along the U-axis and V-axis directions as illustrated. When the rectangle expressed by the UV coordinates is a square, a circular radial procedural image is generated.

  In addition, although the feature amount has been described as being in the range of 0 to 1 in relation to FIG. 9, the present invention is not limited to this. That is, as a result of calculating the luminance value (feature value), if the value is less than 0, it may have a feature of “0” (ie, black), and if greater than 1, a feature of “1” (ie, a feature value). White). Even in the calculation of the quadratic function equation, there are regions where the luminance value is greater than 1 around the four vertices of the rectangle. In the first embodiment, the region is processed so as to be all white as illustrated.

  7) Even in the characterization rule (1-7) shown in FIG. 11c, the luminance value is adopted as the feature amount. Here, the outer white portion of the elliptical radial gradation generated by the rule (1-6), that is, the region portion having a luminance value larger than 1, is preliminarily processed for opacity by the next rule (1-8). As preparation, inversion processing is performed to make the luminance value smaller than zero. Applying the inverted luminance value as a procedural image to generate a procedural image having an elliptical radial gradation along the U-axis and V-axis directions as shown in the figure and having black outside Can do.

  8) Unlike the previous characterization rules (1-1) to (1-7), the characterization rule (1-8) shown in FIG. 11c adopts an opacity value (alpha value) as a feature quantity. To do. That is, with respect to the distribution of the feature quantity as the luminance value generated as a result of the characterization rule (1-7), the feature quantity is subsequently adopted as the opacity value in the characterization rule (1-8). In this case, as shown in FIG. 9, the difference between the two cases of (1) the case where the luminance value is adopted as the feature amount and (2) the case where the opacity value is adopted as the feature amount depends on the value of the feature amount. Thus, the gradation of the procedural image generated by the characterization rule (1-7) can be changed as illustrated. That is, for the characterization rule (1-7), it is possible to newly generate a procedural image having an elliptical radial opaque gradation in the U-axis and V-axis directions as illustrated. In the present specification, processing using an opacity value as a feature amount as shown in (2) of FIG. 9 is referred to as opacity processing.

Second Embodiment In the second embodiment shown in FIG. 12, three characterization rules (2-1) to (2-3) based on UV coordinate values are combined, and a feature amount of UV coordinates based on each characterization rule. Determine the gradation while calculating. Here, the width of the rectangular area corresponding to the slide operation is denoted by 2r, and the slide operation distance is denoted by d. That is, the size of the rectangular area (for example, the area D 2 in FIG. 5B) corresponding to the slide operation on the touch panel is “horizontal (d + 2r) × vertical 2r”.

  In the second embodiment, a rectangular area of the UV coordinate system is divided into three area portions in the U-axis direction. In the first part, the u value is changed from “0” to “r / (2r + d)”, and in the second part, the u value is changed from “r / (2r + d)” to “(r + d) / (2r + d)”. The part and the third part are parts where the u value changes from “(r + d) / (2r + d)” to “1”.

1) In the characterization rule (2-1) shown in FIG. 12, a luminance value is adopted as a feature amount. Then, the luminance value Lum (u, v) is distributed discretely from 0 to 1 along the U-axis direction. The distribution mode of the luminance value Lum (u, v) is as follows.
In the first part, Lum (u, v) = 0 (constant). That is, black.
In the second part, Lum (u, v) = 0.5 (constant). That is, gray.
In the third part, Lum (u, v) = 1 (constant). That is, white.
By applying the luminance value Lum (u, v) based on the above as a procedural image, a procedural image as illustrated can be generated.

2) Similarly, in the characterization rule (2-2) shown in FIG. Then, the luminance values are distributed discretely from 0 to 1 along the U-axis direction. The distribution mode of the luminance value Lum (u, v) is as follows.
In the first part, Lum (u, v) = 1 (constant). That is, white.
In the second part, Lum (u, v) = 0 (constant). That is, black.
In the third part, Lum (u, v) = 1 (constant). That is, white.
By applying the luminance value L (u, v) based on the above as a procedural image, a procedural image as illustrated can be generated.

3) Also in the characterization rule (2-3) shown in FIG. 12, the luminance value is adopted as the feature amount. Then, luminance values are continuously distributed from 0 to 1 along the U-axis direction. Specifically, the distribution mode of the luminance value Lum (u, v) is as follows.
In the first part, Lum (0, v) = 0 and Lum (r / (2r + d), v) = 0.5, and in the meantime (0 <u <r / (2r + d)), Lum (u , V) is linearly interpolated from 0 (black) to 0.5 (gray). (Hereinafter, it is expressed by a linear interpolation function f (u).)
In the second part, Lum (u, v) = 0.5 (constant). That is, gray.
In the third part, Lum ((r + d) / (2r + d), v) = 0.5 and Lum (1, v) = 1, and in the meantime ((r + d) / (2r + d) <u <1) , Lum (u, v) is linearly interpolated from 0.5 (gray) to 1 (white) (hereinafter, expressed as a linear interpolation function g (u)).
By applying the luminance value L (u, v) based on the above as a procedural image, it is possible to generate a procedural image having a gradation as illustrated.

Third Example In the third example shown in FIG. 13, a more flexible gradation is determined by combining the characterization rules generated in the first and second embodiments. More specifically, the characterization in the first embodiment is performed on the value of the feature amount (that is, the luminance value) obtained as a result of applying the characterization rule (2-3) in FIG. 12 in the second embodiment. Apply the rules. Specifically, the characterization rules (1-6) in FIG. 11b in the first embodiment to the characterization rules (1-8) in FIG. 11c are sequentially applied. Thereby, the feature amount of the UV coordinate based on each characterization rule (2-3) and (1-6) to (1-8) can be calculated, and a corresponding procedural image can be generated.

1) First, using the brightness value (feature value) Lum (u, v) calculated by the characterization rule (2-3) of FIG. 12 in the second embodiment, a new u ′ value and v ′ value are obtained. Set as follows.
In the first part, u ′ = f (u), v ′ = v
In the second part, u ′ = 0.5 (constant), v ′ = v
In the third part, u ′ = g (u), v ′ = v
The above (u ′, v ′) is sequentially applied to the characterization rules (1-6) to (1-8) as follows.

2) In the characterization rule (3-1) shown in FIG. 13, a luminance value is adopted as the feature amount. Then, along both the U-axis and V-axis directions, luminance values 0 to 1 and u ′ value = 0.5 and v ′ value = 0.5 are continuously distributed from 1 to 0 at the boundary. That is, the luminance value Lum (u, v) is expressed by the following formula Lum (u, v) = {(u′−0.5) × 2} with respect to a range of 0 ≦ uv value ≦ 1 in the UV coordinate system. 2 + {(v′−0.5) × 2} 2
Is calculated by a quadratic function expression of parameters u ′ and v ′ defined as By applying the calculated luminance value Lum (u ′, v ′) as a procedural image, it is possible to generate a procedural image having gradation along the U-axis and V-axis directions as illustrated.

  3) Also in the characterization rule (3-2) shown in FIG. 13, the luminance value is adopted as the feature amount. Here, the inversion processing is performed on the outer white portion of the gradation generated by the rule (3-1), that is, the region portion where the luminance value Lum (u, v) is larger than 1, so that the luminance value becomes smaller than zero. As a result, it is possible to generate a procedural image having gradation in the U-axis and V-axis directions as shown in the drawing and having black outside.

  4) Finally, in the characterization rule (3-3) shown in FIG. 13, the opacity value (alpha value) is adopted as the feature amount. That is, with respect to the distribution of the feature value as the luminance value Lum (u, v) generated as a result of the characterization rule (3-2), the value of the feature value is set in the characterization rule (3-3). Employing an opacity value to perform an opacity process. Similar to the application of the characterization rule (1-8) in FIG. 11c, the gradation of the procedural image generated by the characterization rule (3-3) changes as shown. This makes it possible to newly generate a procedural image having gradations that have been subjected to opacification processing with respect to elliptical radial shapes in the U-axis and V-axis directions as illustrated for the characterization rule (3-2). it can.

  Finally, the procedural image shown in the characterization rule (3-3) in FIG. 14, that is, an image having a circular radial gradation is divided and separated, and a portion between the separated images is interpolated. Further, a gradation image is generated.

Fourth embodiment (further modification of the third embodiment)
The fourth embodiment shown in FIG. 14 is a further modification of the procedural image having gradation shown in the characterization rule (3-3) of the third embodiment of FIG. Each modification shown in FIGS. 14A to 14G changes part of the characterization rules that have been applied so far when generating the procedural image of the characterization rule (3-3). Alternatively, it can be realized by additionally applying some characterization rule.

  1) In the modification of FIG. 14 (a), an opacity process using an opacity value as a feature quantity is performed. Specifically, with respect to the characterization rule (3-3) of the third embodiment of FIG. 13, the opacity value of the predetermined thickness portion is further set to “1.0” (0). A characterization rule may be applied in which opacity values (from 1 to 1) are rounded up to the nearest decimal point, and other opacity values are rounded down to “0.0”.

2) Even in the modified example of FIG. 14B, an opacity process using an opacity value as a feature amount is performed. Specifically, in addition to the characterization rule (3-3) of the third embodiment of FIG. 13, the following expression Tra (u, v) = u for the opacity value Tra (u, v)
Should be applied.

3) In the modification of FIG. 14C, an alternative rule is applied in place of the application of the characterization rule (3-1) of the third embodiment of FIG. That is, for Lum (u, v), a new function p (u) for the parameter u is introduced,
Lum (u, v) = {(u′−p (u)) × 2} 2 + {(v′−p (u)) × 2} 2
The luminance value Lum (u, v) may be calculated as Here, for p (u), for example,
In the first part, p (u) = 0.25
In the third part, p (u) = 0.1
Then, the characterization rules (3-2) and (3-3) may be further applied.

4) In the modification of FIG. 14D, an alternative rule is applied in place of the application of the characterization rule (3-1) of the third embodiment of FIG. That is, for Lum (u, v), the following expression Lum (u, v) = abs {(u′−0.5) × 2} + abs {(v′−0.5) × 2}
Then, the characterization rules (3-2) and (3-3) may be further applied.

  5) In the modified example of FIG. 14E, an opacity process using an opacity value as a feature amount is further performed with respect to the modified example of FIG. Specifically, a characterization rule that rounds up the decimal point so that the opacity value is “1.0” may be applied.

  6) In the modification of FIG. 14F, an opacity process using an opacity value as a feature amount is further performed with respect to the modification of FIG. Specifically, the opacity value of the outline portion is rounded up so that the opacity value is “1.0”, and the opacity value other than that is “0.0”. Thus, the characterization rule of truncating the decimal point may be applied.

  7) Even in the modified example of FIG. 14G, the opacity process using the opacity value as the feature amount is performed. Specifically, with respect to FIG. 14 (f), a rhombus region having a predetermined thickness with an opacity value of “1.0” is also provided in the center of the image, and further, the horizontal contour portion What is necessary is just to apply the characterization rule which inverts so that an opacity value may be set to "0.0".

  The procedural images formed in various gradation modes described so far can be displayed on the touch panel as UI images. The procedural image has an advantage that the feature amount is calculated each time and, for example, the procedural image does not blur even when enlarged. This point will be further described below with reference to FIG.

  FIG. 15 is a diagram comparing a UI image (FIG. 15A) generated by a UI program for image display according to an embodiment of the present invention, with respect to an enlarged display of a UI image, with an example using another method. As an example using another method, here, a UI image (FIG. 15B) realized by a mesh composed of a large number of polygons is assumed. In both FIGS. 15A and 15B, the right half of the image shows the entire UI image, and the left half shows the enlarged image of the curved portion of the UI image.

  As shown in FIG. 15A, in the UI program for image display according to the embodiment of the present invention, when the enlargement process is performed, the procedural image is generated and displayed while applying the UV coordinate system one by one in conjunction with the enlargement process. Will do. In this regard, in the UI program according to the embodiment of the present invention, a situation that affects the image quality according to the enlargement process does not occur. On the other hand, when the enlargement process is performed on the UI image realized by the mesh as in the example of FIG. 15B, the process of enlarging each polygon according to the magnification is performed. As a result, the image becomes rough as shown in the figure.

  In the example of FIG. 15B, when a UI program is applied to a game application on a smartphone and a UI image is associated with a slide operation, the roughness of the image is particularly remarkable, which is not preferable. This is because the distance between the touch panel and the user's eyes is short in the user's smartphone operation, and the roughness is unavoidable in situations where the slide operation is frequently performed. This can impair the quality of the entire game, especially in the case of smartphone games. As described above, the UI program according to the embodiment of the present invention is advantageous in that the image quality is not affected even if the enlargement process is performed.

  The UI program for image display and the computer-implemented method according to the embodiment of the present invention have been described above with some examples. The above-described embodiments are merely examples for facilitating understanding of the present invention, and are not intended to limit the present invention. The present invention can be changed and improved without departing from the gist thereof, and it is needless to say that the present invention includes equivalents thereof.

Claims (19)

  1. An area determination unit for determining an area corresponding to a user's touch operation on the touch panel;
    A coordinate defining section that associates and defines a predetermined range of a predetermined coordinate system for the region;
    A feature amount determination unit that determines a feature amount of each coordinate by inputting a coordinate value of each coordinate to a characterization rule, and acquires a feature amount distribution corresponding to the predetermined range ;
    An image generation unit that generates an image associated with the region and based on the feature amount distribution ;
    To function on a computer that includes the touch panel as an image display unit for displaying the image on the region on the touch panel, program.
  2. The touch operation is a slide operation, the region is determined based on the sliding direction and the sliding distance, the program according to claim 1, wherein.
  3. The program according to claim 1, wherein the image includes at least a contour obtained based on the feature amount distribution.
  4. Wherein the image generation unit, in order to obtain at least the outline of the image, employing a luminance value on the feature quantity, the program of any one of claims 1 to 3.
  5. Wherein the image generation unit, in order to obtain at least the outline of the image, employing an alpha value to the feature amount, the program of any one of claims 1 to 3.
  6. The program according to claim 5, wherein when the alpha value is adopted as the feature amount, a predetermined RGB value is further applied to set a reference color of the image.
  7. The coordinate system is a UV coordinate system, and each coordinate value in the predetermined range is defined by a u value and a v value in a range from 0 to 1 ,
    The characterization rules, including calculation rules quadratic function based on the u value and v values, program of any one of claims 1 6.
  8. The characterization rules, including interpolation rule u value and / or the v value and the touch operation is based on the slide distance when a slide operation, according to claim 7 program description.
  9. Wherein the image is an image formed of two polygons having triangular, program of any one of claims 1 8.
  10. A Help program displays by modifying the shape of the elastic object on the touch panel,
    A first forming unit that generates and displays a first elastic object image around a first contact point on the touch panel;
    When a slide operation from the first contact point to the second contact point is determined on the touch panel, the first elastic object image is deformed toward the second contact point in conjunction with the slide operation. As a second forming part that forms and displays a second elastic object image, the portable terminal including the touch panel functions, and each of the first forming part and the second forming part includes:
    It said first contact point determines including realm,
    Over the previous SL area, defined in association with a predetermined range of a predetermined coordinate system,
    By inputting the coordinate value of each coordinate to the characterization rule, the feature amount of each coordinate is determined, and the feature amount distribution corresponding to the predetermined range is acquired,
    Associated with the prior SL area, configured to generate and display an image based on the feature distribution, program.
  11. 11. The program according to claim 10, wherein each of the first forming unit and the second forming unit employs an alpha value for the feature amount in order to obtain at least a contour of the image based on the feature amount distribution.
  12. The program according to claim 11, wherein when an alpha value is adopted as the feature amount, a reference color of the image is set by further applying a predetermined RGB value.
  13. A program according to any one of claims 10 12, in the first forming unit,
    Is a square region around the previous SL area said first contact point,
    The first elastic object image, wherein a circular radially gradient image which is formed in a square region, and an image formed from two polygons having triangular, program.
  14. A program according to claim 13, wherein in the second forming unit,
    A rectangular region around the previous SL area said first contact point and the second contact point,
    The second object image is an image further gradationd so as to divide and separate the circularly radiated image within the rectangular region and interpolate a portion between the separated images, and 3 an image formed from two polygons having an angular shape, program.
  15. Determining an area corresponding to a user's touch operation on the touch panel;
    Defining a predetermined range of a predetermined coordinate system in association with the area;
    Determining a feature value of each coordinate by inputting a coordinate value of each coordinate to a characterization rule, and obtaining a feature value distribution corresponding to the predetermined range;
    Generating an image associated with the region and based on the feature distribution ;
    The image including the step of displaying the region on the touch panel is implemented in a computer comprising said touch screen, the method for displaying images.
  16. The method according to claim 15 , wherein the touch operation is a slide operation, and the region is determined based on a slide direction and a slide distance.
  17. The method according to claim 15 or 16 , wherein , in the step of generating the image , a luminance value is adopted as the feature amount in order to obtain at least an outline of the image .
  18. In the step of generating the image, in order to obtain at least the outline of the image, employing an alpha value to the feature amount, any one process of claim 15 17.
  19. The method according to claim 18, wherein when the alpha value is adopted as the feature amount, a predetermined RGB value is further applied to set a reference color of the image.
JP2015178708A 2015-09-10 2015-09-10 User interface program and computer mounting method Active JP5941207B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015178708A JP5941207B1 (en) 2015-09-10 2015-09-10 User interface program and computer mounting method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015178708A JP5941207B1 (en) 2015-09-10 2015-09-10 User interface program and computer mounting method
PCT/JP2016/074193 WO2017043287A1 (en) 2015-09-10 2016-08-19 User interface program and computer implementation method

Publications (2)

Publication Number Publication Date
JP5941207B1 true JP5941207B1 (en) 2016-06-29
JP2017051492A JP2017051492A (en) 2017-03-16

Family

ID=56244693

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015178708A Active JP5941207B1 (en) 2015-09-10 2015-09-10 User interface program and computer mounting method

Country Status (2)

Country Link
JP (1) JP5941207B1 (en)
WO (1) WO2017043287A1 (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0787302A (en) * 1993-09-16 1995-03-31 Fuji Xerox Co Ltd Document processor
JP2009187096A (en) * 2008-02-04 2009-08-20 Seiko Epson Corp Image processor, image processing method and its program
JP2009240620A (en) * 2008-03-31 2009-10-22 Sega Corp Object display control method, object display control device, recording medium, and program
JP2010033294A (en) * 2008-07-28 2010-02-12 Namco Bandai Games Inc Program, information storage medium, and image generation system
JP2012033060A (en) * 2010-07-30 2012-02-16 Sony Corp Information processing device, display control method and display control program
JP4932010B2 (en) * 2010-01-06 2012-05-16 株式会社スクウェア・エニックス User interface processing device, user interface processing method, and user interface processing program
JP2012113743A (en) * 2012-02-29 2012-06-14 Denso Corp Display controller for remote operation device
JP2012247673A (en) * 2011-05-30 2012-12-13 Dainippon Printing Co Ltd Two-dimensional color gradation display device
JP2014191612A (en) * 2013-03-27 2014-10-06 Ntt Docomo Inc Information terminal, information input auxiliary method, and information input auxiliary program
JP5711409B1 (en) * 2014-06-26 2015-04-30 ガンホー・オンライン・エンターテイメント株式会社 Terminal device
JP2015222595A (en) * 2014-04-04 2015-12-10 株式会社コロプラ User interface program and game program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0787302A (en) * 1993-09-16 1995-03-31 Fuji Xerox Co Ltd Document processor
JP2009187096A (en) * 2008-02-04 2009-08-20 Seiko Epson Corp Image processor, image processing method and its program
JP2009240620A (en) * 2008-03-31 2009-10-22 Sega Corp Object display control method, object display control device, recording medium, and program
JP2010033294A (en) * 2008-07-28 2010-02-12 Namco Bandai Games Inc Program, information storage medium, and image generation system
JP4932010B2 (en) * 2010-01-06 2012-05-16 株式会社スクウェア・エニックス User interface processing device, user interface processing method, and user interface processing program
JP2012033060A (en) * 2010-07-30 2012-02-16 Sony Corp Information processing device, display control method and display control program
JP2012247673A (en) * 2011-05-30 2012-12-13 Dainippon Printing Co Ltd Two-dimensional color gradation display device
JP2012113743A (en) * 2012-02-29 2012-06-14 Denso Corp Display controller for remote operation device
JP2014191612A (en) * 2013-03-27 2014-10-06 Ntt Docomo Inc Information terminal, information input auxiliary method, and information input auxiliary program
JP2015222595A (en) * 2014-04-04 2015-12-10 株式会社コロプラ User interface program and game program
JP5711409B1 (en) * 2014-06-26 2015-04-30 ガンホー・オンライン・エンターテイメント株式会社 Terminal device

Also Published As

Publication number Publication date
JP2017051492A (en) 2017-03-16
WO2017043287A1 (en) 2017-03-16

Similar Documents

Publication Publication Date Title
US8203564B2 (en) Efficient 2-D and 3-D graphics processing
US6914610B2 (en) Graphics primitive size estimation and subdivision for use with a texture accumulation buffer
US7106322B2 (en) Dynamically adjusting a sample-to-pixel filter to compensate for the effects of negative lobes
EP1011078A1 (en) Method for generating polygon data and image display using the same
JP6392370B2 (en) An efficient re-rendering method for objects to change the viewport under various rendering and rasterization parameters
JP4896761B2 (en) 3D map display system, 3D map display method, and program thereof
JP4291892B2 (en) Image processing apparatus and method
JP2667835B2 (en) -Computer the graphics display device
KR100897724B1 (en) 2d/3d line rendering using 3d rasterization algorithms
JP4887359B2 (en) Drawing apparatus and drawing method
CN1143245C (en) Image processing apparatus and method
US9710881B2 (en) Varying effective resolution by screen location by altering rasterization parameters
US6686924B1 (en) Method and apparatus for parallel processing of geometric aspects of video graphics data
TWI570665B (en) Computer graphics system and a graphics processing method
US20070236498A1 (en) Graphics-rendering apparatus
TWI578266B (en) Varying effective resolution by screen location in graphics processing by approximating projection of vertices onto curved viewport
JP4327105B2 (en) Drawing method, image generation apparatus, and electronic information device
US5909221A (en) Gray scaled data generating device which balances width and shade of strokes by repositioning their center line to a predetermined distance from pixel border
US7348996B2 (en) Method of and system for pixel sampling
JP3529759B2 (en) Image processing program, a computer readable recording medium recording an image processing program, the program execution apparatus, an image processing apparatus, and image processing method
TWI602148B (en) Gradient adjustment for texture mapping to non-orthonormal grid
US9508185B2 (en) Texturing in graphics hardware
US20070097145A1 (en) Method and system for supersampling rasterization of image data
JPH1115984A (en) Picture processor and picture processing method
Moser et al. Interactive volume rendering on mobile devices

Legal Events

Date Code Title Description
TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20160420

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20160519

R150 Certificate of patent or registration of utility model

Ref document number: 5941207

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250