CN109078325B - User interface processing method and recording medium - Google Patents

User interface processing method and recording medium Download PDF

Info

Publication number
CN109078325B
CN109078325B CN201810563268.4A CN201810563268A CN109078325B CN 109078325 B CN109078325 B CN 109078325B CN 201810563268 A CN201810563268 A CN 201810563268A CN 109078325 B CN109078325 B CN 109078325B
Authority
CN
China
Prior art keywords
list
cursor
touch panel
display area
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810563268.4A
Other languages
Chinese (zh)
Other versions
CN109078325A (en
Inventor
姜夏聪
佐藤央
藤田健作
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koei Tecmo Games Co Ltd
Original Assignee
Koei Tecmo Games Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koei Tecmo Games Co Ltd filed Critical Koei Tecmo Games Co Ltd
Publication of CN109078325A publication Critical patent/CN109078325A/en
Application granted granted Critical
Publication of CN109078325B publication Critical patent/CN109078325B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The present invention relates to a user interface processing method and a recording medium for effectively utilizing a screen by increasing the number of items of a list that can be displayed on a touch panel at one time. The information processing device (1) is caused to function as a first display processing unit (5), a contact detection processing unit (7), and a relative movement processing unit (9), wherein the first display processing unit (5) displays a list (13) containing a plurality of items and a cursor (15) for selecting a specific item from the list (13) on the touch panel (3), the contact detection processing unit (7) detects a contact operation with respect to the touch panel (3), and the relative movement processing unit (9) causes the cursor (15) and the list (13) to move relatively based on the contact operation.

Description

User interface processing method and recording medium
Technical Field
The present invention relates to a user interface processing method and a recording medium in which a user interface processing program for executing the user interface processing method is recorded.
Background
Conventionally, in a touch panel mounted on a smart phone, a tablet computer, or the like, when a specific item is selected from a displayed list (refer to a list in which a plurality of data and objects are listed, for example), a user clicks the specific item with a finger to select the specific item.
As an example of such a technique, for example, japanese patent application laid-open No. 2015-130905 discloses a game system in which a list (candidate list) including a plurality of items (candidate plates) is scrollable and detailed display operation icons are provided for the respective items. The user selects the item by clicking on the detailed display operation icon with a finger to display the detailed contents of the corresponding player account.
Disclosure of Invention
Problems to be solved by the invention
In the game system described above, in order to allow the user to click on an item of the list to perform an operation, it is necessary to set the width of each item slightly larger in consideration of the size of the finger. Therefore, there is a problem that the number of items that can be displayed on the touch panel at one time is limited.
The present invention has been made in view of the above-described problems, and an object of the present invention is to provide a user interface processing method and a recording medium that can effectively use a screen by increasing the number of items of a list that can be displayed on a touch panel at one time.
Means for solving the problems
In order to achieve the above object, a user interface processing method according to the present invention is performed by an information processing apparatus including a touch panel, and includes: a step of displaying a list including a plurality of items and a cursor for selecting a specific item from the list on the touch panel; a step of detecting a contact operation with respect to the touch panel; and a step of moving the cursor relative to the list based on the contact operation.
In order to achieve the above object, a recording medium according to the present invention is a recording medium readable by an information processing apparatus, and a user interface processing program for executing the user interface processing method is recorded.
Effects of the invention
According to the user interface processing method and the like of the present invention, the screen can be effectively utilized by increasing the number of items of the list that can be displayed on the touch panel at one time.
Drawings
Fig. 1 is a diagram showing an example of an external configuration of an information processing apparatus according to an embodiment.
Fig. 2 is a block diagram showing an example of a functional configuration of the information processing apparatus.
Fig. 3 is a diagram showing an example of a selection screen of a city owner.
Fig. 4 is a diagram showing an example of cursor movement when a movement instruction made to the cursor based on a stroke is within the range of the list display area.
Fig. 5 is a diagram showing another example of cursor movement when a movement instruction made to the cursor based on a stroke is within the range of the list display area.
Fig. 6 is a diagram showing an example of cursor movement and list scrolling when a movement instruction to the cursor based on a swipe is out of the range of the list display area.
Fig. 7 is a diagram showing another example of list scrolling when a movement instruction made to the cursor based on a swipe is out of the range of the list display area.
Fig. 8 is a diagram showing an example of cursor movement when a movement instruction to the cursor is within the range of the list display area based on the flick.
Fig. 9 is a diagram showing another example of cursor movement when a movement instruction to the cursor based on a flick is within the range of the list display area.
Fig. 10 is a diagram showing an example of cursor movement and list scrolling when a movement instruction to the cursor based on a flick is out of the range of the list display area.
Fig. 11 is a flowchart showing an example of processing steps executed by the CPU of the information processing apparatus.
Fig. 12 is a block diagram showing an example of a hardware configuration of the information processing apparatus.
Fig. 13 is a view showing an example of a city owner selection screen according to a modification of the enlarged display of the selected item.
Fig. 14 is a diagram showing an example of cursor movement when a movement instruction made to the cursor based on a stroke is within the range of the list display area, in a modification example in which the cursor is moved in the same direction as the stroke direction.
Fig. 15 is a diagram showing another example of cursor movement when a movement instruction to the cursor is within the range of the list display area based on the swipe, which is a modification of moving the cursor in the same direction as the swipe direction.
Fig. 16 is a diagram showing an example of cursor movement and list scrolling when a movement instruction to the cursor is out of the range of the list display area based on a swipe, which is a modification example of moving the cursor in the same direction as the swipe direction.
Fig. 17 is a diagram showing an example of a display in which a city owner selection screen is moved.
Fig. 18 is a diagram showing an example of cursor movement and list scrolling when a movement instruction to the cursor is outside the display range of the touch panel based on a swipe when the selection screen of the city owner is moved.
Detailed Description
An embodiment of the present invention will be described below with reference to the drawings.
<1 > appearance structure of information processing apparatus
First, an example of the external configuration of the information processing apparatus 1 according to the present embodiment will be described with reference to fig. 1. As shown in fig. 1, the information processing apparatus 1 has a touch panel 3 that performs various displays and various contact operations by a user. The user performs a desired input by performing a contact operation on the touch panel 3 with a finger or the like.
The detection method of the touch panel 3 is not particularly limited, but various detection methods such as a resistive film method, a capacitive method, an optical method, and an electromagnetic induction method can be employed. The information processing apparatus 1 may further include an operation input unit such as a button or a switch, in addition to the touch panel 3.
The information processing apparatus 1 is, for example, a smart phone, a tablet computer, a portable game machine, or the like. However, the present invention is not limited to this, and any configuration may be employed as long as the configuration includes a touch panel as a display device and an input device, and for example, a configuration such as a server computer, a desktop computer, a notebook computer, and the like, a configuration such as a cellular phone, a tablet phone, and the like, a configuration such as a telephone set, a stationary game machine, and the like are included in addition to a configuration such as a computer manufacturing, selling, and the like.
In this embodiment, a case will be described in which the information processing apparatus 1 executes a game program as an example of a user interface processing program.
<2 > Structure of functionality of information processing apparatus
Next, an example of the functional configuration of the information processing apparatus 1 will be described with reference to fig. 2 and 3 to 10. The arrows shown in fig. 2 indicate an example of the flow of signals, and the flow direction of signals is not limited.
As shown in fig. 2, the information processing apparatus 1 includes the touch panel 3, the first display processing unit 5, the contact detection processing unit 7, the relative movement processing unit 9, and the second display processing unit 11.
The first display processing unit 5 causes the touch panel 3 to display a list including a plurality of items and a cursor for selecting a specific item from the list. Items include, for example, text-based data, pattern-based objects, and the like.
The second display processing unit 11 displays information (hereinafter referred to as "associated information") associated with the item selected by the cursor on the touch panel 3 together with the list. The association information includes, for example, detailed information related to the selected item, and the like. When the relative movement processing unit 9 moves the list and the cursor relative to each other, the second display processing unit 11 sequentially displays the associated information of the selection items that change in response to the relative movement, which will be described in detail later.
Fig. 3 shows an example of the display of the list, the cursor, and the related information. In the example shown in fig. 3, a selection screen for selecting a city owner of "××city" is displayed on the touch panel 3. On the selection screen, a list 13 of a plurality of armed pairs held by the user and a cursor 15 for selecting a specific armed pair from the list 13 are displayed by the first display processing unit 5. In the list 13, names, identities, belongings, etc. are displayed in accordance with each of the armed forces, the data of which constitutes one item of the list.
The cursor 15 is movable within a range of a list display area 17 (an example of a display range of the touch panel) which is an area where the list 13 is displayed, and details thereof will be described later. In this example, the cursor 15 is in the form of a square frame, but may be in another form such as an arrow object.
In addition, a related information display area 19 is provided in the vicinity of (in this example, adjacent to) the list display area 17. In the related information display area 19, detailed information 21 of the armed forces selected by the cursor 15 is displayed by the second display processing unit 11. In this example, the detailed information 21 includes, for example, a face image of the armed forces, a name, various capability parameters (unity, brave, intelligence, politics).
Returning to fig. 2, the contact detection processing section 7 detects a contact operation with respect to the touch panel 3. In general, operations for a touch panel include: an operation in which there is no positional variation on the touch panel from the start of detection of contact to the end of detection (referred to as a so-called click operation), an operation in which there is positional variation on the touch panel from the start of contact to the end of detection (referred to as a so-called swipe, flick, drag, slip, or the like operation), or the like. The contact detection processing unit 7 detects a contact operation, particularly the latter of the above-described operations, that is, a contact operation in which there is a positional variation. In the following, among the contact operations in which the position varies, an operation in which the finger moves so as to wipe the screen of the touch panel 3 is appropriately referred to as "swipe", and an operation in which the finger moves so as to toggle the screen is appropriately referred to as "flick".
The relative movement processing unit 9 moves the cursor 15 and the list 13 relative to each other based on the contact operation detected by the contact detection processing unit 7. The relative movement of the cursor 15 and the list 13 includes a case where the cursor 15 moves in a state where the list 13 is stopped and a case where the list 13 moves in a state where the cursor 15 is stopped on the screen of the touch panel 3.
That is, when the cursor 15 is moved to an item in the list display area 17 in the list 13 based on the contact operation, the relative movement processing unit 9 moves the cursor 15 in a state where the list 13 is stopped. When the cursor 15 is moved to an item outside the list display area 17 in the list 13 based on the touch operation, the relative movement processing unit 9 moves the cursor 15 to the end of the list display area 17 in a state where the list 13 is stopped, and moves the list 13 in a state where the cursor 15 is stopped at the end of the list display area 17.
The relative movement processing unit 9 moves the cursor 15 in the opposite direction to the direction corresponding to the contact operation in a state where the list 13 is stopped, and moves the list 13 in the direction corresponding to the contact operation in a state where the cursor 15 is stopped. Hereinafter, the operation of moving the list 13 will be appropriately referred to as "scrolling".
Fig. 4 to 10 show specific examples of the relative movement of the cursor 15 and the list 13. The balloon or the arrow shown in the diagrams of fig. 4 to 10 is an annotation description for explaining an operation or the like, and is not displayed on the touch panel 3. In fig. 4 to 10, the face image in the related information display area 19 is not shown.
First, a case will be described in which a movement instruction made to the cursor 15 based on a swipe is within the range of the list display area 17. For example, as shown in fig. 4, when the user swipes the finger 23 upward relatively little from the state shown in fig. 3, the cursor 15 moves downward in the list display area 17 in accordance with the amount of the swipe movement in the state where the list 13 is stopped. For example, as shown in fig. 5, when the user swipes the finger 23 downward relatively little from the state shown in fig. 4, the cursor 15 moves upward in the list display area 17 in accordance with the amount of the swipe movement in the state where the list 13 is stopped. In this way, when the movement instruction to the cursor 15 based on the swipe is within the range of the list display area 17, the cursor 15 is moved successively in the direction opposite to the swipe direction in the state where the list 13 is stopped.
At this time, detailed information of the selection object (in this example, the armed forces) that fluctuates with the movement of the cursor 15 is sequentially displayed in the related information display area 19. For example, in the example shown in fig. 4, during the movement of the cursor 15 from the armed a to the armed E, detailed information of the armed A, B, C, D, E is sequentially displayed in the associated information display area 19. Also in the example shown in fig. 5, during the movement of the cursor 15 from the armed E to the armed B, detailed information of the armed E, D, C, B is sequentially displayed in the associated information display area 19.
The cursor 15 moves in units of items in the list 13, and therefore is not located in the middle of the items. The movement speed and the movement amount of the cursor 15 vary according to the movement speed and the movement amount of the stroke. The position at which the swipe is performed is not particularly limited as long as it is on the touch panel 3, and may be, for example, on the list display area 17 or the related information display area 19, or may be other than the above.
Next, a case will be described in which a movement instruction made to the cursor 15 based on the swipe is out of the range of the list display area 17. For example, as shown in fig. 6, when the user swipes the finger 23 relatively upward from the state shown in fig. 3, the cursor 15 moves downward in the state where the list 13 is stopped in the list display area 17 as shown in the upper stage of fig. 6. If the cursor 15 reaches the lower end of the list display area 17 in the middle of the stroke, the list 13 is scrolled upward while the cursor 15 is stopped at the lower end of the list display area 17 as shown in the lower stage of fig. 6. That is, the operation is switched from the state in which only the cursor 15 is moved to the state in which only the list 13 is scrolled.
Further, for example, as shown in fig. 7, when the user swipes the finger 23 downward from the state shown in fig. 3, the cursor 15 is already positioned at the upper end of the list display area 17, and thus the list 13 is scrolled downward while stopped at the upper end.
Further, on the related information display area 19, detailed information of the selection object (in this example, the armed forces) that fluctuates with the movement (scroll) of the cursor 15 or the list 13 is displayed successively.
In this way, when the movement instruction made to the cursor 15 based on the swipe is out of the range of the list display area 17, the cursor 15 moves in the direction opposite to the swipe direction in a state where the list 13 is stopped before the cursor 15 reaches the end of the list display area 17, and after the cursor 15 reaches the end of the list display area 17, the list 13 scrolls in the swipe direction in a state where the cursor 15 is stopped at the end.
Since the list 13 moves in item units with respect to the cursor 15, the cursor 15 is not positioned in the middle between items at the end of scrolling. The scroll speed and scroll amount of the list 13 vary according to the movement speed and movement amount of the stroke.
Next, a case will be described in which a movement instruction to the cursor 15 based on the flick is within the range of the list display area 17. For example, as shown in fig. 8, when the user flicks relatively lightly upward using the finger 23 from the state shown in fig. 3, the cursor 15 moves inertially downward in the list display area 17 in accordance with the flicking speed in a state where the list 13 is stopped. For example, as shown in fig. 9, when the user flicks relatively lightly downward using the finger 23 from the state shown in fig. 8, the cursor 15 is moved upward by inertia in the list display area 17 in accordance with the flick speed in the state where the list 13 is stopped. The movement speed and the movement amount of the cursor 15 moved by inertia at this time vary according to the flicking speed and the like.
Next, a case will be described in which a movement instruction to the cursor 15 based on the flick is out of the range of the list display area 17. For example, as shown in fig. 10, when the user flicks relatively strongly upward using the finger 23 from the state shown in fig. 3, the cursor 15 is moved downward by inertia in the state where the list 13 is stopped in the list display area 17 as shown in the upper stage of fig. 10. When the cursor 15 reaches the lower end of the list display area 17, the list 13 is scrolled upward by inertia in a state where the cursor 15 is stopped at the lower end of the list display area 17, as shown in the lower stage of fig. 10. That is, the operation is switched from a state in which only the cursor 15 is inertially moved to a state in which only the list 13 is inertially scrolled. The scroll speed and scroll amount of the list 13 at this time vary according to the speed of the flick or the like.
The term "inertially move" means that the cursor 15 or the list 13 continues to move (scroll) for a short period of time even after the finger 23 is separated (not touched) from the touch panel 3 by the flicking motion.
The movement speed (movement amount) of the cursor 15 to be swiped or flicked with respect to the predetermined operation speed (operation amount) may be set equal to or different from the scroll speed (scroll amount) of the list 13. For example, if the settings are equal, the speed variation when the operation is switched from the movement of the cursor 15 to the scrolling of the list 13 can be suppressed, and thus, the user's sense of inconveniences can be prevented. When the scroll speed of the list 13 is set to be different from the first one, for example, by setting the scroll speed of the list 13 to be larger than the movement speed of the cursor 15, the operation of performing rough adjustment by performing the scroll of the list 13 at a high speed and fine adjustment by performing the movement of the cursor 15 at a low speed can be performed, and operability can be improved. In particular, it is effective when the number of items in the list is plural.
The processing and the like of each processing unit described above are not limited to the example of sharing the processing described above, and for example, the processing may be performed by a smaller number of processing units (for example, 1 processing unit), or may be performed by a finer processing unit. The functions of the respective processing units are functions to be installed by a game program executed by the CPU101 (see fig. 12 described later), but for example, a part of the functions may be installed by an application specific integrated circuit such as an ASIC or FPGA, or an actual device such as another circuit.
<3 > processing steps performed by the information processing apparatus
Next, an example of processing steps (user interface processing method) executed by the CPU101 of the information processing apparatus 1 will be described with reference to fig. 11. The present flow is started when a predetermined operation input for list display is made by the user.
In step S10, the information processing apparatus 1 displays a list 13 including a plurality of items and a cursor 15 for selecting a specific item from the list 13 in the list display area 17 by the first display processing section 5.
In step S20, the information processing apparatus 1 displays the related information of the item selected by the cursor 15 in the related information display area 19 by the second display processing section 11.
In step S30, the information processing apparatus 1 determines whether or not a touch operation (swipe, flick, or the like) to the touch panel 3 is detected by the touch detection processing section 7. When the contact operation is detected (step S30: yes), the process proceeds to the next step S40. On the other hand, when the contact operation is not detected (step S30: NO), the process proceeds to step S70 described later.
In step S40, the information processing apparatus 1 moves the cursor 15 in a direction opposite to the direction corresponding to the contact operation in a state where the list 13 is stopped based on the contact operation detected in the above step S30 by the relative movement processing unit 9.
In step S50, the information processing apparatus 1 determines whether or not the movement instruction to the cursor 15 by the contact operation (swipe, flick, etc.) exceeds the range of the list display area 17 based on the contact operation detected in step S30 described above by the relative movement processing unit 9. When the movement instruction to the cursor 15 exceeds the range of the list display area 17 (yes in step S50), the process proceeds to the next step S60. On the other hand, when the movement instruction to the cursor 15 is within the range of the list display area 17 (step S50: no), the process proceeds to step S70 described later.
In step S60, the information processing apparatus 1 moves the cursor 15 to the end portion of the list display area 17 (in the direction opposite to the direction corresponding to the contact operation) by the relative movement processing unit 9, and scrolls the list 13 in the direction corresponding to the contact operation in a state where the cursor 15 is stopped at the end portion of the list display area 17.
In step S70, the information processing apparatus 1 determines whether or not there is a predetermined operation input for ending the list display. The operation input for ending the list display is, for example, an operation to close the selection screen described in fig. 3. When there is no operation input to the end of the list display (no in step S70), the process returns to the previous step S20, and the same steps are repeated. On the other hand, when an operation input of the list display end occurs (yes in step S70), the present flow is ended.
The above-described processing steps are examples, and at least a part of the above-described steps may be deleted or changed, or steps other than the above-described steps may be added. The order of at least a part of the steps may be changed, or a plurality of steps may be combined into a single step.
<4 > hardware configuration of information processing apparatus
Next, an example of a hardware configuration of the information processing apparatus 1 that realizes each processing unit installed by a program executed by the CPU101 or the like described above will be described with reference to fig. 12.
As shown in fig. 12, the information processing apparatus 1 includes, for example, a CPU101, a ROM103, a RAM105, a GPU106, an application specific integrated circuit 107 configured for a specific application, such as an ASIC or FPGA, an input device 113, an output device 115, a recording device 117, a driver 119, a connection port 121, and a communication device 123. These structures are connected to each other via a bus 109, an input/output interface 111, or the like so as to be able to transmit signals.
The game program may be recorded in advance in, for example, the ROM103 or the RAM105, the recording device 117, or the like.
The game program may be recorded temporarily or permanently (non-temporarily) in advance on a removable recording medium 125 such as a magnetic disk such as a floppy disk, an optical disk such as various CDs, MO disks, DVDs, or a semiconductor memory. Such a recording medium 125 may also be provided as so-called package software. In this case, the game program recorded on the recording medium 125 may be read by the drive 119 and recorded on the recording device 117 via the input/output interface 111, the bus 109, or the like.
The game program may be recorded in advance on, for example, a download site, another computer, another recording device, or the like (not shown). In this case, the game program is transmitted via a network NW such as a LAN or the internet, and the communication device 123 receives the program. The program received by the communication device 123 may be recorded in the recording device 117 via the input/output interface 111, the bus 109, or the like.
The game program may be recorded in advance in, for example, an appropriate external connection device 127. In this case, the game program may be transferred via an appropriate connection port 121 and recorded in the recording device 117 via the i/o interface 111, the bus 109, or the like.
The CPU101 executes various processes in accordance with the program recorded in the recording device 117, thereby realizing the processes executed by the first display processing unit 5, the relative movement processing unit 9, and the like. In this case, the CPU101 may read out the program directly from the recording device 117, or may be executed after being loaded into the RAM 105. Further, for example, when the CPU101 receives a program via the communication device 123, the driver 119, and the connection port 121, the received program may be executed directly without being recorded in the recording device 117.
The CPU101 may perform various processes based on signals or information input from the input device 113 including the touch panel 3 (not shown), for example, a mouse, a keyboard, and a microphone, as necessary.
The GPU106 performs processing for image display such as rendering processing in accordance with instructions from the CPU 101.
The CPU101 and the GPU106 output the result of the above-described processing from the output device 115 including the touch panel 3 or the audio output unit. The CPU101 and the GPU106 may transmit the processing result via the communication device 123 or the connection port 121, or may record the processing result in the recording device 117 or the recording medium 125, as necessary.
<5 > effects of the embodiment
The game program of the present embodiment causes the information processing apparatus 1 including the touch panel 3 to function as the first display processing unit 5, the contact detection processing unit 7, and the relative movement processing unit 9, and the first display processing unit 5 displays a list 13 including a plurality of items and a cursor 15 for selecting a specific item from the list 13 on the touch panel 3, and the contact detection processing unit 7 detects a contact operation with respect to the touch panel 3 and the relative movement processing unit 9 moves the cursor 15 and the list 13 relative to each other based on the contact operation. This gives the following effects.
That is, in the present embodiment, the user moves the cursor 15 and the list 13 relatively by the contact operation to the touch panel 3, and selects a desired item by moving the cursor 15 to the item. Accordingly, the user can set the size (for example, the width in the vertical direction) of each item to be slightly smaller without taking the size of the finger into consideration because the user does not need to directly touch (click) the item in the list 13 when selecting the item. Therefore, the number of items of the list 13 that can be displayed on the touch panel at one time can be increased, and the screen can be effectively used.
In terminals without a touch panel (hereinafter, appropriately referred to as "non-touch panel terminals") such as computers and stationary game machines, a user typically uses an input device such as a mouse or a game controller to operate a cursor and move the cursor to a selection object to select the terminal. In contrast, in a terminal having a touch panel (hereinafter, appropriately referred to as a "touch panel terminal") such as a smart phone, a tablet computer, a portable game machine, or the like, a user clicks a selection object to select. Thus, the operating systems are distinct in both terminals.
In the present embodiment, the same operating system as that of the non-touch panel terminal is applied to the touch panel terminal, and the cursor 15 is operated by a contact operation to the touch panel 3. Therefore, since the operation system can be shared between the non-touch panel terminal and the touch panel terminal, for example, when a game is developed simultaneously in both terminals, or when a game of the non-touch panel terminal is transferred to the touch panel terminal, development steps and development costs can be significantly reduced. In addition, even when applied to a game terminal including elements other than the touch panel terminal and the touch panel terminal, the game system can be simplified by sharing the operating system, and therefore development steps and development costs can be suppressed.
In the present embodiment, in particular, when the relative movement processing unit 9 moves the cursor 15 to an item within the display range of the touch panel 3 in the list 13 (in the present embodiment, within the range of the list display area 17), the cursor 15 is moved in a state where the list 13 is stopped.
In this way, when the cursor 15 is moved to an item within the range of the list display area 17 in the list 13, movement (scrolling) of the list 13 is not required. Therefore, the cursor 15 is moved in a state where the list 13 is stopped, and the item can be selected, whereby the selection operation becomes easy and the erroneous operation can be suppressed.
In the present embodiment, in particular, when the relative movement processing unit 9 moves the cursor 15 to an item out of the range of the list display area 17 in the list 13, the cursor 15 is moved to the end of the list display area 17 in a state where the list 13 is stopped, and the list 13 is scrolled in a state where the cursor 15 is stopped at the end of the list display area 17.
In this way, when the cursor 15 is moved to an item out of the range of the list display area 17 in the list 13, movement (scrolling) of the list 13 is required. Therefore, by scrolling the list 13 in a state where the cursor 15 is stopped, the item can be selected, and thus the selection operation becomes easy and erroneous operation can be suppressed. Then, by first moving the cursor 15 to the end of the list display area 17 in a state where the list 13 is stopped and then scrolling the list 13 in a state where the cursor 15 is stopped at the end of the list display area 17, it is possible to smoothly switch the operation from the state where the cursor 15 is moved to the state where the list 13 is scrolled. Further, since the list 13 can be scrolled at a high speed (the list 13 is scrolled at a high speed by swiping quickly, and the list 13 is scrolled by a predetermined amount due to inertia even after the finger is separated from the touch panel 3), even in the case of the list 13 having a plurality of items, quick selection can be made, and operability can be improved.
In the present embodiment, in particular, the relative movement processing unit 9 moves the cursor 15 in a direction opposite to the direction corresponding to the contact operation in a state where the list 13 is stopped, and moves the list 13 in a direction corresponding to the contact operation in a state where the cursor 15 is stopped. This can provide the following effects.
That is, in the present embodiment, for example, when the contact operation is performed downward, the cursor 15 is moved upward in a state where the list 13 is stopped, and when the cursor 15 reaches the upper end of the list display area 17, the list 13 is scrolled downward in a state where the cursor 15 is stopped at the upper end. Similarly, for example, when the contact operation is performed upward, the cursor 15 is first moved downward in a state where the list 13 is stopped, and when the cursor 15 reaches the lower end of the list display area 17, the list 13 is scrolled upward in a state where the cursor 15 is stopped at the lower end.
In this way, the relative movement direction of the cursor 15 and the list 13 can be made the same before and after the operation is switched from the state in which the cursor 15 is moved to the state in which the list 13 is scrolled. This makes it possible to smoothly switch the operation, suppress the user's offensive feeling, and maintain comfortable operability.
The game program according to the present embodiment also causes the information processing apparatus 1 to function as the second display processing unit 11, and the second display processing unit 11 displays the association information associated with the item selected by the cursor 15 on the touch panel 3 together with the list 13. This can provide the following effects.
That is, in general, in a touch panel terminal, when referring to related information (detailed information or the like) with respect to a specific item in a list, a user performs an operation of switching to another screen for displaying the related information by clicking the item, and performing a predetermined restore operation to restore to the list display. In this case, when the related information is to be referred to for a plurality of items, the operation of switching to another screen and the operation of resuming the list display need to be repeated a plurality of times, which makes the operation complicated and requires labor and time for the user.
In the present embodiment, since the detailed information 21 as the related information is displayed in the related information display area 19 together with the list 13, the user merely moves the cursor 15 or scrolls the list 13 by the touch operation, whereby the related information of the item selected by the cursor 15 is automatically switched and sequentially displayed in the related information display area 19. This makes the operation easy and can greatly reduce the labor and time of the user. Particularly in the case where continuous reference to associated information is desired with respect to a plurality of items.
<6 > modification example
The present invention is not limited to the above-described embodiments, and various modifications can be made without departing from the spirit and technical ideas. Hereinafter, such a modification will be described.
(6-1. Case of magnifying and displaying the selected item)
For example, as shown in fig. 13, the first display processing unit 5 can enlarge and display the item selected by the cursor 15 so as to be larger than the other item not selected. In the example shown in fig. 13, the item selected by the cursor 15 is displayed so that the width in the up-down direction is increased compared to the unselected item, and so that the font size of the text representing the content of the item is increased. In addition to or instead of the size of the items and the like, the font type of the text, the shape, color, pattern and the like of the items may be displayed differently.
In the present modification, the items (in this example, the armed forces) that change with the movement (scroll) of the cursor 15 or the list 13 are displayed in a sequentially enlarged manner.
This increases the number of items in the list 13 that can be displayed on the touch panel 3 at one time, and improves the visibility of the selected item. Therefore, erroneous operation can be suppressed and operability can be further improved.
(6-2. Case of moving the cursor in the same direction as the contact operation direction)
In the above embodiment, the case where the cursor 15 is moved in the direction opposite to the contact operation direction has been described, but the present invention is not limited to this, and the cursor 15 may be moved in the same direction as the contact operation direction.
For example, as shown in fig. 14, when the user swipes the finger 23 downward relatively little from the state shown in fig. 3, the cursor 15 moves downward in the list display area 17 in accordance with the amount of the swiped movement in the state where the list 13 is stopped. For example, as shown in fig. 15, when the user swipes the finger 23 upward relatively little from the state shown in fig. 14, the cursor 15 moves upward in the list display area 17 in accordance with the amount of the swiped movement in the state where the list 13 is stopped. In this modification, when the movement instruction to the cursor 15 based on the swipe is within the range of the list display area 17, the cursor 15 is moved in a direction corresponding to the swipe successively in a state where the list 13 is stopped.
In addition, for example, as shown in fig. 16, when the user swipes the finger 23 relatively downward from the state shown in fig. 3, the cursor 15 moves downward in the state where the list 13 is stopped in the list display area 17 as shown in the upper stage of fig. 16. If the cursor 15 reaches the lower end of the list display area 17 in the middle stage of the stroke, the list 13 is scrolled downward in a state where the cursor 15 is stopped at the lower end of the list display area 17 as shown in the lower stage of fig. 16.
Although the explanation is omitted, the case of performing the flicking is also the same as described above. In this way, in the present modification, the cursor 15 and the list 13 move in the directions corresponding to the directions of the touch operation.
In this modification, the same effects as those of the above embodiment can be obtained.
(6-3. Case where a part of the list display area is out of the display range)
In the above embodiment, the case where the entire list display area 17 is within the display range of the touch panel 3 has been described, but the present invention is not limited thereto. For example, when the list screen is movable without being fixed, it is conceivable to view the list in a state in which the list screen is moved to the end side of the display range of the touch panel 3, in accordance with a user's demand for viewing the list while referring to a map of the background or the like. At this time, when the entire list display area 17 does not reach the specification such that the display range of the touch panel 3 is narrowed, a part of the list display area 17 is out of the display range of the touch panel 3, and this part is not displayed. In this case, the cursor 15 may be movable only in the portion of the list display area 17 within the display range of the touch panel 3, and the same processing as in the above embodiment may be performed.
Fig. 17 shows an example of the display of the present modification. In the example shown in fig. 17, since the state shown in fig. 3 described above, the city main selection screen 25 moves to the lower end side of the display range of the touch panel 3, and as a result, a part of the list 13 (the armed E and the armed F) in the list display area 17 is out of the display range of the touch panel 3 and is not displayed. Although not shown, an information screen to which the user wants to refer, such as a map, is displayed in the background area 27 on the upper end side of the display range. In the example shown in fig. 17, the detailed information 21 of the related information display area 19 is also displayed as only a part, but the detailed information 21 may be displayed entirely.
In this state, a case where, for example, a stroke is performed will be described. When the movement instruction to the cursor 15 based on the stroke is within the display range of the touch panel 3, the operation is the same as that shown in fig. 4 and 5. On the other hand, when the movement instruction to the cursor 15 based on the swipe is out of the display range of the touch panel 3, the following operation is performed. For example, as shown in fig. 18, when the user swipes the finger 23 relatively upward from the state shown in fig. 17, the cursor 15 moves downward in the state where the list 13 is stopped in the list display area 17 as shown in the upper stage of fig. 18. If the cursor 15 reaches the lower end of the display range of the touch panel 3 (not the lower end of the list display area 17) in the middle stage of the stroke, the list 13 is scrolled upward in a state where the cursor 15 is stopped at the lower end of the display range of the touch panel 3 as shown in the lower stage of fig. 18. Although not described, the case of performing the flicking operation is also the same as the above.
As described above, according to this modification, the user can view the list while referring to information such as a map, and thus, the interest of the game and the convenience of the user can be improved.
(6-4. Others)
Although the case where only one of the cursor 15 and the list 13 is moved has been described above, the present invention is not limited to this, and both the cursor 15 and the list 13 may be moved as necessary. For example, as shown in fig. 4, 5, 8, 9, and the like, when a movement instruction to the cursor 15 based on a touch operation is within the range of the list display area 17, both the cursor 15 and the list 13 may be moved according to the touch operation. In this case, the moving direction of the cursor 15 and the scrolling direction of the list 13 may be set to opposite directions.
In the above, the case where the user interface processing program of the present invention is a game program has been described as an example, but the present invention is not limited to the game program. For example, when the information processing apparatus 1 is an OA equipment such as an automatic ticket vending machine, an automatic vending machine, an ATM of a financial institution, a coffee machine, or a FAX in a car navigation apparatus, a railway, a restaurant, or the like, the information processing apparatus may be a user interface processing program applied to the equipment.
In addition to the above, the methods of the above embodiments and modifications may be appropriately combined. Although not illustrated, the above embodiments and modifications may be implemented with various modifications within a range not departing from the gist thereof.

Claims (5)

1. A user interface processing method performed by an information processing apparatus provided with a touch panel, the user interface processing method comprising:
a step of displaying a list including a plurality of items and a cursor for selecting a specific item from the list on the touch panel;
a step of detecting a contact operation with respect to the touch panel; and
a step of relatively moving the cursor and the list based on the contact operation,
in the step of moving the cursor relative to the list,
when the cursor is moved to the item within the display range of the touch panel in the list, the cursor is moved at a first speed corresponding to the operation speed of the contact operation in a state where the list is stopped,
when the cursor is moved to the item outside the display range of the touch panel in the list, the cursor is moved to the end of the display range in a state where the list is stopped, and the list is moved at a second speed corresponding to the operation speed of the touch operation in a state where the cursor is stopped at the end of the display range,
the second speed relative to one operation speed is set to be greater than the first speed relative to the one operation speed.
2. The user interface processing method of claim 1, wherein,
in the step of moving the cursor relative to the list,
the cursor is moved in a direction opposite to a direction corresponding to the contact operation in a state where the list is stopped, and the list is moved in a direction corresponding to the contact operation in a state where the cursor is stopped.
3. The user interface processing method of claim 1, wherein,
the user interface processing method further includes the step of displaying information associated with the item selected by the cursor on the touch panel together with the list.
4. A user interface processing method according to any one of claim 1 to 3, wherein,
in the step of displaying the list and the cursor on the touch panel, the item selected by the cursor is displayed in an enlarged manner so as to be larger than the other items not selected.
5. A recording medium readable by an information processing apparatus, in which a user interface processing program for executing the user interface processing method according to any one of claims 1 to 4 is recorded.
CN201810563268.4A 2017-06-06 2018-06-04 User interface processing method and recording medium Active CN109078325B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017111743A JP6941976B2 (en) 2017-06-06 2017-06-06 User interface processing program, recording medium, user interface processing method
JP2017-111743 2017-06-06

Publications (2)

Publication Number Publication Date
CN109078325A CN109078325A (en) 2018-12-25
CN109078325B true CN109078325B (en) 2024-03-15

Family

ID=64839349

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810563268.4A Active CN109078325B (en) 2017-06-06 2018-06-04 User interface processing method and recording medium

Country Status (2)

Country Link
JP (2) JP6941976B2 (en)
CN (1) CN109078325B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001117918A (en) * 1999-10-20 2001-04-27 Sharp Corp Document editing processor
JP2003330613A (en) * 2002-05-13 2003-11-21 Mobile Computing Technologies:Kk Portable information terminal equipment, display control information and display control method
JP2007240925A (en) * 2006-03-09 2007-09-20 Matsushita Electric Ind Co Ltd Display device
CN101809531A (en) * 2007-10-02 2010-08-18 株式会社爱可信 Terminal device, link selection method, and display program
CN101896867A (en) * 2007-11-07 2010-11-24 豪威科技有限公司 Apparatus and method for tracking a light pointer
JP2011118805A (en) * 2009-12-07 2011-06-16 Alpine Electronics Inc Scroll display device
JP2012120782A (en) * 2010-12-10 2012-06-28 Konami Digital Entertainment Co Ltd Game device and game control program
JP2012174249A (en) * 2011-02-24 2012-09-10 Kyocera Corp Electronic apparatus, display control method and display control program
CN103176716A (en) * 2011-11-22 2013-06-26 索尼电脑娱乐公司 Information processing apparatus and information processing method to achieve efficient screen scrolling
CN104093463A (en) * 2012-04-12 2014-10-08 舒佩塞尔公司 System and method for controlling technical processes

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100058240A1 (en) 2008-08-26 2010-03-04 Apple Inc. Dynamic Control of List Navigation Based on List Item Properties
JP5942978B2 (en) 2013-12-26 2016-06-29 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5869722B1 (en) * 2015-11-12 2016-02-24 京セラ株式会社 Electronic device, display control method, and display control program

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001117918A (en) * 1999-10-20 2001-04-27 Sharp Corp Document editing processor
JP2003330613A (en) * 2002-05-13 2003-11-21 Mobile Computing Technologies:Kk Portable information terminal equipment, display control information and display control method
JP2007240925A (en) * 2006-03-09 2007-09-20 Matsushita Electric Ind Co Ltd Display device
CN101809531A (en) * 2007-10-02 2010-08-18 株式会社爱可信 Terminal device, link selection method, and display program
CN101896867A (en) * 2007-11-07 2010-11-24 豪威科技有限公司 Apparatus and method for tracking a light pointer
JP2011118805A (en) * 2009-12-07 2011-06-16 Alpine Electronics Inc Scroll display device
JP2012120782A (en) * 2010-12-10 2012-06-28 Konami Digital Entertainment Co Ltd Game device and game control program
JP2012174249A (en) * 2011-02-24 2012-09-10 Kyocera Corp Electronic apparatus, display control method and display control program
CN103176716A (en) * 2011-11-22 2013-06-26 索尼电脑娱乐公司 Information processing apparatus and information processing method to achieve efficient screen scrolling
CN104093463A (en) * 2012-04-12 2014-10-08 舒佩塞尔公司 System and method for controlling technical processes

Also Published As

Publication number Publication date
JP2021184269A (en) 2021-12-02
JP2018206130A (en) 2018-12-27
JP7196246B2 (en) 2022-12-26
CN109078325A (en) 2018-12-25
JP6941976B2 (en) 2021-09-29

Similar Documents

Publication Publication Date Title
EP3889747B1 (en) Systems, methods, and user interfaces for interacting with multiple application windows
KR102642883B1 (en) Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display
US9898180B2 (en) Flexible touch-based scrolling
US9223411B2 (en) User interface with parallax animation
KR102027612B1 (en) Thumbnail-image selection of applications
CN108694012B (en) Method and system for displaying objects on screen
WO2012145366A1 (en) Improving usability of cross-device user interfaces
KR102228335B1 (en) Method of selection of a portion of a graphical user interface
CA2674663A1 (en) A method and handheld electronic device having dual mode touchscreen-based navigation
US20220276756A1 (en) Display device, display method, and program
CN109078325B (en) User interface processing method and recording medium
CN111309241B (en) Display device and computer-readable storage medium storing display control program
JP7163685B2 (en) Information processing device and information processing program
JP6872985B2 (en) Game program, recording medium, game processing method
JP6344355B2 (en) Electronic terminal, and control method and program thereof
KR20160027063A (en) Method of selection of a portion of a graphical user interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant