WO2009131089A1 - 携帯情報端末、コンピュータ読取可能なプログラムおよび記録媒体 - Google Patents
携帯情報端末、コンピュータ読取可能なプログラムおよび記録媒体 Download PDFInfo
- Publication number
- WO2009131089A1 WO2009131089A1 PCT/JP2009/057838 JP2009057838W WO2009131089A1 WO 2009131089 A1 WO2009131089 A1 WO 2009131089A1 JP 2009057838 W JP2009057838 W JP 2009057838W WO 2009131089 A1 WO2009131089 A1 WO 2009131089A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- touch panel
- operation screen
- information terminal
- portable information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to a portable information terminal, and more particularly to a portable information terminal that operates a touch panel displayed on a display unit, a computer-readable program, and a recording medium.
- a touch panel is provided on a display unit, an image is displayed on the display unit, an image corresponding to the operation unit is displayed, and a user inputs an operation information by operating the touch panel.
- Various technologies are accepted.
- Patent Document 1 Japanese Patent Laid-Open No. 2007-52795
- the touch position on the touch panel is used as a reference.
- a technique for displaying an image including operation buttons such as a shutter button, a zoom-up button, and a zoom-down button is disclosed.
- the area required for the operation screen is expected to increase.
- the area of the operation screen becomes large, even if the operation screen is displayed at a position that is considered to be easy for the user to operate as described in Patent Document 1, the user cannot feel that way. Is assumed. Specifically, even if the operation screen is displayed at a position that is considered to be easy for the user to operate, the buttons displayed in the corners of the operation screen are operated by the user's finger holding the terminal. May be displayed at a position away from the finger. In such a case, when the user wants to operate the button, the user may change the terminal or hold the button with the other hand. Needed to operate.
- the present invention has been conceived in view of the actual situation, and an object of the present invention is to surely improve user convenience in a portable information terminal that displays an operation screen on a touch panel provided in a display unit. .
- a portable information terminal includes a display unit, a touch panel provided on the display unit, an application execution unit that executes an application, and a control unit that executes processing related to the application in response to an operation on the touch panel.
- the control unit displays an operation screen on which information for use in processing related to the application is input on the display unit, and changes the display position of the operation screen on the display unit based on the first operation on the touch panel.
- a portable information terminal includes a display unit, a touch panel provided on the display unit, an application execution unit that executes an application, and a control unit that executes processing related to the application in response to an operation on the touch panel.
- the control unit displays an operation screen to which information for use in processing related to the application is input on the display unit, and the display position of the operation screen on the display unit is displayed based on the first operation on the touch panel.
- the display position of the operation screen on the display unit After changing and changing the display position of the operation screen based on the first operation, it is possible to return to the position before the change, and when there is an operation on the touch panel, the content of the operation is the condition of the first operation Is determined, and if it is determined that the condition is satisfied, the display position of the operation screen on the display unit is changed.
- a portable information terminal executes a process related to an application in response to an operation on the display unit, a touch panel provided on the display unit, an application execution unit that executes the application, and the touch panel.
- a control unit the operation screen of the application is larger than the size of the display unit, the control unit displays a partial screen that is a part of the operation screen on the display unit, the partial screen is used for processing related to the application
- the control unit changes a part of the operation screen displayed on the display unit as a partial screen, and displays the changed partial screen.
- the end of the operation screen is the end of the display unit of the display unit.
- a computer-readable program is a computer-readable program for controlling a portable information terminal including a display unit, a touch panel provided on the display unit, and an application execution unit that executes an application.
- a portable information terminal including a display unit, a touch panel provided on the display unit, and an application execution unit that executes an application.
- an operation screen for inputting information used for processing related to the application is displayed, a step for determining whether or not there is an operation on the touch panel, and an operation on the touch panel. And changing the display position of the operation screen on the display unit.
- a recording medium records a computer-readable program for controlling a portable information terminal including a display unit, a touch panel provided on the display unit, and an application execution unit that executes an application.
- a program which is a medium, displays on the portable information terminal an operation screen on which information for use in processing related to an application is input is displayed on the display unit, a step of determining whether or not there is an operation on the touch panel, and the touch panel And a step of changing the display position of the operation screen on the display unit based on the operation on the display unit.
- the display position of the operation screen displayed on the display unit can be moved based on the operation on the touch panel.
- the button to be operated on the operation screen displayed on the display unit is displayed at a position away from the finger to operate with the finger of the user's hand holding the portable information terminal Even so, the user can bring the display position of the button closer to the finger. Therefore, the user can operate a desired position on the operation screen such as a desired button without changing the portable information terminal.
- FIG. 1A It is a figure which shows the example of the operation screen displayed on the screen of the mobile telephone of FIG. 1A. It is a figure which shows the example of the operation screen displayed on the screen of the mobile telephone of FIG. 1A. It is a figure which shows typically an example of the state which changes the display mode of the operation screen in the display part of the mobile telephone of FIG. 1A. It is a figure which shows typically an example of the state which changes the display mode of the operation screen in the display part of the mobile telephone of FIG. 1A. It is a figure which shows typically the other example of the state which changes the display mode of the operation screen in the display part of the mobile telephone of FIG. 1A.
- FIG. 1A It is a figure which shows typically the other example of the state which changes the display mode of the operation screen in the display part of the mobile telephone of FIG. 1A. It is a figure for demonstrating the processing content for changing the display mode of the operation screen in the mobile telephone of FIG. 1A. It is a figure for demonstrating the processing content for changing the display mode of the operation screen in the mobile telephone of FIG. 1A. It is a figure for demonstrating the processing content for changing the display mode of the operation screen in the mobile telephone of FIG. 1A. It is a figure for demonstrating the processing content for changing the display mode of the operation screen in the mobile telephone of FIG. 1A. It is a figure for demonstrating the processing content for displaying the operation screen in the mobile telephone of FIG. 1A.
- FIG. 28 is a diagram for explaining a change in the display mode of the operation screen shown in FIGS. 27A to 27C.
- FIG. 28 is a diagram for explaining a change in the display mode of the operation screen shown in FIGS. 27A to 27C.
- FIG. 28 is a diagram for explaining a change in the display mode of the operation screen shown in FIGS. 27A to 27C.
- the portable information terminal of the present invention is not limited to a mobile phone. That is, if the portable information terminal of the present invention is a terminal having a touch panel, it is not required to have a specific function such as a call function mounted on a mobile phone.
- FIGS. 1A to 1E are diagrams schematically showing a side surface of a mobile phone which is an embodiment of the portable information terminal of the present invention.
- a display unit 30 made of a liquid crystal display or the like is provided on one side of the mobile phone 100.
- the display unit 30 can display various information such as a document on a network such as a web page, an address book stored in the mobile phone 100, and a mail creation screen by a mailer.
- a touch panel (a touch panel 40 described later) is provided on the front surface of the display unit 30.
- the mobile phone 100 displays an operation screen 31 for inputting information used for processing related to an application executed on the mobile phone 100.
- the operation screen 31 is displayed in the left region of the display unit 30 by touching the region corresponding to the left region of the display unit 30 on the touch panel.
- the operation screen 31 is displayed in the center of the display unit 30 by touching the area corresponding to the center of the display unit 30 on the touch panel.
- FIG. 1D an area corresponding to the area on the right side of the display unit 30 on the touch panel is touched or the like to be displayed on the area on the right side of the display unit 30.
- a broken line H schematically shows a user's hand operating the touch panel of the mobile phone 100.
- the operation screen 31 includes a plurality of operation buttons 310 each corresponding to an individual function.
- the mobile phone 100 appropriately stores which button on the touch panel the button corresponding to which function is displayed for the plurality of buttons 310 on the operation screen 31. Then, in the mobile phone 100, the processing content to be executed is determined by detecting the information and which position on the touch panel is operated.
- FIG. 2 is a diagram schematically illustrating a hardware configuration of the mobile phone 100.
- mobile phone 100 includes control unit 50 that controls the overall operation of mobile phone 100, antenna 81 for transmitting and receiving data, and signal processing when data is transmitted and received by antenna 81.
- a communication control unit 80 that performs the above, a posture detection unit 90 that detects the posture of the mobile phone, a storage unit 60 that includes a flash memory or the like, a touch panel 40, a display unit 30, a display control unit 51 that controls display contents on the display unit 30,
- the receiver 56 and the microphone 58 mainly used for the call function, the speaker 57 that outputs an alarm sound, the sound output control units 53 and 54 that control the sound output from the receiver and the speaker 57, and the sound input to the microphone 58
- a voice input control unit 55 to be processed and a camera 91 are included.
- the control unit 50 includes a CPU.
- the control unit 50 has a built-in timer 50A.
- the posture detection unit 90 is for detecting the direction and moving direction of the mobile phone 100 and the acceleration given to the mobile phone 100, and includes, for example, a plurality of gyroscopes, acceleration sensors, and geomagnetic sensors.
- the orientation of the mobile phone 100 is, for example, held by a user in a horizontally long state as shown in FIG. 1A (and FIGS. 1B to 1D) or held in a vertically long state as shown in FIG. 1E. Means or not.
- a technique for detecting the orientation, the moving direction, and the moving speed of the main body of the mobile phone 100 using the attitude detection unit 90 a well-known technique can be adopted, and therefore description thereof will not be repeated here.
- the storage unit 60 includes a program storage unit 61 that stores programs executed by the CPU of the control unit 50, a setting content storage unit 62 that stores setting contents for the mobile phone 100 such as an address book, and various tables and the like described later.
- a data storage unit 63 that stores various data necessary for executing a program stored in the program storage unit 61 is included.
- the program storage unit 61 may be fixed to the mobile phone 100 main body or may be detachable.
- FIG. 15 is a flowchart of an interrupt process related to display of the operation screen 31 executed by the CPU.
- the CPU executes the processing every certain time (for example, 200 ms).
- step S ⁇ b> 1 the CPU checks an application activation state in mobile phone 100 and advances the process to step S ⁇ b> 2.
- the cellular phone 100 indicates whether the application is not activated, which application is activated in the TV function, the Web browser function, the mail function, or the like, or whether a call is in progress. By checking this state, it becomes necessary to switch the contents of what is displayed as a menu, or to suppress the process of displaying the menu in some cases. For this reason, in step S1, the application activation state (such as what application is activated) is checked.
- step S2 the CPU checks the state of the touch panel 40, and proceeds to step S3.
- an input function from a touch panel is often provided as an OS function rather than being performed by each application.
- the menu display and screen transition may be displayed for a while after the touch on the touch panel is finished, but the menu is displayed until the touch operation is performed except during that period.
- power consumption is reduced by not performing processing.
- the part for checking the touch state on the touch panel is performed outside the menu control process, and the touch state does not change during the execution of the algorithm of the menu control process. .
- step S3 the CPU determines whether it is necessary to call a menu control process in step S5, which will be described later. If it is determined that it is necessary, the CPU advances the process to step S5. The process proceeds to S4.
- step S4 the CPU returns to step S1 after waiting for the fixed time to elapse from the start of execution of the current main routine.
- step S5 after executing the menu control process, the CPU returns to step S1 after waiting for the lapse of the predetermined time from the execution of the current main routine.
- step S5 various processes including the following five processes are executed in parallel or sequentially.
- the touch operation type identification process is based on the content of an operation on the touch panel 40 by the user. This is a process for identifying the type of operation pattern performed on the touch panel 40.
- the first display mode change process is a process that is executed when the display of the operation screen 31 on the display unit 30 is started.
- the operation screen 31 is also referred to as a “menu” as appropriate.
- the menu drag process is a process for changing the display position of the operation screen 31 displayed on the display unit 30 by sliding the operation screen 31 according to the operation content of the user's touch panel 40.
- the menu position return process is a process for returning the display position of the operation screen whose display position has been changed by the above-described menu drag process to the position before the change.
- the second display mode change process is a process executed when the display of the operation screen 31 displayed on the display unit 30 is ended.
- touch operation types include single tap, double tap, and drag.
- the identification of a single tap and a double tap will be described with reference to the flowchart of the single tap / double tap identification process shown in FIG. Table 1 summarizes the internal state in the single tap / double tap identification process.
- the identification mode of the type of touch operation shown below is merely an example, and in the portable information terminal, the type of touch operation is identified by other modes that are generally used instead of the method described in this specification. May be.
- step SA102 the CPU determines whether or not the user is performing a touch operation on touch panel 40, and determines that the touch operation is being performed. The process proceeds to step SA104, and if it is determined that there is no touch operation, the process proceeds to step SA118.
- step SA104 the CPU determines whether or not the value of the in-touch flag Q0 indicating whether or not the touch operation was performed when the touch operation type identification process was executed last time is 0, and is 0. If it is determined, the process proceeds to step SA106, and if it is not, that is, if it is determined that the value of the in-touch flag Q0 is 1, the process proceeds to step SA112.
- the touching flag Q0 is a flag whose value is updated each time the touch operation type identification process is executed, as will be described later, and when the touch operation is performed on the touch panel 40 at that time.
- the value is set to 1, and the value is set to 0 when no touch operation is performed.
- step SA106 the CPU determines whether or not the difference between the current time and the touch start time T0 is below a predetermined threshold value Td. If so, the process proceeds to step SA108. If it is determined that this is not the case, that is, if it is determined that the time during which the touch operation on the touch panel 40 is continued is equal to or longer than the above-described time Td, the process proceeds to step SA110.
- step SA108 the CPU advances the processing to step SA116, assuming that the operation on the touch panel 40 at that time is a double touch.
- step SA108 the value of the double touch state flag DT, which is a flag indicating whether or not the mobile phone 100 is in a double touch operation state (double touch state), is set to 1.
- step SA110 the CPU sets the operation content on the touch panel 40 at that time as a temporary touch, sets the value of the temporary touch state flag ET to 1, and counts the timer 50A as the value of the touch start time T0.
- the current time is recorded, and the values of the double touch state flag DT, single touch state flag ST, double tap state flag DU, and single tap state flag SU described later are set to 0, and the process proceeds to step SA116. .
- the data storage unit 63 includes, for example, a touch information storage table as shown in Table 2 as a table for storing values used when various types of processing such as touch operation type identification processing are executed.
- a touch information storage table as shown in Table 2 as a table for storing values used when various types of processing such as touch operation type identification processing are executed.
- the touch start position P0 is information indicating a position where the user starts the touch operation on the touch panel 40, and is represented by coordinates defined for the touch panel 40, for example. Specifically, the coordinates of the position where the user starts touching the touch panel 40.
- the value of each item in the touch information storage table is updated when a temporary touch is detected.
- the touch start time T0 is the time when the user starts touch operation on the touch panel 40 as described above.
- the touch position P1 is information indicating the touch position of the user on the touch panel 40 at the time when the CPU is executing various processes such as a touch operation type identification process.
- Touch time T1 is information indicating a time recorded when a user touch process is detected during the execution of various processes executed by the CPU.
- step SA112 the CPU determines whether the difference between the current time and the touch start time T0 is shorter than the time Td as in step SA106. The process proceeds to SA116. On the other hand, if the difference between the current time and the touch start time T0 is equal to or greater than the time Td, the CPU advances the process to step SA114.
- step SA114 the CPU sets the type of touch operation as single touch, sets the single touch state flag ST to 1, and advances the process to step SA116.
- step SA116 the CPU sets the value of the in-touch flag Q0 to 1 and ends the touch operation type identification process.
- step SA118 the CPU determines whether or not the value of the in-touch flag Q0 is 0. If the CPU determines that the value is 0, the process proceeds to step SA124. If it is determined that the value of the in-touch flag Q0 is 1, the process proceeds to step SA120.
- step SA120 the CPU determines whether the value of the double touch state flag DT is 1 or the value of the single touch state flag ST is 1. If so, the process proceeds to step SA122. If it is determined that it is not, that is, if both the double touch state flag DT and the single touch state flag ST are determined to be 0, the process proceeds to step SA126.
- step SA122 if the value of the double touch state flag DT is 1 at that time, the CPU sets the operation type to double tap, the value of the double touch state DT is 0, and the value of the single touch state ST is 1. If there is, the operation type is determined as a single tap. In the case of a double tap, the value of the double tap status flag DU is set to 1. In the case of a single tap, the value of the single tap status flag SU is set to 1. The values of the double touch state flag DT, the single touch state flag ST, the temporary touch state flag ET, and the temporary tap state flag EU are all updated to 0.
- step SA124 it is determined whether or not the value of the temporary touch state flag ET is 1. If it is determined, the process proceeds to step SA128, and if not, the process proceeds to step SA132.
- step SA126 the operation content at that time is temporarily raised, the value of the temporary up state flag EU is set to 1, and the process proceeds to step SA132.
- step SA1208 the CPU determines whether or not the difference between the current time and the touch start time T0 is shorter than the time Td, as in step SA106. If it is determined that the time is shorter, the CPU proceeds to step SA132 and is equal to or greater than Td. If determined, the process proceeds to step SA130.
- step SA130 the CPU sets the content of the touch operation at that time as a single tap and sets the value of the single tap state flag SU to 1. Further, the values of the single touch state flag ST, the double touch state flag DT, the temporary tap state flag EU, and the temporary touch state flag ET are all updated to 0, and the process proceeds to step SA132. .
- step SA132 the value of the in-touch flag Q0 is updated to 0, and the touch operation type identification process is terminated.
- the flag values used in each process such as the touch operation type identification process described above are stored in the data storage unit 63 as a table as shown in Table 3, for example.
- FIG. 17 is a flowchart of the process.
- step S102 the CPU determines whether or not a touch operation is being performed on touch panel 40 at that time, that is, whether or not there is a touch input. If it is determined that there is a touch input, the process proceeds to step S104. If it is determined that there is no touch input, the first display mode changing process is terminated as it is.
- step S104 the CPU detects the operation position (touch position) with respect to the touch panel 40 at that time, and advances the process to step S106.
- step S106 the CPU checks the type of touch operation by referring to the touch type identification result storage table (Table 3), and proceeds to step S108.
- step S108 the CPU determines whether the menu display condition is satisfied based on the touch position detected in step S104 and the type of touch operation checked in step S106.
- step S110 the CPU determines whether the menu display condition is satisfied as a result of the determination in step S108. If it is determined that the menu display condition is satisfied, the CPU proceeds to step S112. Terminate the change process.
- the menu display conditions are determined in advance according to the type of touch operation, and are stored in the setting content storage unit 62, for example.
- step S112 the state of the application running on the mobile phone 100 and the orientation of the mobile phone (for example, whether the mobile phone 100 is in the horizontal orientation as shown in FIG. 1A or the vertical orientation as shown in FIG. 1E). Based on the menu (operation screen) type to be displayed on the display unit 30, the process proceeds to step S114.
- data for displaying various operation screens is stored in the data storage unit 63 in accordance with the state of the running application.
- the data storage unit 63 has an operation screen having a design such as an arrangement of buttons suitable for being displayed on the display unit 30 when the mobile phone 100 is in the horizontal orientation, and a vertical orientation.
- Data for displaying a screen having a design suitable for being displayed on the display unit 30 at a certain time is stored.
- data for displaying an operation screen as shown in FIGS. 3A and 3B can be mentioned.
- buttons 350A, 350B, 350C for allowing mobile phone 100 to input the same information. including.
- the screen 351 has a horizontally long design
- the screen 352 has a vertically long design including buttons that allow the mobile phone 100 to perform the same function as the buttons 350A to 350C included in the screen 351.
- the screen 351 is a screen displayed on the display unit 30 when the mobile phone 100 is in the landscape orientation as shown in FIG. 1A, and the screen 352 is oriented vertically as shown in FIG. 1E. It is a screen displayed on the display part 30 when it exists in the attitude
- step S ⁇ b> 114 the CPU determines whether or not the mobile phone 100 is in the vertical orientation based on the detection output of the orientation detection unit 90. If so, the process proceeds to step S ⁇ b> 120. If it is determined that this is not the case, that is, if it is determined that the posture is in the horizontal orientation, the process proceeds to step S116.
- the horizontal posture shown in FIG. 1A and the vertical posture shown in FIG. 1E are in a state in which a rotation of 90 degrees is applied to each other.
- the mobile phone 100 it is determined that the mobile phone is in the lateral orientation from the state shown in FIG. 1A to the state rotated 45 degrees clockwise and counterclockwise, and from the state shown in FIG. 1E, In the state of being rotated 45 degrees counterclockwise and counterclockwise, it is determined to be in the vertical orientation.
- step S120 the CPU executes a process for obtaining coordinates for displaying the operation screen 31 on the display unit 30, and advances the process to step S122.
- step S120 for example, the coordinates serving as the display center of the operation screen 31 are obtained.
- step S122 the CPU displays the screen for portrait orientation at the coordinates obtained in step S120, and proceeds to step S124.
- step S116 the CPU obtains coordinates for displaying the operation screen 31, and in step S118, obtains the operation screen for landscape orientation (for example, the screen 351 in FIG. 3A) in step S116 of the display unit 30.
- the coordinates are displayed, and the process proceeds to step S124.
- step S116 and step S120 how to obtain the coordinates in step S116 and step S120 will be described.
- the touch panel is divided into two areas A1 and A2 as indicated by a one-dot chain line in FIG. 9, and if the touch position detected in step S104 is within the area A1, the area of the display unit 30 is displayed.
- the operation screen 31 may be displayed in the area corresponding to A1, and if the touch position is in the area A2, the operation screen 31 may be displayed in the area A2 of the display unit 30.
- the alternate long and short dash line is defined so that the display unit 30 and the touch panel 40 provided on the front surface of the display unit 30 are equally divided in the left-right direction.
- the operation screen 31 may be displayed with the touch position B on the touch panel 40 detected in step S104 as the center in the horizontal direction and the vertical direction. Good.
- the operation screen 31 When the operation screen 31 is displayed with the touch position as the center, depending on the touch position, when the operation screen 31 is displayed on the display unit 30 with the touch position as the center, a part of the operation screen 31 is displayed. There may be a situation where it is not displayed on the display unit 30. In such a case, it is preferable to configure the display position of the operation screen 31 within the display unit 30 as shown in FIG. 11A.
- the operation screen 31 before correction is indicated by a broken line
- the operation screen 31 after correction is indicated by a solid line.
- the movement of the operation screen 31 due to the correction is indicated by an arrow.
- the touch position is indicated by 1P.
- step S124 the CPU stores the current time as the display start time t, and proceeds to step S126.
- step S126 the CPU stores the center coordinates of the operation screen 31 as the display start position p (the corrected coordinates when correction is performed as described with reference to FIG. 11A, FIG. 11B, or FIG. 12). Then, the first display mode change process is terminated.
- the display start time t and the display start position p described above are stored in a display information storage table stored in the data storage unit 63, for example.
- An example of the storage contents of the display information storage table is shown in Table 4.
- the menu here is an operation screen for inputting information to be used for processing related to the application, and includes a screen object being displayed on the display unit that is the target of the drag process.
- step S202 the CPU determines whether or not touch input has been made on the touch panel 40 at that time. If it is determined that the touch input has been made, the CPU proceeds to step S204. If not, the process proceeds to step S228.
- step S204 the CPU determines whether or not the cellular phone 100 is in the menu display mode. If it is determined that the mobile phone 100 is in the menu display mode, the process proceeds to step S206. If not, the process proceeds to step S212.
- the mobile phone 100 can take one of four modes: a menu display mode, a menu selection mode, a drag mode, and a menu non-display mode.
- the mode information storage table is stored in the data storage unit 63, for example. Further, in the mode information storage table, information is stored with a flag value (1 or 0), for example, to indicate that any one of the four modes shown in Table 5 is valid.
- step S204 when it is determined in step S204 that the mobile phone 100 is in the menu display mode, the CPU determines in step S206 whether or not the touch position is within an area corresponding to the menu (operation screen 31). If it is determined that this is the case, the process proceeds to step S208. If it is determined that this is not the case, the menu drag process is terminated.
- step S208 the CPU stores the touch position at that time as the touch start position P0, and stores the current time at that time as the touch start time T0, and proceeds to step S210.
- the touch start position P0 and the touch start time T0 here correspond to the start position p and the start time t shown in Table 4.
- the touch position is within the area corresponding to the operation screen 31 is within the area of the touch panel 40 where the touch screen 40 is in contact with the area where the operation screen 31 is displayed on the display unit 30. Say there is.
- step S210 the CPU changes the mode of the mobile phone to the menu selection mode, and advances the process to step S216.
- step S212 the CPU stores the touch position at that time as the touch position P1, stores the current time as the touch time T1, and advances the process to step S214.
- step S214 the CPU determines whether the current mode of the mobile phone 100 is the menu selection mode or the drag mode. If the mode is the menu selection mode, the CPU advances the process to step S216, and in the drag mode. If there is, the process proceeds to step S224.
- step S216 the CPU calculates a movement distance by calculating a difference between the touch position P1 and the touch start position P0, and determines whether the movement distance is larger than a predetermined threshold value. If it is determined, the process proceeds to step S222. If not, that is, if it is determined that the value is equal to or less than the threshold value, the process proceeds to step S218.
- step S222 the CPU changes the mode of the mobile phone 100 to the drag mode and advances the process to step S224.
- step S224 a new menu position is calculated based on the touch position P1, and the process proceeds to step S226.
- the display position of the new operation screen 31 is calculated using the touch position P1 as the center coordinates of the display position of the new operation screen 31.
- step S226 the CPU changes the display position of the operation screen 31 on the display unit 30 to the new position calculated in step S224, and ends the menu drag process.
- the movement of the operation screen from the original position to the new position is continuously displayed on the display unit 30, and the display of the operation screen 31 disappears from the display unit 30 as viewed from the user's eyes. It is desirable to be performed so that it appears to move gradually.
- the mobile phone 100 is in the menu display mode, and the distance (P1) that the user's finger moves on the touch panel 40 when the touch panel 40 is operated continuously (continuously).
- ( ⁇ P0) is larger (longer) than the specific threshold
- the display position of the operation screen 31 is changed based on the operation position (touch position P1) of the touch panel 40 as the movement destination.
- continuously operating the touch panel 40 includes a state in which the user never detects a touch-up from the start of touching the touch panel 40.
- the operation screen 31 when the operation screen 31 is displayed on the display unit 30, the user's finger indicated by a broken line is moved in the direction indicated by the arrow A 31 in the operation screen 31.
- the operation screen 31 is slid on the touch panel 40 by a distance longer than the specific threshold (when a drag operation is performed)
- the operation screen 31 is displayed in the dragged direction as shown in FIG. 4B.
- the position is changed.
- the operation screen 31 includes images of buttons 311A to 311C.
- a broken line H1 indicates the user's finger after the drag operation.
- the user performs a touch-up and then performs a touch operation, that is, for example, by performing a touch operation at the position of the dotted line H2 in FIG.
- the upper button can be selected.
- a dotted line H2 indicates a user's finger that selects a button on the operation screen 31 after the position is changed.
- a broken line H1 indicates a finger that performs the first touch operation, and a dotted line H2 indicates a finger that performs the second touch operation.
- the operation screen 31 whose display position has been changed by the first touch operation remains in that position even after the first touch operation has ended, and the mobile phone 100 remains in a state where the display position of the operation screen 31 has been changed.
- a second touch operation is received.
- FIG. 5A in the state where the operation screen 390 is displayed on the display unit 30, the user's hand (finger) indicated by a broken line is indicated by an arrow A33.
- the display content on the display unit 30 changes as shown in FIG. 5B. That is, as shown in FIG. 5B, the display position of the operation screen 390 on the display unit 30 moves downward.
- the operation screen 390 is a screen on which an address book application is activated.
- the display unit 30 displays a cursor 381 indicating a state in which “NA” in a heading such as “A”, “TA”, “NA”, etc., displayed in the display field 380 is selected.
- the heading of the address book that is included in the “NA” line at the beginning of the heading constituting the address book is displayed.
- FIG. 5A shows a state in which the cursor 391 and the user's finger are selecting the sixth displayed name from the top of the operation screen 390, “Yuko Nagawa”.
- the position of the operation screen 390 in the display unit 30 moves downward as shown in FIG. 5B.
- FIG. 5B indicates the user's finger after the drag operation.
- the user performs a touch-up operation on the operation screen 390 in the state illustrated in FIG. 5B and then performs a touch operation so that the finger can reach after changing the position (the operation screen 31).
- FIG. 5B shows a state where the name displayed third from the top in the operation screen 390 of FIG. 5B is selected as “Nayama Hiromichi”. That is, the display position of the operation screen 390 on the display unit 30 is changed from the state illustrated in FIG. 5A to the state illustrated in FIG.
- the name (selected part) displayed above the operation screen 390 is moved up to the user's hand (finger) at the top of the operation screen 390 as shown in FIG.
- the name (selected part) displayed above can be displayed at a position that can be reached by hand (finger) directly.
- Various patterns can be considered for the portion that can be seen as the background depending on the application activation state and the type of operation screen displayed on the display unit 30.
- the heading portion of the address book that is, the operation screen at the highest level for application operation (the operation screen at the highest level for application operation) is an operation screen
- the background of the address book is seen as the background.
- a part of the heading of the “ta” line displayed so as to overlap the “na” line may be displayed, and the operation screen displayed on the display unit 30 is the operation screen of the address book application itself.
- a wallpaper or an application other than the address book appears as a background, in which case the screen displayed as the wallpaper is displayed on the mobile phone 100 when no other application is activated in parallel.
- the state shown in FIG. I if the state such as direction of the operation screen 390 has been displayed before, in the state shown in Figure 5B, the screen of the application that is started the parallel to become visible as an underlying.
- screens for displaying information related to the application in the address book are shown.
- the screens to be handed over in this way are It is not limited to such a screen.
- Other screens include a screen for displaying various contents such as maps and web contents, a screen for playing and browsing downloaded videos and music, an email creation screen and a display screen, a plurality of screens, etc.
- a screen for selecting an item from the list of selection items can be cited.
- a click operation an operation for touching the touch panel 40
- the mobile phone 100 performs any of the following operations according to the operation Includes screens that can be made.
- a Web content browsing screen will be described as an example as another example of a hand-drawn screen.
- the Web content browsing screen includes content playback and display of items linked to URL (Uniform Resource Locator) addresses of other home pages. Then, by performing an operation such as a single tap on the item (a character string corresponding to the item), processing such as access to a link destination corresponding to the item is executed. From this point of view, it can be said that the Web content browsing screen is also an operation screen for inputting information for causing the mobile phone 100 to execute processing.
- URL Uniform Resource Locator
- FIGS. 27A to 27C are diagrams for explaining a mode in which a web content browsing screen is displayed on the display unit 30 and the browsing screen is dragged.
- the display unit 30 displays a screen including a web content browsing screen.
- the web browser screen displays a display field 30A for displaying information for specifying an application for displaying the browsing screen (character information “web browser” in FIG. 27A) and the like, and a web content browsing screen 361.
- a display field 30B for displaying a URL address where the Web content displayed in the display field 30C exists (a URL address accessed by the mobile phone 100 through a web browser).
- a web browser is installed, and when the web browser is executed, a web content browsing screen as shown in FIG. 27A is displayed.
- the screen displayed in the display field 30C corresponds to an operation screen for inputting information used for processing related to the web browser by a touch operation or the like.
- a part of the Web content is displayed in the display column 30C.
- FIG. 28A schematically shows the relationship between the virtual screen for the entire Web content and the portion displayed in the display column 30C.
- the display unit 30C displays an image of an area indicated by a portion 1001 from the Web content 1000 indicated by a broken line.
- the relative position and size (display magnification in the display field 30C) of the part displayed in the display field 30C in the Web content 1000 are based on, for example, the content of the operation on the touch panel 40. Changed.
- the Web content displayed in the display field 30C displays a plurality of items having link information such as URL addresses.
- each of the four items (“Breaking News”, “Venture Person”, “Anchor Desk”, “Company / Industry Trend”) displayed in the “News” group above the left column of the display column 30C, Assume that the URL address is linked.
- the web browser accesses the URL address of the link destination of the item.
- “80%” in the display column 30A in FIG. 27A is information indicating how much display information remains above the portion displayed in the display column 30C in the entire Web content.
- R% (L2 / L1) ⁇ 100 (1)
- L1 is the vertical dimension of the entire Web content (the vertical dimension of the entire Web content 1000)
- L2 is the distance between the upper end of the portion displayed in the display field 30C and the upper end of the Web content. (Distance between the portion 1001 and the upper end of the Web content 1000).
- R% is information displayed in the display column 30A. By displaying such information in the display field 30A, the user can easily recognize where the information displayed in the display field 30C is located in the entire Web content. it can.
- the display for making a user recognize such a ratio is not limited to the% display shown by FIG. 27A etc. If the screen displayed in the display field 30C is information that allows the user to recognize the positional relationship with respect to the end of the content, as schematically shown in FIG. 28A, Information indicating the positional relationship itself between the entire content and the portion displayed in the display field 30 ⁇ / b> C may be displayed on the display unit 30 separately from the screen 361.
- the display screen in the display column 30C is scrolled downward by an amount corresponding to the amount of the operation (the distance and the number of operations of sliding the finger), and then the scrolling is stopped and the stationary position is stopped.
- the screen will be displayed.
- this scroll display may be performed as if the screen has inertia. In this case, the scroll speed gradually increases from the start of scrolling, and then the scroll speed is decreased and the scroll is stopped.
- FIG. 28A shows an arrow 301A.
- the relationship between the direction of the arrow 301A and the web content 1001 corresponds to the relationship between the direction of the arrow 301 and the screen 361 in FIG. 27A.
- the Web content 1000 is moved with respect to the portion 1001 in the direction in which the finger is slid (the direction of the arrow 301A in FIG. 28A). That is, as a result, as shown in FIG. 28B, the relative positional relationship between the Web content 1000 and the portion 1001 is as if the portion 1001 moved in the opposite direction (upward) to the arrow 301A in the Web content 1000. ,Change.
- FIG. 27B shows a screen displayed on the display unit 30 as a result of the operation of sliding the finger in the direction of the arrow 301.
- the screen displayed in the display column 30C is switched to the screen 362.
- FIG. 27B As shown in FIG. 28B, the positional relationship between the upper end portions of the Web content 1000 and the portion 1001 in the portion displayed in the display column 30C in the entire Web content changes with respect to FIG. 28A. Therefore, the percentage display displayed in the display column 30A changes accordingly. Specifically, in FIG. 27B, the upper end portion of the Web content is displayed at the upper end portion of the display field 30C. As a result, L2 described above becomes 0, and “0%” is displayed in the display column 30A.
- FIG. 27C When the user's finger is further slid in the downward direction indicated by the arrow 302 with respect to the touch panel 40 from the state shown in FIG. 27B, the display in the display column 30C is shown in FIG. 27C. As described above, the display content of the display field 30C is changed so that the Web content screen 362 itself displayed in the display field 30C moves downward (arrow 302). The display content after the change is shown in FIG. 27C.
- the upper end portion of the web content screen 363 displayed in the display column 30C does not coincide with the upper end portion of the display column 30C, and is positioned below the upper end portion.
- the portion displayed in the display column 30C in the Web content has a short vertical dimension.
- the lower portion is deleted by the hatched area from the portion 1001 in FIG. 28B.
- an operation in a pattern different from the operation of sliding the user's finger on the touch panel 40 is performed in an item on the screen 363 (for example, in the “Keyword of Interest” menu). If an operation is performed on the “1-5th”, “6-10th”, and “11th-15th” items indicated as tabs in FIG. 1, the web browser corresponds to the item for which the operation was performed. Assuming that the information to be entered has been entered, screen 3 Processing such as switching of display contents at 63 is executed.
- the selection items near the top of the Web content for example, “1st to 5th place” in the above “Featured keyword” menu
- the items “6th to 10th” and “11th to 15th” can be displayed slightly below the central portion in the vertical direction of the display column 30C.
- the operation screen is continuously displayed at the display position after the movement.
- the user can perform an operation of selecting a selection item by an operation such as touch-up.
- step S216 if it is determined in step S216 that the movement distance does not exceed a specific threshold value, the user's operation is recognized as the selection of a menu item (selected part such as a button in the operation screen 31) ( If it is determined that the movement distance exceeds the specific threshold value (steps S218 to S220), it is recognized as an operation for changing the display position so that the operation screen 31 is dragged (steps S224 to 226).
- the threshold relating to the moving distance can be automatically determined according to the size of the button on the operation screen and the distance of each button. A method for determining the threshold will be described later.
- the moving distance here may be the moving distance of the actual operation target position on the touch panel 40 or may be the moving distance in a certain direction.
- the direct distance from point A to point B can be used as the moving distance, or from point A to point B
- the distance (distance RX) of the horizontal component for the movement up to B can be set as the movement distance.
- the threshold of the movement distance is set to the above value.
- the distance R can be used.
- the threshold when the threshold is the distance R between the center positions of adjacent buttons, the threshold can be determined according to the size of the button on the operation screen and the distance of the button. By determining the threshold value for each operation screen, it is possible to determine more accurately the menu selection mode and the drag mode without being influenced by the size and distance of the button.
- FIG. 7 shows a state in which the button 314 ⁇ / b> B on the operation screen 31 is highlighted as an example of the button highlighting display.
- step S218 the CPU specifies an operation button located closest to the touch position on the operation screen 31, and advances the process to step S220.
- step S220 the CPU performs a process of highlighting and displaying the button specified in step S218, and finishes the menu drag process.
- the user can easily recognize the selected item, and the user can easily recognize that the mobile phone 100 is in the menu selection mode.
- the user recognizes that the drag operation distance is too short to display the operation screen 31 by hand if the highlighting is performed when performing the touch operation in order to perform the hand drag. Then, by continuously performing a drag operation for a longer distance, the display position of the operation screen 31 can be changed so that the mobile phone 100 can draw it.
- step S228 the CPU determines whether or not the mode of the mobile phone is the menu selection mode. If so, the CPU proceeds to step S230 and determines that it is not. The process proceeds to step S232.
- step S230 the CPU executes a process corresponding to the menu item selected in mobile phone 100 at that time, and then proceeds to step S234.
- step S232 the CPU determines whether or not the mode of the mobile phone 100 is the drag mode. If so, the CPU proceeds to step S234, and if not (menu (operation screen 31). ), The menu dragging process is terminated.
- step S234 the CPU sets the mode of the mobile phone 100 to the menu display mode and ends the menu drag process.
- step S202 immediately after the user's finger touches the operation screen 31, the process proceeds from step S202 to S208, and the touch start position P0 and the touch start time T0 are stored.
- step S210 the mode of the cellular phone is changed to the menu selection mode, and the determination in step S216 is executed.
- step S216 if the finger is not moved, or if the moving distance is short enough not to exceed the above-described specific threshold, NO is determined in step S216.
- the processing proceeds to step S218 (FIG. 18), and the button closest to the touch position at that time (button 311B in FIG. 4A) is selected. Note that after the button is selected, the button may be highlighted in step S220, or the highlighting in step S220 may be omitted. Thereafter, the process returns to step S202.
- step S222 When the user moves his / her finger from the state shown in FIG. 4A to the state shown by the broken line H1 in FIG. In the touch period), after the mode of the mobile phone 100 is shifted to the drag mode in step S222, a series of processes of steps S202, S204, S212, S214, S224, and S226 are performed until the user's finger stops. Is repeated, and the display position of the menu is changed according to the movement of the finger. Specifically, when the moving distance of the user's finger on the touch panel due to the moving operation exceeds the specific threshold, a determination of YES is made in step S216, and the process proceeds to step S222. Is changed to the drag mode, and then a series of processes of steps S202, S204, S212, S214, S224, and S226 are repeated to change the display position of the menu.
- step S202 when the user releases his / her finger from touch panel 40 at the position shown in H1 (FIG. 4B) (that is, when touch-up is performed at that position), a NO determination is made at step S202, and the step of FIG. The process proceeds to S228.
- the operation mode of the mobile phone 100 is the drag mode. Therefore, the process proceeds to step S234 via step S232, where the operation mode of the mobile phone 100 is set to the menu display mode, and the process returns to step S202.
- the display position of the operation screen 31 remains the position changed by the execution of step S226 so far, that is, the state shown in FIG. 4B.
- step S202 After the moving operation, once the user touches up, and then the user's finger touches the position indicated by H2 in FIG. 4B, the process starts from step S202, and the processes of steps S204, S206, S208, and S210 Are executed in order. Thereby, the operation mode of the mobile phone 100 is set to the menu selection mode. Thereafter, in S218, the button located near the touch position (button 311A in FIG. 4B) is selected, and the process returns to step S202.
- step S202 Thereafter, when the user's finger is released from the touch panel 40 in the position indicated by H2 (that is, when the moving distance is smaller than the threshold value in S216 and the touch-up occurs in the menu selection mode state), step S202.
- the process proceeds from step S228 to step S228.
- step S230 since the process of changing the operation mode of the mobile phone 100 from the menu selection mode has not been performed after the touch operation is performed at the position H2, YES is determined in step S228, and the process proceeds to step S230.
- step S230 the selection menu item is executed, that is, the processing content corresponding to the button 311A in FIG. 4B is executed.
- step S234 the operation mode of the mobile phone 100 is set to the menu display mode, and the process returns to step S202.
- the midpoint (point Pc) of the point P1 where the drag operation has ended (the finger is released from the touch panel 40) from the point P0 at which the drag operation is started is displayed as the operation screen after movement.
- the center of the 31 display positions may be used.
- the operation screen 31 is to be displayed on the display unit 30 with the point Pc as the center, if the entire operation screen 31 is not displayed on the display unit 30, the point Pc is appropriately set (for example, FIG. 11A and FIG. 11B). Correction is preferred (as described with reference).
- step S216 when the moving distance is equal to or greater than a specific distance in step S216, the operation screen 31 is moved based on the user's operation on the touch panel 40, that is, the shift to the drag mode. To do.
- the moving distance for the touch operation is used as the condition for shifting to the drag mode.
- other operation modes that can be conditions for shifting to the drag mode are conceivable. It is important to accurately determine whether or not to shift to the drag mode based on a user operation. This is because if there is an executable selection item such as a button at the touched position, it is necessary to determine whether to execute it or move the operation screen without executing it.
- the mobile phone is equipped with a determination means for determining whether to move to the drag mode or to select a menu corresponding to the touch position based on the user's operation. It is thought that advantageous effects can be achieved in terms of screen design and ease of use.
- the user instead of the movement distance of the user's operation position, the user has operated the touch panel 40 without touching up the mobile phone 100 for a certain length of time, or the user moves on the operation screen 31 at a specific speed or more. It can also be configured to shift to the drag mode on condition that the finger is moved at the speed of and the drag operation is performed. An example of such processing is shown in FIG.
- FIG. 20 corresponds to a flowchart of a modification of the flowchart of FIG. In FIG. 20, step S216 of FIG. 18 is changed to step S216A.
- step S216A the difference between the touch start time T0 and the touch time T1 (T1-T0), that is, whether the duration of the touch operation has exceeded a predetermined threshold Tx, or the moving speed, that is, the moving distance (P1- It is determined whether or not the value obtained by dividing (P0) by the movement time (T1-T0) (movement speed (here, the initial speed of movement)) exceeds a predetermined threshold value Vx.
- the mobile phone 100 is configured to shift to the drag mode when the duration of the touch operation exceeds a predetermined threshold Tx or the moving speed exceeds the threshold Vx.
- the continuation of the touch operation means that the touch operation is performed without being touched up after being touched.
- the mobile phone 100 can be configured to shift to the above-mentioned drag mode on condition that the user's operation is an operation corresponding to a reciprocating motion.
- FIG. 21 A flowchart of such a modification is shown in FIG.
- the touch position position (Pn) is continuously stored in the touch information storage table in step S212B until the menu display mode is entered, and in step S216B, the operation position is changed from the series of touch positions Pn.
- the trajectory is analyzed, and it is determined whether or not the trajectory corresponds to a reciprocating motion.
- the mobile phone 100 When the mobile phone 100 is reciprocating, the mobile phone 100 shifts to the drag mode in step S222.
- the mobile phone 100 can be configured to shift to the above-described hand-drawing mode on condition that the user operates the touch panel 40.
- step S205A when it is determined in step S204 that mobile phone 100 is in the menu display mode, in step S205A, the position touched by the user is near the end of operation screen 31 (near the boundary of the menu). It is determined whether or not there is.
- the vicinity of the end portion is a region between a broken line 330 and an alternate long and short dash line 331, for example, as shown in FIG.
- This area is provided with buttons 314A to 314D inside the operation screen 31, but can be an area outside them.
- step S205A If it is determined in step S205A that the touch position is near the end, the mode of the mobile phone 100 is set to the drag mode in step S205B.
- step S205A when the touch position is an area other than the area where the button (selection target item) on the operation screen is displayed, the process may be advanced to step S205B.
- the mobile phone 100 can be configured to shift to the above-described hand-drawing mode on condition that an operation of a specific touch operation type has been performed.
- step S209A A flowchart of such a modification is shown in FIG. Referring to FIG. 23, after the touch start position (P0) and the touch start time (T0) are stored in step S208, table 3 is referred to in step S209A, so that the current operation is performed on touch panel 40. It is determined whether or not the type of the touch operation is double touch. If it is determined that this is not the case, in step S209B, the mode of the mobile phone 100 is set to the drag mode.
- the operation screen 31 displayed on the display unit 30 is returned to the display position before the operation on the button after the operation on the button on the operation screen 31 is performed. Is processed.
- the display position of the operation screen is changed so that the operation screen is drawn closer to the user's hand once based on the user's operation on the touch panel 40.
- touch-up occurs at the position indicated by the broken line H11, and then the user's finger performs a selection operation on the button 311 in the operation screen 31 at the position indicated by the dotted line H12, and the process is executed.
- menu position return processing is executed as processing for returning the display position on display unit 30 of operation screen 31 to the original position.
- the menu position return process may return the display position to the original position even when a new touch operation is not performed for a certain period of time after the hand drag operation is completed. That is, the display position may be returned to the original position even if the process by the second operation related to the operation screen such as the selection operation is not executed. Further, if a new touch operation is not performed for a while, the operation screen may not be displayed as shown as the mobile phone 100C in FIG.
- the operation screen 31 displayed on the display unit 30 is returned to the display position before the operation on the button after the operation on the button on the operation screen 31 is performed. Is processed. Specifically, as shown as a mobile phone 100A in FIG. 26, after the user has drawn close to the user's hand (broken line H11) once based on the user's operation on the touch panel 40, the user can In response to the selection operation performed on the button 311, as shown in FIG. 26 as the mobile phone 100 B, the menu position return process is performed as a process for returning the display position on the display unit 30 of the operation screen 31. Is executed.
- FIG. 24 is a flowchart of the menu position return process.
- the CPU first determines whether or not the mode of mobile phone 100 is the menu selection mode in step S302, and if so, the process proceeds to step S306. If it is determined that this is not the case, the process proceeds to step S314.
- step S306 the CPU executes the processing item selected by the user operating the button (eg, button 311) on the operation screen 31, and advances the process to step S308.
- the button eg, button 3111
- step S308 the CPU determines whether or not the menu start position p (see Table 4) is different from the current menu display position (center coordinate of the operation screen 31). If it is determined, the process proceeds to step S310. If it is determined that there is no difference, that is, the coordinates of the center of the current operation screen 31 match the coordinates stored as the display start position p, the process proceeds to step S312.
- step S310 the CPU changes (returns) the display position of the operation screen 31 so that the display position of the operation screen 31 becomes the coordinates stored as the display start position p, and the process proceeds to step S312. To proceed.
- step S312 the CPU changes the mode of the mobile phone 100 to the menu display mode, and ends the menu position return process.
- step S314 the CPU determines whether or not the mode of the mobile phone 100 is the drag mode. If so, the CPU proceeds to step S316, and if not, proceeds to step S318.
- step S316 the CPU changes the mode of the mobile phone 100 to the menu display mode, and proceeds to step S318.
- step S318 the CPU determines whether or not a time without touch input continues for a predetermined time Tx or more. If so, the CPU proceeds to step S320 and determines that it is not. The menu position return process is terminated as it is.
- step S320 the CPU changes the mode of the mobile phone 100 to the menu non-display mode and ends the menu position returning process.
- FIG. 25 is a flowchart of the process.
- step S402 the CPU determines whether or not an input operation is performed on touch panel 40.
- the mode change process is terminated.
- the CPU advances the process to step S404.
- step S404 the CPU determines whether or not the mode of the mobile phone 100 is the menu selection mode. If so, the process proceeds to step S406, and if not, the process proceeds to step S408. .
- step S406 the CPU causes the control item selected at that time to be executed, and then proceeds to step S414.
- step S408 the CPU determines whether or not the mobile phone 100 is in the drag mode. If so, the CPU proceeds to step S410, and if not, the CPU proceeds to step S412.
- step S410 the CPU changes the mode of the mobile phone 100 to the menu display mode, and proceeds to step S412.
- step S412 as in step S318, the CPU determines whether or not the time when there is no touch input on the touch panel 40 is, for example, the time Tx stored in the setting content storage unit 62, and determines that this is the case. The process proceeds to step S414, and if it is not, the second display mode change process is terminated.
- step S414 the CPU changes the mode of the mobile phone 100 to the menu non-display mode and ends the second display mode change process.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
- Telephone Set Structure (AREA)
Abstract
Description
図2を参照して、携帯電話機100は、当該携帯電話機100の動作を全体的に制御する制御部50、データの送受信を行なうためのアンテナ81、アンテナ81によるデータの送受信の際の信号の処理等を行なう通信制御部80、携帯電話機の姿勢を検出する姿勢検出部90、フラッシュメモリ等からなる記憶部60、タッチパネル40、表示部30、表示部30における表示内容を制御する表示制御部51、主に通話機能に利用されるレシーバ56およびマイク58、アラーム音等を出力するスピーカ57、レシーバ,スピーカ57から出力させる音声を制御する音声出力制御部53,54、マイク58に入力された音声を処理する音声入力制御部55、ならびに、カメラ91を含む。制御部50は、CPUを含む。また、制御部50には、タイマ50Aが内蔵されている。
図15は、CPUが実行する、操作画面31の表示に関する割り込み処理のフローチャートである。CPUは、一定時間(たとえば、200ms)毎に、当該処理を実行する。
・第1表示モード変更処理
・メニューの手繰り寄せ処理
・メニュー位置戻し処理
・第2表示モード変更処理
タッチ操作種類識別処理とは、ユーザのタッチパネル40に対する操作の内容に基づいて、タッチパネル40に対して行なわれた操作パターンの種類を識別する処理である。
タッチ開始時刻T0は、上記したように、ユーザがタッチパネル40をタッチ操作を開始した時刻である。
なお、ここでいうメニューとは、アプリケーションに関する処理に利用するための情報を入力するための操作画面であり、手繰り寄せ処理の対象となる表示部で表示中の画面オブジェクトのことを含む。
携帯電話機100は、表5に示すように、メニュー表示モード、メニュー選択モード、手繰り寄せモード、メニュー非表示モードの4つのモードのいずれかを取ることができる。
ここで、L1は、Webコンテンツ全体の縦方向の寸法(Webコンテンツ1000全体の縦方向の寸法)であり、L2は、表示欄30Cに表示されている部分の上端とWebコンテンツの上端との距離(部分1001とWebコンテンツ1000の上端との距離)である。そして、R%が、表示欄30Aに表示される情報である。表示欄30Aにこのような情報が表示されることにより、ユーザは、表示欄30Cに表示されている情報が、Webコンテンツ全体の中のどの位置にある部分であるのかを容易に認識することができる。
63における表示内容の切換等の処理を実行する。
る。図8には、ボタンの強調表示の一例として、操作画面31上のボタン314Bが強調表示された状態が示されている。
図20では、図18のステップS216が、ステップS216Aに変更されている。
図21では、メニュー表示モードに移行するまで、ステップS212Bにおいて、タッチ位置の位置(Pn)を継続的にタッチ情報記憶テーブルに記憶し、そして、ステップS216Bで、一連のタッチ位置Pnから操作位置の軌跡を解析し、当該軌跡が往復運動に相当するか否かが判断されている。そして、往復運動である場合、携帯電話機100は、ステップS222において手繰り寄せモードに移行する。
図22を参照して、ステップS204において携帯電話機100がメニュー表示モードであると判断されると、ステップS205Aにおいて、ユーザがタッチ操作した位置が操作画面31の端部近傍(メニューの境界付近)であるか否かが判断される。
図23を参照して、ステップS208においてタッチ開始位置(P0)とタッチ開始時刻(T0)が記憶された後、ステップS209Aにおいて、表3が参照されることにより、現在タッチパネル40上で行なわれているタッチ操作の種類がダブルタッチであったか否かが判断される。そして、そうではないと判断されると、ステップS209Bで、携帯電話機100のモードが手繰り寄せモードとされる。
携帯電話機100では、図26に示すように、表示部30に表示された操作画面31について、当該操作画面31上のボタンに対する操作の後、当該ボタンに対する操作が行なわれる前の表示位置に戻すような処理がなされる。
図24を参照して、メニュー位置戻し処理では、CPUは、まずステップS302で、携帯電話機100のモードがメニュー選択モードであるか否かを判断し、そうであると判断するとステップS306へ処理を進め、そうではないと判断するとステップS314へ処理を進める。
Claims (25)
- 表示部(30)と、
前記表示部(30)に設けられたタッチパネル(40)と、
アプリケーションを実行するアプリケーション実行部(50)と、
前記タッチパネル(40)に対する操作に応じて前記アプリケーションに関する処理を実行する制御部(50)とを備え、
前記制御部(50)は、
前記表示部(30)に、前記アプリケーションに関する処理に利用するための情報を入力される操作画面(31)を表示し、
前記タッチパネル(40)に対する第1の操作に基づいて、前記表示部(30)における前記操作画面(31)の表示位置を変更する、携帯情報端末(100)。 - 前記制御部(50)は、前記タッチパネル(40)に対する操作があった場合に、当該操作の内容が前記第1の操作の条件を満たしているか否かを判断し、満たしていると判断した場合には、前記表示部(30)における前記操作画面(31)の表示位置を変更する、請求の範囲第1項に記載の携帯情報端末(100)。
- 前記制御部(50)は、
前記操作画面(31)に、前記アプリケーションに関する処理に利用するための情報の入力のための項目(314A~314D)を表示し、
前記制御部(50)は、前記タッチパネル(40)に対する操作があった場合に、前記条件を満たしていないと判断した場合には、前記操作画面(31)の中の、前記タッチパネル(40)に対する操作位置に最も近い項目を、強調表示させる、請求の範囲第2項に記載の携帯情報端末(100)。 - 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作位置の移動距離に基づいて判断する、請求の範囲第2項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記操作画面(31)に、前記アプリケーションに関する処理に利用するための情報の入力のための項目(314A~314D)を表示し、
前記条件を満たしているか否かについての前記移動距離の閾値は、前記操作画面(31)における前記項目の表示の大きさと前記項目の表示間隔とに基づいて決定される、請求の範囲第4項に記載の携帯情報端末(100)。 - 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作のパターンに基づいて判断する、請求の範囲第2項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作開始時の操作位置の移動距離の速度に基づいて判断する、請求の範囲第2項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作が継続された時間に基づいて判断する、請求の範囲第2項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作位置に基づいて判断する、請求の範囲第2項に記載の携帯情報端末(100)。
- 前記携帯情報端末(100)は、片手で把持でき、
前記操作画面(31)は、前記携帯情報端末(100)を把持した手の指で操作可能な位置に表示される、請求の範囲第2項に記載の携帯情報端末(100)。 - 前記制御部(50)は、前記第1の操作がなされることによって表示位置を変更した前記表示部(30)における前記操作画面(31)を、前記第1の操作の終了後にも、当該変更後の位置に継続して表示させる、請求の範囲第1項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記第1の操作がなされることによって前記表示部(30)における前記操作画面(31)の表示位置を変更した後、前記タッチパネル(40)に対して、前記第1の操作とは異なる操作であって前記操作画面(31)の中の項目(314A~314D)を選択する操作である第2の操作がなされた場合に、当該第2の操作によって選択された項目に対応した、前記アプリケーションに関する処理を実行する、請求の範囲第1項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記第2の操作がなされたことによって選択された項目に対応した前記処理を実行した後、前記第1の操作がなされることによって変更した前記表示部(30)における前記操作画面(31)の表示位置を、当該変更前の位置に戻す、請求の範囲第12項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記第1の操作がなされることによって変更した前記表示部(30)における前記操作画面(31)の表示位置を、前記第1の操作の終了後に、当該変更前の位置に戻す、請求の範囲第1項に記載の携帯情報端末(100)。
- 表示部(30)と、
前記表示部(30)に設けられたタッチパネル(40)と、
アプリケーションを実行するアプリケーション実行部(50)と、
前記タッチパネル(40)に対する操作に応じて前記アプリケーションに関する処理を実行する制御部(50)とを備え、
前記制御部(50)は、
前記表示部(30)に、前記アプリケーションに関する処理に利用するための情報を入力され
る操作画面(31)を表示し、
前記タッチパネル(40)に対する第1の操作に基づいて、前記表示部(30)における前記操作画面(31)の表示位置を変更し、
前記操作画面(31)の表示位置を前記第1の操作に基づいて変更した後、変更前の位置に戻すことができ、
前記タッチパネル(40)に対する操作があった場合に、当該操作の内容が前記第1の操作の条件を満たしているか否かを判断し、満たしていると判断した場合には、前記表示部(30)における前記操作画面(31)の表示位置を変更する、携帯情報端末(100)。 - 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作位置の移動距離に基づいて判断する、請求の範囲第15項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記操作画面(31)に、前記アプリケーションに関する処理に利用するための情報の入力のための項目(314A~314D)を表示し、
前記条件を満たしているか否かについての前記移動距離の閾値は、前記操作画面(31)における前記項目の表示の大きさと前記項目の表示間隔とに基づいて決定される、請求の範囲第16に記載の携帯情報端末(100)。 - 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作のパターンに基づいて判断する、請求の範囲第15項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作開始時の操作位置の移動距離の速度に基づいて判断する、請求の範囲第15項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作が継続された時間に基づいて判断する、請求の範囲第15項に記載の携帯情報端末(100)。
- 前記制御部(50)は、前記条件を満たしているか否かを、前記タッチパネル(40)に対する操作位置に基づいて判断する、請求の範囲第15項に記載の携帯情報端末(100)。
- 前記携帯情報端末(100)は、片手で把持でき、
前記操作画面(31)は、前記携帯情報端末(100)を把持した手の指で操作可能な位置に表示される、請求の範囲第15項に記載の携帯情報端末(100)。 - 表示部(30C)と、
前記表示部(30C)に設けられたタッチパネル(40)と、
アプリケーションを実行するアプリケーション実行部(50)と、
前記タッチパネル(40)に対する操作に応じて前記アプリケーションに関する処理を実行する制御部(50)とを備え、
前記アプリケーションの操作画面(1000)は、前記表示部(30C)のサイズよりも大きく、
前記制御部(50)は、前記操作画面(1000)の一部である部分画面(1001,361~363)を前記表示部(30C)に表示し、
前記部分画面(1001,361~363)は、前記アプリケーションに関する処理に利用するための情報の入力のための項目を含み、
前記制御部(50)は、
前記タッチパネル(40)に対する第1の操作に応じて、前記部分画面(1001,361~363)として前記表示部(30C)に表示される前記操作画面(1000)中の部分を変更し、変更後の前記部分画面(1001,361~363)に対して第2の操作がなされることにより前記項目を選択する情報が入力されたと判断して、当該選択された前記項目に対応した前記アプリケーションに関する処理を実行し、
前記部分画面(1001,361~363)が前記操作画面(1000)の端部に位置する状態で、前記第1の操作が行なわれた場合には、前記操作画面(1000)の端部が、前記表示部(30C)の、当該表示部(30C)の端部から前記第1の操作における操作方向と同じ方向に移動した位置に表示されるようにし、
前記移動した位置にある操作画面(1000)に対して前記第2の操作が行なわれた場合には、前記項目を選択する情報が入力されたと判断し、当該選択された前記項目に対応した前記アプリケーションに関する処理を実行する、携帯情報端末(100)。 - 表示部(30)と、前記表示部(30)に設けられたタッチパネル(40)と、アプリケーションを実行するアプリケーション実行部(50)とを備える携帯情報端末(100)を制御するための、コンピュータ読取可能なプログラムであって、
前記携帯情報端末(100)に、
前記表示部(30)に、前記アプリケーションに関する処理に利用するための情報を入力される操作画面(31)を表示するステップと、
前記タッチパネル(40)に対する操作の有無を判断するステップと、
前記タッチパネル(40)に対する操作に基づいて、前記表示部(30)における前記操作画面(31)の表示位置を変更するステップとを実行させる、プログラム。 - 表示部(30)と、前記表示部(30)に設けられたタッチパネル(40)と、アプリケーションを実行するアプリケーション実行部(50)とを備える携帯情報端末(100)を制御するための、コンピュータ読取可能なプログラムを記録した記録媒体であって、
前記プログラムは、前記携帯情報端末(100)に、
前記表示部(30)に、前記アプリケーションに関する処理に利用するための情報を入力される操作画面(31)を表示するステップと、
前記タッチパネル(40)に対する操作の有無を判断するステップと、
前記タッチパネル(40)に対する操作に基づいて、前記表示部(30)における前記操作画面(31)の表示位置を変更するステップとを実行させる、記録媒体(61)。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/989,318 US20110037720A1 (en) | 2008-04-23 | 2009-04-20 | Mobile information terminal, computer-readable program, and recording medium |
CN2009801143851A CN102016779A (zh) | 2008-04-23 | 2009-04-20 | 便携式信息终端、计算机可读程序和记录介质 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-112849 | 2008-04-23 | ||
JP2008112849 | 2008-04-23 | ||
JP2009064587A JP2009284468A (ja) | 2008-04-23 | 2009-03-17 | 携帯情報端末、コンピュータ読取可能なプログラムおよび記録媒体 |
JP2009-064587 | 2009-03-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009131089A1 true WO2009131089A1 (ja) | 2009-10-29 |
Family
ID=41216822
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/057838 WO2009131089A1 (ja) | 2008-04-23 | 2009-04-20 | 携帯情報端末、コンピュータ読取可能なプログラムおよび記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20110037720A1 (ja) |
JP (1) | JP2009284468A (ja) |
CN (1) | CN102016779A (ja) |
WO (1) | WO2009131089A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011161892A1 (ja) * | 2010-06-23 | 2011-12-29 | パナソニック株式会社 | 操作制御装置、操作制御方法および入力装置 |
WO2012169188A1 (ja) * | 2011-06-06 | 2012-12-13 | パナソニック株式会社 | 情報機器及び表示制御方法 |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4364273B2 (ja) * | 2007-12-28 | 2009-11-11 | パナソニック株式会社 | 携帯端末装置及び表示制御方法並びに表示制御プログラム |
CN102763400B (zh) | 2010-02-12 | 2015-07-15 | 京瓷株式会社 | 便携式电子设备 |
JP5304848B2 (ja) * | 2010-10-14 | 2013-10-02 | 株式会社ニコン | プロジェクタ |
JP5751934B2 (ja) * | 2010-10-15 | 2015-07-22 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
US8773473B2 (en) * | 2010-11-29 | 2014-07-08 | Microsoft Corporation | Instantaneous panning using a groove metaphor |
US10552032B2 (en) * | 2010-11-30 | 2020-02-04 | Ncr Corporation | System, method and apparatus for implementing an improved user interface on a terminal |
TWI448934B (zh) * | 2011-03-21 | 2014-08-11 | Au Optronics Corp | 觸碰點的判斷方法 |
CN102760025A (zh) * | 2011-04-26 | 2012-10-31 | 富泰华工业(深圳)有限公司 | 图片浏览系统及图片缩放方法和图片切换方法 |
JP5858641B2 (ja) * | 2011-05-10 | 2016-02-10 | キヤノン株式会社 | 情報処理装置、情報処理装置と外部装置とを含むシステム、システムの制御方法、及びプログラム |
WO2013012267A1 (en) * | 2011-07-19 | 2013-01-24 | Samsung Electronics Co., Ltd. | Electronic device and method for sensing input gesture and inputting selected symbol |
JP5806573B2 (ja) * | 2011-09-28 | 2015-11-10 | キヤノン株式会社 | 座標入力装置およびその制御方法、座標入力システム |
CN102346651A (zh) * | 2011-11-14 | 2012-02-08 | 华为终端有限公司 | 音乐文件的处理方法和装置 |
US8436827B1 (en) * | 2011-11-29 | 2013-05-07 | Google Inc. | Disambiguating touch-input based on variation in characteristic such as speed or pressure along a touch-trail |
US9524050B2 (en) | 2011-11-29 | 2016-12-20 | Google Inc. | Disambiguating touch-input based on variation in pressure along a touch-trail |
US9645733B2 (en) | 2011-12-06 | 2017-05-09 | Google Inc. | Mechanism for switching between document viewing windows |
JP5640035B2 (ja) * | 2012-03-28 | 2014-12-10 | エヌ・ティ・ティ・コムウェア株式会社 | 操作ログ収集方法、操作ログ収集装置、操作ログ収集プログラム |
JP2013214164A (ja) * | 2012-03-30 | 2013-10-17 | Fujitsu Ltd | 携帯電子機器、スクロール処理方法及びスクロール処理プログラム |
JP5998700B2 (ja) * | 2012-07-20 | 2016-09-28 | 日本電気株式会社 | 情報機器 |
KR102016975B1 (ko) * | 2012-07-27 | 2019-09-02 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
JP2014032506A (ja) * | 2012-08-02 | 2014-02-20 | Sharp Corp | 情報処理装置、選択操作検出方法およびプログラム |
US9696879B2 (en) * | 2012-09-07 | 2017-07-04 | Google Inc. | Tab scrubbing using navigation gestures |
JP2014071732A (ja) * | 2012-09-28 | 2014-04-21 | Toshiba Corp | 電子機器、表示制御方法及びプログラム |
EP2904480A4 (en) * | 2012-10-04 | 2016-06-22 | Intel Corp | METHOD, DEVICE AND SYSTEM FOR MANAGING A USER-INTERFACE INTERFACE |
CN102929535B (zh) * | 2012-10-09 | 2018-05-01 | 中兴通讯股份有限公司 | 一种悬浮窗位置控制的方法及终端 |
KR102008512B1 (ko) * | 2012-12-10 | 2019-08-07 | 엘지디스플레이 주식회사 | 터치 센싱 시스템의 에지부 좌표 보상 방법 |
JP5862587B2 (ja) * | 2013-03-25 | 2016-02-16 | コニカミノルタ株式会社 | ジェスチャ判別装置、ジェスチャ判別方法、およびコンピュータプログラム |
JP2014211720A (ja) * | 2013-04-17 | 2014-11-13 | 富士通株式会社 | 表示装置および表示制御プログラム |
US9575649B2 (en) * | 2013-04-25 | 2017-02-21 | Vmware, Inc. | Virtual touchpad with two-mode buttons for remote desktop client |
JP6155869B2 (ja) | 2013-06-11 | 2017-07-05 | ソニー株式会社 | 表示制御装置、表示制御方法およびプログラム |
KR101345847B1 (ko) | 2013-06-13 | 2013-12-30 | 김기두 | 모바일 그래픽 유저 인터페이스 제공방법 |
CN103294346B (zh) * | 2013-06-20 | 2018-03-06 | 锤子科技(北京)有限公司 | 一种移动设备的窗口移动方法及其装置 |
US9612736B2 (en) * | 2013-07-17 | 2017-04-04 | Korea Advanced Institute Of Science And Technology | User interface method and apparatus using successive touches |
US20160246434A1 (en) * | 2013-09-02 | 2016-08-25 | Sony Corporation | Information processing apparatus, information processing method, and program |
WO2015068271A1 (ja) * | 2013-11-08 | 2015-05-14 | 三菱電機株式会社 | アニメーション装置及びアニメーション方法 |
CN104794376B (zh) * | 2014-01-17 | 2018-12-14 | 联想(北京)有限公司 | 终端设备以及信息处理方法 |
CN106068209B (zh) * | 2014-04-03 | 2019-03-15 | 歌乐株式会社 | 车载信息装置 |
JP5711409B1 (ja) * | 2014-06-26 | 2015-04-30 | ガンホー・オンライン・エンターテイメント株式会社 | 端末装置 |
US9678656B2 (en) * | 2014-12-19 | 2017-06-13 | International Business Machines Corporation | Preventing accidental selection events on a touch screen |
JP6131982B2 (ja) * | 2015-04-06 | 2017-05-24 | コニカミノルタ株式会社 | ジェスチャ判別装置 |
JP6014711B2 (ja) * | 2015-04-20 | 2016-10-25 | アルプス電気株式会社 | 携帯機器と自律航法演算法 |
JP6812639B2 (ja) * | 2016-02-03 | 2021-01-13 | セイコーエプソン株式会社 | 電子機器、電子機器の制御プログラム |
US10747337B2 (en) * | 2016-04-26 | 2020-08-18 | Bragi GmbH | Mechanical detection of a touch movement using a sensor and a special surface pattern system and method |
CN108304760B (zh) * | 2017-01-11 | 2021-10-29 | 神盾股份有限公司 | 检测手指上手和离手之方法和电子装置 |
US10755066B2 (en) * | 2017-01-11 | 2020-08-25 | Egis Technology Inc. | Method and electronic device for detecting finger-on or finger-off |
KR20190050485A (ko) * | 2017-11-03 | 2019-05-13 | 현대자동차주식회사 | Ui 관리 서버 및 ui 관리 서버의 제어 방법 |
JP6981326B2 (ja) * | 2018-03-22 | 2021-12-15 | 富士通株式会社 | 情報処理装置、表示システム及びウィンドウ配置プログラム |
CN110647286A (zh) * | 2019-10-09 | 2020-01-03 | 北京字节跳动网络技术有限公司 | 屏幕元素控制方法、装置、设备、存储介质 |
CN111294637A (zh) * | 2020-02-11 | 2020-06-16 | 北京字节跳动网络技术有限公司 | 视频播放方法、装置、电子设备和计算机可读介质 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10254619A (ja) * | 1997-03-07 | 1998-09-25 | Nec Corp | 候補選択用ユーザインタフェース装置 |
JP2006302263A (ja) * | 2005-03-18 | 2006-11-02 | Microsoft Corp | 電子インクまたは手書きインターフェースを呼び出すためのシステム、方法およびコンピュータ読取り可能媒体 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0693852A3 (en) * | 1994-07-22 | 1997-05-28 | Eastman Kodak Co | Method and apparatus for applying a function to a localized domain of a digital image using a window |
US6714214B1 (en) * | 1999-12-07 | 2004-03-30 | Microsoft Corporation | System method and user interface for active reading of electronic content |
TWI238348B (en) * | 2002-05-13 | 2005-08-21 | Kyocera Corp | Portable information terminal, display control device, display control method, and recording media |
US8042044B2 (en) * | 2002-11-29 | 2011-10-18 | Koninklijke Philips Electronics N.V. | User interface with displaced representation of touch area |
TWI240208B (en) * | 2004-02-17 | 2005-09-21 | Elan Microelectronics Corp | Capacitance touch panel with simplified scanning lines and the detection method thereof |
US20060007178A1 (en) * | 2004-07-07 | 2006-01-12 | Scott Davis | Electronic device having an imporoved user interface |
KR100686165B1 (ko) * | 2006-04-18 | 2007-02-26 | 엘지전자 주식회사 | 오에스디 기능 아이콘을 갖는 휴대용 단말기 및 이를이용한 오에스디 기능 아이콘의 디스플레이 방법 |
-
2009
- 2009-03-17 JP JP2009064587A patent/JP2009284468A/ja active Pending
- 2009-04-20 CN CN2009801143851A patent/CN102016779A/zh active Pending
- 2009-04-20 US US12/989,318 patent/US20110037720A1/en not_active Abandoned
- 2009-04-20 WO PCT/JP2009/057838 patent/WO2009131089A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10254619A (ja) * | 1997-03-07 | 1998-09-25 | Nec Corp | 候補選択用ユーザインタフェース装置 |
JP2006302263A (ja) * | 2005-03-18 | 2006-11-02 | Microsoft Corp | 電子インクまたは手書きインターフェースを呼び出すためのシステム、方法およびコンピュータ読取り可能媒体 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011161892A1 (ja) * | 2010-06-23 | 2011-12-29 | パナソニック株式会社 | 操作制御装置、操作制御方法および入力装置 |
WO2012169188A1 (ja) * | 2011-06-06 | 2012-12-13 | パナソニック株式会社 | 情報機器及び表示制御方法 |
Also Published As
Publication number | Publication date |
---|---|
JP2009284468A (ja) | 2009-12-03 |
CN102016779A (zh) | 2011-04-13 |
US20110037720A1 (en) | 2011-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2009131089A1 (ja) | 携帯情報端末、コンピュータ読取可能なプログラムおよび記録媒体 | |
JP5371002B2 (ja) | 携帯情報端末、コンピュータ読取可能なプログラムおよび記録媒体 | |
CN106227344B (zh) | 电子设备及其控制方法 | |
KR102016975B1 (ko) | 디스플레이 장치 및 그 제어 방법 | |
US20080297485A1 (en) | Device and method for executing a menu in a mobile terminal | |
US9110582B2 (en) | Mobile terminal and screen change control method based on input signals for the same | |
US7834861B2 (en) | Mobile communication terminal and method of selecting menu and item | |
JP4955505B2 (ja) | 携帯端末機及びその画面表示方法 | |
US20200371685A1 (en) | Graphical User Interface Display Method And Electronic Device | |
US9600153B2 (en) | Mobile terminal for displaying a webpage and method of controlling the same | |
US20160320891A1 (en) | Electronic Display with a Virtual Bezel | |
US9891805B2 (en) | Mobile terminal, and user interface control program and method | |
US20120162112A1 (en) | Method and apparatus for displaying menu of portable terminal | |
US20110087983A1 (en) | Mobile communication terminal having touch interface and touch interface method | |
EP3575939A1 (en) | Information processing device, information processing method, and program | |
USRE44294E1 (en) | Apparatus and method for display control in a mobile communication terminal | |
KR20100037973A (ko) | 휴대 단말기 및 그 휴대 단말기에서 기능 수행 방법 | |
KR102107469B1 (ko) | 사용자 단말 장치 및 이의 디스플레이 방법 | |
KR20150094484A (ko) | 사용자 단말 장치 및 이의 디스플레이 방법 | |
KR20150094477A (ko) | 사용자 단말 장치 및 이의 디스플레이 방법 | |
EP3531258A1 (en) | Method for searching for icon, and terminal | |
JP5814821B2 (ja) | 携帯端末装置、プログラムおよび画面制御方法 | |
KR20150007048A (ko) | 전자 장치의 디스플레이 방법 | |
WO2015016214A1 (ja) | 携帯端末ならびに表示方向制御方法 | |
JP2011252970A (ja) | 画像表示装置、画像表示方法、及び画像表示プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980114385.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09735772 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12989318 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09735772 Country of ref document: EP Kind code of ref document: A1 |