WO2013161170A1 - Input device, input support method, and program - Google Patents

Input device, input support method, and program Download PDF

Info

Publication number
WO2013161170A1
WO2013161170A1 PCT/JP2013/001799 JP2013001799W WO2013161170A1 WO 2013161170 A1 WO2013161170 A1 WO 2013161170A1 JP 2013001799 W JP2013001799 W JP 2013001799W WO 2013161170 A1 WO2013161170 A1 WO 2013161170A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
screen
touch
hover
proximity
Prior art date
Application number
PCT/JP2013/001799
Other languages
French (fr)
Japanese (ja)
Inventor
中尾 雅俊
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013161170A1 publication Critical patent/WO2013161170A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to an input device that accepts an input operation via a touch panel, an input support method, and a program.
  • a touch panel that can be operated intuitively by a user is widely used as a device that receives an input operation of an electronic device including a mobile phone.
  • the touch panel accepts input operations for the screen of a display unit (for example, LCD (Liquid Crystal Display) or organic EL (Electroluminescence) display) provided in the electronic device and displays the processing result of the electronic device in the same screen.
  • a display unit for example, LCD (Liquid Crystal Display) or organic EL (Electroluminescence) display
  • a touch panel that can detect the proximity of a finger is known (see, for example, Patent Document 1).
  • This touch panel can detect a state where a finger is held at a position separated from the touch panel by a predetermined height, that is, a proximity state between the finger and the touch panel, and has a capacitance determined by a distance between the finger and the touch panel. Based on this, it is possible to detect that the finger has been slid in substantially parallel to the touch panel in the same manner as when the finger is directly slid on the touch panel. For this reason, a touch panel capable of detecting the proximity of a finger is expected to be established as a new user interface.
  • the non-contact type user input device of Patent Document 1 includes a plurality of linear transmission electrodes, a transmitter that supplies an alternating current for transmission to each transmission electrode, and a plurality that is arranged so as not to contact each transmission electrode.
  • a capacitor is formed at each intersection of the transmission electrode and the reception electrode, and a capacitor is formed according to the proximity of the user's fingertip. Therefore, the capacitance of the capacitor changes according to the proximity of the fingertip.
  • the non-contact type user input device can recognize the distance between the touch panel and the finger based on the change in capacitance.
  • the conventional input device has the following problems. Specifically, in order to detect a slide operation in which a finger is separated from the touch panel in a substantially parallel manner with respect to a touch panel capable of detecting the proximity of a finger, in a conventional input device, once in a hover mode, It was necessary to switch to a mode for detecting the movement of the finger by continuing the proximity state.
  • the conventional input device uses the touch operation. It seems that it is difficult to give the user such operability.
  • the hover operation there is nothing to support the finger during the operation, and the finger is not stable compared to the touch operation, so there is a high possibility that an erroneous operation will occur.
  • the finger is continuously slid at a position where the finger is separated from the touch panel as a hover operation, it is necessary to return the finger to the original position in order to perform the second slide operation after the first slide operation. If the finger is not sufficiently separated from the touch panel, a return slide operation may be detected, and it may be difficult to detect a continuous slide operation.
  • the present invention has been made in view of the above-described conventional circumstances, and an input device and input support for efficiently selecting control contents for content according to a user input operation on a touch panel and providing comfortable operability.
  • An object is to provide a method and a program.
  • the present invention provides a display unit that displays data on a screen, a proximity detection unit that detects the proximity of the first finger to the screen, and a second detection unit that detects the proximity of the first finger and detects the proximity of the first finger.
  • a contact detection unit that detects contact of a finger; and an operation execution unit that executes an operation corresponding to the combined operation in accordance with an operation combining the proximity of the first finger and the contact of the second finger; It is an input device provided with.
  • an input support method for an input device including a display unit that displays data on a screen, the step of detecting the proximity of the first finger to the screen, and the detection of the proximity of the first finger. And a step of detecting contact of the second finger on the screen, a step of receiving an operation combining the proximity of the first finger and the contact of the second finger, and corresponding to the combined operation. Performing an operation.
  • a computer which is an input device including a display unit that displays data on a screen, the step of detecting the proximity of the first finger to the screen, and after the proximity of the first finger is detected , Detecting a contact of the second finger with the screen, receiving a combined operation of the proximity of the first finger and the contact of the second finger, and performing an operation corresponding to the combined operation Is a program for realizing
  • the present invention it is possible to efficiently select the control content for the content according to the user's input operation on the touch panel, and to provide comfortable operability.
  • the block diagram which shows the hardware constitutions of the portable terminal in each embodiment The block diagram which shows the functional structure of the portable terminal in each of 1st-5th embodiment (A) The figure which shows a mode that the hover slide operation was performed independently, (B) The figure which shows a mode that the combination operation of a hover slide operation and a touch slide operation was performed.
  • the figure which shows combination operation of the hover slide operation and touch slide operation in 3rd Embodiment (A) The figure which shows the mode at the time of the start of combination operation, (B) The hover slide operation is performed after the start of combination operation. The figure which shows a mode that the space
  • the figure which shows the touch receivable range 45a The figure which shows combination operation of the hover slide operation and touch hold operation in 4th Embodiment, (A) The figure which shows a mode that the hover slide operation was performed independently, (B) Between hover slide operation and touch hold operation Diagram showing how the combination operation is performed The flowchart explaining the operation
  • the figure which shows combination operation of hover slide operation and touch hold operation in 5th Embodiment (A) The figure which shows a mode that combination operation of hover slide operation and touch hold operation was performed, (B) Combination operation The figure which shows a mode that the space
  • the input device of the present embodiment is an electronic device including a display unit that displays data on a screen, and is, for example, a mobile phone, a smartphone, a tablet terminal, a digital still camera, a PDA (personal digital assistant), or an electronic book terminal.
  • a mobile terminal for example, a smartphone
  • a smartphone will be described as an example of the input device of each embodiment.
  • the present invention can also be expressed as an input device as a device or a program for operating the input device as a computer. Furthermore, the present invention can also be expressed as an input support method including each operation (step) executed by the input device. That is, the present invention can be expressed in any category of an apparatus, a method, and a program.
  • the predetermined process is a process (for example, a process of reproducing video data) that executes contents related to the content currently displayed in the application.
  • the “button” may be a hyperlinked character string, that is, a news headline, or an image (for example, an icon or an icon for prompting the user to perform a selection operation). Keyboard software key) or a combination of a character string and an image.
  • the input device can accept, for example, the selection of “news headline” corresponding to the button as the operation on the button in accordance with the input operation of the user, and can display the details of the news corresponding to the selected button.
  • the “button” is determined according to the application running on the input device.
  • the two axes representing the horizontal plane on the touch panel are the x-axis and the y-axis
  • the axis representing the vertical direction (height direction) of the touch panel is the z-axis.
  • the “coordinate” is a position on the horizontal plane of the touch panel, that is, a coordinate (x, y) determined by a combination of the x coordinate and the y coordinate, and the coordinates (x, y) and the touch panel and the finger.
  • the coordinate (x, y, z) using the distance in the vertical direction between them, that is, the height z of the finger from the touch panel.
  • the instruction medium for the touch panel is described using a user's finger as an example, but is not limited to the finger, and a conductive stylus held by the user's hand may be used.
  • the instruction medium is not particularly limited as long as it can detect proximity and touch to the touch panel according to the structure and detection method of the touch panel.
  • an operation of placing a finger on a position on a space separated from the surface of the touch panel is defined as a “hover operation”, and the position on the touch panel surface is determined from the position on the space that is deceived by the hover operation.
  • the operation of sliding (moving) substantially in parallel is defined as “hover slide operation”. Therefore, the operation in which the finger directly touches the surface of the touch panel is not “hover operation” but “touch operation”.
  • an operation of sliding (moving) in a state where a finger is in contact with the surface of the touch panel is defined as a “touch slide operation”.
  • an operation for maintaining the touch state at the position without sliding the finger from the position on the touch panel surface is defined as a “touch hold operation”.
  • the distance between the finger and the surface of the touch panel is inversely proportional to the capacitance detected by the touch panel. It is preferable to correspond.
  • FIG. 1 is a block diagram showing a hardware configuration of the mobile terminals 1 and 1A in each embodiment.
  • 1 includes a processor 11, a display unit 13, a touch panel driver 14, a touch panel 15, a power supply control unit 16, a communication control unit 17 to which an antenna 17a is connected, a ROM (Read Only Memory) 21, and a RAM. (Random Access Memory) 22 and a storage unit 23 are included.
  • the processor 11, the display unit 13, the touch panel driver 14, the power supply control unit 16, the communication control unit 17, the ROM 21, the RAM 22, and the storage unit 23 are connected to each other via a bus 19 so as to be able to input and output data.
  • the processor 11 is configured using, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor), and performs overall control of the mobile terminals 1, 1A, 1B, and 1C. Various arithmetic processes or control processes are performed.
  • the processor 11 reads the program and data stored in the ROM 21 and performs various processes in each embodiment described later.
  • the ROM 21 stores an application 65 (see FIG. 2) installed in the mobile terminal 1, and a program and data for the processor 11 to execute processing in each unit shown in FIG.
  • the RAM 22 operates as a work memory in the operation of the processor 11, the touch panel driver 14, or the communication control unit 17.
  • the storage unit 23 is configured by using a hard disk or a flash memory built in the mobile terminal 1, and stores data acquired or generated by the mobile terminals 1, 1A, 1B, and 1C.
  • the application 65 (see FIG. 2) is stored in the storage unit 23.
  • the storage unit 23 may be configured using, for example, an external storage medium (for example, a USB memory) connected via a USB (Universal Serial Bus) terminal instead of a hard disk or a flash memory.
  • the display unit 13 has a function of displaying a screen, and is configured using, for example, an LCD or an organic EL display, and displays data output from the processor 11 or the touch panel driver 14 on the screen.
  • the touch panel driver 14 controls the operation of the touch panel 15 and monitors a user input operation on the touch panel 15. For example, when the touch panel driver 15 detects contact by a touch operation or a touch slide operation of a user's finger 68 (see FIG. 3A) or proximity by a hover operation or a hover slide operation, the touch coordinates (x, y ) Or proximity coordinates (x, y, z), and information on the contact coordinates (x, y) or proximity coordinates (x, y, z) is output to the processor 11, the RAM 22, or the storage unit 23.
  • the contact coordinates (x, y) are referred to as “touch coordinates (x, y)”.
  • the touch panel 15 is mounted on the screen 45 (see FIG. 3A) of the display unit 13, and detects that the user's finger 68 performs a touch operation or a touch slide operation on the horizontal surface of the touch panel 15. Further, the touch panel 15 detects that the user's finger 68 has approached the touch panel 15 by a hover operation or a hover slide operation.
  • the touch panel 15 has a finger height z value in a hover operation of a predetermined value zth or less, Alternatively, it is detected that the finger 68 is close to the touch panel 15 when the capacitance determined according to the value of the finger height z is equal to or greater than a predetermined value.
  • the power supply control unit 16 is configured using a power supply source (for example, a battery) of the mobile terminal 1, and switches the power supply state of the mobile terminal 1 to an on state or an off state according to an input operation to the touch panel 15. When the power supply is on, the power supply control unit 16 supplies power from the power supply source to each unit shown in FIG. 1 so that the mobile terminal 1 can operate.
  • a power supply source for example, a battery
  • the communication control unit 17 is configured using a wireless communication circuit, transmits data as a processing result processed by the processor 11 via a transmission / reception antenna 17a, and further, a base station (not shown) or other communication Receives data sent from the terminal.
  • FIG. 1 illustrates a configuration necessary for the description of each embodiment including the present embodiment, but the mobile terminals 1, 1A, 1B, and 1C in each embodiment perform voice control for controlling call voice.
  • FIG. 2 is a block diagram showing a functional configuration of the mobile terminal 1 in each of the first to fifth embodiments.
  • a proximity detection unit 5 includes a proximity detection unit 5, a touch detection unit 10, a screen display unit 30, a memory 40, a proximity coordinate extraction unit 51, a touch coordinate extraction unit 52, a state management unit 54, an image button management unit 55, An operation determination unit 56, a display image generation unit 58, an application screen generation unit 59, an image composition unit 60, and an application 65 are included.
  • Application 65 includes a control extension unit 64.
  • the control extension unit 64 includes a state change amount change unit 61, a change target change unit 62, and a state change continuation unit 63.
  • the proximity detector 5 detects a state in which the user's finger is close to the touch panel 15 by a hover operation or a hover slide operation.
  • the proximity detection unit 5 outputs a proximity notification indicating that the finger has approached the touch panel 15 to the proximity coordinate extraction unit 51.
  • the touch detection unit 10 as a contact detection unit detects an operation in which a finger touches the touch panel 15 by a touch operation or a touch slide operation.
  • the touch detection unit 10 outputs a contact notification that a finger has touched the touch panel 15 to the touch coordinate extraction unit 52.
  • the proximity detection unit 5 and the touch detection unit 10 can be configured using the touch panel 15, and in FIG. 2, the proximity detection unit 5 and the touch detection unit 10 are configured separately. You may comprise.
  • the screen display unit 30 corresponds to the display unit 13 shown in FIG. 1, has a function of displaying data on the screen 45, and displays composite image data output from the image composition unit 60 described later on the screen 45.
  • the composite image data is data obtained by combining the data of the screen of the application 65 (hereinafter simply referred to as “application screen”) and the image data generated by the display image generating unit 58 by the image combining unit 60.
  • the memory 40 corresponds to the RAM 22 or the storage unit 23 shown in FIG. 1 and is configured as at least an image button database 55a.
  • the image button database 55a is used in, for example, screen data and image data used in the application 65, image data generated by the application 65, image data received from a base station (not shown) or another communication terminal, and the application 65. Button coordinate information and operation information of the application 65 assigned to the button are stored.
  • the memory 40 temporarily stores information on the proximity coordinates (x, y, z) extracted by the proximity coordinate extraction unit 51 or the touch coordinates (x, y) extracted by the touch coordinate extraction unit 52. May be. In FIG. 2, the arrows from the proximity coordinate extraction unit 51 and the touch coordinate extraction unit 52 to the memory 40 are omitted in order to avoid complication of the drawing.
  • the proximity coordinate extraction unit 51 calculates and extracts proximity coordinates (x, y, z) of the finger with respect to the touch panel 15 based on the proximity notification output from the proximity detection unit 5.
  • the x component and the y component are coordinate values representing positions on the touch panel 15, and the z component is a vertical distance between the finger and the touch panel 15, that is, the finger. Is a coordinate value representing the height z with respect to the touch panel 15.
  • the proximity coordinate extraction unit 51 outputs information on the extracted proximity coordinates (x, y, z) to the operation determination unit 56.
  • the touch coordinate extraction unit 52 calculates and extracts touch coordinates (x, y) when a finger touches the touch panel 15 based on the contact notification output from the touch detection unit 10.
  • the touch coordinate extraction unit 52 outputs information on the extracted touch coordinates (x, y) to the operation determination unit 56.
  • the user input operation determined by the operation determination unit 56 is a hover operation, a hover slide operation, a touch operation, a touch slide operation, or a combination operation of each operation, but is not limited to these operations.
  • the state management unit 54 inputs operation determination result information (described later) output from the operation determination unit 56. Based on the operation determination result information output from the operation determination unit 56, the state management unit 54 is the content of the input operation of the user's finger, that is, any of hover operation, hover slide operation, touch operation, and touch slide operation. Depending on whether the operation is performed, the mobile terminal 1 determines which of the proximity state, the contact state, and the state where the proximity state and the contact state coexist.
  • the state management unit 54 performs a touch slide operation with a second finger (for example, the thumb 68b; the same applies below) when the hover slide operation is performed with a user's first finger (for example, an index finger 68a; the same applies to the following). It is assumed that operation determination result information indicating that a combination operation of a hover slide operation and a touch slide operation has been performed is acquired from the operation determination unit 56.
  • a user input operation the operation of the state management unit 54 when a combination operation of a hover slide operation and a touch slide operation is used will be described.
  • the state management unit 54 changes the proximity state by the hover slide operation and the touch slide operation from the proximity state by the hover slide operation as the state of the mobile terminal 1. It determines with having shifted to the coexistence state with the contact state by. Further, the state management unit 54 outputs (notifies) to the application 65 that the combination operation of the hover slide operation and the touch operation has been performed.
  • the state management unit 54 is information on the slide amount (movement amount) of the combination operation.
  • a display image generation instruction for generating a display image to be displayed on the screen 45 is output to the display image generation unit 58.
  • the slide amount (movement amount) of the combination operation is calculated by the operation determination unit 56 and included in the operation determination result information.
  • the image button management unit 55 reads or writes information indicating display coordinates on the screen 45 of each button constituting the application screen used in the application 65 or image data used in the application 65 from the memory 40.
  • the operation determination unit 56 inputs the information of the proximity coordinates (x, y, z) output from the proximity coordinate extraction unit 51 or the information of the touch coordinates (x, y) output from the touch coordinate extraction unit 52.
  • the operation determination unit 56 based on the input proximity coordinate (x, y, z) information or touch coordinate (x, y) information, the content of the input operation of the user's finger, that is, hover operation, hover slide It is determined which operation or which operation is combined with which operation among the operation, touch operation, and touch slide operation.
  • the operation determination unit 56 performs a combination operation of a hover slide operation with the user's first finger (index finger 68a) and a touch slide operation with the user's second finger (thumb 68b) in the same direction. It is determined whether or not the distance is the same. Furthermore, the operation determination unit 56 calculates the value of the movement amount (slide amount) of the first finger and the second finger by the combination operation. The operation determination unit 56 also calculates the value of the finger movement amount (slide amount) by each operation even when the hover slide operation or the touch slide operation is performed independently.
  • the operation determination unit 56 has the same screen 45 in which the hover slide operation by the user's first finger (index finger 68a) and the touch slide operation by the same user's second finger (thumb 68b) are in the same direction.
  • the state management is performed with information indicating that the combination operation of the hover slide operation and the touch slide operation has been performed in the same direction and the slide amount (movement amount) as operation determination result information.
  • the slide amount movement amount
  • the display image generation unit 58 Based on the display image generation instruction output from the state management unit 54 and the information on the display range of the screen 45 output from the application screen generation unit 59, the display image generation unit 58 performs the application via the image button management unit 55. The image data at 65 is acquired. The display image generation unit 58 generates a display image to be displayed on the screen 45 by cutting out an image in a range corresponding to the display image generation instruction from the acquired image data. The display image generation unit 58 outputs the generated display image to the image composition unit 60.
  • the application screen generation unit 59 acquires various data necessary for generating a screen in the application 65 from the memory 40 via the image button management unit 55 based on the screen generation instruction output from the application 65.
  • the application screen generation unit 59 generates screen data of the application screen in the application 65 using the acquired various data.
  • the application screen generation unit 59 outputs the generated image data to the image composition unit 60.
  • the application screen generation unit 59 and the application 65 are shown as separate configurations. However, the application 65 has the function of the application screen generation unit 59, so that the application screen generation unit 59 and the application 65 You may comprise as the application 65 which bundled.
  • the image composition unit 60 synthesizes the display image data output from the display image generation unit 58 and the screen data of the application screen output from the application screen generation unit 59 and causes the screen display unit 30 to display them.
  • the state change amount changing unit 61 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch slide operation has been performed in the same direction and the same distance, the hover slide operation and the touch slide operation are performed.
  • the state change amount of the operation executed in the application 65 immediately before the combination operation is performed in the same direction and the same distance is changed.
  • the state change amount changing unit 61 increases or decreases the moving amount (sliding amount) or moving speed (sliding speed) of the map image 47 displayed on the screen 45 as an example of changing the operation state changing amount. Change to
  • the change target changing unit 62 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch slide operation is performed in the same direction and the same distance
  • the change target change unit 62 performs the hover slide operation and the touch slide operation.
  • the operation change target (for example, the output sound of the video) executed in the application 65 immediately before the combination operation is performed in the same direction and the same distance is changed to another change target.
  • the change target changing unit 62 changes the output sound of the video to the playback speed of the video as an example of changing the change target of the operation.
  • the state change continuation unit 63 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch hold operation is performed in the same direction and the same distance, the state change continuation unit 63 performs the hover slide operation and the touch hold operation.
  • the change state of the operation executed in the application 65 immediately before the combination operation is performed is maintained, and the maintained state is continued.
  • the state change continuation unit 63 is an example of continuing the state in which the change state of the operation is maintained, and when the map image 47 is moved in the screen 45 by the amount of sliding of the finger by a hover slide operation.
  • the combination operation of the hover slide operation and the touch hold operation causes the hover slide operation to be in the state of the hover operation, but by continuing the touch hold operation, the same operation according to the hover slide operation, that is, the finger slide
  • the movement of the map image 47 in the screen 45 is continued by twice the amount.
  • FIG. 3A is a diagram illustrating a state where the hover slide operation is performed independently.
  • FIG. 3B is a diagram illustrating a state in which a combination operation of a hover slide operation and a touch slide operation is performed.
  • FIG. 3A After the user's finger 68 shown in FIG. 3A starts a hover slide operation on the touch panel 15 (screen 45), as shown in FIG. 3B, another finger (index finger 68a) is displayed. ) Touches (touches) the touch panel 15 and performs a touch slide operation by the same distance in the same direction as the hover slide operation of the index finger 68a.
  • the mobile terminal 1 changes the state change amount of the operation of the application 65 according to the hover slide operation immediately before the combination operation of the hover slide operation and the touch slide operation is performed.
  • the index finger 68 shown in FIG. 3A is the same as the index finger 68a shown in FIG. 3B, and is not the same as the thumb 68b shown in FIG. 3B. Further, in the following description, a finger that performs a hover operation or a hover slide operation is described as an index finger 68a, and a finger that performs a touch operation or a touch slide operation is described as a thumb 68b. However, the present invention is not particularly limited.
  • the hover slide operation is performed independently with the finger 68.
  • the screen 45 displays.
  • the map image 47 thus moved moves (or is also referred to as “scroll”) within the screen 45. By this movement, the content of the image displayed in the screen 45 is scrolled and switched.
  • the map image 47 displayed on the screen 45 by the hover slide operation of the finger 68 causes the movement amount of the finger 68 (the hover slide amount, the length of the arrow a) in the same direction as the hover slide operation of the finger 68 (the direction of the arrow b). For example) is moved (scrolled) within the screen 45 by a distance (corresponding to the length of the arrow b), for example, twice as long.
  • the map image 47 displayed in the screen 45 by the combination operation of the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b has the same distance as the hover slide amount of the index finger 68a in the direction of the arrow b.
  • the position touched by the thumb 68b is represented by a circle 50 (see dotted line), and the same applies to the following embodiments.
  • FIG. 4 is a flowchart for explaining the operation procedure of the mobile terminal 1 in accordance with the combination operation of the hover slide operation and the touch slide operation in the first embodiment.
  • the flowchart shown in FIG. 4 represents the operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 3 (A) and 3 (B).
  • the state management unit 54 determines whether or not a hover slide operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S1). When it is determined that the hover slide operation is being performed, the operation of the mobile terminal 1 proceeds to step S2.
  • the state management unit 54 determines that the hover slide operation is being performed with the user's index finger 68a (S1, YES), based on the operation determination result information from the operation determination unit 56, the user's thumb It is determined whether a touch slide operation is performed by 68b (S2).
  • step S2 the state management unit 54 determines that the hover slide operation by the user's index finger 68a and the touch slide operation by the thumb 68b are in the same direction and are performed at the same distance based on the operation determination result information of the operation determination unit 56.
  • the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO)
  • the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed.
  • the state change amount changing unit 61 initializes the state change amount of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 (S3).
  • the map image 47 is displayed in the screen 45 of the display unit 13 of the mobile terminal 1, and the state change amount in step S3 is set as the movement amount (slide amount or scroll amount) of the map image 47, and the initial state change amount is set. Double the value.
  • the state change amount changing unit 61 moves within the screen 45 according to the movement amount (hover slide amount) of the finger 68 when the hover slide operation (see FIG. 3A) is performed alone.
  • the movement amount (state change amount) of the map image 47 to be made is initialized.
  • the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES)
  • the combined operation of the hover slide operation and the touch slide operation is performed.
  • the state change amount changing unit 61 changes the state change amount of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 to, for example, the same size (1 time) (S4).
  • the amount of movement of the index finger 68a is equal to the amount of movement of the map image 47 displayed in the screen 45, and the user can move the finger 68 performing the hover slide operation.
  • the map image 47 is scrolled by the hover slide operation twice as much as the amount of movement, but by the combined operation of the hover slide operation and the touch slide operation, it is the same as the amount of movement of the index finger 68a performing the hover slide operation.
  • the map image can be scrolled by the amount of movement, and the fine adjustment of the scroll of the map image 47 can be easily performed.
  • step S5 the operation of the mobile terminal 1 proceeds to step S5.
  • the operation determination unit 56 calculates the slide amount of the hover slide operation determined to be executed in step S1 based on the operation determination instruction output from the state management unit 54 (S5).
  • the operation determination unit 56 outputs information on the slide amount as a calculation result to the state management unit 54.
  • the state management unit 54 multiplies the slide amount output from the operation determination unit 56 by the change amount obtained in step S3 or S4, and the map image 47 in the screen 45 corresponding to the slide amount calculated in step S5. Is calculated (scroll amount) (S6).
  • the state management unit 54 outputs a display image generation instruction including information on the calculated movement amount (scroll amount) of the map image to the display image generation unit 58.
  • the display image generation unit 58 Based on the display image generation instruction output from the state management unit 54, the display image generation unit 58 displays the map image 47 after moving within the screen 45 by the movement amount (scroll amount) calculated in step S6. It produces
  • the image composition unit 60 synthesizes the display image data output from the display image generation unit 58 and the screen data of the application screen output from the application screen generation unit 59 and causes the screen display unit 30 to display them (S8). . After step S8, the operation of the portable terminal 1 returns to the process of step S1.
  • the mobile terminal 1 of the present embodiment uses two fingers when, for example, the hover slide operation is performed independently and the map image 47 is moved within the screen 45 by twice the slide amount of the hover slide operation.
  • the movement (scrolling) of the map image 47 within the screen 45 becomes a movement (scrolling) equivalent to the slide amount of the hover slide operation (scrolling).
  • the mobile terminal 1 can finely adjust the movement (scrolling) of the map image 47 in the screen 45 from twice the hover slide amount to the same amount as the movement processing of the map image 47 in the screen 45. According to the user's input operation on the touch panel, the control content for the content (map image) can be efficiently selected and comfortable operability can be given.
  • the movement amount in the screen 45 of the map image 47 when the hover slide operation is performed alone has been described as being twice the hover slide amount of the hover slide operation, but is not particularly limited to twice.
  • the state change amount the movement amount in the screen 45 of the map image has been described, but other state change amounts can be similarly applied.
  • an operation change target in the application 65 is changed to another change target.
  • changing the operation change target in the application 65 to another change target when changing the luminance changed according to the hover slide amount of the hover slide operation to saturation, or reproducing the video data.
  • the volume (output sound) that has been changed according to the hover slide amount of the hover slide operation may be changed to the playback speed.
  • switching of the operation change target in the application 65 is not limited to these examples.
  • the portable terminal of 2nd Embodiment since the portable terminal of 2nd Embodiment has the structure similar to the portable terminal 1 of 1st Embodiment, it uses the same code
  • FIG. 5 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch slide operation in the second embodiment.
  • movement same as 1st Embodiment description is abbreviate
  • the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO)
  • the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed.
  • the change target changing unit 62 initializes the change target of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 (S3A).
  • step S3A the change target changing unit 62 initializes the change target according to the hover slide operation of the finger 68 when the hover slide operation (see FIG. 3A) is performed alone.
  • the change target changing unit 62 initializes the change target to change the change target according to the hover slide operation of the finger 68 from the current “selection process of video data list to be played back”. Initialization to “Volume adjustment processing during playback”. Note that the initialized processing content is preferably determined according to the application 65, and “volume adjustment processing during reproduction” is an example.
  • the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES)
  • the combined operation of the hover slide operation and the touch slide operation is performed. Is output (notified) to the application 65.
  • the change target changing unit 62 changes the change target of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 to another change target (S4A).
  • step S4A the change target changing unit 62 changes the change target according to the hover slide operation of the finger 68 to another change target when a combination operation of the hover slide operation and the touch slide operation is performed.
  • the change target changing unit 62 changes the change target to change the change target according to the hover slide operation of the finger 68 from, for example, the current “selection process of video data to be played back” to “double speed playback of video data”. Change to "Process".
  • step S5A or S4A the operation of the mobile terminal 1 proceeds to step S5.
  • the operation determination unit 56 calculates the slide amount of the hover slide operation determined to be executed in step S1 based on the operation determination instruction output from the state management unit 54 (S5).
  • the operation determination unit 56 outputs information on the slide amount as a calculation result to the state management unit 54.
  • the state management unit 54 Based on the information on the slide amount output from the operation determination unit 56, the state management unit 54 changes the effect of giving the change in the slide 65 for the change target obtained in step S3A or S4A to the operation in the application 65.
  • a target reflection instruction is output to the application 65.
  • the change target changing unit 62 Based on the change target reflection instruction output from the state management unit 54, the change target changing unit 62 reflects the change according to the slide amount with respect to the change target obtained in step S3A or S4A to the operation of the application 65. (S8A). After step S8A, the operation of the mobile terminal 1 returns to step S1.
  • the mobile terminal 1 performs the hover slide operation and the touch slide operation using two fingers, for example, when the list of video data to be reproduced is selected by performing the hover slide alone.
  • the combination operation the selection processing of the video data list is changed to the volume adjustment processing at the time of reproduction of the currently reproduced video data.
  • the portable terminal 1 can easily change the operation change target in the application 65 by a combination operation of the hover slide operation and the touch slide operation, and can give a comfortable operability to the user. .
  • the portable terminal of the third embodiment has the same configuration as that of the first embodiment, the same reference numerals are used for the same components as those of the first embodiment, and description thereof is omitted and different. The contents will be described.
  • FIG. 6 is a diagram illustrating a combination operation of a hover slide operation and a touch slide operation in the third embodiment.
  • FIG. 6A is a diagram illustrating a state at the start of the combination operation.
  • FIG. 6B is a diagram illustrating a state in which the inter-finger spacing between the finger performing the hover slide operation (index finger 68a) and the finger performing the touch slide operation (thumb 68b) after the combination operation is started is shortened. It is.
  • FIG. 6C is a diagram illustrating a state where the locus of the combination operation is a circle. Also in this embodiment, the state change amount of the operation in the application 65 is the amount of movement in the screen 45 of the map image 47.
  • the index finger 68a is close to the touch panel 15 and the thumb 68b is touched (contacted). Suppose that it is in the state.
  • the distance m between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is maintained, and the hover slide operation is performed using the index finger 68a and the thumb 68b. Assume that the combination operation with the touch slide operation is performed in the direction of the arrow a1.
  • the movement amount (slide amount or scroll amount) by which the map image 47 moves within the screen 45 is maintained at a constant movement amount (see arrow b1). That is, the amount of movement of the map image 47 within the screen 45 is the amount of movement equivalent to the hover slide amount of the hover slide operation of the index finger 68a.
  • the amount of movement (slide amount) by which the map image 47 moves within the screen 45 increases, for example, unlike the first embodiment (see arrow b2). That is, the amount of movement of the map image 47 in the screen 45 is, for example, twice the amount of hover slide of the hover slide operation of the index finger 68a.
  • hover slide operation and the combination operation of the hover slide operation and the touch slide operation are not limited to operations for drawing a linear locus (see FIGS. 6A and 6B). It may be a continuous operation (see FIG. 6C).
  • the distance between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is maintained, or the distance is widened or narrowed.
  • the state change amount of the operation in the application 65 is changed according to the interval.
  • the mobile terminal 1 adjusts the volume at the time of reproducing the video data according to the rotation amount, or a plurality of thumbnails displayed in the screen 45 by a combination operation that draws a circular motion locus in FIG. Can be cyclically switched and displayed.
  • FIG. 7 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch slide operation in the third embodiment.
  • the flowchart shown in FIG. 7 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 6 (A) and 6 (B), for example.
  • movement same as 1st Embodiment description is abbreviate
  • the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO)
  • the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed.
  • the state change amount changing unit 61 initializes the state change amount of the operation of the application 65 according to the hover slide operation determined to be executed in step S1 (S3).
  • the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES), the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b are in the same direction. Then, it is determined whether or not the interval between the position on the touch panel 15 (screen 45) in the vertically downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is within a predetermined specified value ( S2A, S2B). 4 and 5, illustration of the operations of steps S2A and S2B is omitted, but the contents of the operations of steps S2A and S2B shown in FIG. 7 are included in step S2.
  • the state management unit 54 determines that the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b are in the same direction, and the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the touch panel 15 ( When it is determined that the distance from the position on the screen 45) is within the specified value (S2A-YES, S2B-YES), the combination operation of the hover slide operation and the touch slide operation is in the same direction and between the fingers Output (notify) to the application 65 that the interval is within the specified value.
  • the state change amount changing unit 61 changes the state change amount of the operation of the application 65 according to the hover slide operation determined to be executed in step S1 to the state change amount according to the inter-finger interval (S4B, (See FIG. 6B).
  • the specified value used in the determination in step S2B is set to a range where the positions of the index finger 68a and the thumb 68b are considered to be appropriate (touch acceptable range), and the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a. It is determined by relative coordinates starting from the position inside (see FIG. 8).
  • FIG. 8 is a diagram showing a touch acceptable range 45a set in accordance with the position of the index finger 68a on the screen 45. As shown in FIG.
  • the touch receivable range 45a is a distance (interval) between a position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a that is operated by the hover and a position on the touch panel 15 (screen 45) of the touched thumb 68b. ) Is within a prescribed value (for example, a circular or elliptical range).
  • the position of the index finger 68a that is, the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a operating the hover is a position of a circle 50A shown in FIG.
  • the touch position of the thumb 68b is included in the touch receivable range 45a (see the hatched portion in FIG. 8), it is determined that the inter-finger interval between the index finger 68a and the thumb 68b is within a specified value.
  • step S2B if the inter-finger spacing exceeds the specified value, the operation of the mobile terminal 1 proceeds to the operation of step S3, assuming that there is a possibility of an erroneous operation by the user. Since the process after step S5 is the same as that of the first embodiment, the description thereof is omitted.
  • the mobile terminal 1 reduces the movement amount (state change amount) of the map image in the screen 45 when the finger interval between the index finger 68a operating the hover and the thumb 68b operating the touch slide is reduced. ) Can be increased and the distance between fingers can be increased, the amount of movement (state change amount) of the map image within the screen 45 can be reduced, and fine adjustment of the state change amount can be easily performed.
  • the inter-finger interval and the increase / decrease in the magnitude of the state change amount may be in a direct proportional or inverse proportional relationship.
  • the movement amount (state of the map image in the screen 45) (Change amount) may be decreased.
  • the amount of movement (state of the map image in the screen 45 from the time when the inter-finger interval is changed) (Change amount) may be further changed.
  • the sliding directions of the index finger 68a for the hover slide operation and the thumb 68b for the touch slide operation are the same direction, or the touch acceptable range 45a is provided, so that the mobile terminal 1 may be erroneously operated by the user. Operations can be eliminated and user operability can be improved.
  • the mobile terminal 1 may determine the touch receivable range 45a of the touch operation after determining the user's dominant hand. For example, when the mobile terminal 1 has touch coordinates (x1, y1) and proximity coordinates (x2, y2, z2), the touch coordinates (x1, y1) and the coordinates (x2, y2) of the proximity coordinates on the xy plane
  • the user's dominant hand may be determined according to the positive direction or the negative direction of the slope of the straight line. In this case, the mobile terminal 1 determines that it is right-handed if the slope of the straight line is the right monotonously increasing direction (positive direction), and is determined to be left-handed if it is the left monotonically increasing direction (negative direction).
  • the setting of the touch receivable range 45a can be applied to the first and second embodiments described above. In each of the following embodiments, the touch acceptable range 45a may be applied.
  • the portable terminal of 4th Embodiment since the portable terminal of 4th Embodiment has the structure similar to the portable terminal 1 of 1st Embodiment, it uses the same code
  • FIG. 9 is a diagram illustrating a combination operation of a hover slide operation and a touch hold operation in the fourth embodiment.
  • FIG. 9A is a diagram illustrating a state where the hover slide operation is performed independently.
  • FIG. 9B is a diagram illustrating a state where a combination operation of a hover slide operation and a touch hold operation is performed.
  • the index finger 68a shown in FIG. 9A starts a hover slide operation on the touch panel 15 (screen 45)
  • the index finger 68a is placed on the touch panel 15 as shown in FIG. 9B. Touch and continue the touch hold operation.
  • the index finger 68a is in the state of hover operation, and the map image 47 displayed in the screen 45 continues to move (scroll), and the map image 47 in the direction when the hover slide operation is performed (see reference sign d). Scrolls.
  • FIG. 10 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch hold operation in the fourth embodiment.
  • the flowchart shown in FIG. 10 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 9A and 9B.
  • the state management unit 54 determines whether or not a hover slide operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S11). When it is determined that the hover slide operation is performed, the operation of the mobile terminal 1 proceeds to step S12.
  • the state management unit 54 outputs (notifies) that the hover slide operation is being performed to the application 65.
  • the state change amount changing unit 61 executes a change process in the operation of the application 65 according to the hover slide operation without changing the state change amount of the operation of the application 65 according to the hover slide operation (S12).
  • step S12 the mobile terminal 1 performs scrolling of the map image 47 displayed in the screen 45 based on the hover slide amount of the hover slide operation as an example of the operation of the application 65 in response to the hover slide operation. To do.
  • the state change amount changing unit 61 holds the state change amount (change condition) of the operation of the application 65 according to the hover slide operation in the memory 40 (S13).
  • the change condition may be a state change amount of the operation of the application 65 according to the hover slide operation or a change target of the operation of the application 65 according to the hover slide operation.
  • the state management unit 54 determines whether or not the hover slide operation has stopped and the state of the hover operation has been entered based on the operation determination result information from the operation determination unit 56 (S14). When it is determined that the hover slide operation is not stopped, the operation of the mobile terminal 1 returns to the operation of step S12.
  • the state management unit 54 determines whether the touch hold operation is performed simultaneously with the stop of the hover slide operation (S15).
  • the term “simultaneously” does not limit the stop of the hover slide operation and the detection of the touch hold operation, and may include, for example, a case where the touch hold operation is detected immediately before the hover slide operation stops. .
  • the state management unit 54 stops the change process of the operation of the application 65 according to the hover slide operation (S16). For example, the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S16). After step S16, the operation of the portable terminal 1 returns to the operation of step S11.
  • the state management unit 54 determines that the simultaneous touch hold operation has been performed (S15, YES)
  • the process is continued (S17).
  • the mobile terminal 1 automatically performs screen scrolling when the hover slide operation is performed independently even when the hover slide operation is stopped and the hover operation is performed and the touch hold operation is performed simultaneously with the stop of the hover slide operation. Can continue.
  • the state management unit 54 determines whether or not the touch hold operation is continued based on the information on the touch coordinates (x, y) output from the touch coordinate extraction unit 52 (S18). When it is determined that the touch hold operation is continued (S18, YES), the operation of the mobile terminal 1 returns to the operation of step S17.
  • the state management unit 54 determines whether the application 65 corresponding to the hover slide operation The operation change process is stopped (S19). For example, the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S19).
  • the mobile terminal 1 can easily scroll the map image 47 in the screen 45 even using a combination operation of a hover slide operation and a touch hold operation. Further, since the mobile terminal 1 does not need to perform the hover slide operation during the touch hold operation and only needs to perform the hover operation, a malfunction occurs due to detection of the return operation when the hover slide operation is continuously performed. None will happen.
  • the mobile terminal 1 of the present embodiment is not limited to continuing the movement state (screen scroll state) of the map image 47 within the screen 45, and for example, the movement amount (state change) of the map image 47 within the screen 45 (The screen scroll amount) may be continued (maintained). For example, if a fast hover slide operation is performed with the index finger 68a, the portable terminal 1 continues the change corresponding to the fast hover slide operation even when the thumb 68b is touch-held, and the slow hover slide operation is performed with the index finger 68a. If so, it is possible to continue the change corresponding to the slow hover slide operation at the time of the touch hold by the thumb 68b.
  • the mobile terminal of the fifth embodiment has the same configuration as the mobile terminal 1 of the fourth embodiment, the same reference numerals are used for the same components as those of the mobile terminal 1 of the fourth embodiment. Thus, the description will be omitted, and different contents will be described.
  • FIG. 11 is a diagram illustrating a combination operation of a hover slide operation and a touch hold operation according to the fifth embodiment.
  • FIG. 11A is a diagram illustrating a state where a combination operation of a hover slide operation and a touch hold operation is performed.
  • FIG. 11B is a diagram illustrating a state in which the inter-finger interval between the finger performing the hover operation and the finger performing the touch hold operation after the start of the combination operation is shortened.
  • the thumb 68b touches the screen 45, so that the screen 45 of the map image 47 is displayed. Assume that the movement (scrolling) inside continues.
  • the inter-finger spacing between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is set.
  • the state change amount of the operation of the application 65 increases or decreases (see FIG. 11B).
  • the state change amount For example, the amount of movement of the map image 47 within the screen 45 increases.
  • FIG. 12 is a flowchart illustrating an operation procedure of the mobile terminal 1 according to a combination operation of the hover slide operation and the touch hold operation in the fifth embodiment.
  • FIG. 12A shows the overall operation.
  • FIG. 12B is a diagram showing details of the change continuation process.
  • the flowchart shown in FIG. 12 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 11 (A) and 11 (B).
  • movement same as 5th Embodiment description is abbreviate
  • step S17A change continuation processing is performed by the state management unit 54 and the control extension unit 64 (S17A). Details of the change continuation processing will be described with reference to FIG.
  • the state management unit 54 determines whether or not the touch position of the thumb 68b shown in FIG. 12B has moved based on the operation determination result information from the operation determination unit 56 (S21).
  • the state management unit 54 reduces the state change amount of the operation of the application 65 according to the hover slide operation. Information to that effect is output to the application 65.
  • the state change amount changing unit 61 reduces the state change amount of the operation of the application 65 in response to the hover slide operation based on the information output from the state management unit 54 (S22). After step S22, the operation of the mobile terminal 1 proceeds to step S24.
  • the state management unit 54 determines the state change amount of the operation of the application 65 according to the hover slide operation. Information indicating the increase is output to the application 65.
  • the state change amount changing unit 61 increases the state change amount of the operation of the application 65 in response to the hover slide operation based on the information output from the state management unit 54 (S23). After step S23, the operation of the mobile terminal 1 proceeds to step S24.
  • step S24 when there is no movement of the touch position, that is, when the touch hold operation is continued at the position touched by the thumb 68b (S21, none), the operation of the portable terminal 1 proceeds to the process of step S24.
  • step S24 the state management unit 54 continues the process of changing the operation of the application 65 according to the hover slide operation according to the change condition held in step S13 (S24).
  • the mobile terminal 1 automatically performs screen scrolling when the hover slide operation is performed independently even when the hover slide operation is stopped and the hover operation is performed and the touch hold operation is performed simultaneously with the stop of the hover slide operation. Can continue.
  • the state management unit 54 determines whether or not the touch hold operation is continued based on the operation determination result information from the operation determination unit 56 (S25). When it is determined that the touch hold operation is continued (S25, YES), the operation of the mobile terminal 1 returns to the operation of step S21. When it is determined that the touch hold operation is not continued (S25, NO), the change continuation process of the mobile terminal 1 is finished, and the operation of the mobile terminal 1 proceeds to step S19.
  • the state management unit 54 stops the change process of the operation of the application 65 according to the hover slide operation (S19).
  • the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S19).
  • the mobile terminal 1 according to the present embodiment moves the thumb 68b in which the touch hold operation has been continued in the forward direction with respect to the direction of the index finger 68a for the hover slide operation, as compared with the mobile terminal 1 according to the fourth embodiment.
  • the state change amount of the operation of the application 65 can be easily adjusted, and the operability for the user can be improved.
  • the portable terminal of the present embodiment designates an item to be selected by a hover operation of the index finger 68a, and determines an item designated by a touch operation of the thumb 68b as a selection target.
  • FIG. 13 is a block diagram illustrating a functional configuration of the mobile terminal 1A according to the sixth embodiment.
  • the same components as those of the mobile terminal 1 shown in FIG. 2 are denoted by the same reference numerals, description thereof is omitted, and different contents are described.
  • a mobile terminal 1A shown in FIG. 13 includes a proximity detection unit 5, a touch detection unit 10, a screen display unit 30, a memory 40, a proximity coordinate extraction unit 51, a touch coordinate extraction unit 52, a state management unit 54, an image button management unit 55, An operation determination unit 56, an indicator state management unit 67, an application screen generation unit 59, and an application 65 are included.
  • the state management unit 54 determines whether or not the user's index finger 68a is in a proximity state by a hover operation, and the user's thumb 68b performs a touch operation to determine the contact state. Further, it is determined whether or not the thumb 68b is touched and the proximity state and the contact state coexist when the index finger 68a is in the proximity state by the hover operation.
  • the status management unit 54 When the state management unit 54 determines that the user's index finger 68 a is hovering, the status management unit 54 outputs information indicating that the index finger 68 a is hovering to the application 65 and the indicator state management unit 67.
  • the indicator state management unit 67 selects the user at a position on the touch panel 15 (screen 45) in the vertical downward direction of the finger (for example, the index finger 68a) that is operating the hover.
  • An indicator generation instruction for displaying a pointer as an indicator indicating a target candidate is output to the application screen generation unit 59.
  • the indicator generation instruction includes a position (x coordinate and y coordinate) on the touch panel 15 where the indicator is displayed, and the type of the indicator.
  • the indicator state management unit 67 may use a cursor in addition to the pointer as the type of indicator included in the indicator generation instruction, or selected when a plurality of items are displayed on the screen 45.
  • a focus may be used to explicitly identify the target.
  • the application screen generation unit 59 Based on the indicator generation instruction output from the indicator state management unit 67, the application screen generation unit 59 displays the indicator at the position of the touch panel 15 (screen 45) in the vertical downward direction of the finger (for example, the index finger 68a) that is operating the hover. Is generated and displayed on the screen display unit 30. Note that the application screen generation unit 59 acquires the indicator via the image button management unit 55 when the pointer shape is stored in the image button database 55a as an indicator, for example.
  • the application 65 does not illustrate the configuration of the control extension unit 64, but may or may not include the control extension unit 64.
  • FIG. 14 is a diagram illustrating a combination operation of a hover operation and a touch operation in the sixth embodiment.
  • FIG. 14A is a diagram illustrating a state where the hover operation is performed independently.
  • FIG. 14B is a diagram illustrating a state where a combination operation of a hover operation and a touch operation is performed.
  • buttons 81 as selectable items are displayed on the screen 45 (touch panel 15) shown in FIG. 14A
  • the plurality of buttons 81 are displayed by the hover operation of the index finger 68a.
  • a specific button for example, letter “c”
  • the specific button is specified around the specific button or a part or all of the specific button.
  • a possible pointer 85 is displayed.
  • the specific specified as the selection target surrounded by the pointer 85 Button (for example, the letter “c”) is selected.
  • the selection of the plurality of buttons 81 selected in advance is confirmed by the touch operation on the screen 45 of the thumb 68b. May be.
  • FIG. 15 is a flowchart illustrating an operation procedure of the mobile terminal 1A according to a combination operation of a hover operation and a touch operation according to the sixth embodiment.
  • the state management unit 54 determines whether or not a hover operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S31). When it is determined that the hover operation is being performed, the operation of the mobile terminal 1 proceeds to step S2.
  • the state management unit 54 determines that the hover operation is being performed with the user's index finger 68a (S31, YES), based on the operation determination result information from the operation determination unit 56, the user's thumb 68b. Whether or not a touch operation has been performed is determined (S32).
  • step S32 when the determination that the touch operation is not performed is the first time, the operation of the mobile terminal 1A returns to step S31, and the determination that the touch operation is not performed is other than the first time. In that case, the operation of the portable terminal 1A returns to step S32.
  • the number of times that the touch operation has not been performed is temporarily stored in the RAM 22, and the operation determination unit 56 determines whether the hover operation is determined in step S31 or the touch operation in step S32 based on the information on the determination number. Make a decision.
  • the state management unit 54 determines that the touch operation has been performed with the user's thumb 68b (S32, YES)
  • the state management unit 54 is vertically below the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. It is determined whether or not there is a button at a position (proximity position) on the touch panel 15 (screen 45) in the direction (S33).
  • the state management unit 54 searches the image button database 55a via the image button management unit 55, and determines whether or not a button exists in the proximity position.
  • step S35 When it is determined that there is no button at a position (proximity position) on the touch panel 15 (screen 45) in the vertical downward direction of the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. (S33, NO), the operation of the portable terminal 1A proceeds to step S35.
  • the state management unit 54 has a button at a position (proximity position) on the touch panel 15 (screen 45) in the vertical downward direction of the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. If it is determined that it is present (S33, YES), information indicating that the type (information) of the button at the position designated by the hover operation is held is output to the indicator state management unit 67 (S34). As an example of the operation of step S34, the state management unit 54 stores (stores) the type (information) of the button at the position specified by the hover operation in the RAM 22 or the memory 40, and further confirms that the button is specified as the selection target. A pointer (indicator) is displayed for explicit recognition.
  • the state management unit 54 displays information indicating that the type of the button at the position specified by the hover operation is retained as the indicator state management unit Output to 67. Based on the information output from the state management unit 54, the indicator state management unit 67 selects the user's selection target at a position on the touch panel 15 (screen 45) in the vertical downward direction of the finger (index finger 68a) that is operating the hover. An indicator generation instruction for displaying a pointer as an indicator indicating the candidate is output to the application screen generation unit 59.
  • the application screen generation unit 59 uses a button at the position of the touch panel 15 (screen 45) in the vertically downward direction of the finger (index finger 68a) that is operating the hover. An application screen displaying the corresponding indicator is generated and displayed on the screen display unit 30.
  • step S31 the state management unit 54 continues the hover operation with the user's index finger 68 based on the information of the proximity coordinates (x, y, z) output from the proximity coordinate extraction unit 51. It is determined whether or not (S35). When it is determined that the hover operation is continuously performed by the user's index finger 68 (S35, YES), the operation of the mobile terminal 1A returns to step S32.
  • the state management unit 54 determines that the hover operation is not performed with the user's index finger 68 (S35, NO)
  • the state management unit 54 causes the operation corresponding to the operation after the hover operation to be executed (S36). For example, when a touch operation is performed after a hover operation, an operation corresponding to the touched position is executed. If it is determined in step S33 that there is a button in the proximity position, as an example of the operation in step S36, the state management unit 54 determines the button type (information) or button function specified by the hover operation. The corresponding operation is executed (S36).
  • the mobile terminal 1A designates the button to be selected by the hover operation of the index finger 68a when, for example, a plurality of items (buttons) are displayed in the screen 45, and touches the thumb 68b. By the operation, selection of the designated button can be confirmed. Thereby, 1 A of portable terminals can select the small button displayed in the screen 45 correctly, and can start the process according to the button.
  • the mobile terminal 1A is not limited to the case where a button is specified by a hover operation and the selection of the button is confirmed by a touch operation, but the change in the operation state of the application 65 is performed by a combination operation of the hover operation and the touch operation.
  • the operation state of the application 65 may be initialized, or the change amount or the change target may be changed as in the above-described embodiments.
  • the mobile terminal 1A may determine that the combination operation of the hover operation and the touch operation is performed only when the touch operation is performed in the touch acceptable range 45a illustrated in FIG. Thereby, 1 A of portable terminals can eliminate the erroneous operation of combined operation with hover operation and touch operation.
  • the movement (scrolling) of the screen 45 on which the map image 47 is displayed is shown as an example of the operation for the hover slide operation, but a photograph or a handwritten image may be used instead of the map image 47.
  • a photograph or a handwritten image may be used instead of the map image 47.
  • scrolling (selection) of a list item of video data or audio data, volume (output audio), content reproduction, stop, fast forward, fast reverse, and the like may be used.
  • examples of the change target that is changed by the combination operation of the hover slide operation and the touch slide operation include volume, luminance, saturation, transparency, and the like. In reproduction of content (for example, video data), double speed reproduction is exemplified.
  • the intermediate state of the change in the operation of the application 65 according to the hover slide operation is memorized by the combination operation of the hover slide operation and the touch hold operation, and is reproduced from the stored state again.
  • the time (timing) at which the touch hold operation is performed such as “bookmark”, may be stored by the combined operation of the hover slide operation and the touch hold operation during reproduction of video data or audio data.
  • the operation of the application 65 according to the hover slide operation may be resumed.
  • the operation state of the application 65 according to the hover slide operation may be reset.
  • the present invention is useful as an input device, an input support method, and a program that efficiently select control contents for content according to a user input operation on a touch panel when displaying an image on a screen and provide comfortable operability. is there.

Abstract

The content of an image (47) displayed on a screen (45) switches if a finger (68) that has continued to be in the proximity of the screen (45) is caused to hover-slide. The image (47) displayed on the screen (45) moves by twice the distance that the finger (68) has moved, in the same direction as the hover-slide direction of the finger (68). If, however, a touch-slide operation is added to the hover-slide operation, the thumb (68b) is caused to touch the screen (45) and the pointer finger (68a) that has continued to be in the proximity of the screen (45) is caused to hover-slide. The image (47) displayed on the screen (45) is switched in the same direction as the slide direction of the pointer finger (68a) and by the same distance, i.e., by an equal magnitude.

Description

入力装置、入力支援方法及びプログラムInput device, input support method, and program
 本発明は、タッチパネルを介して入力操作を受け付ける入力装置、入力支援方法及びプログラムに関する。 The present invention relates to an input device that accepts an input operation via a touch panel, an input support method, and a program.
 近年、タッチパネルを搭載した電子機器が増えている。ユーザにとって直感的な操作が可能なタッチパネルは、携帯電話機を含む電子機器の入力操作を受け付けるデバイスとして、広く用いられている。タッチパネルは、電子機器に設けられた表示部(例えばLCD(Liquid Crystal Display)又は有機EL(Electroluminescence)ディスプレイ)の画面に対する入力操作の受け付けと、電子機器の処理結果の表示処理とを同一の画面内において行うことを可能にする。 In recent years, electronic devices equipped with touch panels are increasing. A touch panel that can be operated intuitively by a user is widely used as a device that receives an input operation of an electronic device including a mobile phone. The touch panel accepts input operations for the screen of a display unit (for example, LCD (Liquid Crystal Display) or organic EL (Electroluminescence) display) provided in the electronic device and displays the processing result of the electronic device in the same screen. Makes it possible to do in
 また、指の近接を検知可能なタッチパネルが知られている(例えば特許文献1参照)。このタッチパネルは、タッチパネルから所定の高さほど離間した位置に指が翳された状態、即ち、指とタッチパネルとの近接状態を検知可能であり、指とタッチパネルとの間の距離により定まる静電容量を基に、指をタッチパネルに対して略平行にスライド操作させたことを、指をタッチパネルに直接触れてスライド操作させた場合と同様に検知できる。このため、指の近接を検知可能なタッチパネルは、新たなユーザインタフェースとしての確立が期待されている。 Also, a touch panel that can detect the proximity of a finger is known (see, for example, Patent Document 1). This touch panel can detect a state where a finger is held at a position separated from the touch panel by a predetermined height, that is, a proximity state between the finger and the touch panel, and has a capacitance determined by a distance between the finger and the touch panel. Based on this, it is possible to detect that the finger has been slid in substantially parallel to the touch panel in the same manner as when the finger is directly slid on the touch panel. For this reason, a touch panel capable of detecting the proximity of a finger is expected to be established as a new user interface.
 特許文献1の非接触型ユーザ入力装置は、複数の線状の送信電極と、各送信電極に送信用の交流電流を供給する発信器と、各送信電極とは接触しないように配置された複数の線状の受信電極と、受信電極を流れる交流電流を受信する受信器とを含む。送信電極と受信電極との各交差点では、コンデンサが形成され、また、ユーザの指先の近接に応じてコンデンサが形成されるので、指先の近接程度に応じてコンデンサの静電容量が変化する。非接触型ユーザ入力装置は、静電容量の変化を基に、タッチパネルと指との距離を認識できる。 The non-contact type user input device of Patent Document 1 includes a plurality of linear transmission electrodes, a transmitter that supplies an alternating current for transmission to each transmission electrode, and a plurality that is arranged so as not to contact each transmission electrode. A linear receiving electrode and a receiver for receiving an alternating current flowing through the receiving electrode. A capacitor is formed at each intersection of the transmission electrode and the reception electrode, and a capacitor is formed according to the proximity of the user's fingertip. Therefore, the capacitance of the capacitor changes according to the proximity of the fingertip. The non-contact type user input device can recognize the distance between the touch panel and the finger based on the change in capacitance.
日本国特開2002-342033号公報Japanese Laid-Open Patent Publication No. 2002-342033
 しかしながら、従来の入力装置には、次のような問題があった。具体的には、指の近接を検知可能なタッチパネルに対し、指をタッチパネルから離間した位置を略平行に行ったスライド操作を検知するためには、従来の入力装置において、一旦、ホバーモード、即ち、近接状態を継続させて指の動きを検知するためのモードに切り替える必要があった。 However, the conventional input device has the following problems. Specifically, in order to detect a slide operation in which a finger is separated from the touch panel in a substantially parallel manner with respect to a touch panel capable of detecting the proximity of a finger, in a conventional input device, once in a hover mode, It was necessary to switch to a mode for detecting the movement of the finger by continuing the proximity state.
 ホバーモードにおいて、タッチ操作を用いずに、ホバー操作、即ち、タッチパネルから離間した位置を継続して指を翳す又はスライドするための操作だけを用いた場合では、従来の入力装置は、タッチ操作ほどの操作性をユーザに与えることが困難であると考えられる。 In the hover mode, when using only the hover operation, that is, the operation for continuously flicking or sliding the finger away from the touch panel without using the touch operation, the conventional input device uses the touch operation. It seems that it is difficult to give the user such operability.
 具体的には、ホバー操作では、操作時に指の支えとなるものが無く、タッチ操作に比べて、指が安定しないために誤操作が生じる可能性が高い。例えば、ホバー操作として、指をタッチパネルから離間した位置を連続的にスライド操作する場合、1回目のスライド操作の後、2回目のスライド操作を行うために指を元の位置に戻す必要があるが、指をタッチパネルから十分な距離を離さないと戻りのスライド操作が検知される場合があり、連続的なスライド操作の検知が困難となることがあった。このように、ホバー操作だけが用いられた場合では、ユーザの入力操作の意図とは拘わりなく、タッチ操作に比べて誤操作が生じる可能性が高いと考えられる。 Specifically, in the hover operation, there is nothing to support the finger during the operation, and the finger is not stable compared to the touch operation, so there is a high possibility that an erroneous operation will occur. For example, when the finger is continuously slid at a position where the finger is separated from the touch panel as a hover operation, it is necessary to return the finger to the original position in order to perform the second slide operation after the first slide operation. If the finger is not sufficiently separated from the touch panel, a return slide operation may be detected, and it may be difficult to detect a continuous slide operation. As described above, when only the hover operation is used, it is considered that there is a high possibility that an erroneous operation occurs compared to the touch operation regardless of the intention of the user's input operation.
 本発明は、上述した従来の事情に鑑みてなされたものであり、タッチパネルに対するユーザの入力操作に応じて、コンテンツに対する制御内容を効率的に選択し、快適な操作性を与える入力装置、入力支援方法及びプログラムを提供することを目的とする。 The present invention has been made in view of the above-described conventional circumstances, and an input device and input support for efficiently selecting control contents for content according to a user input operation on a touch panel and providing comfortable operability. An object is to provide a method and a program.
 本発明は、画面にデータを表示する表示部と、前記画面に対する第1の指の近接を検知する近接検知部と、前記第1の指の近接が検知された後に、前記画面に対する第2の指の接触を検知する接触検知部と、前記第1の指の近接及び前記第2の指の接触を組み合わせた操作に応じて、前記組み合わせた操作に対応する動作を実行する動作実行部と、を備える入力装置である。 The present invention provides a display unit that displays data on a screen, a proximity detection unit that detects the proximity of the first finger to the screen, and a second detection unit that detects the proximity of the first finger and detects the proximity of the first finger. A contact detection unit that detects contact of a finger; and an operation execution unit that executes an operation corresponding to the combined operation in accordance with an operation combining the proximity of the first finger and the contact of the second finger; It is an input device provided with.
 また、本発明は、画面にデータを表示する表示部を含む入力装置の入力支援方法であって、前記画面に対する第1の指の近接を検知するステップと、前記第1の指の近接が検知された後に、前記画面に対する第2の指の接触を検知するステップと、前記第1の指の近接及び前記第2の指の接触を組み合わせた操作を受け付けるステップと、前記組み合わせた操作に対応する動作を実行するステップと、を有する。 According to another aspect of the present invention, there is provided an input support method for an input device including a display unit that displays data on a screen, the step of detecting the proximity of the first finger to the screen, and the detection of the proximity of the first finger. And a step of detecting contact of the second finger on the screen, a step of receiving an operation combining the proximity of the first finger and the contact of the second finger, and corresponding to the combined operation. Performing an operation.
 また、本発明は、画面にデータを表示する表示部を含む入力装置であるコンピュータに、前記画面に対する第1の指の近接を検知するステップと、前記第1の指の近接が検知された後に、前記画面に対する第2の指の接触を検知するステップと、前記第1の指の近接及び前記第2の指の接触を組み合わせた操作を受け付けるステップと、前記組み合わせた操作に対応する動作を実行するステップと、を実現させるためのプログラムである。 According to another aspect of the present invention, there is provided a computer, which is an input device including a display unit that displays data on a screen, the step of detecting the proximity of the first finger to the screen, and after the proximity of the first finger is detected , Detecting a contact of the second finger with the screen, receiving a combined operation of the proximity of the first finger and the contact of the second finger, and performing an operation corresponding to the combined operation Is a program for realizing
 この構成によれば、タッチパネルに対するユーザの入力操作に応じて、コンテンツに対する制御内容を効率的に選択し、快適な操作性を与えることができる。 According to this configuration, it is possible to efficiently select the control content for the content according to the user's input operation on the touch panel, and to provide comfortable operability.
 本発明によれば、タッチパネルに対するユーザの入力操作に応じて、コンテンツに対する制御内容を効率的に選択し、快適な操作性を与えることができる。 According to the present invention, it is possible to efficiently select the control content for the content according to the user's input operation on the touch panel, and to provide comfortable operability.
各実施形態における携帯端末のハードウェア構成を示すブロック図The block diagram which shows the hardware constitutions of the portable terminal in each embodiment 第1~第5の各実施形態における携帯端末の機能的構成を示すブロック図The block diagram which shows the functional structure of the portable terminal in each of 1st-5th embodiment (A)ホバースライド操作が単独で行われた様子を示す図、(B)ホバースライド操作とタッチスライド操作との組み合わせ操作が行われた様子を示す図(A) The figure which shows a mode that the hover slide operation was performed independently, (B) The figure which shows a mode that the combination operation of a hover slide operation and a touch slide operation was performed 第1の実施形態におけるホバースライド操作とタッチスライド操作との組み合わせ操作に応じた携帯端末の動作手順を説明するフローチャートThe flowchart explaining the operation | movement procedure of the portable terminal according to combination operation of hover slide operation and touch slide operation in 1st Embodiment. 第2の実施形態におけるホバースライド操作とタッチスライド操作との組み合わせ操作に応じた携帯端末の動作手順を説明するフローチャートThe flowchart explaining the operation | movement procedure of the portable terminal according to combination operation of hover slide operation and touch slide operation in 2nd Embodiment. 第3の実施形態におけるホバースライド操作とタッチスライド操作との組み合わせ操作を示す図、(A)組み合わせ操作の開始時の様子を示す図、(B)組み合わせ操作の開始後にホバースライド操作を行っている指とタッチスライド操作を行っている指との指間間隔が短くなった様子を示す図、(C)組み合わせ操作の軌跡が円である様子を示す図The figure which shows combination operation of the hover slide operation and touch slide operation in 3rd Embodiment, (A) The figure which shows the mode at the time of the start of combination operation, (B) The hover slide operation is performed after the start of combination operation. The figure which shows a mode that the space | interval between fingers with the finger and the finger which is performing the touch slide operation became short, (C) The figure which shows a mode that the locus | trajectory of combination operation is a circle 第3の実施形態におけるホバースライド操作とタッチスライド操作との組み合わせ操作に応じた携帯端末の動作手順を説明するフローチャートThe flowchart explaining the operation | movement procedure of the portable terminal according to combination operation of hover slide operation and touch slide operation in 3rd Embodiment. タッチ受付可能範囲45aを示す図The figure which shows the touch receivable range 45a 第4の実施形態におけるホバースライド操作とタッチホールド操作との組み合わせ操作を示す図、(A)ホバースライド操作が単独で行われた様子を示す図、(B)ホバースライド操作とタッチホールド操作との組み合わせ操作が行われた様子を示す図The figure which shows combination operation of the hover slide operation and touch hold operation in 4th Embodiment, (A) The figure which shows a mode that the hover slide operation was performed independently, (B) Between hover slide operation and touch hold operation Diagram showing how the combination operation is performed 第4の実施形態におけるホバースライド操作とタッチホールド操作との組み合わせ操作に応じた携帯端末の動作手順を説明するフローチャートThe flowchart explaining the operation | movement procedure of the portable terminal according to combination operation of hover slide operation and touch hold operation in 4th Embodiment 第5の実施形態におけるホバースライド操作とタッチホールド操作との組み合わせ操作を示す図、(A)ホバースライド操作とタッチホールド操作との組み合わせ操作が行われた様子を示す図、(B)組み合わせ操作の開始後にホバー操作を行っている指とタッチホールド操作を行っている指との指間間隔が短くなった様子を示す図The figure which shows combination operation of hover slide operation and touch hold operation in 5th Embodiment, (A) The figure which shows a mode that combination operation of hover slide operation and touch hold operation was performed, (B) Combination operation The figure which shows a mode that the space | interval between fingers between the finger performing the hover operation and the finger performing the touch hold operation after the start is shortened 第5の実施形態におけるホバースライド操作とタッチホールド操作との組み合わせ操作に応じた携帯端末の動作手順を示すフローチャート、(A)全体動作、(B)変化継続処理の詳細The flowchart which shows the operation | movement procedure of the portable terminal according to the combination operation of the hover slide operation and touch hold operation in 5th Embodiment, (A) Whole operation | movement, (B) The detail of a change continuation process 第6の実施形態の携帯端末1Aの機能的構成を示すブロック図The block diagram which shows the functional structure of 1 A of portable terminals of 6th Embodiment ホバー操作とタッチ操作との組み合わせ操作を示す図、(A)ホバー操作が単独で行われた様子を示す図、(B)ホバー操作とタッチ操作との組み合わせ操作が行われた様子を示す図The figure which shows the combination operation of a hover operation and a touch operation, (A) The figure which shows a mode that the hover operation was performed independently, (B) The figure which shows the mode that the combination operation of a hover operation and a touch operation was performed 第6の実施形態におけるホバー操作とタッチ操作との組み合わせ操作に応じた携帯端末の動作手順を説明するフローチャートThe flowchart explaining the operation | movement procedure of the portable terminal according to combination operation of hover operation and touch operation in 6th Embodiment
 以下、本発明に係る入力装置、入力支援方法及びプログラムの各実施形態について、図面を参照して説明する。本実施形態の入力装置は、データを画面に表示する表示部を含む電子機器であり、例えば携帯電話機、スマートフォン、タブレット端末、デジタルスチルカメラ、PDA(personal digital assistant)又は電子書籍端末である。以下、各実施形態の入力装置の一例として携帯端末(例えばスマートフォン)を用いて説明する。 Hereinafter, embodiments of an input device, an input support method, and a program according to the present invention will be described with reference to the drawings. The input device of the present embodiment is an electronic device including a display unit that displays data on a screen, and is, for example, a mobile phone, a smartphone, a tablet terminal, a digital still camera, a PDA (personal digital assistant), or an electronic book terminal. Hereinafter, a mobile terminal (for example, a smartphone) will be described as an example of the input device of each embodiment.
 なお、本発明は、装置としての入力装置、又は入力装置をコンピュータとして動作させるためのプログラムとして表現することも可能である。更に、本発明は、入力装置により実行される各動作(ステップ)を含む入力支援方法として表現することも可能である。即ち、本発明は、装置、方法及びプログラムのうちいずれのカテゴリーにおいても表現可能である。 Note that the present invention can also be expressed as an input device as a device or a program for operating the input device as a computer. Furthermore, the present invention can also be expressed as an input support method including each operation (step) executed by the input device. That is, the present invention can be expressed in any category of an apparatus, a method, and a program.
 また、以下の説明において、ユーザからのタッチ操作の受け付けが可能であって、入力装置の表示部(例えばLCD又は有機ELディスプレイ)の画面に表示されたアプリケーション毎のコンテンツの一部を選択可能とするための項目、又は選択によってコンテンツに対する所定の処理を起動可能とするための項目を「ボタン」と定義する。所定の処理とは、アプリケーションにおいて現在表示されているコンテンツに関連した内容を実行する処理(例えば映像データを再生する処理)である。 In the following description, it is possible to accept a touch operation from the user, and to select a part of the content for each application displayed on the screen of the display unit (for example, LCD or organic EL display) of the input device. An item for enabling or starting a predetermined process for content by selection is defined as a “button”. The predetermined process is a process (for example, a process of reproducing video data) that executes contents related to the content currently displayed in the application.
 「ボタン」は、アプリケーションのコンテンツとして、例えばニュースの見出しが表示されている場合、ハイパーリンクされた文字列、即ち、ニュースの見出しでも良いし、ユーザの選択操作を促すための画像(例えばアイコン又はキーボードのソフトウェアキー)でも良いし、又は、文字列と画像との組合せでも良い。入力装置は、ユーザの入力操作に応じて、ボタンに対する操作として、例えばボタンに対応する「ニュースの見出し」の選択を受け付け、選択されたボタンに対応するニュースの詳細を表示することができる。なお、「ボタン」は入力装置において起動しているアプリケーションに応じて定められる。 For example, when a news headline is displayed as the application content, the “button” may be a hyperlinked character string, that is, a news headline, or an image (for example, an icon or an icon for prompting the user to perform a selection operation). Keyboard software key) or a combination of a character string and an image. The input device can accept, for example, the selection of “news headline” corresponding to the button as the operation on the button in accordance with the input operation of the user, and can display the details of the news corresponding to the selected button. The “button” is determined according to the application running on the input device.
 また、タッチパネル上の水平面を表す2軸をx軸及びy軸とし、タッチパネルの鉛直方向(高さ方向)を表す軸をz軸とする。更に、以下の説明において、「座標」は、タッチパネルの水平面上の位置、即ちx座標及びy座標の組合せにより定まる座標(x、y)と、この座標(x、y)とタッチパネルと指との間の鉛直方向における距離、即ち、タッチパネルからの指の高さzとを用いた座標(x、y、z)との両方を含むとする。 Also, the two axes representing the horizontal plane on the touch panel are the x-axis and the y-axis, and the axis representing the vertical direction (height direction) of the touch panel is the z-axis. Furthermore, in the following description, the “coordinate” is a position on the horizontal plane of the touch panel, that is, a coordinate (x, y) determined by a combination of the x coordinate and the y coordinate, and the coordinates (x, y) and the touch panel and the finger. And the coordinate (x, y, z) using the distance in the vertical direction between them, that is, the height z of the finger from the touch panel.
 なお、以下の説明において、タッチパネルへの指示媒体としては、一例としてユーザの指を用いて説明するが、指に限らず、ユーザの手により把持された導電性のスタイラスを用いてもよい。また、指示媒体は、タッチパネルの構造及び検知方式に応じて、タッチパネルへの近接及びタッチが検出可能なものであれば特に限定されない。 In the following description, the instruction medium for the touch panel is described using a user's finger as an example, but is not limited to the finger, and a conductive stylus held by the user's hand may be used. In addition, the instruction medium is not particularly limited as long as it can detect proximity and touch to the touch panel according to the structure and detection method of the touch panel.
 更に、以下の説明において、指をタッチパネルの面上から離間した空間上の位置に翳す操作を「ホバー操作」と定義し、ホバー操作によって翳された空間上の位置からタッチパネルの面に対して略平行にスライド(移動)する操作を、「ホバースライド操作」と定義する。従って、指がタッチパネルの面上に直接タッチする操作は「ホバー操作」ではなく、「タッチ操作」となる。また、タッチ操作において、指をタッチパネルの面上に接触させた状態においてスライド(移動)する操作を、「タッチスライド操作」と定義する。また、タッチ操作において、指をタッチパネルの面上に接触させた位置からスライドさせずにその位置のタッチ状態を維持する操作を、「タッチホールド操作」と定義する。 Further, in the following description, an operation of placing a finger on a position on a space separated from the surface of the touch panel is defined as a “hover operation”, and the position on the touch panel surface is determined from the position on the space that is deceived by the hover operation. The operation of sliding (moving) substantially in parallel is defined as “hover slide operation”. Therefore, the operation in which the finger directly touches the surface of the touch panel is not “hover operation” but “touch operation”. In the touch operation, an operation of sliding (moving) in a state where a finger is in contact with the surface of the touch panel is defined as a “touch slide operation”. In the touch operation, an operation for maintaining the touch state at the position without sliding the finger from the position on the touch panel surface is defined as a “touch hold operation”.
 なお、ホバー操作又はホバースライド操作が検知されるためには、指とタッチパネルの面上との距離は、タッチパネルが検出する静電容量に反比例するため、タッチパネルが検出可能な静電容量の範囲に対応することが好ましい。 In order to detect a hover operation or a hover slide operation, the distance between the finger and the surface of the touch panel is inversely proportional to the capacitance detected by the touch panel. It is preferable to correspond.
(第1の実施形態)
(各実施形態に共通する携帯端末のハードウェア構成)
 図1は、各実施形態における携帯端末1,1Aのハードウェア構成を示すブロック図である。図1に示す携帯端末1,1Aは、プロセッサ11、表示部13、タッチパネルドライバ14、タッチパネル15、電源制御部16、アンテナ17aが接続された通信制御部17、ROM(Read Only Memory)21、RAM(Random Access Memory)22及び記憶部23を含む。
(First embodiment)
(Hardware configuration of portable terminal common to each embodiment)
FIG. 1 is a block diagram showing a hardware configuration of the mobile terminals 1 and 1A in each embodiment. 1 includes a processor 11, a display unit 13, a touch panel driver 14, a touch panel 15, a power supply control unit 16, a communication control unit 17 to which an antenna 17a is connected, a ROM (Read Only Memory) 21, and a RAM. (Random Access Memory) 22 and a storage unit 23 are included.
 プロセッサ11、表示部13、タッチパネルドライバ14、電源制御部16、通信制御部17、ROM21、RAM22及び記憶部23は、バス19を介して、相互にデータを入出力可能に接続されている。 The processor 11, the display unit 13, the touch panel driver 14, the power supply control unit 16, the communication control unit 17, the ROM 21, the RAM 22, and the storage unit 23 are connected to each other via a bus 19 so as to be able to input and output data.
 プロセッサ11は、例えばCPU(Central Processing Unit)、MPU(Micro Processing Unit)又はDSP(Digital Signal Processor)を用いて構成され、携帯端末1,1A,1B,1Cの総括的な制御を行い、その他の種々の演算処理又は制御処理を行う。プロセッサ11は、ROM21に格納されたプログラム及びデータを読み込んで、後述の各実施形態における種々の処理を行う。 The processor 11 is configured using, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor), and performs overall control of the mobile terminals 1, 1A, 1B, and 1C. Various arithmetic processes or control processes are performed. The processor 11 reads the program and data stored in the ROM 21 and performs various processes in each embodiment described later.
 ROM21は、携帯端末1にインストールされているアプリケーション65(図2参照)、並びに、プロセッサ11が図2に示す各部における処理を実行するためのプログラム及びデータを格納している。 The ROM 21 stores an application 65 (see FIG. 2) installed in the mobile terminal 1, and a program and data for the processor 11 to execute processing in each unit shown in FIG.
 RAM22は、プロセッサ11、タッチパネルドライバ14又は通信制御部17の動作におけるワークメモリとして動作する。 The RAM 22 operates as a work memory in the operation of the processor 11, the touch panel driver 14, or the communication control unit 17.
 記憶部23は、携帯端末1に内蔵されるハードディスク又はフラッシュメモリを用いて構成され、携帯端末1,1A,1B,1Cが取得したデータ又は生成したデータを格納する。なお、アプリケーション65(図2参照)は、記憶部23に格納される。記憶部23は、ハードディスク又はフラッシュメモリではなく、例えば、USB(Universal Serial Bus)端子を介して接続された外部記憶媒体(例えばUSBメモリ)を用いて構成されても良い。 The storage unit 23 is configured by using a hard disk or a flash memory built in the mobile terminal 1, and stores data acquired or generated by the mobile terminals 1, 1A, 1B, and 1C. The application 65 (see FIG. 2) is stored in the storage unit 23. The storage unit 23 may be configured using, for example, an external storage medium (for example, a USB memory) connected via a USB (Universal Serial Bus) terminal instead of a hard disk or a flash memory.
 表示部13は、画面を表示する機能を有し、例えばLCD又は有機ELディスプレイを用いて構成され、プロセッサ11又はタッチパネルドライバ14から出力されたデータを画面に表示する。 The display unit 13 has a function of displaying a screen, and is configured using, for example, an LCD or an organic EL display, and displays data output from the processor 11 or the touch panel driver 14 on the screen.
 タッチパネルドライバ14は、タッチパネル15の動作を制御してタッチパネル15に対するユーザの入力操作を監視する。例えば、タッチパネルドライバ14は、タッチパネル15がユーザの指68(図3(A)参照)のタッチ操作若しくはタッチスライド操作による接触又はホバー操作若しくはホバースライド操作による近接を検知すると、接触座標(x、y)又は近接座標(x、y、z)を取得し、接触座標(x、y)又は近接座標(x、y、z)の情報をプロセッサ11、RAM22又は記憶部23に出力する。以下、接触座標(x、y)を「タッチ座標(x、y)」という。 The touch panel driver 14 controls the operation of the touch panel 15 and monitors a user input operation on the touch panel 15. For example, when the touch panel driver 15 detects contact by a touch operation or a touch slide operation of a user's finger 68 (see FIG. 3A) or proximity by a hover operation or a hover slide operation, the touch coordinates (x, y ) Or proximity coordinates (x, y, z), and information on the contact coordinates (x, y) or proximity coordinates (x, y, z) is output to the processor 11, the RAM 22, or the storage unit 23. Hereinafter, the contact coordinates (x, y) are referred to as “touch coordinates (x, y)”.
 タッチパネル15は、表示部13の画面45(図3(A)参照)の上に搭載され、ユーザの指68がタッチパネル15の水平面上をタッチ操作したこと又はタッチスライド操作したことを検知する。また、タッチパネル15は、ユーザの指68がホバー操作又はホバースライド操作によって指68がタッチパネル15に近接したことを検知する。 The touch panel 15 is mounted on the screen 45 (see FIG. 3A) of the display unit 13, and detects that the user's finger 68 performs a touch operation or a touch slide operation on the horizontal surface of the touch panel 15. Further, the touch panel 15 detects that the user's finger 68 has approached the touch panel 15 by a hover operation or a hover slide operation.
 なお、タッチパネル15の具体的構成は、例えば上述した特許文献1に詳細に開示されているため説明を省略するが、タッチパネル15は、ホバー操作における指の高さzの値が所定値zth以下、又は指の高さzの値に応じて定まる静電容量が所定値以上となる場合に、指68がタッチパネル15に近接したことを検知する。 Although the specific configuration of the touch panel 15 is disclosed in detail in, for example, Patent Document 1 described above, the description thereof will be omitted, but the touch panel 15 has a finger height z value in a hover operation of a predetermined value zth or less, Alternatively, it is detected that the finger 68 is close to the touch panel 15 when the capacitance determined according to the value of the finger height z is equal to or greater than a predetermined value.
 電源制御部16は、携帯端末1の電力供給源(例えばバッテリ)を用いて構成され、タッチパネル15への入力操作に応じて、携帯端末1の電源のオン状態又はオフ状態を切り替える。電源がオン状態の場合、電源制御部16は、電力供給源から図1に示す各部に電力を供給して携帯端末1を動作可能にさせる。 The power supply control unit 16 is configured using a power supply source (for example, a battery) of the mobile terminal 1, and switches the power supply state of the mobile terminal 1 to an on state or an off state according to an input operation to the touch panel 15. When the power supply is on, the power supply control unit 16 supplies power from the power supply source to each unit shown in FIG. 1 so that the mobile terminal 1 can operate.
 通信制御部17は、無線通信回路を用いて構成され、送受信用のアンテナ17aを介して、プロセッサ11により処理された処理結果としてのデータを送信し、更に、不図示の基地局又は他の通信端末から送信されたデータを受信する。また、図1では、本実施形態を含む各実施形態の説明に必要となる構成が図示されているが、各実施形態における携帯端末1,1A,1B,1Cは、通話音声を制御する音声制御部と、ユーザの声を集音するマイクロフォンと、更に通話相手の音声データを出力するスピーカとを更に含んでも良い。 The communication control unit 17 is configured using a wireless communication circuit, transmits data as a processing result processed by the processor 11 via a transmission / reception antenna 17a, and further, a base station (not shown) or other communication Receives data sent from the terminal. Further, FIG. 1 illustrates a configuration necessary for the description of each embodiment including the present embodiment, but the mobile terminals 1, 1A, 1B, and 1C in each embodiment perform voice control for controlling call voice. A microphone for collecting the voice of the user, and a speaker for outputting voice data of the other party.
(携帯端末1の機能的構成)
 次に、各実施形態のうち第1~第5の各実施形態に共通する携帯端末1の機能的構成について、図2を参照して説明する。図2は、第1~第5の各実施形態における携帯端末1の機能的構成を示すブロック図である。
(Functional configuration of mobile terminal 1)
Next, a functional configuration of the mobile terminal 1 common to the first to fifth embodiments among the embodiments will be described with reference to FIG. FIG. 2 is a block diagram showing a functional configuration of the mobile terminal 1 in each of the first to fifth embodiments.
 図2に示す携帯端末1は、近接検知部5、タッチ検知部10、画面表示部30、メモリ40、近接座標抽出部51、タッチ座標抽出部52、状態管理部54、画像ボタン管理部55、操作判定部56、表示画像生成部58、アプリ画面生成部59、画像合成部60及びアプリケーション65を含む。 2 includes a proximity detection unit 5, a touch detection unit 10, a screen display unit 30, a memory 40, a proximity coordinate extraction unit 51, a touch coordinate extraction unit 52, a state management unit 54, an image button management unit 55, An operation determination unit 56, a display image generation unit 58, an application screen generation unit 59, an image composition unit 60, and an application 65 are included.
 アプリケーション65は、制御拡張部64を含む。制御拡張部64は、状態変化量変更部61、変化対象変更部62及び状態変化継続部63を有する。 Application 65 includes a control extension unit 64. The control extension unit 64 includes a state change amount change unit 61, a change target change unit 62, and a state change continuation unit 63.
 近接検知部5は、ホバー操作又はホバースライド操作によってユーザの指がタッチパネル15に近接した状態を検知する。近接検知部5は、指がタッチパネル15に近接した旨の近接通知を近接座標抽出部51に出力する。 The proximity detector 5 detects a state in which the user's finger is close to the touch panel 15 by a hover operation or a hover slide operation. The proximity detection unit 5 outputs a proximity notification indicating that the finger has approached the touch panel 15 to the proximity coordinate extraction unit 51.
 接触検知部としてのタッチ検知部10は、タッチ操作又はタッチスライド操作によって指がタッチパネル15にタッチした動作を検知する。タッチ検知部10は、指がタッチパネル15にタッチした旨の接触通知をタッチ座標抽出部52に出力する。なお、近接検知部5とタッチ検知部10とはタッチパネル15を用いて構成可能であり、図2では近接検知部5とタッチ検知部10とは別々の構成としているが、両者をタッチパネル15として一纏めに構成しても良い。 The touch detection unit 10 as a contact detection unit detects an operation in which a finger touches the touch panel 15 by a touch operation or a touch slide operation. The touch detection unit 10 outputs a contact notification that a finger has touched the touch panel 15 to the touch coordinate extraction unit 52. Note that the proximity detection unit 5 and the touch detection unit 10 can be configured using the touch panel 15, and in FIG. 2, the proximity detection unit 5 and the touch detection unit 10 are configured separately. You may comprise.
 画面表示部30は、図1に示す表示部13に対応し、画面45にデータを表示する機能を有し、後述する画像合成部60から出力された合成画像データを画面45に表示する。合成画像データは、アプリケーション65の画面(以下、単に「アプリ画面」という)のデータと、表示画像生成部58により生成された画像データとが画像合成部60によって合成されたデータである。 The screen display unit 30 corresponds to the display unit 13 shown in FIG. 1, has a function of displaying data on the screen 45, and displays composite image data output from the image composition unit 60 described later on the screen 45. The composite image data is data obtained by combining the data of the screen of the application 65 (hereinafter simply referred to as “application screen”) and the image data generated by the display image generating unit 58 by the image combining unit 60.
 メモリ40は、図1に示すRAM22又は記憶部23に対応し、少なくとも画像ボタンデータベース55aとして構成される。画像ボタンデータベース55aは、例えばアプリケーション65において用いられる画面のデータ及び画像データ、アプリケーション65によって生成された画像データ、不図示の基地局又は他の通信端末から受信された画像データ、並びにアプリケーション65において用いられるボタンの座標情報及びボタンに割り当てられたアプリケーション65の動作情報を格納する。 The memory 40 corresponds to the RAM 22 or the storage unit 23 shown in FIG. 1 and is configured as at least an image button database 55a. The image button database 55a is used in, for example, screen data and image data used in the application 65, image data generated by the application 65, image data received from a base station (not shown) or another communication terminal, and the application 65. Button coordinate information and operation information of the application 65 assigned to the button are stored.
 なお、メモリ40には、近接座標抽出部51により抽出された近接座標(x、y、z)又はタッチ座標抽出部52により抽出されたタッチ座標(x、y)の各情報が一時的に格納されても良い。図2では、図面の複雑化を避けるために、近接座標抽出部51及びタッチ座標抽出部52からメモリ40への矢印の図示を省略している。 The memory 40 temporarily stores information on the proximity coordinates (x, y, z) extracted by the proximity coordinate extraction unit 51 or the touch coordinates (x, y) extracted by the touch coordinate extraction unit 52. May be. In FIG. 2, the arrows from the proximity coordinate extraction unit 51 and the touch coordinate extraction unit 52 to the memory 40 are omitted in order to avoid complication of the drawing.
 近接座標抽出部51は、近接検知部5から出力された近接通知を基に、指のタッチパネル15に対する近接座標(x、y、z)を算出して抽出する。近接座標(x、y、z)のうち、x成分及びy成分はタッチパネル15上の位置を表す座標値であって、z成分は指とタッチパネル15との間の鉛直方向の距離、即ち、指のタッチパネル15に対する高さzを表す座標値である。近接座標抽出部51は、抽出された近接座標(x、y、z)の情報を操作判定部56に出力する。 The proximity coordinate extraction unit 51 calculates and extracts proximity coordinates (x, y, z) of the finger with respect to the touch panel 15 based on the proximity notification output from the proximity detection unit 5. Among the proximity coordinates (x, y, z), the x component and the y component are coordinate values representing positions on the touch panel 15, and the z component is a vertical distance between the finger and the touch panel 15, that is, the finger. Is a coordinate value representing the height z with respect to the touch panel 15. The proximity coordinate extraction unit 51 outputs information on the extracted proximity coordinates (x, y, z) to the operation determination unit 56.
 タッチ座標抽出部52は、タッチ検知部10から出力された接触通知を基に、指がタッチパネル15に対してタッチしたときのタッチ座標(x、y)を算出して抽出する。タッチ座標抽出部52は、抽出されたタッチ座標(x、y)の情報を操作判定部56に出力する。 The touch coordinate extraction unit 52 calculates and extracts touch coordinates (x, y) when a finger touches the touch panel 15 based on the contact notification output from the touch detection unit 10. The touch coordinate extraction unit 52 outputs information on the extracted touch coordinates (x, y) to the operation determination unit 56.
 以下の説明では、操作判定部56により判定されたユーザの入力操作は、ホバー操作、ホバースライド操作、タッチ操作、タッチスライド操作、又は各操作の組み合わせ操作とするが、これらの操作に限定されない。 In the following description, the user input operation determined by the operation determination unit 56 is a hover operation, a hover slide operation, a touch operation, a touch slide operation, or a combination operation of each operation, but is not limited to these operations.
 状態管理部54は、操作判定部56から出力された操作判定結果情報(後述参照)を入力する。状態管理部54は、操作判定部56から出力された操作判定結果情報を基に、ユーザの指の入力操作の内容、即ち、ホバー操作、ホバースライド操作、タッチ操作及びタッチスライド操作のうちいずれの操作が行われているかによって、携帯端末1において近接状態、接触状態、及び近接状態と接触状態とが共存する状態のうちいずれであるかを判定する。 The state management unit 54 inputs operation determination result information (described later) output from the operation determination unit 56. Based on the operation determination result information output from the operation determination unit 56, the state management unit 54 is the content of the input operation of the user's finger, that is, any of hover operation, hover slide operation, touch operation, and touch slide operation. Depending on whether the operation is performed, the mobile terminal 1 determines which of the proximity state, the contact state, and the state where the proximity state and the contact state coexist.
 状態管理部54は、例えば後述するユーザの第1の指(例えば人差し指68a。以下同様)によってホバースライド操作が行われている場合に第2の指(例えば親指68b。以下同様)によってタッチスライド操作が行われてホバースライド操作とタッチスライド操作との組み合わせ操作が行われた旨の操作判定結果情報を操作判定部56から取得したとする。ここでは、ユーザの入力操作の一例として、ホバースライド操作とタッチスライド操作との組み合わせ操作を用いた場合の状態管理部54の動作を説明している。 The state management unit 54 performs a touch slide operation with a second finger (for example, the thumb 68b; the same applies below) when the hover slide operation is performed with a user's first finger (for example, an index finger 68a; the same applies to the following). It is assumed that operation determination result information indicating that a combination operation of a hover slide operation and a touch slide operation has been performed is acquired from the operation determination unit 56. Here, as an example of a user input operation, the operation of the state management unit 54 when a combination operation of a hover slide operation and a touch slide operation is used will be described.
 この場合、状態管理部54は、操作判定部56から出力された操作判定結果情報を基に、携帯端末1の状態として、ホバースライド操作による近接状態から、ホバースライド操作による近接状態とタッチスライド操作による接触状態との共存状態に移行したと判定する。更に、状態管理部54は、ホバースライド操作とタッチ操作との組み合わせ操作が行われた旨をアプリケーション65に出力(通知)する。 In this case, based on the operation determination result information output from the operation determination unit 56, the state management unit 54 changes the proximity state by the hover slide operation and the touch slide operation from the proximity state by the hover slide operation as the state of the mobile terminal 1. It determines with having shifted to the coexistence state with the contact state by. Further, the state management unit 54 outputs (notifies) to the application 65 that the combination operation of the hover slide operation and the touch operation has been performed.
 状態管理部54は、操作判定部56からの操作判定結果情報が上述したいずれかの操作(例えばホバースライド操作)が行われた旨を含む場合に、組み合わせ操作のスライド量(移動量)の情報と画面45に表示するための表示画像を生成する旨の表示画像生成指示とを表示画像生成部58に出力する。なお、後述するように、組み合わせ操作のスライド量(移動量)は、操作判定部56によって算出され、操作判定結果情報に含まれている。 When the operation determination result information from the operation determination unit 56 includes any of the above-described operations (for example, a hover slide operation), the state management unit 54 is information on the slide amount (movement amount) of the combination operation. And a display image generation instruction for generating a display image to be displayed on the screen 45 is output to the display image generation unit 58. As will be described later, the slide amount (movement amount) of the combination operation is calculated by the operation determination unit 56 and included in the operation determination result information.
 画像ボタン管理部55は、アプリケーション65において用いられるアプリ画面を構成する各々のボタンの画面45上における表示座標を示す情報又はアプリケーション65において用いられる画像データを、メモリ40から読み出したり書き込んだりする。 The image button management unit 55 reads or writes information indicating display coordinates on the screen 45 of each button constituting the application screen used in the application 65 or image data used in the application 65 from the memory 40.
 操作判定部56は、近接座標抽出部51から出力された近接座標(x、y、z)の情報、又はタッチ座標抽出部52から出力されたタッチ座標(x、y)の情報を入力する。 The operation determination unit 56 inputs the information of the proximity coordinates (x, y, z) output from the proximity coordinate extraction unit 51 or the information of the touch coordinates (x, y) output from the touch coordinate extraction unit 52.
 操作判定部56は、入力された近接座標(x、y、z)の情報又はタッチ座標(x、y)の情報を基に、ユーザの指の入力操作の内容、即ち、ホバー操作、ホバースライド操作、タッチ操作及びタッチスライド操作のうちいずれの操作又はどの操作とどの操作との組み合わせ操作が行われているかを判定する。 The operation determination unit 56, based on the input proximity coordinate (x, y, z) information or touch coordinate (x, y) information, the content of the input operation of the user's finger, that is, hover operation, hover slide It is determined which operation or which operation is combined with which operation among the operation, touch operation, and touch slide operation.
 操作判定部56は、例えば同一の画面45において、ユーザの第1の指(人差し指68a)によるホバースライド操作とユーザの第2の指(親指68b)によるタッチスライド操作との組み合わせ操作が同一方向に同距離行われたか否かを判定する。更に、操作判定部56は、組み合わせ操作による第1の指と第2の指の移動量(スライド量)の値を算出する。また、操作判定部56は、ホバースライド操作又はタッチスライド操作が単独で行われた場合にも、各操作による指の移動量(スライド量)の値を算出する。 For example, on the same screen 45, the operation determination unit 56 performs a combination operation of a hover slide operation with the user's first finger (index finger 68a) and a touch slide operation with the user's second finger (thumb 68b) in the same direction. It is determined whether or not the distance is the same. Furthermore, the operation determination unit 56 calculates the value of the movement amount (slide amount) of the first finger and the second finger by the combination operation. The operation determination unit 56 also calculates the value of the finger movement amount (slide amount) by each operation even when the hover slide operation or the touch slide operation is performed independently.
 操作判定部56は、同一の画面45において、ユーザの第1の指(人差し指68a)によるホバースライド操作と同一のユーザの第2の指(親指68b)によるタッチスライド操作とが同一方向であって同距離行われたと判定した場合に、ホバースライド操作とタッチスライド操作との組み合わせ操作が同一方向で同距離行われた旨及びスライド量(移動量)の各情報を、操作判定結果情報として状態管理部54に出力する。 The operation determination unit 56 has the same screen 45 in which the hover slide operation by the user's first finger (index finger 68a) and the touch slide operation by the same user's second finger (thumb 68b) are in the same direction. When it is determined that the same distance has been performed, the state management is performed with information indicating that the combination operation of the hover slide operation and the touch slide operation has been performed in the same direction and the slide amount (movement amount) as operation determination result information. To the unit 54.
 表示画像生成部58は、状態管理部54から出力された表示画像生成指示及びアプリ画面生成部59から出力された画面45の表示範囲の情報を基に、画像ボタン管理部55を介して、アプリケーション65における画像データを取得する。表示画像生成部58は、取得された画像データから表示画像生成指示に対応する範囲の画像を切り出すことで、画面45に表示するための表示画像を生成する。表示画像生成部58は、生成された表示画像を画像合成部60に出力する。 Based on the display image generation instruction output from the state management unit 54 and the information on the display range of the screen 45 output from the application screen generation unit 59, the display image generation unit 58 performs the application via the image button management unit 55. The image data at 65 is acquired. The display image generation unit 58 generates a display image to be displayed on the screen 45 by cutting out an image in a range corresponding to the display image generation instruction from the acquired image data. The display image generation unit 58 outputs the generated display image to the image composition unit 60.
 アプリ画面生成部59は、アプリケーション65から出力された画面生成指示を基に、画像ボタン管理部55を介して、メモリ40からアプリケーション65における画面の生成に必要となる各種データを取得する。アプリ画面生成部59は、取得された各種データを用いて、アプリケーション65におけるアプリ画面の画面データを生成する。アプリ画面生成部59は、生成された画像データを画像合成部60に出力する。 The application screen generation unit 59 acquires various data necessary for generating a screen in the application 65 from the memory 40 via the image button management unit 55 based on the screen generation instruction output from the application 65. The application screen generation unit 59 generates screen data of the application screen in the application 65 using the acquired various data. The application screen generation unit 59 outputs the generated image data to the image composition unit 60.
 なお、図2では、アプリ画面生成部59とアプリケーション65とが別々の構成として示されているが、アプリケーション65がアプリ画面生成部59の機能を有することによって、アプリ画面生成部59とアプリケーション65とを一纏めにしたアプリケーション65として構成しても良い。 In FIG. 2, the application screen generation unit 59 and the application 65 are shown as separate configurations. However, the application 65 has the function of the application screen generation unit 59, so that the application screen generation unit 59 and the application 65 You may comprise as the application 65 which bundled.
 画像合成部60は、表示画像生成部58から出力された表示画像のデータと、アプリ画面生成部59から出力されたアプリ画面の画面データとを合成して画面表示部30に表示させる。 The image composition unit 60 synthesizes the display image data output from the display image generation unit 58 and the screen data of the application screen output from the application screen generation unit 59 and causes the screen display unit 30 to display them.
 状態変化量変更部61は、例えばホバースライド操作とタッチスライド操作との組み合わせ操作が同一方向で同距離行われた旨の情報を状態管理部54から取得した場合に、ホバースライド操作とタッチスライド操作との組み合わせ操作が同一方向で同距離行われる直前にアプリケーション65において実行されていた動作の状態変化量を変更する。状態変化量変更部61は、動作の状態変化量の変更の一例として、画面45に表示される地図画像47の移動量(スライド量)又は移動速度(スライド速度)を、増加させるように又は減少させるように変更する。 For example, when the state change amount changing unit 61 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch slide operation has been performed in the same direction and the same distance, the hover slide operation and the touch slide operation are performed. The state change amount of the operation executed in the application 65 immediately before the combination operation is performed in the same direction and the same distance is changed. The state change amount changing unit 61 increases or decreases the moving amount (sliding amount) or moving speed (sliding speed) of the map image 47 displayed on the screen 45 as an example of changing the operation state changing amount. Change to
 変化対象変更部62は、例えばホバースライド操作とタッチスライド操作との組み合わせ操作が同一方向で同距離行われた旨の情報を状態管理部54から取得した場合に、ホバースライド操作とタッチスライド操作との組み合わせ操作が同一方向で同距離行われる直前にアプリケーション65において実行されていた動作の変化対象(例えば映像の出力音声)を他の変化対象に変更する。変化対象変更部62は、動作の変化対象の変更の一例として、映像の出力音声を映像の再生速度に変更する。 For example, when the change target changing unit 62 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch slide operation is performed in the same direction and the same distance, the change target change unit 62 performs the hover slide operation and the touch slide operation. The operation change target (for example, the output sound of the video) executed in the application 65 immediately before the combination operation is performed in the same direction and the same distance is changed to another change target. The change target changing unit 62 changes the output sound of the video to the playback speed of the video as an example of changing the change target of the operation.
 状態変化継続部63は、例えばホバースライド操作とタッチホールド操作との組み合わせ操作が同一方向で同距離行われた旨の情報を状態管理部54から取得した場合に、ホバースライド操作とタッチホールド操作との組み合わせ操作が行われる直前にアプリケーション65において実行されていた動作の変化状態を維持し、その維持された状態を継続させる。 For example, when the state change continuation unit 63 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch hold operation is performed in the same direction and the same distance, the state change continuation unit 63 performs the hover slide operation and the touch hold operation. The change state of the operation executed in the application 65 immediately before the combination operation is performed is maintained, and the maintained state is continued.
 状態変化継続部63は、動作の変化状態が維持された状態を継続することの一例として、ホバースライド操作によって指のスライド量の2倍分、地図画像47が画面45内を移動していた場合に、ホバースライド操作とタッチホールド操作との組み合わせ操作によって、ホバースライド操作はホバー操作の状態となるが、タッチホールド操作を継続することで、ホバースライド操作に応じた同じ動作、即ち、指のスライド量の2倍分、画面45内の地図画像47の移動を継続させる。 The state change continuation unit 63 is an example of continuing the state in which the change state of the operation is maintained, and when the map image 47 is moved in the screen 45 by the amount of sliding of the finger by a hover slide operation. In addition, the combination operation of the hover slide operation and the touch hold operation causes the hover slide operation to be in the state of the hover operation, but by continuing the touch hold operation, the same operation according to the hover slide operation, that is, the finger slide The movement of the map image 47 in the screen 45 is continued by twice the amount.
(第1の実施形態の動作概要)
 次に、第1の実施形態の携帯端末1の動作概要について、図3(A)及び(B)を参照して説明する。図3(A)は、ホバースライド操作が単独で行われた様子を示す図である。図3(B)は、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われ様子を示す図である。
(Outline of operation of first embodiment)
Next, an outline of the operation of the mobile terminal 1 according to the first embodiment will be described with reference to FIGS. 3 (A) and 3 (B). FIG. 3A is a diagram illustrating a state where the hover slide operation is performed independently. FIG. 3B is a diagram illustrating a state in which a combination operation of a hover slide operation and a touch slide operation is performed.
 本実施形態では、図3(A)に示すユーザの指68がタッチパネル15(画面45)に対してホバースライド操作を開始した後に、図3(B)に示すように、別の指(人差し指68a)が、タッチパネル15の上にタッチ(接触)して人差し指68aのホバースライド操作と同一方向に同距離ほどタッチスライド操作したとする。 In the present embodiment, after the user's finger 68 shown in FIG. 3A starts a hover slide operation on the touch panel 15 (screen 45), as shown in FIG. 3B, another finger (index finger 68a) is displayed. ) Touches (touches) the touch panel 15 and performs a touch slide operation by the same distance in the same direction as the hover slide operation of the index finger 68a.
 この場合、携帯端末1は、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われる直前、即ち、ホバースライド操作に応じたアプリケーション65の動作の状態変化量を変更する。なお、図3(A)に示す人差し指68は、図3(B)に示す人差し指68aと同じであって、図3(B)に示す親指68bと同じではない。また、以下の説明において、ホバー操作又はホバースライド操作する指を人差し指68aとし、タッチ操作又はタッチスライド操作する指を親指68bとして説明するが、特に限定されない。 In this case, the mobile terminal 1 changes the state change amount of the operation of the application 65 according to the hover slide operation immediately before the combination operation of the hover slide operation and the touch slide operation is performed. The index finger 68 shown in FIG. 3A is the same as the index finger 68a shown in FIG. 3B, and is not the same as the thumb 68b shown in FIG. 3B. Further, in the following description, a finger that performs a hover operation or a hover slide operation is described as an index finger 68a, and a finger that performs a touch operation or a touch slide operation is described as a thumb 68b. However, the present invention is not particularly limited.
 先ず、指68によって、ホバースライド操作が単独で行われた場合について説明する。図3(A)に示すように、指68によるホバースライド操作単独の場合、即ち、指68が画面45(タッチパネル15)に近接した状態において矢印aの方向にホバースライド操作すると、画面45に表示された地図画像47が画面45内を移動(又は「スクロール」ともいう)する。この移動によって、画面45内に表示された画像の内容がスクロールされて切り替えられる。 First, a case where the hover slide operation is performed independently with the finger 68 will be described. As shown in FIG. 3A, in the case of a single hover slide operation with the finger 68, that is, when the hover slide operation is performed in the direction of the arrow a while the finger 68 is close to the screen 45 (touch panel 15), the screen 45 displays. The map image 47 thus moved moves (or is also referred to as “scroll”) within the screen 45. By this movement, the content of the image displayed in the screen 45 is scrolled and switched.
 指68のホバースライド操作により、画面45に表示された地図画像47は、指68のホバースライド操作と同方向(矢印bの方向)に、指68の移動量(ホバースライド量、矢印aの長さに対応)の例えば2倍分の距離(矢印bの長さに対応)だけ画面45内を移動(スクロール)する。 The map image 47 displayed on the screen 45 by the hover slide operation of the finger 68 causes the movement amount of the finger 68 (the hover slide amount, the length of the arrow a) in the same direction as the hover slide operation of the finger 68 (the direction of the arrow b). For example) is moved (scrolled) within the screen 45 by a distance (corresponding to the length of the arrow b), for example, twice as long.
 次に、人差し指68aと親指68bとによって、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われた場合について説明する。図3(A)に示すホバースライド操作が単独で行われていた状態において、図3(B)に示すように、親指68bが画面45(タッチパネル15)上にタッチし、人指し指68aが矢印aの方向にホバースライド操作し、更に親指68bがタッチされた位置から矢印a’の方向に同時にタッチスライド操作する。 Next, a case where a combination operation of a hover slide operation and a touch slide operation is performed with the index finger 68a and the thumb 68b will be described. In the state where the hover slide operation shown in FIG. 3 (A) was performed alone, as shown in FIG. 3 (B), the thumb 68b touches the screen 45 (touch panel 15), and the index finger 68a moves to the arrow a. The hover slide operation is performed in the direction, and the touch slide operation is simultaneously performed in the direction of the arrow a ′ from the position where the thumb 68b is touched.
 この場合、人差し指68aのホバースライド操作と親指68bのタッチスライド操作との組み合わせ操作により、画面45内に表示された地図画像47は、矢印bの方向に、人指し指68aのホバースライド量と同じ距離の分だけ、即ち、等倍で画面45内を移動(スクロール)する。なお、親指68bがタッチした位置は丸印50(点線参照)で表し、以後の各実施形態においても同様である。 In this case, the map image 47 displayed in the screen 45 by the combination operation of the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b has the same distance as the hover slide amount of the index finger 68a in the direction of the arrow b. Move (scroll) within the screen 45 by the minute, that is, at the same magnification. The position touched by the thumb 68b is represented by a circle 50 (see dotted line), and the same applies to the following embodiments.
(第1の実施形態の携帯端末1の動作)
 図4は、第1の実施形態におけるホバースライド操作とタッチスライド操作との組み合わせ操作に応じた携帯端末1の動作手順を説明するフローチャートである。図4に示すフローチャートは、図3(A)及び(B)に示す画面45(タッチパネル15)に対する入力操作に応じた携帯端末1の動作手順を表す。
(Operation of the mobile terminal 1 of the first embodiment)
FIG. 4 is a flowchart for explaining the operation procedure of the mobile terminal 1 in accordance with the combination operation of the hover slide operation and the touch slide operation in the first embodiment. The flowchart shown in FIG. 4 represents the operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 3 (A) and 3 (B).
 図4において、状態管理部54は、操作判定部56からの操作判定結果情報を基に、ユーザの人差し指68によってホバースライド操作が行われているか否かを判定する(S1)。ホバースライド操作が行われていると判定された場合に、携帯端末1の動作はステップS2に進む。 In FIG. 4, the state management unit 54 determines whether or not a hover slide operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S1). When it is determined that the hover slide operation is being performed, the operation of the mobile terminal 1 proceeds to step S2.
 次に、状態管理部54は、ユーザの人差し指68aによってホバースライド操作が行われていると判定した場合に(S1、YES)、操作判定部56からの操作判定結果情報を基に、ユーザの親指68bによってタッチスライド操作が行われているか否かを判定する(S2)。 Next, when the state management unit 54 determines that the hover slide operation is being performed with the user's index finger 68a (S1, YES), based on the operation determination result information from the operation determination unit 56, the user's thumb It is determined whether a touch slide operation is performed by 68b (S2).
 なお、ステップS2では、状態管理部54は、操作判定部56の操作判定結果情報を基に、ユーザの人差し指68aによるホバースライド操作と親指68bによるタッチスライド操作とが同一方向であって同距離行われたか否かを判定しており、第2の実施形態においても同様である。従って、ステップS2において、ユーザの人差し指68aによるホバースライド操作と親指68bによるタッチスライド操作とが同一方向且つ同距離行われていないと判定された場合には、携帯端末1の動作はステップS3に進む。 In step S2, the state management unit 54 determines that the hover slide operation by the user's index finger 68a and the touch slide operation by the thumb 68b are in the same direction and are performed at the same distance based on the operation determination result information of the operation determination unit 56. The same applies to the second embodiment. Therefore, if it is determined in step S2 that the hover slide operation by the user's index finger 68a and the touch slide operation by the thumb 68b are not performed in the same direction and the same distance, the operation of the mobile terminal 1 proceeds to step S3. .
 状態管理部54は、ユーザの親指68bによってタッチスライド操作が行われていないと判定した場合には(S2、NO)、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われていない旨をアプリケーション65に出力(通知)する。状態変化量変更部61は、ステップS1において実行していると判定されたホバースライド操作に対応したアプリケーション65の動作の状態変化量を初期化する(S3)。 If the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO), the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed. Output (notify) to 65. The state change amount changing unit 61 initializes the state change amount of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 (S3).
 例えば、携帯端末1の表示部13の画面45内に地図画像47が表示されており、ステップS3における状態変化量を地図画像47の移動量(スライド量又はスクロール量)とし、状態変化量の初期値を2倍とする。ステップS3では、状態変化量変更部61は、ホバースライド操作(図3(A)参照)が単独で行われた場合に、指68の移動量(ホバースライド量)に応じて画面45内を移動させる地図画像47の移動量(状態変化量)を初期化する。 For example, the map image 47 is displayed in the screen 45 of the display unit 13 of the mobile terminal 1, and the state change amount in step S3 is set as the movement amount (slide amount or scroll amount) of the map image 47, and the initial state change amount is set. Double the value. In step S3, the state change amount changing unit 61 moves within the screen 45 according to the movement amount (hover slide amount) of the finger 68 when the hover slide operation (see FIG. 3A) is performed alone. The movement amount (state change amount) of the map image 47 to be made is initialized.
 一方、状態管理部54は、ユーザの親指68bによってタッチスライド操作が行われていると判定した場合には(S2、YES)、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われている旨をアプリケーション65に出力(通知)する。状態変化量変更部61は、ステップS1において実行されていると判定されたホバースライド操作に対応したアプリケーション65の動作の状態変化量を例えば等倍(1倍)に変更する(S4)。 On the other hand, when the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES), the combined operation of the hover slide operation and the touch slide operation is performed. Is output (notified) to the application 65. The state change amount changing unit 61 changes the state change amount of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 to, for example, the same size (1 time) (S4).
 これにより、図3(B)に示すように、人指し指68aの移動量と画面45内に表示された地図画像47の移動量とが等しくなり、ユーザは、ホバースライド操作を行っている指68の移動量の2倍分、地図画像47のスクロールをホバースライド操作によって行っていたが、ホバースライド操作とタッチスライド操作との組み合わせ操作によって、ホバースライド操作を行っている人差し指68aの移動量と等倍の移動量の分だけ、地図画像のスクロールを行うことができ、地図画像47のスクロールの微調整を簡易に行うことができる。 As a result, as shown in FIG. 3B, the amount of movement of the index finger 68a is equal to the amount of movement of the map image 47 displayed in the screen 45, and the user can move the finger 68 performing the hover slide operation. The map image 47 is scrolled by the hover slide operation twice as much as the amount of movement, but by the combined operation of the hover slide operation and the touch slide operation, it is the same as the amount of movement of the index finger 68a performing the hover slide operation. The map image can be scrolled by the amount of movement, and the fine adjustment of the scroll of the map image 47 can be easily performed.
 ステップS3又はS4の後、携帯端末1の動作はステップS5に進む。操作判定部56は、状態管理部54から出力された操作判定指示を基に、ステップS1において実行されていると判定されたホバースライド操作のスライド量を算出する(S5)。操作判定部56は、算出結果としてのスライド量の情報を状態管理部54に出力する。 After step S3 or S4, the operation of the mobile terminal 1 proceeds to step S5. The operation determination unit 56 calculates the slide amount of the hover slide operation determined to be executed in step S1 based on the operation determination instruction output from the state management unit 54 (S5). The operation determination unit 56 outputs information on the slide amount as a calculation result to the state management unit 54.
 状態管理部54は、操作判定部56から出力されたスライド量とステップS3又はS4により得られた変化量とを乗算し、ステップS5において算出されたスライド量に応じた画面45内の地図画像47の移動量(スクロール量)を算出する(S6)。状態管理部54は、算出された地図画像の移動量(スクロール量)の情報を含む表示画像生成指示を表示画像生成部58に出力する。 The state management unit 54 multiplies the slide amount output from the operation determination unit 56 by the change amount obtained in step S3 or S4, and the map image 47 in the screen 45 corresponding to the slide amount calculated in step S5. Is calculated (scroll amount) (S6). The state management unit 54 outputs a display image generation instruction including information on the calculated movement amount (scroll amount) of the map image to the display image generation unit 58.
 表示画像生成部58は、状態管理部54から出力された表示画像生成指示を基に、ステップS6により算出された移動量(スクロール量)の分だけ画面45内を移動した後の地図画像47を表示画像として生成して画像合成部60に出力する(S7)。 Based on the display image generation instruction output from the state management unit 54, the display image generation unit 58 displays the map image 47 after moving within the screen 45 by the movement amount (scroll amount) calculated in step S6. It produces | generates as a display image and outputs it to the image synthetic | combination part 60 (S7).
 画像合成部60は、表示画像生成部58から出力された表示画像のデータと、アプリ画面生成部59から出力されたアプリ画面の画面データとを合成して画面表示部30に表示させる(S8)。ステップS8の後、携帯端末1の動作はステップS1の処理に戻る。 The image composition unit 60 synthesizes the display image data output from the display image generation unit 58 and the screen data of the application screen output from the application screen generation unit 59 and causes the screen display unit 30 to display them (S8). . After step S8, the operation of the portable terminal 1 returns to the process of step S1.
 以上により、本実施形態の携帯端末1は、例えばホバースライド操作を単独で行って地図画像47を画面45内においてホバースライド操作のスライド量の2倍分移動する場合に、2本の指を用いたホバースライド操作とタッチスライド操作との組み合わせ操作により、地図画像47の画面45内の移動(スクロール)を、ホバースライド操作のスライド量の等倍分の移動(スクロール)となるように状態変化量を変更する。 As described above, the mobile terminal 1 of the present embodiment uses two fingers when, for example, the hover slide operation is performed independently and the map image 47 is moved within the screen 45 by twice the slide amount of the hover slide operation. As a result of the combination operation of the hover slide operation and the touch slide operation, the movement (scrolling) of the map image 47 within the screen 45 becomes a movement (scrolling) equivalent to the slide amount of the hover slide operation (scrolling). To change.
 これにより、携帯端末1は、画面45内における地図画像47の移動処理を、ホバースライド量の2倍分から等倍分まで地図画像47の画面45内の移動(スクロール)を微調整することができ、タッチパネルに対するユーザの入力操作に応じて、コンテンツ(地図画像)に対する制御内容を効率的に選択し、快適な操作性を与えることができる。 Accordingly, the mobile terminal 1 can finely adjust the movement (scrolling) of the map image 47 in the screen 45 from twice the hover slide amount to the same amount as the movement processing of the map image 47 in the screen 45. According to the user's input operation on the touch panel, the control content for the content (map image) can be efficiently selected and comfortable operability can be given.
 なお、本実施形態では、ホバースライド操作を単独で行った場合の地図画像47の画面45内の移動量はホバースライド操作のホバースライド量の2倍分として説明したが、特に2倍に限定されない。更に、状態変化量の一例として、地図画像の画面45内における移動量を用いて説明したが、他の状態変化量でも同様に適用可能である。 In the present embodiment, the movement amount in the screen 45 of the map image 47 when the hover slide operation is performed alone has been described as being twice the hover slide amount of the hover slide operation, but is not particularly limited to twice. . Furthermore, as an example of the state change amount, the movement amount in the screen 45 of the map image has been described, but other state change amounts can be similarly applied.
(第2の実施形態)
 第1の実施形態では、ホバースライド操作が行われている場合にホバースライド操作とタッチスライド操作との組み合わせ操作が行われたときのアプリケーション65における変化対象として、画面45内における地図画像の移動量を一例として説明した。
(Second Embodiment)
In the first embodiment, when the hover slide operation is performed, the amount of movement of the map image in the screen 45 as the change target in the application 65 when the combination operation of the hover slide operation and the touch slide operation is performed. Was described as an example.
 第2の実施形態では、アプリケーション65における動作の変化対象を他の変化対象に変更する例を説明する。アプリケーション65における動作の変化対象を他の変化対象に変更する一例として、ホバースライド操作のホバースライド量に応じて変更していた輝度を彩度に変更する場合、又は、映像データを再生している場合に、ホバースライド操作のホバースライド量に応じて変更していたボリューム(出力音声)を再生速度に変更する場合が考えられる。但し、アプリケーション65における動作の変化対象の切り替えは、これらの例に限定されない。 In the second embodiment, an example will be described in which an operation change target in the application 65 is changed to another change target. As an example of changing the operation change target in the application 65 to another change target, when changing the luminance changed according to the hover slide amount of the hover slide operation to saturation, or reproducing the video data. In this case, the volume (output sound) that has been changed according to the hover slide amount of the hover slide operation may be changed to the playback speed. However, switching of the operation change target in the application 65 is not limited to these examples.
 なお、第2の実施形態の携帯端末は第1の実施形態の携帯端末1と同様の構成を有するため、第1の実施形態の携帯端末1と同一の構成要素については同一の符号を用いることで、説明を省略し、異なる内容について説明する。 In addition, since the portable terminal of 2nd Embodiment has the structure similar to the portable terminal 1 of 1st Embodiment, it uses the same code | symbol about the component same as the portable terminal 1 of 1st Embodiment. Thus, the description will be omitted, and different contents will be described.
 図5は、第2の実施形態におけるホバースライド操作とタッチスライド操作との組み合わせ操作に応じた携帯端末1の動作手順を説明するフローチャートである。なお、第1の実施形態と同一の動作については、同一のステップ番号を付すことにより、説明を省略する。 FIG. 5 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch slide operation in the second embodiment. In addition, about the operation | movement same as 1st Embodiment, description is abbreviate | omitted by attaching | subjecting the same step number.
 図5において、状態管理部54は、ユーザの人差し指68aによってホバースライド操作が行われていると判定した場合に(S1、YES)、操作判定部56からの操作判定結果情報を基に、ユーザの親指68bによってタッチスライド操作が行われているか否かを判定する(S2)。 In FIG. 5, when the state management unit 54 determines that the hover slide operation is being performed with the user's index finger 68a (S1, YES), based on the operation determination result information from the operation determination unit 56, It is determined whether or not a touch slide operation is being performed with the thumb 68b (S2).
 状態管理部54は、ユーザの親指68bによってタッチスライド操作が行われていないと判定した場合には(S2、NO)、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われていない旨をアプリケーション65に出力(通知)する。変化対象変更部62は、ステップS1において実行していると判定されたホバースライド操作に対応したアプリケーション65の動作の変化対象を初期化する(S3A)。 If the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO), the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed. Output (notify) to 65. The change target changing unit 62 initializes the change target of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 (S3A).
 例えば、携帯端末1のアプリケーション65において映像データが再生されており、ホバースライド操作によって再生対象の映像データのリストが選択されている場合を想定する。ステップS3Aでは、変化対象変更部62は、ホバースライド操作(図3(A)参照)が単独で行われた場合に、指68のホバースライド操作に応じた変化対象を初期化する。 For example, it is assumed that video data is being played back in the application 65 of the mobile terminal 1 and a list of video data to be played back is selected by a hover slide operation. In step S3A, the change target changing unit 62 initializes the change target according to the hover slide operation of the finger 68 when the hover slide operation (see FIG. 3A) is performed alone.
 具体的には、変化対象変更部62は、変化対象を初期化することで、指68のホバースライド操作に応じた変化対象を、現在の「再生対象の映像データのリストの選択処理」から「再生時のボリュームの調整処理」に初期化する。なお、初期化された処理内容はアプリケーション65に応じて定められることが好ましく、「再生時のボリュームの調整処理」は一例である。 Specifically, the change target changing unit 62 initializes the change target to change the change target according to the hover slide operation of the finger 68 from the current “selection process of video data list to be played back”. Initialization to “Volume adjustment processing during playback”. Note that the initialized processing content is preferably determined according to the application 65, and “volume adjustment processing during reproduction” is an example.
 一方、状態管理部54は、ユーザの親指68bによってタッチスライド操作が行われていると判定した場合には(S2、YES)、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われている旨をアプリケーション65に出力(通知)する。変化対象変更部62は、ステップS1において実行されていると判定されたホバースライド操作に対応したアプリケーション65の動作の変化対象を他の変化対象に変更する(S4A)。 On the other hand, when the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES), the combined operation of the hover slide operation and the touch slide operation is performed. Is output (notified) to the application 65. The change target changing unit 62 changes the change target of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 to another change target (S4A).
 例えば、携帯端末1のアプリケーション65において映像データが再生されており、ホバースライド操作によって再生対象の映像データのリストが選択されている場合を想定する。ステップS4Aでは、変化対象変更部62は、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われた場合に、指68のホバースライド操作に応じた変化対象を他の変化対象に変更する。変化対象変更部62は、変化対象を変更することで、指68のホバースライド操作に応じた変化対象を、例えば現在の「再生対象の映像データのリストの選択処理」から「映像データの倍速再生処理」に変更する。 For example, it is assumed that video data is being played back in the application 65 of the mobile terminal 1 and a list of video data to be played back is selected by a hover slide operation. In step S4A, the change target changing unit 62 changes the change target according to the hover slide operation of the finger 68 to another change target when a combination operation of the hover slide operation and the touch slide operation is performed. The change target changing unit 62 changes the change target to change the change target according to the hover slide operation of the finger 68 from, for example, the current “selection process of video data to be played back” to “double speed playback of video data”. Change to "Process".
 ステップS3A又はS4Aの後、携帯端末1の動作はステップS5に進む。操作判定部56は、状態管理部54から出力された操作判定指示を基に、ステップS1において実行されていると判定されたホバースライド操作のスライド量を算出する(S5)。操作判定部56は、算出結果としてのスライド量の情報を状態管理部54に出力する。 After step S3A or S4A, the operation of the mobile terminal 1 proceeds to step S5. The operation determination unit 56 calculates the slide amount of the hover slide operation determined to be executed in step S1 based on the operation determination instruction output from the state management unit 54 (S5). The operation determination unit 56 outputs information on the slide amount as a calculation result to the state management unit 54.
 状態管理部54は、操作判定部56から出力されたスライド量の情報を基に、ステップS3A又はS4Aで得られた変化対象に対するスライド量に応じた変化を、アプリケーション65における動作に与える旨の変化対象反映指示をアプリケーション65に出力する。変化対象変更部62は、状態管理部54から出力された変化対象反映指示を基に、ステップS3A又はS4Aで得られた変化対象に対するスライド量に応じた変化を、アプリケーション65の動作に与えて反映させる(S8A)。ステップS8Aの後、携帯端末1の動作はステップS1に戻る。 Based on the information on the slide amount output from the operation determination unit 56, the state management unit 54 changes the effect of giving the change in the slide 65 for the change target obtained in step S3A or S4A to the operation in the application 65. A target reflection instruction is output to the application 65. Based on the change target reflection instruction output from the state management unit 54, the change target changing unit 62 reflects the change according to the slide amount with respect to the change target obtained in step S3A or S4A to the operation of the application 65. (S8A). After step S8A, the operation of the mobile terminal 1 returns to step S1.
 以上により、本実施形態の携帯端末1は、例えばホバースライド操作を単独で行って再生対象の映像データのリストを選択している場合に、2本の指を用いたホバースライド操作とタッチスライド操作との組み合わせ操作により、映像データのリストの選択処理を、現在再生中の映像データの再生時におけるボリュームの調整処理に変更する。これにより、携帯端末1は、ホバースライド操作とタッチスライド操作との組み合わせ操作によって、アプリケーション65における動作の変化対象を簡易に変更することができ、快適な操作性をユーザに対して与えることができる。 As described above, the mobile terminal 1 according to the present embodiment performs the hover slide operation and the touch slide operation using two fingers, for example, when the list of video data to be reproduced is selected by performing the hover slide alone. With the combination operation, the selection processing of the video data list is changed to the volume adjustment processing at the time of reproduction of the currently reproduced video data. Thereby, the portable terminal 1 can easily change the operation change target in the application 65 by a combination operation of the hover slide operation and the touch slide operation, and can give a comfortable operability to the user. .
(第3の実施形態)
 第3の実施形態では、第1の実施形態の変形例として、第1の実施形態と同様に、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われた場合に、ホバースライド操作に応じたアプリケーション65における動作の状態変化量を変更する例を説明する。
(Third embodiment)
In the third embodiment, as a modification of the first embodiment, when a combination operation of a hover slide operation and a touch slide operation is performed as in the first embodiment, the hover slide operation is performed. An example of changing the operation state change amount in the application 65 will be described.
 なお、第3の実施形態の携帯端末は第1の実施形態と同様の構成を有するため、第1の実施形態と同一の構成要素については同一の符号を用いることで、説明を省略し、異なる内容について説明する。 In addition, since the portable terminal of the third embodiment has the same configuration as that of the first embodiment, the same reference numerals are used for the same components as those of the first embodiment, and description thereof is omitted and different. The contents will be described.
(第3の実施形態の動作概要)
 図6は、第3の実施形態におけるホバースライド操作とタッチスライド操作との組み合わせ操作を示す図である。図6(A)は、組み合わせ操作の開始時の様子を示す図である。図6(B)は、組み合わせ操作の開始後にホバースライド操作を行っている指(人差し指68a)とタッチスライド操作を行っている指(親指68b)との指間間隔が短くなった様子を示す図である。図6(C)は、組み合わせ操作の軌跡が円である様子を示す図である。また、本実施形態においても、アプリケーション65における動作の状態変化量は、地図画像47の画面45内の移動量とする。
(Outline of operation of the third embodiment)
FIG. 6 is a diagram illustrating a combination operation of a hover slide operation and a touch slide operation in the third embodiment. FIG. 6A is a diagram illustrating a state at the start of the combination operation. FIG. 6B is a diagram illustrating a state in which the inter-finger spacing between the finger performing the hover slide operation (index finger 68a) and the finger performing the touch slide operation (thumb 68b) after the combination operation is started is shortened. It is. FIG. 6C is a diagram illustrating a state where the locus of the combination operation is a circle. Also in this embodiment, the state change amount of the operation in the application 65 is the amount of movement in the screen 45 of the map image 47.
 本実施形態では、図6(A)に示すように、ホバースライド操作とタッチスライド操作との組み合わせ操作の開始時では、人差し指68aはタッチパネル15に近接し、親指68bはタッチパネル15にタッチ(接触)した状態であるとする。 In this embodiment, as shown in FIG. 6A, at the start of the combination operation of the hover slide operation and the touch slide operation, the index finger 68a is close to the touch panel 15 and the thumb 68b is touched (contacted). Suppose that it is in the state.
 人指し指68aの鉛直下方向におけるタッチパネル15(画面45)上の位置と親指68bのタッチパネル15(画面45)上の位置との間隔mを維持し、人指し指68aと親指68bとを用いてホバースライド操作とタッチスライド操作との組み合わせ操作が矢印a1の方向に行われたとする。 The distance m between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is maintained, and the hover slide operation is performed using the index finger 68a and the thumb 68b. Assume that the combination operation with the touch slide operation is performed in the direction of the arrow a1.
 この場合、地図画像47が画面45内を移動する移動量(スライド量又はスクロール量)が一定の移動量に維持される(矢印b1参照)。即ち、地図画像47が画面45内を移動する移動量は、人差し指68aのホバースライド操作のホバースライド量の等倍分の移動量となる。 In this case, the movement amount (slide amount or scroll amount) by which the map image 47 moves within the screen 45 is maintained at a constant movement amount (see arrow b1). That is, the amount of movement of the map image 47 within the screen 45 is the amount of movement equivalent to the hover slide amount of the hover slide operation of the index finger 68a.
 一方、図6(B)に示すように、ホバースライド操作とタッチスライド操作との組み合わせ操作を行いながら、人指し指68aの鉛直下方向におけるタッチパネル15(画面45)上の位置と親指68bのタッチパネル15(画面45)上の位置との間隔を図6(A)に示す間隔mより狭い間隔m’に狭めて、人指し指68aと親指68bとを用いてホバースライド操作とタッチスライド操作との組み合わせ操作が矢印a2の方向に行われたとする。 On the other hand, as shown in FIG. 6B, while performing a combination operation of a hover slide operation and a touch slide operation, the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the touch panel 15 ( The interval between the position on the screen 45) is narrowed to an interval m ′ narrower than the interval m shown in FIG. 6A, and the combined operation of the hover slide operation and the touch slide operation using the index finger 68a and the thumb 68b is an arrow. Suppose that it was performed in the direction of a2.
 この場合、地図画像47が画面45内を移動する移動量(スライド量)は、例えば、第1の実施形態とは異なって増加する(矢印b2参照)。即ち、地図画像47が画面45内を移動する移動量は、人差し指68aのホバースライド操作のホバースライド量の例えば2倍分の移動量となる。 In this case, the amount of movement (slide amount) by which the map image 47 moves within the screen 45 increases, for example, unlike the first embodiment (see arrow b2). That is, the amount of movement of the map image 47 in the screen 45 is, for example, twice the amount of hover slide of the hover slide operation of the index finger 68a.
 また、ホバースライド操作、ホバースライド操作とタッチスライド操作との組み合わせ操作は、いずれも直線的な軌跡を描く操作(図6(A)及び(B)参照)に限定されず、例えば軌跡が円となる連続的な操作でも良い(図6(C)参照)。 Further, the hover slide operation, and the combination operation of the hover slide operation and the touch slide operation are not limited to operations for drawing a linear locus (see FIGS. 6A and 6B). It may be a continuous operation (see FIG. 6C).
 例えば、図6(C)では、人指し指68aの鉛直下方向におけるタッチパネル15(画面45)上の位置と親指68bのタッチパネル15(画面45)上の位置との間隔を維持したり間隔を広げたり狭めたりすることにより、矢印cに示す円の軌跡を描くような組み合わせ操作が行われた場合には、間隔に応じて、アプリケーション65における動作の状態変化量が変更される。 For example, in FIG. 6C, the distance between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is maintained, or the distance is widened or narrowed. When a combination operation is performed so as to draw the locus of the circle indicated by the arrow c, the state change amount of the operation in the application 65 is changed according to the interval.
 図6(C)の円運動の軌跡を描くような組み合わせ操作によって、携帯端末1は、例えば映像データの再生時におけるボリュームを回転量によって調整し、又は、画面45内に表示された複数のサムネイルを巡回的に切り替えて表示することができる。 6C, for example, the mobile terminal 1 adjusts the volume at the time of reproducing the video data according to the rotation amount, or a plurality of thumbnails displayed in the screen 45 by a combination operation that draws a circular motion locus in FIG. Can be cyclically switched and displayed.
(第3の実施形態の携帯端末1の動作)
 図7は、第3の実施形態におけるホバースライド操作とタッチスライド操作との組み合わせ操作に応じた携帯端末1の動作手順を説明するフローチャートである。図7に示すフローチャートは、例えば図6(A)及び(B)に示す画面45(タッチパネル15)に対する入力操作に応じた携帯端末1の動作手順を表す。なお、第1の実施形態と同一の動作については、同一のステップ番号を付すことにより、説明を省略する。
(Operation of the mobile terminal 1 of the third embodiment)
FIG. 7 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch slide operation in the third embodiment. The flowchart shown in FIG. 7 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 6 (A) and 6 (B), for example. In addition, about the operation | movement same as 1st Embodiment, description is abbreviate | omitted by attaching | subjecting the same step number.
 図7において、状態管理部54は、ユーザの人差し指68aによってホバースライド操作が行われていると判定した場合(S1、YES)、操作判定部56からの操作判定結果情報を基に、ユーザの親指68bによってタッチスライド操作が行われているか否かを判定する(S2)。 In FIG. 7, when the state management unit 54 determines that the hover slide operation is being performed with the user's index finger 68a (S1, YES), based on the operation determination result information from the operation determination unit 56, the user's thumb It is determined whether a touch slide operation is performed by 68b (S2).
 状態管理部54は、ユーザの親指68bによってタッチスライド操作が行われていないと判定した場合には(S2、NO)、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われていない旨をアプリケーション65に出力(通知)する。状態変化量変更部61は、ステップS1において実行していると判定されたホバースライド操作に応じたアプリケーション65の動作の状態変化量を初期化する(S3)。 If the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO), the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed. Output (notify) to 65. The state change amount changing unit 61 initializes the state change amount of the operation of the application 65 according to the hover slide operation determined to be executed in step S1 (S3).
 状態管理部54は、ユーザの親指68bによってタッチスライド操作が行われていると判定した場合には(S2、YES)、人差し指68aのホバースライド操作と親指68bのタッチスライド操作とは同一の方向であって、人差し指68aの鉛直下方向におけるタッチパネル15(画面45)上の位置と親指68bのタッチパネル15(画面45)上の位置との間隔が所定の規定値内であるか否かを判定する(S2A,S2B)。なお、図4及び図5では、ステップS2A及びS2Bの動作の図示を省略しているが、ステップS2において図7に示すステップS2A及びS2Bの動作の内容が含まれている。 When the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES), the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b are in the same direction. Then, it is determined whether or not the interval between the position on the touch panel 15 (screen 45) in the vertically downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is within a predetermined specified value ( S2A, S2B). 4 and 5, illustration of the operations of steps S2A and S2B is omitted, but the contents of the operations of steps S2A and S2B shown in FIG. 7 are included in step S2.
 状態管理部54は、人差し指68aのホバースライド操作と親指68bのタッチスライド操作とは同一方向であって、人差し指68aの鉛直下方向におけるタッチパネル15(画面45)上の位置と親指68bのタッチパネル15(画面45)上の位置との間隔が規定値内であると判定した場合には(S2A-YES、S2B-YES)、ホバースライド操作とタッチスライド操作との組み合わせ操作が同一方向であって指間間隔が規定値内である状態にて行われている旨をアプリケーション65に出力(通知)する。状態変化量変更部61は、ステップS1において実行されていると判定されたホバースライド操作に応じたアプリケーション65の動作の状態変化量を、指間間隔に応じた状態変化量に変更する(S4B、図6(B)参照)。 The state management unit 54 determines that the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b are in the same direction, and the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the touch panel 15 ( When it is determined that the distance from the position on the screen 45) is within the specified value (S2A-YES, S2B-YES), the combination operation of the hover slide operation and the touch slide operation is in the same direction and between the fingers Output (notify) to the application 65 that the interval is within the specified value. The state change amount changing unit 61 changes the state change amount of the operation of the application 65 according to the hover slide operation determined to be executed in step S1 to the state change amount according to the inter-finger interval (S4B, (See FIG. 6B).
 なお、ステップS2Bの判定において用いられる規定値は、人差し指68aと親指68bとの位置が妥当と考えられる範囲(タッチ受付可能範囲)に設定され、人差し指68aの鉛直下方向におけるタッチパネル15(画面45)内の位置を起点とする相対座標により定められる(図8参照)。図8は、画面45上の人指し指68aの位置に応じて設定されたタッチ受付可能範囲45aを示す図である。 The specified value used in the determination in step S2B is set to a range where the positions of the index finger 68a and the thumb 68b are considered to be appropriate (touch acceptable range), and the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a. It is determined by relative coordinates starting from the position inside (see FIG. 8). FIG. 8 is a diagram showing a touch acceptable range 45a set in accordance with the position of the index finger 68a on the screen 45. As shown in FIG.
 タッチ受付可能範囲45aは、ホバー操作している人差し指68aの鉛直下方向におけるタッチパネル15(画面45)上の位置と、タッチ操作した親指68bのタッチパネル15(画面45)上の位置との距離(間隔)が規定値以内である場合の所定の範囲(例えば円形状又は楕円形状の範囲)である。 The touch receivable range 45a is a distance (interval) between a position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a that is operated by the hover and a position on the touch panel 15 (screen 45) of the touched thumb 68b. ) Is within a prescribed value (for example, a circular or elliptical range).
 ホバースライド操作とタッチスライド操作との組み合わせ操作が行われる場合、人指し指68aの位置が変わると、タッチ受付可能範囲45aの位置も変わる。ここで、人指し指68aの位置、即ちホバー操作している人差し指68aの鉛直下方向におけるタッチパネル15(画面45)上の位置は、図8に示す丸印50Aの位置である。親指68bのタッチ位置がタッチ受付可能範囲45a(図8のハッチング部参照)に含まれる場合、人差し指68aと親指68bとの指間間隔は規定値内であると判定される。 When the combination operation of the hover slide operation and the touch slide operation is performed, when the position of the index finger 68a is changed, the position of the touch acceptable range 45a is also changed. Here, the position of the index finger 68a, that is, the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a operating the hover is a position of a circle 50A shown in FIG. When the touch position of the thumb 68b is included in the touch receivable range 45a (see the hatched portion in FIG. 8), it is determined that the inter-finger interval between the index finger 68a and the thumb 68b is within a specified value.
 ステップS2Bでは、指間間隔が規定値を超える場合、携帯端末1の動作は、ユーザによって誤操作のおそれがあるとして、ステップS3の動作に進む。ステップS5以降の処理は、第1の実施形態と同様であるため説明を省略する。 In step S2B, if the inter-finger spacing exceeds the specified value, the operation of the mobile terminal 1 proceeds to the operation of step S3, assuming that there is a possibility of an erroneous operation by the user. Since the process after step S5 is the same as that of the first embodiment, the description thereof is omitted.
 以上により、本実施形態の携帯端末1は、ホバー操作している人差し指68aとタッチスライド操作している親指68bとの指間間隔を狭めると、地図画像の画面45内における移動量(状態変化量)を増加でき、指間間隔を広げると、地図画像の画面45内における移動量(状態変化量)を減少でき、状態変化量の微調整を簡易に行うことができる。 As described above, the mobile terminal 1 according to the present embodiment reduces the movement amount (state change amount) of the map image in the screen 45 when the finger interval between the index finger 68a operating the hover and the thumb 68b operating the touch slide is reduced. ) Can be increased and the distance between fingers can be increased, the amount of movement (state change amount) of the map image within the screen 45 can be reduced, and fine adjustment of the state change amount can be easily performed.
 なお、本実施形態において、指間間隔と状態変化量の大きさの増減とは、正比例又は反比例の関係でも良く、例えば、指間間隔を狭めると、地図画像の画面45内における移動量(状態変化量)を減少しても良い。また、円運動又は直線的なホバースライド操作とタッチスライド操作との組み合わせ操作の途中で、指間間隔を変更することで、指間間隔の変更時点から地図画像の画面45内における移動量(状態変化量)を更に変更しても良い。 In the present embodiment, the inter-finger interval and the increase / decrease in the magnitude of the state change amount may be in a direct proportional or inverse proportional relationship. For example, when the inter-finger interval is reduced, the movement amount (state of the map image in the screen 45) (Change amount) may be decreased. Further, by changing the inter-finger interval during the circular motion or the combination operation of the linear hover slide operation and the touch slide operation, the amount of movement (state of the map image in the screen 45 from the time when the inter-finger interval is changed) (Change amount) may be further changed.
 また、ホバースライド操作の人差し指68aとタッチスライド操作の親指68bとの各スライド方向が同一方向である、又は、タッチ受付可能範囲45aを設けることで、携帯端末1は、ユーザの誤操作のおそれがある操作を排除でき、ユーザの操作性を向上させることができる。 In addition, the sliding directions of the index finger 68a for the hover slide operation and the thumb 68b for the touch slide operation are the same direction, or the touch acceptable range 45a is provided, so that the mobile terminal 1 may be erroneously operated by the user. Operations can be eliminated and user operability can be improved.
 なお、携帯端末1は、タッチ操作のタッチ受付可能範囲45aを、ユーザの利き手を判定した上で決定しても良い。例えば、携帯端末1は、タッチ座標(x1、y1)、近接座標(x2、y2、z2)である場合、タッチ座標(x1、y1)と近接座標のxy平面上の座標(x2、y2)とを結ぶ場合の直線の傾きの正方向又は負方向に応じて、ユーザの利き手を判定しても良い。この場合、携帯端末1は、直線の傾きが右単調増加方向(正方向)であれば右利きと判定し、左単調増加方向(負方向)であれば左利きと判定する。 Note that the mobile terminal 1 may determine the touch receivable range 45a of the touch operation after determining the user's dominant hand. For example, when the mobile terminal 1 has touch coordinates (x1, y1) and proximity coordinates (x2, y2, z2), the touch coordinates (x1, y1) and the coordinates (x2, y2) of the proximity coordinates on the xy plane The user's dominant hand may be determined according to the positive direction or the negative direction of the slope of the straight line. In this case, the mobile terminal 1 determines that it is right-handed if the slope of the straight line is the right monotonously increasing direction (positive direction), and is determined to be left-handed if it is the left monotonically increasing direction (negative direction).
 また、タッチ受付可能範囲45aの設定は、上述した第1及び第2の各実施形態に対しても適用可能である。また、以後の各実施形態においても、タッチ受付可能範囲45aを適用しても良い。 Further, the setting of the touch receivable range 45a can be applied to the first and second embodiments described above. In each of the following embodiments, the touch acceptable range 45a may be applied.
(第4の実施形態)
 第1から第3までの各実施形態では、2本の指を用いた組み合わせ操作の一例として、ホバースライド操作とタッチスライド操作との組み合わせ操作が行われた場合の携帯端末1の動作を説明した。
(Fourth embodiment)
In each of the first to third embodiments, the operation of the mobile terminal 1 when the combination operation of the hover slide operation and the touch slide operation is performed is described as an example of the combination operation using two fingers. .
 第4の実施形態では、ホバースライド操作が行われている場合に2本の指を用いた組み合わせ操作の一例として「ホバースライド操作とタッチホールド操作との組み合わせ操作」が行われた場合に、ホバースライド操作に応じたアプリケーション65の動作の変化を継続させる例を説明する。 In the fourth embodiment, when “hover slide operation and touch hold operation” are performed as an example of the combination operation using two fingers when the hover slide operation is performed, the hover is performed. An example in which the change in the operation of the application 65 according to the slide operation is continued will be described.
 なお、第4の実施形態の携帯端末は第1の実施形態の携帯端末1と同様の構成を有するため、第1の実施形態の携帯端末1と同一の構成要素については同一の符号を用いることで、説明を省略し、異なる内容について説明する。 In addition, since the portable terminal of 4th Embodiment has the structure similar to the portable terminal 1 of 1st Embodiment, it uses the same code | symbol about the component same as the portable terminal 1 of 1st Embodiment. Thus, the description will be omitted, and different contents will be described.
(第4の実施形態の動作概要)
 図9は、第4の実施形態におけるホバースライド操作とタッチホールド操作との組み合わせ操作を示す図である。図9(A)は、ホバースライド操作が単独で行われた様子を示す図である。図9(B)は、ホバースライド操作とタッチホールド操作との組み合わせ操作が行われた様子を示す図である。
(Outline of operation of the fourth embodiment)
FIG. 9 is a diagram illustrating a combination operation of a hover slide operation and a touch hold operation in the fourth embodiment. FIG. 9A is a diagram illustrating a state where the hover slide operation is performed independently. FIG. 9B is a diagram illustrating a state where a combination operation of a hover slide operation and a touch hold operation is performed.
 本実施形態では、図9(A)に示す人差し指68aがタッチパネル15(画面45)に対してホバースライド操作を開始した後に、図9(B)に示すように、人差し指68aがタッチパネル15の上にタッチしてタッチホールド操作を継続する。この場合、人差し指68aはホバー操作の状態となり、画面45内に表示された地図画像47の移動(スクロール)が継続し、ホバースライド操作を行っていたときの方向(符号d参照)に地図画像47がスクロールする。 In this embodiment, after the index finger 68a shown in FIG. 9A starts a hover slide operation on the touch panel 15 (screen 45), the index finger 68a is placed on the touch panel 15 as shown in FIG. 9B. Touch and continue the touch hold operation. In this case, the index finger 68a is in the state of hover operation, and the map image 47 displayed in the screen 45 continues to move (scroll), and the map image 47 in the direction when the hover slide operation is performed (see reference sign d). Scrolls.
 従って、親指68bによってタッチホールド操作が継続していれば、人指し指68aがホバースライド操作を行わなくてもホバー操作を維持している限り、地図画像47の画面45内の移動(スクロール)が継続する。 Therefore, if the touch hold operation continues with the thumb 68b, the movement (scrolling) of the map image 47 within the screen 45 continues as long as the index finger 68a maintains the hover operation without performing the hover slide operation. .
(第4の実施形態の携帯端末1の動作)
 図10は、第4の実施形態におけるホバースライド操作とタッチホールド操作との組み合わせ操作に応じた携帯端末1の動作手順を説明するフローチャートである。図10に示すフローチャートは、図9(A)及び(B)に示す画面45(タッチパネル15)に対する入力操作に応じた携帯端末1の動作手順を表す。
(Operation of the mobile terminal 1 of the fourth embodiment)
FIG. 10 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch hold operation in the fourth embodiment. The flowchart shown in FIG. 10 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 9A and 9B.
 図10において、状態管理部54は、操作判定部56からの操作判定結果情報を基に、ユーザの人差し指68によってホバースライド操作が行われているか否かを判定する(S11)。ホバースライド操作が行われていると判定された場合に、携帯端末1の動作はステップS12に進む。 In FIG. 10, the state management unit 54 determines whether or not a hover slide operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S11). When it is determined that the hover slide operation is performed, the operation of the mobile terminal 1 proceeds to step S12.
 次に、状態管理部54は、ユーザの人差し指68aによってホバースライド操作が行われていると判定した場合に(S11、YES)、ホバースライド操作が行われている旨をアプリケーション65に出力(通知)する。状態変化量変更部61は、ホバースライド操作に応じたアプリケーション65の動作の状態変化量を変更せずに、ホバースライド操作に応じたアプリケーション65の動作における変化処理を実行する(S12)。 Next, when it is determined that the hover slide operation is being performed with the user's index finger 68a (S11, YES), the state management unit 54 outputs (notifies) that the hover slide operation is being performed to the application 65. To do. The state change amount changing unit 61 executes a change process in the operation of the application 65 according to the hover slide operation without changing the state change amount of the operation of the application 65 according to the hover slide operation (S12).
 例えば、ステップS12では、携帯端末1は、ホバースライド操作に応じたアプリケーション65の動作の一例として、ホバースライド操作のホバースライド量を基に、画面45内に表示された地図画像47のスクロールを実行する。 For example, in step S12, the mobile terminal 1 performs scrolling of the map image 47 displayed in the screen 45 based on the hover slide amount of the hover slide operation as an example of the operation of the application 65 in response to the hover slide operation. To do.
 また、状態変化量変更部61は、ホバースライド操作に応じたアプリケーション65の動作の状態変化量(変化条件)をメモリ40に保持する(S13)。なお、図10の説明では、変化条件は、ホバースライド操作に応じたアプリケーション65の動作の状態変化量でも良いし、ホバースライド操作に応じたアプリケーション65の動作の変化対象でも良い。 Further, the state change amount changing unit 61 holds the state change amount (change condition) of the operation of the application 65 according to the hover slide operation in the memory 40 (S13). In the description of FIG. 10, the change condition may be a state change amount of the operation of the application 65 according to the hover slide operation or a change target of the operation of the application 65 according to the hover slide operation.
 状態管理部54は、操作判定部56からの操作判定結果情報を基に、ホバースライド操作が停止してホバー操作の状態となったか否かを判定する(S14)。ホバースライド操作が停止していないと判定された場合には、携帯端末1の動作はステップS12の動作に戻る。 The state management unit 54 determines whether or not the hover slide operation has stopped and the state of the hover operation has been entered based on the operation determination result information from the operation determination unit 56 (S14). When it is determined that the hover slide operation is not stopped, the operation of the mobile terminal 1 returns to the operation of step S12.
 状態管理部54は、ホバースライド操作が停止していると判定した場合には(S14、YES)、ホバースライド操作の停止と同時にタッチホールド操作が行われたか否か判定する(S15)。ここで、同時とは、ホバースライド操作の停止とタッチホールド操作の検知とが同時であることに限定せず、例えばホバースライド操作が停止する直前にタッチホールド操作が検知された場合を含んでも良い。 When it is determined that the hover slide operation is stopped (S14, YES), the state management unit 54 determines whether the touch hold operation is performed simultaneously with the stop of the hover slide operation (S15). Here, the term “simultaneously” does not limit the stop of the hover slide operation and the detection of the touch hold operation, and may include, for example, a case where the touch hold operation is detected immediately before the hover slide operation stops. .
 状態管理部54は、同時のタッチホールド操作が行われていないと判定した場合には(S15、NO)、ホバースライド操作に応じたアプリケーション65の動作の変化処理を停止する(S16)。例えば、携帯端末1は、ステップS12において実行された地図画像47の画面45内のスクロールを停止する(S16)。ステップS16の後、携帯端末1の動作はステップS11の動作に戻る。 When it is determined that the simultaneous touch hold operation is not performed (S15, NO), the state management unit 54 stops the change process of the operation of the application 65 according to the hover slide operation (S16). For example, the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S16). After step S16, the operation of the portable terminal 1 returns to the operation of step S11.
 一方、状態管理部54は、同時のタッチホールド操作が行われたと判定した場合には(S15、YES)、ステップS13において保持された変化条件に従って、ホバースライド操作に応じたアプリケーション65の動作の変化処理を継続する(S17)。例えば、携帯端末1は、ホバースライド操作が停止してホバー操作となり、ホバースライド操作の停止と同時にタッチホールド操作が行われた場合でも、ホバースライド操作を単独で行っていたときの画面スクロールを自動的に継続することができる。 On the other hand, when the state management unit 54 determines that the simultaneous touch hold operation has been performed (S15, YES), the change in the operation of the application 65 according to the hover slide operation according to the change condition held in step S13. The process is continued (S17). For example, the mobile terminal 1 automatically performs screen scrolling when the hover slide operation is performed independently even when the hover slide operation is stopped and the hover operation is performed and the touch hold operation is performed simultaneously with the stop of the hover slide operation. Can continue.
 状態管理部54は、タッチ座標抽出部52から出力されたタッチ座標(x、y)の情報を基に、タッチホールド操作が継続しているか否かを判定する(S18)。タッチホールド操作が継続していると判定された場合には(S18、YES)、携帯端末1の動作はステップS17の動作に戻る。 The state management unit 54 determines whether or not the touch hold operation is continued based on the information on the touch coordinates (x, y) output from the touch coordinate extraction unit 52 (S18). When it is determined that the touch hold operation is continued (S18, YES), the operation of the mobile terminal 1 returns to the operation of step S17.
 一方、状態管理部54は、操作判定部56からの操作判定結果情報を基に、タッチホールド操作が継続しなくなったと判定した場合には(S18、NO)、ホバースライド操作に応じたアプリケーション65の動作の変化処理を停止する(S19)。例えば、携帯端末1は、ステップS12において実行された地図画像47の画面45内のスクロールを停止する(S19)。 On the other hand, if the state management unit 54 determines that the touch hold operation is not continued based on the operation determination result information from the operation determination unit 56 (S18, NO), the state management unit 54 determines whether the application 65 corresponding to the hover slide operation The operation change process is stopped (S19). For example, the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S19).
 以上により、本実施形態の携帯端末1は、ホバースライド操作とタッチホールド操作との組み合わせ操作を用いても、地図画像47の画面45内におけるスクロールを簡易に行うことができる。また、携帯端末1は、タッチホールド操作を行っている間ではホバースライド操作を行う必要が無くホバー操作を行えば良いため、ホバースライド操作を連続的に行う場合の戻り操作の検知によって誤動作が生じることもなくなる。 As described above, the mobile terminal 1 according to the present embodiment can easily scroll the map image 47 in the screen 45 even using a combination operation of a hover slide operation and a touch hold operation. Further, since the mobile terminal 1 does not need to perform the hover slide operation during the touch hold operation and only needs to perform the hover operation, a malfunction occurs due to detection of the return operation when the hover slide operation is continuously performed. Nothing will happen.
 また、本実施形態の携帯端末1は、地図画像47の画面45内の移動状態(画面スクロール状態)を継続することに限定されずに、例えば地図画像47の画面45内の移動量(状態変化量としての画面スクロール量)を継続(維持)しても良い。例えば、携帯端末1は、人差し指68aによって早いホバースライド操作が行われていれば、親指68bによるタッチホールド時にも早いホバースライド操作に対応した変化を継続させ、人差し指68aによって遅いホバースライド操作が行われていれば、親指68bによってタッチホールド時にも遅いホバースライド操作に対応する変化を継続させることができる。 In addition, the mobile terminal 1 of the present embodiment is not limited to continuing the movement state (screen scroll state) of the map image 47 within the screen 45, and for example, the movement amount (state change) of the map image 47 within the screen 45 (The screen scroll amount) may be continued (maintained). For example, if a fast hover slide operation is performed with the index finger 68a, the portable terminal 1 continues the change corresponding to the fast hover slide operation even when the thumb 68b is touch-held, and the slow hover slide operation is performed with the index finger 68a. If so, it is possible to continue the change corresponding to the slow hover slide operation at the time of the touch hold by the thumb 68b.
(第5の実施形態)
 第5の実施形態では、第4の実施形態で説明したホバースライド操作とタッチホールド操作との組み合わせ操作によって、ホバースライド操作に応じたアプリケーション65の動作の状態変化量を増減させる例を説明する。
(Fifth embodiment)
In the fifth embodiment, an example in which the state change amount of the operation of the application 65 according to the hover slide operation is increased or decreased by the combination operation of the hover slide operation and the touch hold operation described in the fourth embodiment will be described.
 なお、第5の実施形態の携帯端末は第4の実施形態の携帯端末1と同様の構成を有するため、第4の実施形態の携帯端末1と同一の構成要素については同一の符号を用いることで、説明を省略し、異なる内容について説明する。 Since the mobile terminal of the fifth embodiment has the same configuration as the mobile terminal 1 of the fourth embodiment, the same reference numerals are used for the same components as those of the mobile terminal 1 of the fourth embodiment. Thus, the description will be omitted, and different contents will be described.
(第5の実施形態の動作概要)
 図11は、第5の実施形態におけるホバースライド操作とタッチホールド操作との組み合わせ操作を示す図である。図11(A)は、ホバースライド操作とタッチホールド操作との組み合わせ操作が行われた様子を示す図である。図11(B)は、組み合わせ操作の開始後にホバー操作を行っている指とタッチホールド操作を行っている指との指間間隔が短くなった様子を示す図である。
(Operation outline of the fifth embodiment)
FIG. 11 is a diagram illustrating a combination operation of a hover slide operation and a touch hold operation according to the fifth embodiment. FIG. 11A is a diagram illustrating a state where a combination operation of a hover slide operation and a touch hold operation is performed. FIG. 11B is a diagram illustrating a state in which the inter-finger interval between the finger performing the hover operation and the finger performing the touch hold operation after the start of the combination operation is shortened.
 図11(A)に示すように、人差し指68aが画面45(タッチパネル15)に対してホバースライド操作を行っている場合に、親指68bが画面45上をタッチしたことで、地図画像47の画面45内の移動(スクロール)が継続している状態を想定する。 As shown in FIG. 11A, when the index finger 68a performs a hover slide operation on the screen 45 (touch panel 15), the thumb 68b touches the screen 45, so that the screen 45 of the map image 47 is displayed. Assume that the movement (scrolling) inside continues.
 本実施形態では、図11(A)に示す状態において、人差し指68aの鉛直下方向におけるタッチパネル15(画面45)上の位置と親指68bのタッチパネル15(画面45)上の位置との指間間隔を広げたり狭めたりすることで、アプリケーション65の動作の状態変化量が増減する(図11(B)参照)。 In the present embodiment, in the state shown in FIG. 11A, the inter-finger spacing between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is set. By expanding or narrowing, the state change amount of the operation of the application 65 increases or decreases (see FIG. 11B).
 即ち、タッチホールド操作していた親指68bが図11(B)に示す矢印eの方向、即ち、ホバースライド操作と同じ方向(順方向)にタッチスライド操作すると、アプリケーション65の動作の状態変化量(例えば画面45内における地図画像47の移動量)が増加する。 That is, when the thumb 68b that has been touch-held is touch-slid in the direction of the arrow e shown in FIG. 11B, that is, in the same direction (forward direction) as the hover slide operation, the state change amount ( For example, the amount of movement of the map image 47 within the screen 45 increases.
 反対に、タッチホールド操作していた親指68bが図11(B)に示す矢印fの方向、即ち、ホバースライド操作と逆の方向(逆方向)にタッチスライド操作すると、アプリケーション65の動作の状態変化量(例えば画面45内における地図画像47の移動量)が減少する。 On the other hand, if the thumb 68b that has been touch-held is touch-slid in the direction indicated by the arrow f shown in FIG. The amount (for example, the amount of movement of the map image 47 within the screen 45) decreases.
(第5の実施形態の携帯端末1の動作)
 図12は、第5の実施形態におけるホバースライド操作とタッチホールド操作との組み合わせ操作に応じた携帯端末1の動作手順を示すフローチャートである。図12(A)は、全体動作を示す。図12(B)は、変化継続処理の詳細を示す図である。図12に示すフローチャートは、図11(A)及び(B)に示す画面45(タッチパネル15)に対する入力操作に応じた携帯端末1の動作手順を表す。なお、第5の実施形態と同一の動作については、同一のステップ番号を付すことにより、説明を省略する。
(Operation of the mobile terminal 1 of the fifth embodiment)
FIG. 12 is a flowchart illustrating an operation procedure of the mobile terminal 1 according to a combination operation of the hover slide operation and the touch hold operation in the fifth embodiment. FIG. 12A shows the overall operation. FIG. 12B is a diagram showing details of the change continuation process. The flowchart shown in FIG. 12 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 11 (A) and 11 (B). In addition, about the operation | movement same as 5th Embodiment, description is abbreviate | omitted by attaching | subjecting the same step number.
 図12(A)において、状態管理部54は、同時のタッチホールド操作が行われたと判定した場合には(S15、YES)、現在の親指68bのタッチパネル15(画面45)上の位置及びホバースライド操作のホバースライド方向の各情報を保持してアプリケーション65に出力(通知)する(S16A)。 In FIG. 12A, when the state management unit 54 determines that the simultaneous touch hold operation has been performed (S15, YES), the current position of the thumb 68b on the touch panel 15 (screen 45) and the hover slide. Each information of the hover slide direction of the operation is held and output (notified) to the application 65 (S16A).
 ステップS17Aでは、状態管理部54及び制御拡張部64による変化継続処理が行われる(S17A)。この変化継続処理の詳細について、図12(B)を参照して説明する。 In step S17A, change continuation processing is performed by the state management unit 54 and the control extension unit 64 (S17A). Details of the change continuation processing will be described with reference to FIG.
 図12(B)において、状態管理部54は、操作判定部56からの操作判定結果情報を基に、図12(B)に示す親指68bのタッチ位置の移動の有無を判定する(S21)。 12B, the state management unit 54 determines whether or not the touch position of the thumb 68b shown in FIG. 12B has moved based on the operation determination result information from the operation determination unit 56 (S21).
 タッチ位置の移動方向が人差し指68aのホバースライド操作の逆方向である場合には(S21、逆方向)、状態管理部54は、ホバースライド操作に応じたアプリケーション65の動作の状態変化量を減少する旨の情報をアプリケーション65に出力する。状態変化量変更部61は、状態管理部54から出力された情報を基に、ホバースライド操作に応じたアプリケーション65の動作の状態変化量を減少する(S22)。ステップS22の後、携帯端末1の動作はステップS24に進む。 When the movement direction of the touch position is the reverse direction of the hover slide operation of the index finger 68a (S21, reverse direction), the state management unit 54 reduces the state change amount of the operation of the application 65 according to the hover slide operation. Information to that effect is output to the application 65. The state change amount changing unit 61 reduces the state change amount of the operation of the application 65 in response to the hover slide operation based on the information output from the state management unit 54 (S22). After step S22, the operation of the mobile terminal 1 proceeds to step S24.
 一方、タッチ位置の移動方向が人差し指68aのホバースライド操作の順方向である場合には(S21、順方向)、状態管理部54は、ホバースライド操作に応じたアプリケーション65の動作の状態変化量を増加する旨の情報をアプリケーション65に出力する。状態変化量変更部61は、状態管理部54から出力された情報を基に、ホバースライド操作に応じたアプリケーション65の動作の状態変化量を増加する(S23)。ステップS23の後、携帯端末1の動作はステップS24に進む。 On the other hand, when the moving direction of the touch position is the forward direction of the hover slide operation of the index finger 68a (S21, forward direction), the state management unit 54 determines the state change amount of the operation of the application 65 according to the hover slide operation. Information indicating the increase is output to the application 65. The state change amount changing unit 61 increases the state change amount of the operation of the application 65 in response to the hover slide operation based on the information output from the state management unit 54 (S23). After step S23, the operation of the mobile terminal 1 proceeds to step S24.
 また一方、タッチ位置の移動が無い、即ち、親指68bがタッチした位置においてタッチホールド操作が継続している場合には(S21、無し)、携帯端末1の動作はステップS24の処理に進む。 On the other hand, when there is no movement of the touch position, that is, when the touch hold operation is continued at the position touched by the thumb 68b (S21, none), the operation of the portable terminal 1 proceeds to the process of step S24.
 ステップS24では、状態管理部54は、ステップS13において保持された変化条件に従って、ホバースライド操作に応じたアプリケーション65の動作の変化処理を継続する(S24)。例えば、携帯端末1は、ホバースライド操作が停止してホバー操作となり、ホバースライド操作の停止と同時にタッチホールド操作が行われた場合でも、ホバースライド操作を単独で行っていたときの画面スクロールを自動的に継続することができる。 In step S24, the state management unit 54 continues the process of changing the operation of the application 65 according to the hover slide operation according to the change condition held in step S13 (S24). For example, the mobile terminal 1 automatically performs screen scrolling when the hover slide operation is performed independently even when the hover slide operation is stopped and the hover operation is performed and the touch hold operation is performed simultaneously with the stop of the hover slide operation. Can continue.
 状態管理部54は、操作判定部56からの操作判定結果情報を基に、タッチホールド操作が継続しているか否かを判定する(S25)。タッチホールド操作が継続していると判定された場合には(S25、YES)、携帯端末1の動作はステップS21の動作に戻る。タッチホールド操作が継続していないと判定された場合には(S25、NO)、携帯端末1の変化継続処理は終了し、携帯端末1の動作はステップS19に進む。 The state management unit 54 determines whether or not the touch hold operation is continued based on the operation determination result information from the operation determination unit 56 (S25). When it is determined that the touch hold operation is continued (S25, YES), the operation of the mobile terminal 1 returns to the operation of step S21. When it is determined that the touch hold operation is not continued (S25, NO), the change continuation process of the mobile terminal 1 is finished, and the operation of the mobile terminal 1 proceeds to step S19.
 即ち、状態管理部54は、ホバースライド操作に応じたアプリケーション65の動作の変化処理を停止する(S19)。例えば、携帯端末1は、ステップS12において実行された地図画像47の画面45内のスクロールを停止する(S19)。 That is, the state management unit 54 stops the change process of the operation of the application 65 according to the hover slide operation (S19). For example, the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S19).
 以上により、本実施形態の携帯端末1は、第4の実施形態の携帯端末1に比べて、タッチホールド操作が継続していた親指68bをホバースライド操作の人差し指68aの方向に対して順方向又は逆方向に更にスライドさせて指間間隔を広げたり狭めたりすることで、アプリケーション65の動作の状態変化量を簡易に調整することができ、ユーザの操作性を向上させることができる。 As described above, the mobile terminal 1 according to the present embodiment moves the thumb 68b in which the touch hold operation has been continued in the forward direction with respect to the direction of the index finger 68a for the hover slide operation, as compared with the mobile terminal 1 according to the fourth embodiment. By further sliding in the reverse direction to widen or narrow the inter-finger interval, the state change amount of the operation of the application 65 can be easily adjusted, and the operability for the user can be improved.
(第6の実施形態)
 第6の実施形態では、2本の指を用いた組み合わせ操作の一例として、「ホバー操作とタッチ操作との組み合わせ操作」が行われた場合の携帯端末の動作を説明する。例えば、本実施形態の携帯端末は、人差し指68aのホバー操作によって選択対象の項目を指定し、親指68bのタッチ操作によって指定された項目を選択対象として確定する。
(Sixth embodiment)
In the sixth embodiment, as an example of a combination operation using two fingers, an operation of the mobile terminal when a “combination operation of a hover operation and a touch operation” is performed will be described. For example, the portable terminal of the present embodiment designates an item to be selected by a hover operation of the index finger 68a, and determines an item designated by a touch operation of the thumb 68b as a selection target.
 図13は、第6の実施形態の携帯端末1Aの機能的構成を示すブロック図である。図2に示す携帯端末1と同一の構成要素については同一の符号を付すことで説明を省略し、異なる内容について説明する。 FIG. 13 is a block diagram illustrating a functional configuration of the mobile terminal 1A according to the sixth embodiment. The same components as those of the mobile terminal 1 shown in FIG. 2 are denoted by the same reference numerals, description thereof is omitted, and different contents are described.
 図13に示す携帯端末1Aは、近接検知部5、タッチ検知部10、画面表示部30、メモリ40、近接座標抽出部51、タッチ座標抽出部52、状態管理部54、画像ボタン管理部55、操作判定部56、インジケータ状態管理部67、アプリ画面生成部59及びアプリケーション65を含む。 A mobile terminal 1A shown in FIG. 13 includes a proximity detection unit 5, a touch detection unit 10, a screen display unit 30, a memory 40, a proximity coordinate extraction unit 51, a touch coordinate extraction unit 52, a state management unit 54, an image button management unit 55, An operation determination unit 56, an indicator state management unit 67, an application screen generation unit 59, and an application 65 are included.
 状態管理部54は、操作判定部56からの操作判定結果情報を基に、ユーザの人差し指68aがホバー操作して近接状態となっているか否か、ユーザの親指68bがタッチ操作して接触状態となったか否か、更に、人差し指68aがホバー操作して近接状態となっている場合に親指68bがタッチ操作して近接状態と接触状態とが共存した状態となったか否かを判定する。 Based on the operation determination result information from the operation determination unit 56, the state management unit 54 determines whether or not the user's index finger 68a is in a proximity state by a hover operation, and the user's thumb 68b performs a touch operation to determine the contact state. Further, it is determined whether or not the thumb 68b is touched and the proximity state and the contact state coexist when the index finger 68a is in the proximity state by the hover operation.
 状態管理部54は、ユーザの人差し指68aがホバー操作していると判定した場合には、人差し指68aがホバー操作している旨の情報をアプリケーション65及びインジケータ状態管理部67に出力する。 When the state management unit 54 determines that the user's index finger 68 a is hovering, the status management unit 54 outputs information indicating that the index finger 68 a is hovering to the application 65 and the indicator state management unit 67.
 インジケータ状態管理部67は、状態管理部54から出力された情報を基に、ホバー操作している指(例えば人差し指68a)の鉛直下方向におけるタッチパネル15(画面45)上の位置に、ユーザの選択対象の候補を示すインジケータとしてのポインタを表示する旨のインジケータ生成指示をアプリ画面生成部59に出力する。インジケータ生成指示には、インジケータを表示するタッチパネル15上の位置(x座標とy座標)と、インジケータの種別とを含む。 Based on the information output from the state management unit 54, the indicator state management unit 67 selects the user at a position on the touch panel 15 (screen 45) in the vertical downward direction of the finger (for example, the index finger 68a) that is operating the hover. An indicator generation instruction for displaying a pointer as an indicator indicating a target candidate is output to the application screen generation unit 59. The indicator generation instruction includes a position (x coordinate and y coordinate) on the touch panel 15 where the indicator is displayed, and the type of the indicator.
 また、インジケータ状態管理部67は、インジケータ生成指示に含まれるインジケータの種別として、ポインタの他にカーソルを用いても良いし、又は、複数の項目が画面45に表示されている場合に選択されている対象を明示的に識別させるためのフォーカスを用いても良い。 In addition, the indicator state management unit 67 may use a cursor in addition to the pointer as the type of indicator included in the indicator generation instruction, or selected when a plurality of items are displayed on the screen 45. A focus may be used to explicitly identify the target.
 アプリ画面生成部59は、インジケータ状態管理部67から出力されたインジケータ生成指示を基に、ホバー操作している指(例えば人差し指68a)の鉛直下方向におけるタッチパネル15(画面45)の位置に、インジケータを表示したアプリ画面を生成して画面表示部30に表示させる。なお、アプリ画面生成部59は、インジケータとして例えばポインタの形状が画像ボタンデータベース55aに記憶されている場合には、画像ボタン管理部55を介して取得する。 Based on the indicator generation instruction output from the indicator state management unit 67, the application screen generation unit 59 displays the indicator at the position of the touch panel 15 (screen 45) in the vertical downward direction of the finger (for example, the index finger 68a) that is operating the hover. Is generated and displayed on the screen display unit 30. Note that the application screen generation unit 59 acquires the indicator via the image button management unit 55 when the pointer shape is stored in the image button database 55a as an indicator, for example.
 また、図13ではアプリケーション65は制御拡張部64の構成を図示していないが、制御拡張部64を含んでも含まなくても良い。 Further, in FIG. 13, the application 65 does not illustrate the configuration of the control extension unit 64, but may or may not include the control extension unit 64.
(第6の実施形態の動作概要)
 図14は、第6の実施形態におけるホバー操作とタッチ操作との組み合わせ操作を示す図である。図14(A)は、ホバー操作が単独で行われた様子を示す図である。図14(B)は、ホバー操作とタッチ操作との組み合わせ操作が行われた様子を示す図である。
(Overview of operation of the sixth embodiment)
FIG. 14 is a diagram illustrating a combination operation of a hover operation and a touch operation in the sixth embodiment. FIG. 14A is a diagram illustrating a state where the hover operation is performed independently. FIG. 14B is a diagram illustrating a state where a combination operation of a hover operation and a touch operation is performed.
 本実施形態では、図14(A)に示す画面45(タッチパネル15)上に複数の選択可能な項目としてのボタン81が表示されている場合に、人差し指68aのホバー操作によって、複数のボタン81のうち特定のボタン(例えば文字「c」)が指定された場合に、この特定のボタンの周り又は特定のボタンの一部若しくは全部の範囲に、特定のボタンが指定されたことを明示的に認知可能となるようなポインタ85が表示される。 In the present embodiment, when a plurality of buttons 81 as selectable items are displayed on the screen 45 (touch panel 15) shown in FIG. 14A, the plurality of buttons 81 are displayed by the hover operation of the index finger 68a. When a specific button (for example, letter “c”) is specified, it is explicitly recognized that the specific button is specified around the specific button or a part or all of the specific button. A possible pointer 85 is displayed.
 ポインタ85が表示されている状態で、図14(B)に示すように、親指68bが画面45上の任意の点をタッチ操作した場合に、ポインタ85に囲まれた選択対象として指定された特定のボタン(例えば文字「c」)の選択が確定する。なお、本実施形態では、人差し指68aのホバー操作によって複数のボタン81が予め選択された上で、親指68bの画面45上のタッチ操作によって、予め選択された複数のボタン81の選択を一括で確定されても良い。 In the state where the pointer 85 is displayed, as shown in FIG. 14B, when the thumb 68b touches an arbitrary point on the screen 45, the specific specified as the selection target surrounded by the pointer 85 Button (for example, the letter “c”) is selected. In the present embodiment, after the plurality of buttons 81 are selected in advance by the hover operation of the index finger 68a, the selection of the plurality of buttons 81 selected in advance is confirmed by the touch operation on the screen 45 of the thumb 68b. May be.
(第6の実施形態の携帯端末1Aの動作)
 図15は、第6の実施形態におけるホバー操作とタッチ操作との組み合わせ操作に応じた携帯端末1Aの動作手順を示すフローチャートである。
(Operation of the mobile terminal 1A of the sixth embodiment)
FIG. 15 is a flowchart illustrating an operation procedure of the mobile terminal 1A according to a combination operation of a hover operation and a touch operation according to the sixth embodiment.
 図15において、状態管理部54は、操作判定部56からの操作判定結果情報を基に、ユーザの人差し指68によってホバー操作が行われているか否かを判定する(S31)。ホバー操作が行われていると判定された場合に、携帯端末1の動作はステップS2に進む。 15, the state management unit 54 determines whether or not a hover operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S31). When it is determined that the hover operation is being performed, the operation of the mobile terminal 1 proceeds to step S2.
 次に、状態管理部54は、ユーザの人差し指68aによってホバー操作が行われていると判定した場合に(S31、YES)、操作判定部56からの操作判定結果情報を基に、ユーザの親指68bによってタッチ操作が行われたか否かを判定する(S32)。 Next, when the state management unit 54 determines that the hover operation is being performed with the user's index finger 68a (S31, YES), based on the operation determination result information from the operation determination unit 56, the user's thumb 68b. Whether or not a touch operation has been performed is determined (S32).
 ここで、ステップS32では、タッチ操作が行われていない旨の判定が初回である場合には携帯端末1Aの動作はステップS31に戻り、タッチ操作が行われていない旨の判定が初回以外である場合には携帯端末1Aの動作はステップS32に戻る。なお、タッチ操作が行われていない旨の判定回数はRAM22に一時的に記憶され、操作判定部56はこの判定回数の情報を基に、ステップS31のホバー操作の判定又はステップS32のタッチ操作の判定を行う。 Here, in step S32, when the determination that the touch operation is not performed is the first time, the operation of the mobile terminal 1A returns to step S31, and the determination that the touch operation is not performed is other than the first time. In that case, the operation of the portable terminal 1A returns to step S32. Note that the number of times that the touch operation has not been performed is temporarily stored in the RAM 22, and the operation determination unit 56 determines whether the hover operation is determined in step S31 or the touch operation in step S32 based on the information on the determination number. Make a decision.
 状態管理部54は、ユーザの親指68bによってタッチ操作が行われたと判定した場合に(S32、YES)、ステップS31において判定されたホバー操作に対応する近接座標(x、y、z)の鉛直下方向におけるタッチパネル15(画面45)上の位置(近接位置)にボタンが存在するか否かを判定する(S33)。具体的には、状態管理部54は、画像ボタン管理部55を介して画像ボタンデータベース55aを検索し、近接位置にボタンが存在するか否かを判定する。 When the state management unit 54 determines that the touch operation has been performed with the user's thumb 68b (S32, YES), the state management unit 54 is vertically below the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. It is determined whether or not there is a button at a position (proximity position) on the touch panel 15 (screen 45) in the direction (S33). Specifically, the state management unit 54 searches the image button database 55a via the image button management unit 55, and determines whether or not a button exists in the proximity position.
 ステップS31において判定されたホバー操作に対応する近接座標(x、y、z)の鉛直下方向におけるタッチパネル15(画面45)上の位置(近接位置)にボタンが存在しないと判定された場合には(S33、NO)、携帯端末1Aの動作はステップS35に進む。 When it is determined that there is no button at a position (proximity position) on the touch panel 15 (screen 45) in the vertical downward direction of the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. (S33, NO), the operation of the portable terminal 1A proceeds to step S35.
 状態管理部54は、ステップS31において判定されたホバー操作に対応する近接座標(x、y、z)の鉛直下方向におけるタッチパネル15(画面45)上の位置(近接位置)にボタンが存在していると判定した場合には(S33、YES)、ホバー操作によって指定された位置のボタンの種類(情報)を保持する旨の情報をインジケータ状態管理部67に出力する(S34)。ステップS34の動作の一例として、状態管理部54は、ホバー操作によって指定された位置のボタンの種類(情報)をRAM22又はメモリ40に保持(記憶)し、更に、選択対象として指定されたことを明示的に認知させるためにポインタ(インジケータ)を表示させる。 The state management unit 54 has a button at a position (proximity position) on the touch panel 15 (screen 45) in the vertical downward direction of the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. If it is determined that it is present (S33, YES), information indicating that the type (information) of the button at the position designated by the hover operation is held is output to the indicator state management unit 67 (S34). As an example of the operation of step S34, the state management unit 54 stores (stores) the type (information) of the button at the position specified by the hover operation in the RAM 22 or the memory 40, and further confirms that the button is specified as the selection target. A pointer (indicator) is displayed for explicit recognition.
 具体的には、状態管理部54は、ユーザの人差し指68aがホバー操作していると判定した場合には、ホバー操作によって指定された位置のボタンの種類を保持する旨の情報をインジケータ状態管理部67に出力する。インジケータ状態管理部67は、状態管理部54から出力された情報を基に、ホバー操作している指(人差し指68a)の鉛直下方向におけるタッチパネル15(画面45)上の位置に、ユーザの選択対象の候補を示すインジケータとしてのポインタを表示する旨のインジケータ生成指示をアプリ画面生成部59に出力する。 Specifically, when it is determined that the user's index finger 68a is operating the hover, the state management unit 54 displays information indicating that the type of the button at the position specified by the hover operation is retained as the indicator state management unit Output to 67. Based on the information output from the state management unit 54, the indicator state management unit 67 selects the user's selection target at a position on the touch panel 15 (screen 45) in the vertical downward direction of the finger (index finger 68a) that is operating the hover. An indicator generation instruction for displaying a pointer as an indicator indicating the candidate is output to the application screen generation unit 59.
 アプリ画面生成部59は、インジケータ状態管理部67から出力されたインジケータ生成指示を基に、ホバー操作している指(人差し指68a)の鉛直下方向におけるタッチパネル15(画面45)の位置に、ボタンに応じたインジケータを表示したアプリ画面を生成して画面表示部30に表示させる。 Based on the indicator generation instruction output from the indicator state management unit 67, the application screen generation unit 59 uses a button at the position of the touch panel 15 (screen 45) in the vertically downward direction of the finger (index finger 68a) that is operating the hover. An application screen displaying the corresponding indicator is generated and displayed on the screen display unit 30.
 状態管理部54は、ステップS31と同様に、近接座標抽出部51から出力された近接座標(x、y、z)の情報を基に、ユーザの人差し指68によってホバー操作が継続して行われているか否かを判定する(S35)。ユーザの人差し指68によってホバー操作が継続して行われていると判定された場合には(S35、YES)、携帯端末1Aの動作はステップS32に戻る。 Similarly to step S31, the state management unit 54 continues the hover operation with the user's index finger 68 based on the information of the proximity coordinates (x, y, z) output from the proximity coordinate extraction unit 51. It is determined whether or not (S35). When it is determined that the hover operation is continuously performed by the user's index finger 68 (S35, YES), the operation of the mobile terminal 1A returns to step S32.
 状態管理部54は、ユーザの人差し指68によってホバー操作が行われていないと判定した場合には(S35、NO)、ホバー操作後の操作に応じた動作を実行させる(S36)。例えば、ホバー操作後にタッチ操作された場合には、タッチされた位置に応じた動作が実行される。また、ステップS33において近接位置にボタンが存在すると判定されていた場合では、ステップS36の動作の一例として、状態管理部54は、ホバー操作によって指定されたボタンの種類(情報)又はボタンの機能に応じた動作を実行する(S36)。 When the state management unit 54 determines that the hover operation is not performed with the user's index finger 68 (S35, NO), the state management unit 54 causes the operation corresponding to the operation after the hover operation to be executed (S36). For example, when a touch operation is performed after a hover operation, an operation corresponding to the touched position is executed. If it is determined in step S33 that there is a button in the proximity position, as an example of the operation in step S36, the state management unit 54 determines the button type (information) or button function specified by the hover operation. The corresponding operation is executed (S36).
 以上により、本実施形態の携帯端末1Aは、例えば複数の項目(ボタン)が画面45内に表示されている場合に、人差し指68aのホバー操作によって選択対象とするボタンを指定し、親指68bのタッチ操作によって、指定されたボタンの選択を確定することができる。これにより、携帯端末1Aは、画面45内に表示された小さなボタンを正確に選択して、そのボタンに応じた処理を起動することができる。 As described above, the mobile terminal 1A according to the present embodiment designates the button to be selected by the hover operation of the index finger 68a when, for example, a plurality of items (buttons) are displayed in the screen 45, and touches the thumb 68b. By the operation, selection of the designated button can be confirmed. Thereby, 1 A of portable terminals can select the small button displayed in the screen 45 correctly, and can start the process according to the button.
 なお、携帯端末1Aは、ホバー操作によってボタンを指定してタッチ操作によってそのボタンの選択を確定する場合に限らず、アプリケーション65の動作の状態の変化を、ホバー操作とタッチ操作との組み合わせ操作によって、アプリケーション65の動作の状態を初期化し、又は、上述した各実施形態のように変化量又は変化対象を変更しても良い。 Note that the mobile terminal 1A is not limited to the case where a button is specified by a hover operation and the selection of the button is confirmed by a touch operation, but the change in the operation state of the application 65 is performed by a combination operation of the hover operation and the touch operation. The operation state of the application 65 may be initialized, or the change amount or the change target may be changed as in the above-described embodiments.
 また、携帯端末1Aは、図8に示すタッチ受付可能範囲45aにタッチ操作された場合にだけ、ホバー操作とタッチ操作との組み合わせ操作が行われたと判定しても良い。これにより、携帯端末1Aは、ホバー操作とタッチ操作との組み合わせ操作の誤操作を排除することができる。 Further, the mobile terminal 1A may determine that the combination operation of the hover operation and the touch operation is performed only when the touch operation is performed in the touch acceptable range 45a illustrated in FIG. Thereby, 1 A of portable terminals can eliminate the erroneous operation of combined operation with hover operation and touch operation.
 以上、図面を参照して各種の実施形態について説明したが、本発明はかかる例に限定されないことは言うまでもない。当業者であれば、特許請求の範囲に記載された範疇内において、各種実施の形態の変更例または修正例、更に各種実施の形態の組み合わせ例に想到し得ることは明らかであり、それらについても当然に本発明の技術的範囲に属するものと了解される。 Although various embodiments have been described above with reference to the drawings, it goes without saying that the present invention is not limited to such examples. It is obvious for those skilled in the art that variations and modifications of various embodiments, and combinations of various embodiments can be conceived within the scope of the claims. Of course, it is understood that it belongs to the technical scope of the present invention.
 上述した実施形態では、例えばホバースライド操作に対する動作の一例として、地図画像47が表示された画面45の移動(スクロール)を示したが、地図画像47の代わりに、写真又は手書き画像でも良い。また、ホバースライド操作に対する動作の一例として、映像データ又は音声データのリスト項目のスクロール(選択)、ボリューム(出力音声)、コンテンツの再生、停止、早送り、早戻し等でもよい。また、ホバースライド操作とタッチスライド操作との組み合わせ操作によって変更される変化対象の一例としては、音量、輝度、彩度、透過度等が挙げられる。コンテンツ(例えば映像データ)の再生では、倍速再生が挙げられる。 In the above-described embodiment, the movement (scrolling) of the screen 45 on which the map image 47 is displayed is shown as an example of the operation for the hover slide operation, but a photograph or a handwritten image may be used instead of the map image 47. Further, as an example of the operation for the hover slide operation, scrolling (selection) of a list item of video data or audio data, volume (output audio), content reproduction, stop, fast forward, fast reverse, and the like may be used. Further, examples of the change target that is changed by the combination operation of the hover slide operation and the touch slide operation include volume, luminance, saturation, transparency, and the like. In reproduction of content (for example, video data), double speed reproduction is exemplified.
 また、上述した実施形態において、ホバースライド操作とタッチホールド操作との組み合わせ操作によって、ホバースライド操作に応じたアプリケーション65の動作の変化の途中状態を記憶し、再び記憶された状態から再生しても良い。例えば、映像データ又は音声データの再生中に、ホバースライド操作とタッチホールド操作との組み合わせ操作によって、「しおり」のように、タッチホールド操作が行われた時点(タイミング)を記憶しても良い。 Further, in the above-described embodiment, the intermediate state of the change in the operation of the application 65 according to the hover slide operation is memorized by the combination operation of the hover slide operation and the touch hold operation, and is reproduced from the stored state again. good. For example, the time (timing) at which the touch hold operation is performed, such as “bookmark”, may be stored by the combined operation of the hover slide operation and the touch hold operation during reproduction of video data or audio data.
 また、ホバースライド操作とタッチホールド操作との組み合わせ操作が行われている間にホバースライド操作が再び開始した場合には、ホバースライド操作に応じたアプリケーション65の動作を再開しても良い。 In addition, when the hover slide operation starts again while the combination operation of the hover slide operation and the touch hold operation is performed, the operation of the application 65 according to the hover slide operation may be resumed.
 ホバースライド操作が行われている場合に、ホバースライド操作とタッチ操作との組み合わせ操作が行われた場合には、ホバースライド操作に応じたアプリケーション65の動作の状態をリセットしても良い。 When the hover slide operation is performed and the combination operation of the hover slide operation and the touch operation is performed, the operation state of the application 65 according to the hover slide operation may be reset.
 なお、本出願は、2012年4月27日出願の日本特許出願(特願2012-104123)に基づくものであり、その内容は本出願の中に参照として援用される。 Note that this application is based on a Japanese patent application filed on April 27, 2012 (Japanese Patent Application No. 2012-104123), the contents of which are incorporated herein by reference.
 本発明は、画面に画像を表示する際、タッチパネルに対するユーザの入力操作に応じて、コンテンツに対する制御内容を効率的に選択し、快適な操作性を与える入力装置、入力支援方法及びプログラムとして有用である。 INDUSTRIAL APPLICABILITY The present invention is useful as an input device, an input support method, and a program that efficiently select control contents for content according to a user input operation on a touch panel when displaying an image on a screen and provide comfortable operability. is there.
 1、1A 携帯端末
 5 近接検知部
 10 タッチ検知部
 30 画面表示部
 40 メモリ
 51 近接座標抽出部
 52 タッチ座標抽出部
 54 状態管理部
 55 画像ボタン管理部
 56 操作判定部
 58 表示画像生成部
 59 アプリ画面生成部
 60 画像合成部
 61 状態変化量変更部
 62 変化対象変更部
 63 状態変化継続部
 64 制御拡張部
 65 アプリケーション
 66 入力操作判定部
 67 インジケータ状態管理部
DESCRIPTION OF SYMBOLS 1, 1A Portable terminal 5 Proximity detection part 10 Touch detection part 30 Screen display part 40 Memory 51 Proximity coordinate extraction part 52 Touch coordinate extraction part 54 State management part 55 Image button management part 56 Operation determination part 58 Display image generation part 59 Application screen Generation unit 60 Image composition unit 61 State change amount change unit 62 Change target change unit 63 State change continuation unit 64 Control extension unit 65 Application 66 Input operation determination unit 67 Indicator state management unit

Claims (10)

  1.  画面にデータを表示する表示部と、
     前記画面に対する第1の指の近接を検知する近接検知部と、
     前記第1の指の近接が検知された後に、前記画面に対する第2の指の接触を検知する接触検知部と、
     前記第1の指の近接及び前記第2の指の接触を組み合わせた操作に応じて、前記組み合わせた操作に対応する動作を実行する動作実行部と、を備える入力装置。
    A display for displaying data on the screen;
    A proximity detector that detects the proximity of the first finger to the screen;
    A contact detection unit that detects contact of the second finger with respect to the screen after the proximity of the first finger is detected;
    An operation execution unit that executes an operation corresponding to the combined operation in response to an operation combining the proximity of the first finger and the contact of the second finger.
  2.  請求項1に記載の入力装置であって、
     前記動作実行部は、
     前記近接が継続した前記第1の指の操作に応じた動作を実行し、前記近接が継続した前記第1の指の移動操作と前記接触が継続した前記第2の指の移動操作とに応じて、前記第1の指の操作に応じた動作の変化量を変更する入力装置。
    The input device according to claim 1,
    The operation execution unit is
    An operation according to the operation of the first finger that has continued to approach is executed, and the operation to move the first finger that has continued to approach and the operation to move the second finger that has continued to contact An input device that changes the amount of change in motion according to the operation of the first finger.
  3.  請求項1に記載の入力装置であって、
     前記動作実行部は、
     前記近接が継続した前記第1の指の操作に応じた動作を実行し、前記近接が継続した前記第1の指の移動操作と前記接触が継続した前記第2の指の移動操作とに応じて、前記第1の指の操作に応じた動作の変化対象を変更する入力装置。
    The input device according to claim 1,
    The operation execution unit is
    An operation according to the operation of the first finger that has continued to approach is executed, and the operation to move the first finger that has continued to approach and the operation to move the second finger that has continued to contact An input device that changes a movement change target according to the operation of the first finger.
  4.  請求項2に記載の入力装置であって、
     前記動作実行部は、
     前記第1の指の鉛直下方向における前記画面上の位置と前記画面上の前記第2の指と間隔に応じて、前記第1の指の操作に応じた動作の変化量を更に増減させる入力装置。
    The input device according to claim 2,
    The operation execution unit is
    An input for further increasing or decreasing the amount of change in motion according to the operation of the first finger according to the position of the first finger on the screen in the vertically downward direction and the distance between the second finger on the screen. apparatus.
  5.  請求項1に記載の入力装置であって、
     前記動作実行部は、
     前記近接が継続した前記第1の指の操作に応じた動作を実行し、前記近接が継続した前記第1の指の移動操作と前記接触が継続された前記第2の指の前記画面に対するタッチ操作とに応じて、前記第1の指の操作に応じた動作を継続する入力装置。
    The input device according to claim 1,
    The operation execution unit is
    An operation corresponding to the operation of the first finger with which the proximity has been continued is executed, and the movement operation of the first finger with which the proximity has been continued and the touch of the second finger with which the contact has been continued on the screen An input device that continues the operation according to the operation of the first finger in response to the operation.
  6.  請求項5に記載の入力装置であって、
     前記動作実行部は、
     前記タッチ操作された前記第2の指が前記第1の指の移動操作の方向と同じ方向に移動した場合に、前記第1の指の操作に対応する動作の変化量を変更させる入力装置。
    The input device according to claim 5,
    The operation execution unit is
    An input device that changes a change amount of an operation corresponding to an operation of the first finger when the touched second finger moves in the same direction as the movement operation of the first finger.
  7.  請求項1に記載の入力装置であって、
     前記表示部は、
     前記画面に選択可能な一つ又は複数の項目を表示し、
     前記動作実行部は、
     前記近接が継続した第1の指の前記項目の指定操作に応じて指定された前記項目の選択を、前記第2の指の前記画面に対するタッチ操作に応じて確定する入力装置。
    The input device according to claim 1,
    The display unit
    Displaying one or more selectable items on the screen;
    The operation execution unit is
    An input device for confirming selection of the item designated according to the designation operation of the item of the first finger that has continued to approach in response to a touch operation on the screen of the second finger.
  8.  請求項7に記載の入力装置であって、
     前記動作実行部は、
     前記画面の大きさのうち所定の受付可能範囲に対する前記第2の指のタッチ操作に応じて、前記近接が継続した第1の指の前記項目の指定操作に応じて指定された前記項目の選択を確定する入力装置。
    The input device according to claim 7,
    The operation execution unit is
    The selection of the item specified according to the specifying operation of the item of the first finger that has continued to approach in response to the touch operation of the second finger with respect to a predetermined acceptable range of the size of the screen An input device for confirming.
  9.  画面にデータを表示する表示部を含む入力装置の入力支援方法であって、
     前記画面に対する第1の指の近接を検知するステップと、
     前記第1の指の近接が検知された後に、前記画面に対する第2の指の接触を検知するステップと、
     前記第1の指の近接及び前記第2の指の接触を組み合わせた操作を受け付けるステップと、
     前記組み合わせた操作に対応する動作を実行するステップと、を有する入力支援方法。
    An input support method for an input device including a display unit for displaying data on a screen,
    Detecting the proximity of the first finger to the screen;
    Detecting the contact of the second finger with respect to the screen after the proximity of the first finger is detected;
    Receiving an operation combining the proximity of the first finger and the contact of the second finger;
    Executing an operation corresponding to the combined operation.
  10.  画面にデータを表示する表示部を含む入力装置であるコンピュータに、
     前記画面に対する第1の指の近接を検知するステップと、
     前記第1の指の近接が検知された後に、前記画面に対する第2の指の接触を検知するステップと、
     前記第1の指の近接及び前記第2の指の接触を組み合わせた操作を受け付けるステップと、
     前記組み合わせた操作に対応する動作を実行するステップと、を実現させるためのプログラム。
    To a computer that is an input device including a display unit that displays data on a screen,
    Detecting the proximity of the first finger to the screen;
    Detecting the contact of the second finger with respect to the screen after the proximity of the first finger is detected;
    Receiving an operation combining the proximity of the first finger and the contact of the second finger;
    And a step of executing an operation corresponding to the combined operation.
PCT/JP2013/001799 2012-04-27 2013-03-15 Input device, input support method, and program WO2013161170A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-104123 2012-04-27
JP2012104123A JP2013232119A (en) 2012-04-27 2012-04-27 Input device, input supporting method, and program

Publications (1)

Publication Number Publication Date
WO2013161170A1 true WO2013161170A1 (en) 2013-10-31

Family

ID=49482528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/001799 WO2013161170A1 (en) 2012-04-27 2013-03-15 Input device, input support method, and program

Country Status (2)

Country Link
JP (1) JP2013232119A (en)
WO (1) WO2013161170A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106134186A (en) * 2014-02-26 2016-11-16 微软技术许可有限责任公司 Distant existing experience

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015099526A (en) * 2013-11-20 2015-05-28 富士通株式会社 Information processing apparatus and information processing program
JP6410537B2 (en) * 2014-09-16 2018-10-24 キヤノン株式会社 Information processing apparatus, control method therefor, program, and storage medium
KR102339839B1 (en) 2014-12-26 2021-12-15 삼성전자주식회사 Method and apparatus for processing gesture input
JP2017021449A (en) * 2015-07-07 2017-01-26 富士通株式会社 Information processing apparatus, display control method, and display control program
CN110062921B (en) * 2016-12-27 2024-03-19 松下知识产权经营株式会社 Electronic device, tablet terminal, input control method, and storage medium
JP2019144955A (en) * 2018-02-22 2019-08-29 京セラ株式会社 Electronic device, control method and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009525538A (en) * 2006-01-30 2009-07-09 アップル インコーポレイテッド Gesture using multi-point sensing device
WO2011005977A2 (en) * 2009-07-10 2011-01-13 Apple Inc. Touch and hover sensing
WO2011056387A1 (en) * 2009-11-03 2011-05-12 Qualcomm Incorporated Methods for implementing multi-touch gestures on a single-touch touch surface
JP2012133729A (en) * 2010-12-24 2012-07-12 Sony Corp Information processing device, information processing method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009525538A (en) * 2006-01-30 2009-07-09 アップル インコーポレイテッド Gesture using multi-point sensing device
WO2011005977A2 (en) * 2009-07-10 2011-01-13 Apple Inc. Touch and hover sensing
WO2011056387A1 (en) * 2009-11-03 2011-05-12 Qualcomm Incorporated Methods for implementing multi-touch gestures on a single-touch touch surface
JP2012133729A (en) * 2010-12-24 2012-07-12 Sony Corp Information processing device, information processing method and program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106134186A (en) * 2014-02-26 2016-11-16 微软技术许可有限责任公司 Distant existing experience
CN106134186B (en) * 2014-02-26 2021-02-26 微软技术许可有限责任公司 Telepresence experience

Also Published As

Publication number Publication date
JP2013232119A (en) 2013-11-14

Similar Documents

Publication Publication Date Title
US10216407B2 (en) Display control apparatus, display control method and display control program
JP5721662B2 (en) Input receiving method, input receiving program, and input device
US9772762B2 (en) Variable scale scrolling and resizing of displayed images based upon gesture speed
TWI585673B (en) Input device and user interface interactions
US10114494B2 (en) Information processing apparatus, information processing method, and program
KR101224588B1 (en) Method for providing UI to detect a multi-point stroke and multimedia apparatus thereof
US9035883B2 (en) Systems and methods for modifying virtual keyboards on a user interface
US9013422B2 (en) Device, method, and storage medium storing program
WO2013161170A1 (en) Input device, input support method, and program
US10073493B2 (en) Device and method for controlling a display panel
EP2876538A1 (en) Mobile terminal and method for controlling the same
US20100138782A1 (en) Item and view specific options
JP5828800B2 (en) Display device, display control method, and program
KR102168648B1 (en) User terminal apparatus and control method thereof
EP2613247B1 (en) Method and apparatus for displaying a keypad on a terminal having a touch screen
JP2010205050A (en) User interface device with touch panel, method, and program for controlling user interface
US9298364B2 (en) Mobile electronic device, screen control method, and storage medium strong screen control program
WO2014112029A1 (en) Information processing device, information processing method, and program
WO2010060502A1 (en) Item and view specific options
TW200928916A (en) Method for operating software input panel
JP5854928B2 (en) Electronic device having touch detection function, program, and control method of electronic device having touch detection function
JP2012174247A (en) Mobile electronic device, contact operation control method, and contact operation control program
WO2013047023A1 (en) Display apparatus, display method, and program
KR20090089707A (en) Method and apparatus for zoom in/out using touch-screen
JP5969320B2 (en) Mobile terminal device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13781732

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13781732

Country of ref document: EP

Kind code of ref document: A1