WO2018167860A1 - Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile - Google Patents

Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile Download PDF

Info

Publication number
WO2018167860A1
WO2018167860A1 PCT/JP2017/010299 JP2017010299W WO2018167860A1 WO 2018167860 A1 WO2018167860 A1 WO 2018167860A1 JP 2017010299 W JP2017010299 W JP 2017010299W WO 2018167860 A1 WO2018167860 A1 WO 2018167860A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
determination
information
touch gesture
gesture
Prior art date
Application number
PCT/JP2017/010299
Other languages
English (en)
Japanese (ja)
Inventor
琴由 笹山
佐々木 雄一
前川 拓也
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2017/010299 priority Critical patent/WO2018167860A1/fr
Priority to JP2017540658A priority patent/JP6253861B1/ja
Publication of WO2018167860A1 publication Critical patent/WO2018167860A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a touch gesture determination device, a touch gesture determination method, a touch gesture determination program, and a touch gesture operation for inputting operation information corresponding to a touch gesture operation and outputting a signal based on the input operation information. And a touch panel input device that outputs a signal based on a touch gesture operation.
  • Patent Document 1 discloses a portable information terminal and a character input method for displaying a virtual input key for inputting information based on the position of the input reference point when the user sets an input reference point on the touch panel. ing. The user first sets an input reference point by touching an arbitrary position on the touch panel with the thumb, and then touches a virtual input key arranged based on the position of the input reference point with a forefinger or the like. Enter a symbol, etc.
  • JP 2012-185565 A (for example, paragraphs 0027 to 0048)
  • Patent Document 1 when the user does not remember the virtual input key arrangement, it is necessary to view the touch panel after the virtual input key is displayed, and it is difficult to input information without completely viewing the touch panel. there were.
  • the present invention has been made to solve the above-described problems, and it is an object of the present invention to easily recognize a touch gesture operation performed by a user without looking at the touch panel as a touch gesture operation.
  • a touch gesture determination apparatus receives an operation image from a touch panel that displays an operation image on a screen, receives a user's touch gesture operation, and outputs operation information corresponding to the touch gesture operation.
  • a touch gesture determination device that generates output information based on the operation information, the touch gesture determination device including an operation determination unit that generates the output information for display on the touch panel based on the operation information;
  • a display control unit that receives the output information and causes the touch panel to display an image corresponding to the output information as the operation image. The operation determination unit is performed on the entire screen of the touch panel.
  • a whole screen input determining unit that determines the output information from operation information corresponding to a gesture operation, and the entire screen input determining unit determines a content of the touch gesture operation from the operation information.
  • the gesture input information determination unit determines whether the touch gesture operation is a first operation that is performed by the user without looking at the touch panel, based on the operation information.
  • a determination condition is determined, and when the determination condition is satisfied, it is determined that the touch gesture operation is the first operation.
  • the touch gesture determination method displays an operation image on a screen, accepts a user's touch gesture operation, and outputs the operation information from a touch panel that outputs operation information corresponding to the touch gesture operation.
  • a touch gesture determination method that receives and generates output information based on the operation information, wherein the touch gesture determination method generates the output information for display on the touch panel based on the operation information.
  • a display control step for receiving the output information and causing the touch panel to display an image corresponding to the output information as the operation image.
  • the operation determination step is performed on the entire screen of the touch panel.
  • the entire screen input mode that identifies the input content based on the touch gesture operation
  • a whole screen input determining step for determining the output information from operation information corresponding to the touch gesture operation performed in the touch screen, wherein the whole screen input determining step determines the content of the touch gesture operation from the operation information.
  • a gesture input information determination step wherein the gesture input information determination step is based on the operation information whether the touch gesture operation is a first operation performed by the user without looking at the touch panel.
  • a determination condition for determining whether or not the touch gesture operation is the first operation is determined when the determination condition is satisfied.
  • a touch panel input device displays an operation image on a screen, receives a touch gesture operation of a user, outputs operation information corresponding to the touch gesture operation, and receives the operation information.
  • a touch gesture determination device that controls a display of the touch panel based on the operation information, wherein the touch gesture determination device outputs an output for displaying the touch panel based on the operation information.
  • An operation determination unit that generates information; and a display control unit that receives the output information and causes the touch panel to display an image corresponding to the output information as the operation image.
  • the operation determination unit includes the touch panel Identify input based on touch gesture operations performed on the entire screen
  • An entire screen input determination unit that determines the output information from operation information corresponding to a touch gesture operation performed in the entire screen input mode, and the entire screen input determination unit receives the touch gesture operation from the operation information.
  • a gesture input information determination unit for determining the content of the first gesture, wherein the gesture input information determination unit is a first operation in which the touch gesture operation is an operation performed by the user without looking at the touch panel based on the operation information. It is determined about a determination condition for determining whether or not the touch gesture operation is the first operation when the determination condition is satisfied.
  • the touch gesture operation performed by the user without looking at the touch panel can be easily recognized as the touch gesture operation by the touch panel.
  • the touch panel input device includes a touch panel having a touch operation screen (operation screen) and a touch gesture determination device that receives operation information on the touch panel.
  • the touch panel input device is mounted on the target device or connected to be communicable with the target device, the operation screen of the electric device as the target device, the operation screen of the camera as the target device, and the target device
  • the present invention can be applied to an operation screen of factory equipment, an operation screen mounted on a car, a ship, an aircraft, or the like as a target device, an operation screen of a portable information terminal such as a smartphone or a tablet terminal as a target device.
  • the touch panel input device uses a signal (for example, a selection value) based on operation information input by a touch gesture operation (also referred to as “touch operation”) from the operation screen of the touch panel as a target device equipped with the touch panel input device, or a touch panel. This can be provided to a target device that can communicate with the input device.
  • a touch gesture operation also referred to as “touch operation”
  • the touch panel is a touch gesture input unit that accepts a touch gesture operation performed by the user.
  • the touch gesture operation is an information input operation by a specific movement such as the user's finger (or the user's palm, or the user's finger and palm).
  • Touch gesture operation is tapping, which is an operation of tapping the operation screen of the touch panel with a finger, flicking, which is an operation of flicking the operation screen of the touch panel with a finger, and a swipe which is an operation of sliding the operation screen of the touch panel with a finger (sliding the finger).
  • tapping which is an operation of tapping the operation screen of the touch panel with a finger
  • flicking which is an operation of flicking the operation screen of the touch panel with a finger
  • a swipe which is an operation of sliding the operation screen of the touch panel with a finger (sliding the finger).
  • Touch gesture operations include dragging, which is an operation of dragging a display component on the touch panel with a finger, pinch-in, which is an operation of narrowing the interval between fingers while pinching with multiple fingers on the operation screen of the touch panel, and multiple touch gestures on the operation screen of the touch panel. It can include pinch out, which is an operation to increase the interval between fingers.
  • Touch gesture operation is a dial gesture that operates to rotate the dial with two or more touches, and a slider that moves the finger to slide up, down, left, or right while touching the touch panel with two or more touches It can include gestures and the like.
  • the touch gesture operation can include an operation using a touch pen which is a pen-type input auxiliary tool.
  • a gesture performed while the user touches one point on the touch panel using one finger is referred to as a one-point touch operation, and the user performs two touches on the touch panel using two fingers.
  • a gesture (for example, a dial gesture) is referred to as a two-point touch operation, and a gesture performed while the user touches three points on the touch panel using three fingers is referred to as a three-point touch operation.
  • FIG. 1 is a functional block diagram showing a schematic configuration of a touch panel input device 100 according to an embodiment of the present invention.
  • FIG. 2 is a functional block diagram illustrating a schematic configuration of the entire screen input determination unit 12b according to the embodiment.
  • the touch panel input device 100 includes a touch gesture determination device 110 and a touch panel 130.
  • the touch gesture determination device 110 is a device that can execute the touch gesture determination method according to the embodiment and the touch gesture determination program according to the embodiment.
  • the touch panel 130 receives a touch gesture operation performed by a user, and outputs operation information (hereinafter, also referred to as “touch information”) A0 corresponding to the touch gesture operation.
  • the display panel unit 131 is arranged so as to overlap with the operation panel unit 132 and can display an operation image such as a GUI (Graphical User Interface) screen.
  • the display panel unit 131 is, for example, a liquid crystal display.
  • the touch gesture determination device 110 includes an operation information input unit 11, an operation determination unit 12, and a display control unit 14.
  • the touch gesture determination device 110 may include a notification unit 13.
  • the operation information input unit 11 receives operation information (operation signal) A0 output from the operation panel unit 132.
  • the operation information input unit 11 outputs input information A1 corresponding to the received operation information A0 to the operation determination unit 12.
  • the input information A1 is information corresponding to the operation information A0, and may be the same information as the operation information A0.
  • the operation determination unit 12 receives the input information A1 from the operation information input unit 11, and outputs a selection value A2 as output information as a result of the operation determination to the notification unit 13 and the display control unit 14.
  • the selection value A2 output from the operation determination unit 12 is a selection value determined by the operation determination unit 12 based on the input information A1, and an application program for a device in which the touch panel input device 100 is mounted based on the selection value A2. Etc. perform device control and the like.
  • the operation determination unit 12 determines the type and content of the touch gesture operation by the user from the received input information A1. As shown in FIG. 1, the operation determination unit 12 includes a display component input determination unit 12a, a whole screen input determination unit 12b, and an operation mode switching determination unit 12c.
  • the display component input determination unit 12a identifies the input content from the touch information on the display component displayed on the screen, accepts the operation, and selects the selected value. Take the decision.
  • the entire screen input determining unit 12b identifies input contents from touch information on the entire screen in the entire screen operation mode in which the entire screen is set as an input range, and is responsible for accepting operations and determining selected values. As illustrated in FIG. 2, the entire screen input determination unit 12 b includes a parameter adjustment unit 121 and a gesture input information determination unit 122.
  • the input content of touch information is identified by the parameter adjustment unit 121 and the gesture input information determination unit 122.
  • the parameter adjustment unit 121 adjusts the parameter according to the operation mode before the input is started, and passes the adjusted parameter to the gesture input information determination unit 122.
  • the gesture input information determination unit 122 determines a gesture input of an arbitrary pattern from the input information A1 received from the operation information input unit 11 and the parameter received from the parameter adjustment unit 121, and determines a selection value A2 corresponding thereto. . For example, the gesture input information determination unit 122 determines whether the touch gesture operation corresponding to the input information A1 is a one-point touch operation or a two-point touch operation, and the selection value A2 corresponding to the determined touch gesture operation. To decide.
  • the coordinates of the tapped positions of the two tapped points are moved by a predetermined distance (B pixel) within a predetermined time (A msec). It was determined that the operation is a two-point touch operation with the absence of the determination condition.
  • the touch panel 130 is touched at one point, touched at two points, and when released, the touch returns to the one-point touch and the hand is released.
  • the operation is performed without looking at the touch panel 130, it is easy to perform an operation of pushing the touch panel 130 without knowing the force.
  • the following first to third determination conditions are set as examples of the determination conditions for determining that the two-point touch operation is performed without looking at the touch panel 130.
  • the first to third determination conditions are conditions determined in advance based on a change in the number of touch gesture operation touches or a movement amount of the touch position in a predetermined time (predetermined time).
  • the first to third determination conditions can be stored in the parameter adjustment unit 121 or the gesture input information determination unit 122.
  • First determination condition When the touch position during the two-point touch operation hardly moves (for example, the total amount of movement in the predetermined time (C0 msec: first time) of the touch position during the two-point touch operation is predetermined) Value (D0 pixel: first value) or less).
  • Second determination condition When the touch position of the one-point touch operation immediately before the two-point touch operation hardly moves (for example, a predetermined time of the touch position of the one-point touch operation immediately before the two-point touch operation (C1 msec: second When the movement amount in (time) is less than or equal to a predetermined value (D1 pixel: second value)).
  • Third determination condition When the touch position of the one-point touch operation immediately after the two-point touch operation hardly moves (for example, a predetermined time of the touch position of the one-point touch operation immediately after the two-point touch operation (C2 msec: third (When the movement amount in (time) is equal to or smaller than a predetermined value (D2 pixel: third value)).
  • the gesture input information determination unit 122 is a two-point touch operation (first operation) in which the touch gesture operation is performed without looking at the touch panel when all of the first to third determination conditions are satisfied. If any one of the first to third determination conditions is not satisfied, it is determined that the touch gesture operation is not a two-point touch operation performed without looking at the touch panel. To do.
  • the gesture input information determination unit 122 determines that the touch gesture operation is a two-point touch operation performed without looking at the touch panel when the following fourth determination condition is satisfied, and the following fourth determination condition: Is not satisfied, it is determined that the touch gesture operation is not a two-point touch operation performed without looking at the touch panel.
  • Fourth determination condition When the operation is continued with two-point touch for a long time (for example, when the two-point touch operation duration is equal to or longer than a predetermined value (C4 msec: fourth time)).
  • the operation mode switching determination unit 12c determines from the input information A1 received from the operation information input unit 11 whether the input information A1 includes information related to switching of the operation mode of the screen.
  • the screen operation modes include a display component operation mode in which display components to be operated are displayed in the screen, and an entire screen operation mode in which the entire screen is set as an input range. These operation modes can be switched. .
  • the operation mode switching determination unit 12c switches the operation mode from the determination result regarding the switching of the operation mode.
  • the operation mode switching determination unit 12c switches the operation mode, the display content of the operation screen, the input method, and the notification method are switched.
  • the notification unit 13 switches the operation screen notification method according to the determination result of the operation mode switching determination unit 12c, and receives the information of the selected value A2 determined by the operation determination unit 12, and notifies the operation status.
  • the notification unit 13 issues a notification of notification contents according to the selection value A2 or outputs a notification signal.
  • the notification unit 13 notifies the status of the user's touch gesture operation by, for example, sound, screen display, vibration by a vibrator, or lamp lighting.
  • the notification unit 13 When the notification by the notification unit 13 is a notification by sound, the notification unit 13 outputs a notification signal to a speaker as an audio output unit.
  • the speaker is shown in FIG.
  • the notification unit 13 When the notification by the notification unit 13 is an image display, the notification unit 13 sends notification information A3 to the display control unit 14, and the display control unit 14 transmits an image signal A4 based on the notification information A3 to the display panel unit 131 of the touch panel 130. Send to.
  • the display control unit 14 switches the display content of the operation screen according to the determination result of the operation mode switching determination unit 12c, receives the information of the selection value A2 determined by the operation determination unit 12, and reflects the operation result on the screen. As shown in FIG. 1, the display control unit 14 outputs an image signal A4 of an operation image displayed on the display panel unit 131 of the touch panel 130 to the display panel unit 131.
  • FIG. 3 is a diagram illustrating an example of a hardware (H / W) configuration of the touch panel input device 100 according to the embodiment.
  • the touch panel input device 100 includes a touch panel 130, a processor 301, a memory 302, and a speaker 303.
  • a touch gesture determination device 110 shown in FIG. 1 includes a memory 302 as a storage device that stores a touch gesture determination program as software, and a processor 301 as an information processing unit that executes the touch gesture determination program stored in the memory 302. (For example, by a computer).
  • the components 11 to 14 in FIG. 1 correspond to the processor 301 that executes the touch gesture determination program in FIG.
  • a part of the touch gesture determination device 110 shown in FIG. 1 can also be realized by the memory 302 shown in FIG. 2 and the processor 301 that executes the touch gesture determination program.
  • the touch panel 130 detects contact of a plurality of fingers, and transmits touch information (an identification number, coordinates, and a contact state of each finger) to the processor 301.
  • the processor 301 stores the touch information acquired from the touch panel 130 in the memory 302, and switches the operation screen, the input method, and the notification method based on the touch history information accumulated in the memory 302.
  • the processor 301 determines from the touch information stored in the memory 302 whether the operation is for the entire screen or an operation for the display component, and determines a selection value in each operation mode.
  • the speaker 303 is a sound output unit used when, for example, a touch gesture operation status is notified by sound such as an announcement.
  • the touch panel input device 100 may include an additional device such as a vibrator, a lamp, and a transmission device for wirelessly transmitting a notification signal instead of the speaker 303 or as an additional configuration.
  • FIG. 4A and 4B are diagrams illustrating an example of the screen of the touch panel 130 and the touch gesture operation in the touch panel input device 100 according to the embodiment.
  • FIG. 4A shows a screen example 401 displayed in the display component operation mode. As shown in FIG. 4A, the screen example 401 displays various parts to be operated, such as a button 402 for directly selecting a selection value. By touch-inputting the area where the display component is displayed, the target component can be operated.
  • FIG. 4B shows an example screen 403 displayed in the entire screen operation mode.
  • the screen example 403 includes a display 404 of the currently selected value, or a display prompting an operation instruction for switching the screen (for example, a cancel area 405 described later). Indicated.
  • information is input by the number of tap operations, a gesture that rotates the dial with two or more touches (dial gesture), and the like.
  • FIG. 5 is a flowchart illustrating the operation (touch gesture determination method) of the touch gesture determination device 110 according to the embodiment. As shown in FIG. 5, in step ST401, the user activates the system in the display component operation mode and initializes each processing unit.
  • the operation mode switching determination unit 12c determines switching input to the entire screen operation mode based on the touch information input from the operation information input unit 11.
  • the operation mode is changed, and the display contents, input method, and notification method in the entire screen operation mode are switched.
  • the gesture input information determination unit 122 identifies information related to the gesture input from the touch information input by the user, and thereby determines the selection value A2. For example, the user inputs information using a gesture (dial gesture) that rotates the dial with two or more touches.
  • a gesture dial gesture
  • step ST405 by passing the selection value A2 determined in step ST404 to the display control unit 14, the display content is switched and a screen reflecting the operation content is displayed.
  • step ST406 the selection value A2 determined in step ST404 is passed to the notification unit 13, so that the notification content is switched, the notification of the situation after the operation result, and the announcement for prompting the operation are performed.
  • the operation mode switching determination unit 12c determines a switching input to the display component operation mode from the touch information input from the operation information input unit 11 and the touch history information.
  • the display component operation mode is entered, and the operation screen for the display component displayed at the initial activation is entered.
  • the touch input to the operation panel unit 132 is received until the switching input is received, and the display content and the notification content are switched accordingly.
  • step ST402 In the input determination regarding the screen switching in step ST402, if the input content is not switching to the entire screen operation mode (NO in ST402), input determination for the display component is performed in the next step ST709.
  • ⁇ ⁇ Input input to the display component in the display component operation mode is determined based on whether touch input coordinates are included in the display component operation range.
  • the selection value A2 is determined in the next step ST410.
  • the touch input is accepted again in step ST402.
  • the display component input determination unit 12a determines the selection value A2 based on the touch information input from the operation information input unit 11. As an example of selection, as shown in the display screen of each display component in FIG. 3, a display component assigned with a specific selection value such as a button is arranged, and the selection value is input by touch input to the display component. decide.
  • step ST411 by passing the selection value A2 determined in step ST410 to the display control unit 14, the display content of the display panel unit 131 is switched, and a screen corresponding to the operation content is displayed.
  • step ST412 by passing the selection value A2 determined in step ST410 to the notification unit 13, the notification content is switched, the notification of the situation after the operation result, and the announcement for prompting the operation are performed.
  • FIG. 6 is a flowchart illustrating an example of the operation of the two-point touch operation determination process by the gesture input information determination unit 122 according to the embodiment.
  • the gesture input information determination unit 122 determines that the user's touch gesture operation is a two-point touch operation performed by the user without looking at the touch panel will be described.
  • step ST501 the gesture input information determination unit 122 determines whether the total movement amount in C0 msec of the two touch positions during the two-point touch operation is equal to or less than D0 pixel ( First determination condition: When the touch position during the two-point touch operation hardly moves).
  • step ST501 When the movement amount within C0 msec is equal to or less than D0 pixel (YES in step ST501), the process proceeds to the next step ST502. On the other hand, when the movement amount within C0 msec is larger than D0 pixel (NO in step ST501), the process proceeds to step ST505, and the gesture input information determination unit 122 performs the touch gesture operation without looking at the touch panel. It is determined that it is not.
  • the gesture input information determination unit 122 determines whether the movement amount in C1 msec of the touch position of the one-point touch operation immediately before the two-point touch operation is equal to or less than D1 pixel (second determination condition: When the touch position of the one-point touch operation immediately before the two-point touch operation hardly moves).
  • step ST502 When the movement amount in C1 msec of the touch position of the one-point touch operation immediately before the two-point touch operation is equal to or less than D1 pixel (YES in step ST502), the process proceeds to the next step ST503. On the other hand, when the movement amount in C1 msec of the touch position of the one-point touch operation immediately before the two-point touch operation is larger than D1 pixel (NO in step ST502), the process proceeds to step ST505, and the touch gesture operation is performed without looking at the touch panel. It is determined that this is not a two-point touch operation.
  • the gesture input information determination unit 122 determines whether or not the movement amount in C2 msec of the touch position of the one-point touch operation immediately after the two-point touch operation is equal to or less than D2 pixel (third determination condition: When the touch position of the one-point touch operation immediately after the two-point touch operation hardly moves).
  • step ST503 When the movement amount in C2 msec of the touch position of the one-point touch operation immediately after the two-point touch operation is equal to or less than D2 pixel (YES in step ST503), the process proceeds to step S504, and the gesture input information determination unit 122 performs the touch gesture operation. Is determined to be a two-point touch operation performed without looking at the touch panel.
  • step ST503 when the movement amount in C2 msec of the touch position of the one-point touch operation immediately after the two-point touch operation is larger than D2 pixel (NO in step ST503), the process proceeds to step ST505, and the gesture input information determination unit 122 performs the touch gesture. It is determined that the operation is not a two-point touch operation performed without looking at the touch panel.
  • the touch gesture operation is a two-point touch operation performed without looking at the touch panel. A determination is made and a selection value A2 corresponding to the determined touch gesture operation is determined.
  • the gesture input information determination unit 122 performs the touch gesture operation without touching the touch panel. It is determined that the operation is not an operation, and a selection value A2 corresponding to the determined touch gesture operation is determined.
  • the gesture input information determination unit 122 determines that the touch gesture operation is a two-point touch operation performed without looking at the touch panel when all the first to third determination conditions are satisfied. When one or more of the first to third determination conditions are satisfied, the touch gesture operation may be determined as a two-point touch operation performed without looking at the touch panel.
  • the touch gesture operation may be determined as a two-point touch operation performed without looking at the touch panel. Further, another condition may be added to the determination condition for the two-point touch operation performed without looking at the touch panel. Also, the first to third determination conditions may be used for a two-point touch operation performed while looking at the touch panel.
  • FIG. 7 is a flowchart illustrating another example of the operation of the two-point tap determination process performed by the gesture input information determination unit 122 according to the embodiment.
  • the gesture input information determination unit 122 determines that the touch gesture operation is a two-point touch operation performed without the user looking at the touch panel will be described.
  • step ST601 the gesture input information determination unit 122 determines whether the two-point touch operation duration is C3 msec or more (fourth determination condition: operation with long-time two-point touch ).
  • step ST601 When the two-point touch operation duration is C3 msec or longer (YES in step ST601), the process proceeds to step ST602, and the gesture input information determination unit 122 determines that the touch gesture operation is a two-point touch operation.
  • step ST601 When the two-point touch operation duration is less than C3 msec (NO in step ST601), the process proceeds to step ST603, and the gesture input information determination unit 122 determines that the touch gesture operation is not a two-point touch operation.
  • the touch gesture operation tendency when performing the two-point touch operation without looking at the touch panel is performed without looking at the touch panel.
  • First to third determination conditions (or fourth determination conditions) for determining whether or not a two-point touch operation is performed are set, and the gesture input information determination unit 122 sets the first to third determination conditions ( Or the fourth determination condition) determines whether the touch gesture operation is a two-point touch operation performed without looking at the touch panel. This makes it possible to distinguish between a touch gesture operation performed while looking at the touch panel and a touch gesture operation performed without looking at the touch panel, and the touch panel easily recognizes the touch gesture operation performed without looking at the touch panel. be able to.
  • the two-point touch operation performed without looking at the touch panel is determined based on the first to third determination conditions (or the fourth determination condition).
  • the touch gesture operation determined by the touch gesture determination device 110 according to the embodiment is not limited to the two-point touch operation.
  • a determination condition may be newly set for a three-point touch operation (an operation performed by touching three fingers) performed without looking at the touch panel, and the three-point touch operation may be determined.
  • Operation information input unit 12 Operation determination unit, 12a Display component input determination unit, 12b Whole screen input determination unit, 13 Notification unit, 14 Display control unit, 100 Touch panel input device, 110 Touch gesture determination device, 121 Parameter adjustment unit, 122 Gesture input information determination unit, 130 touch panel, 131 display panel unit, 132 operation panel unit, 301 processor, 302 memory, 303 speaker, A0 operation information, 401, 403 screen example, 405 cancel area, A1 input information, A2 selection value , A3 notification information, A4 operation determination information.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un dispositif de détermination de geste tactile (110) comprenant une unité d'entrée d'informations d'opération (11), une unité de détermination d'opération (12), une unité de notification (13), et une unité de commande d'affichage (14). L'unité de détermination d'opération (12) comprend une unité de détermination d'entrée de composant d'affichage (12a), une unité de détermination d'entrée d'écran globale (12b), et une unité de détermination de commutation de mode de fonctionnement (12c). L'unité de détermination d'entrée d'écran globale (12b) comprend une unité de détermination d'informations d'entrée de geste (122) qui détermine la satisfaction d'une condition de détermination pour déterminer si une opération de geste tactile est une opération qu'un utilisateur a effectué sans regarder un panneau tactile.
PCT/JP2017/010299 2017-03-15 2017-03-15 Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile WO2018167860A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/010299 WO2018167860A1 (fr) 2017-03-15 2017-03-15 Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile
JP2017540658A JP6253861B1 (ja) 2017-03-15 2017-03-15 タッチジェスチャ判定装置、タッチジェスチャ判定方法、タッチジェスチャ判定プログラム、及びタッチパネル入力装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/010299 WO2018167860A1 (fr) 2017-03-15 2017-03-15 Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile

Publications (1)

Publication Number Publication Date
WO2018167860A1 true WO2018167860A1 (fr) 2018-09-20

Family

ID=60860136

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010299 WO2018167860A1 (fr) 2017-03-15 2017-03-15 Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile

Country Status (2)

Country Link
JP (1) JP6253861B1 (fr)
WO (1) WO2018167860A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220253207A1 (en) * 2020-01-19 2022-08-11 Huawei Technologies Co., Ltd. Display method and electronic device
JP7549492B2 (ja) 2020-09-08 2024-09-11 元路 小下 情報処理装置、情報処理方法及びプログラム

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016533575A (ja) * 2013-11-01 2016-10-27 インテル コーポレイション 注視補助型タッチスクリーン入力

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016533575A (ja) * 2013-11-01 2016-10-27 インテル コーポレイション 注視補助型タッチスクリーン入力

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220253207A1 (en) * 2020-01-19 2022-08-11 Huawei Technologies Co., Ltd. Display method and electronic device
JP7549492B2 (ja) 2020-09-08 2024-09-11 元路 小下 情報処理装置、情報処理方法及びプログラム

Also Published As

Publication number Publication date
JP6253861B1 (ja) 2017-12-27
JPWO2018167860A1 (ja) 2019-04-11

Similar Documents

Publication Publication Date Title
JP7342208B2 (ja) 画像処理装置、画像処理装置の制御方法及びプログラム
US9612736B2 (en) User interface method and apparatus using successive touches
JP5718042B2 (ja) タッチ入力処理装置、情報処理装置およびタッチ入力制御方法
US20120174044A1 (en) Information processing apparatus, information processing method, and computer program
JP5522755B2 (ja) 入力表示制御装置、シンクライアントシステム、入力表示制御方法およびプログラム
JP2013250761A (ja) 情報処理装置、情報処理装置の制御方法、及びプログラム
EP2829967A2 (fr) Procédé de traitement d'entrées et son dispositif électronique
JP6041742B2 (ja) タッチパネル表示制御装置
US10353540B2 (en) Display control device
JP6253861B1 (ja) タッチジェスチャ判定装置、タッチジェスチャ判定方法、タッチジェスチャ判定プログラム、及びタッチパネル入力装置
WO2013077359A1 (fr) Dispositif électronique, procédé de fonctionnement d'un dispositif électronique et programme
JP6737239B2 (ja) 表示装置及び表示制御プログラム
KR20150040825A (ko) 연속적인 터치를 이용한 사용자 인터페이스 방법 및 장치
JP2019070990A (ja) 表示制御装置
JP6327834B2 (ja) 操作表示装置、操作表示方法及びプログラム
JP6227213B1 (ja) タッチジェスチャ判定装置、タッチジェスチャ判定方法、タッチジェスチャ判定プログラム、及びタッチパネル入力装置
JP2016038619A (ja) 携帯端末装置及びその操作方法
JP2015102946A (ja) 情報処理装置、情報処理装置の制御方法、およびプログラム
KR101397907B1 (ko) 멀티 터치 인식을 위한 시스템, 제어방법과, 기록 매체
JP2020013472A (ja) 画像出力装置、制御方法及びプログラム
JP6210664B2 (ja) 情報処理装置とその制御方法、及びプログラムと記憶媒体
JP7030529B2 (ja) 電子機器、情報処理方法、プログラム及び記憶媒体
JP2016200918A (ja) 情報処理装置、その制御方法、及びプログラム
JP6475050B2 (ja) 情報入出力システム及び情報処理システム
JP2015200979A (ja) 情報処理装置およびコンピュータプログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017540658

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900961

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17900961

Country of ref document: EP

Kind code of ref document: A1