WO2018173180A1 - Dispositif de détermination d'entrée tactile, procédé de détermination d'entrée tactile et programme de détermination d'entrée tactile - Google Patents

Dispositif de détermination d'entrée tactile, procédé de détermination d'entrée tactile et programme de détermination d'entrée tactile Download PDF

Info

Publication number
WO2018173180A1
WO2018173180A1 PCT/JP2017/011615 JP2017011615W WO2018173180A1 WO 2018173180 A1 WO2018173180 A1 WO 2018173180A1 JP 2017011615 W JP2017011615 W JP 2017011615W WO 2018173180 A1 WO2018173180 A1 WO 2018173180A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
touch gesture
knob
operation mode
touch panel
Prior art date
Application number
PCT/JP2017/011615
Other languages
English (en)
Japanese (ja)
Inventor
大介 木皿
佐々木 雄一
萩原 利幸
前川 拓也
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2017/011615 priority Critical patent/WO2018173180A1/fr
Priority to CN201780088528.0A priority patent/CN110431525A/zh
Priority to JP2017535719A priority patent/JP6207804B1/ja
Priority to DE112017007110.0T priority patent/DE112017007110T5/de
Priority to KR1020197027062A priority patent/KR20190112160A/ko
Publication of WO2018173180A1 publication Critical patent/WO2018173180A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to a touch input determination device, a touch input determination method, and a touch input determination program that change a selection value using a slide bar displayed on a touch panel.
  • a value corresponding to the amount of movement is output by moving (sliding) a knob (also referred to as “knob”) arranged on a long bar along the bar.
  • a knob also referred to as “knob”
  • Electronic devices have also been proposed (see, for example, Patent Document 2).
  • the present invention has been made in order to solve the above-described problems, and a touch input determination device and a touch input determination method capable of appropriately performing a touch gesture operation on a slide bar without viewing the touch panel screen. And a touch input determination program.
  • a touch input determination device is a device that receives the operation information from a touch panel that outputs operation information corresponding to a touch gesture operation, and generates a selection value based on the operation information.
  • the user operation mode is switched to a first operation mode for operating a display component displayed on the screen of the touch panel or a second operation mode in which the entire screen of the touch panel is used as the reception area for the touch gesture operation.
  • An operation determination unit that performs the user operation mode from the first operation mode to the second operation mode.
  • a bar and a knob are placed at the position where the first touch gesture operation is performed.
  • the second touch gesture operation is performed in which a slide bar is displayed and the knob is moved along the bar, the selection value is changed according to the movement amount of the knob.
  • a touch input determination method is a method of receiving the operation information from a touch panel that outputs operation information corresponding to a touch gesture operation, and generating a selection value based on the operation information.
  • the user operation mode in is switched from the first operation mode in which the display component displayed on the touch panel screen is operated to the second operation mode in which the entire touch panel screen is used as the reception area for the touch gesture operation. Later, when a predetermined first touch gesture operation is performed at an arbitrary position on the screen of the touch panel, a slide bar including a bar and a knob is provided at the position where the first touch gesture operation is performed.
  • a display step and a second touch gesture operation for moving the knob Characterized by a step of changing the selection value according to the amount of movement of the knob.
  • the slide bar is displayed at the position where the first touch gesture operation is performed, and the selection value is changed by performing the second touch gesture operation from the position where the first touch gesture operation is performed. Therefore, even if the screen of the touch panel is not visually observed, an effect that the touch gesture operation on the slide bar can be appropriately performed is obtained.
  • FIG. 6 is a flowchart illustrating an operation example of the touch input determination device according to the first embodiment. It is a figure which shows the example of a display of the screen of the touchscreen in step S1 of FIG. It is a figure which shows the touch gesture operation (two-point touch) in step S3 of FIG. It is a figure which shows the example of a display of the screen of the touchscreen in step S4 of FIG. It is a figure which shows 1st touch gesture operation (pinch in) in FIG.3 S5.
  • FIG. 4 is a diagram showing a second touch gesture operation (up / down operation) in steps S6 to S8 of FIG. It is a figure which shows 3rd touch gesture operation (pinch out) in step S9 of FIG. It is a figure which shows the example of a display of the screen of the touchscreen in step S11 of FIG.
  • FIG. 10 is a diagram showing a fourth touch gesture operation in steps S9 and S12 to S13 of FIG.
  • FIG. 4 is a diagram showing a touch gesture operation (single point touch) in steps S16 to S18 of FIG. 10 is a flowchart illustrating an operation example of the touch input determination device according to the second and third embodiments. 10 is a flowchart showing left and right operation processing in the touch input determination device according to Embodiment 2.
  • FIG. 10 is a flowchart showing left and right operation processing in the touch input determination device according to Embodiment 3. It is a figure which shows the touch gesture operation (left-right operation) in FIG.
  • the touch input determination device receives operation information from a touch panel that outputs operation information corresponding to a touch gesture operation, generates a selection value (output information) based on the operation information, and selects the selection value. Outputs a value signal.
  • the touch panel input device is mounted on equipment or equipment, can change and determine a selection value, and can provide the determined selection value to the equipment or equipment.
  • the touch input determination device is applied to a registration device for a stop floor (destination floor) in an elevator system
  • the present invention is not limited to the application of an elevator system.
  • the present invention is also applicable to other systems such as audio equipment and factory equipment.
  • FIG. 1 is a functional block diagram schematically showing a configuration of a touch input determination device 10 according to Embodiment 1 of the present invention.
  • the touch input determination device 10 is a device that can perform the touch input determination method according to the first embodiment. Further, the touch input determination device 10 may be a device (computer) that can execute the touch input determination program according to the first embodiment.
  • the touch input determination device 10 is connected to the touch panel 20.
  • the touch panel 20 includes a display panel unit (display device) 22 that provides an operation image of a display component (for example, button display) used for user operation, and an operation panel unit (input device) 21 that receives a touch gesture operation by the user.
  • a display panel unit 22 and the operation panel unit 21 are arranged so as to constitute a screen for touch gesture operation of the touch panel 20.
  • the touch panel 20 may be a part of the touch input determination device 10.
  • the touch input determination device 10 stores an operation information input unit 11 that receives operation information provided from the operation panel unit 21, a control unit 12 that controls the operation of the entire device, and information.
  • a storage unit 17 such as a semiconductor memory, a notification unit 18 as a sound output unit that outputs an audio signal to a sound output device (for example, a speaker), and a display control unit 19 that controls display contents of the display panel unit 22 are provided. ing.
  • the control unit 12 includes an operation mode switching unit 13 that switches a user operation mode on the touch panel 20 and an operation determination unit 14 that performs processing based on operation information received from the touch panel 20 via the operation information input unit 11. .
  • the operation determination unit 14 includes a first touch gesture determination unit 15 that determines a touch gesture operation in a normal operation mode (display component is displayed) as a first operation mode operated on the touch panel 20, and a touch panel. And a second touch gesture determination unit 16 that determines a touch gesture operation in a non-visual operation mode as a second operation mode operated in 20 (when the entire screen of the touch panel 20 is used as an operation area).
  • the non-visual operation mode means a mode in which a touch gesture operation is possible without viewing the touch panel 20, and is related to whether the user is actually viewing the screen of the touch panel 20 or not. There is no.
  • the operation information input unit 11 receives the operation information output from the operation panel unit 21 of the touch panel 20 and provides it to the control unit 12.
  • the operation mode switching unit 13 sets the user operation mode on the touch panel 20 as a first operation mode for operating display components displayed on the screen of the touch panel 20 or the entire screen of the touch panel 20 as a reception area for touch gesture operations. Switch to operation mode 2. The switching is performed, for example, in response to a user operation on the touch panel 20.
  • the operation mode switching unit 13 is a case where the touch coordinate information that is the operation information received from the touch panel 20 is a two-point touch (shown in FIG. 5 described later) that causes two fingers to contact the screen of the touch panel 20.
  • the user operation mode is switched from the first operation mode to the second operation mode.
  • a pinch-in (first touch gesture operation), which is an operation for narrowing the interval between two fingers, is performed
  • a slide bar (in the position where the pinch-in is performed is displayed on the screen of the touch panel 20. (Shown in FIG. 7 described later) is displayed.
  • the first touch gesture determination unit 15 of the operation determination unit 14 performs a first process based on operation information provided from the touch panel 20 when the first operation mode is set by the operation mode switching unit 13. Specifically, the first touch gesture determination unit 15 uses a button touch that is a touch gesture operation in the first operation mode (normal operation mode) from operation information based on the touch gesture operation provided from the touch panel 20. The contents of a gesture operation (a touch gesture operation on a button as a display component and shown in FIG. 12 described later) are determined, and a signal corresponding to the determination result is output.
  • a gesture operation a touch gesture operation on a button as a display component and shown in FIG. 12 described later
  • the second touch gesture determination unit 16 of the operation determination unit 14 performs a second process based on the operation information provided from the touch panel 20 when the second operation mode is set by the operation mode switching unit 13. Specifically, the second touch gesture determination unit 16 determines in advance at an arbitrary position on the screen of the touch panel 20 after the user operation mode is switched from the first operation mode to the second operation mode.
  • the first touch gesture operation for example, pinch-in
  • a slide bar is displayed at the position where the first touch gesture operation is performed.
  • the second touch gesture determination unit 16 performs a second touch gesture operation (also referred to as “slide bar gesture operation”, which will be described later in FIG. 8) to move (slide) the slide bar knob,
  • the selected value (operation result information) is changed according to the amount of movement.
  • the touch input determination device 10 displays the slide bar including the bar and the knob (knob) at the position where the first touch gesture operation (for example, pinch-in) is performed.
  • a second touch gesture operation (up / down operation for moving the knob upward or downward, shown in FIG. 8 described later) is performed from the position where the first touch gesture operation is performed (shown in FIG. 7 described later). Therefore, the user can change (set) the selected value (for example, the stop floor of the elevator), so that the user can easily and appropriately perform the touch gesture operation on the slide bar without looking at the screen of the touch panel 20. Can do.
  • FIG. 2 is a diagram illustrating a hardware configuration of the touch input determination device 10 according to the first embodiment.
  • the touch input determination device 10 includes a memory 42 that stores a program (touch input determination program) as software, and an information processing device that executes the program stored in the memory 42. And a CPU (Central Processing Unit) 41 as a processor.
  • the memory 42 shown in FIG. 2 corresponds to the storage unit 17 in FIG.
  • the touch input determination device 10 includes a touch interface 43 (corresponding to the operation information input unit 11 in FIG. 1) that mediates communication between the operation panel unit 21 of the touch panel 20 and the CPU 41, and a display panel unit of the touch panel 20. 22 and the CPU 41, a display interface 44 (corresponding to the display control unit 19 in FIG.
  • the CPU 41 in FIG. 2 can implement each configuration of the control unit 12 by executing a touch input determination program stored in the memory 42.
  • FIG. 3 is a flowchart showing an operation example of the touch input determination apparatus 10 according to the first embodiment.
  • 4 to 12 are diagrams showing a screen of the touch panel 20 and a touch gesture operation performed by the user.
  • step S1 of FIG. 3 when the system in which the touch input determination device 10 is mounted is started, the control unit 12 is in a first operation mode (normal operation mode) in which display components are operated, as shown in FIG. As shown, buttons on the touch panel 20 are displayed as display components. In the example of FIG. 4, buttons to which numerical values (selection value candidates) indicating the first floor to the 30th floor are assigned are displayed. In the example of FIG. 4, the 7th floor and the 15th floor are already selected as stop floors (that is, floors scheduled to stop) and are highlighted (for example, brightness highlight or color change display).
  • step S2 the control unit 12 waits for an input of a touch gesture operation on the touch panel 20, and when the touch gesture operation is performed, the process proceeds to step S3.
  • the control unit 12 performs a two-point touch that is a touch gesture operation for switching the operation mode based on the touch position coordinates acquired from the touch panel 20 (as shown in FIG. 5, 2 of the hand 51). It is determined whether or not there has been an operation of touching with a finger of a book. Any finger may be used for the two-point touch. Further, the touch positions 61 and 62 are not limited to the example of FIG. Further, this operation may be another touch gesture operation such as a three-point touch or a rotation operation as long as it can be identified as a touch gesture operation for switching the operation mode.
  • step S3 If it is determined that there is a two-point touch (YES in step S3), the control unit 12 advances the process to step S4, hides the button on the screen of the touch panel 20, and slides the slide bar as shown in FIG.
  • the bar 71 is displayed.
  • the bar 71 extends in a first direction that is a predetermined direction of the screen of the touch panel 20, that is, in the vertical direction (vertical direction).
  • the timing at which the bar 71 is displayed may be later than step S4 (eg, step S6).
  • the extending direction of the bar 71 may be set in another direction such as a horizontal direction (horizontal direction).
  • the control unit 12 determines whether or not a pinch-in as a first touch gesture operation has been performed with two fingers at an arbitrary position on the screen of the touch panel 20 as shown in FIG. to decide. If the first touch gesture operation is an operation that can specify the position on the screen of the touch panel 20 (for example, a double tap with one finger or a long press with one finger), the other touch gesture operation is performed. It may be.
  • step S5 the control unit 12 advances the process to step S6, and slides including a bar 71 and a knob 72 on the screen of the touch panel 20 as shown in FIG. Display the bar. Also, as shown in FIG. 7, the current floor display (“first floor” in FIG. 7) or the like may be displayed.
  • the control unit 12 performs a vertical operation of the knob 72 as a second touch gesture operation for moving the knob 72 in the first direction, that is, a drag operation for moving the knob 72 up or down.
  • step S7 When the up / down operation of the knob 72 is not performed (NO in step S7), the control unit 12 returns the process to step S6 and displays the bar 71 and the knob 72 on the screen of the touch panel 20 as shown in FIG. Is displayed, and a waiting state for waiting for the up / down operation is performed.
  • step S7 When an up / down operation is performed to move the finger upward or downward with the finger gripping the knob 72 (YES in step S7), the control unit 12 advances the process to step S8 and sets the amount of movement of the knob 72.
  • a calculation for changing the current selection value (stop floor) is performed using the corresponding change amount.
  • the selected value increases by a value corresponding to the amount of movement.
  • the selected value is decreased by a value corresponding to the moving amount.
  • the control unit 12 determines whether or not the pinch-out as the third touch gesture operation has been performed at the position when the movement of the knob 72 is completed, as shown in FIG.
  • the third touch gesture operation may be any other operation that can specify the completion of movement of the knob 72 on the screen of the touch panel 20 (for example, a double tap with one finger or a long press with one finger).
  • the touch gesture operation may be used.
  • step S3 when the control unit 12 determines that there is no two-point touch, the process proceeds to step S16, and it is determined whether or not there is a one-point touch as shown as a touch position 63 in FIG. to decide.
  • control unit 12 When the one-point touch is not performed (NO in step S16), the control unit 12 returns the process to step S2 and enters a standby state waiting for acceptance of the touch gesture operation.
  • step S16 determines whether or not a touch release has been performed in step S17. If touch release has been performed (YES in step S17), control unit 12 advances the process to step S18, and determines the changed selection value (stop floor) as a new selection value. When the touch release is not performed (NO in step S10), control unit 12 returns the process to step S2.
  • the figure can be displayed at an arbitrary position on the screen of the touch panel 20. If the pinch-in as the first touch gesture operation shown in step S5 of FIG. 3 and FIG. 7 is performed, a slide bar including the bar 71 and the knob 72 is displayed, and the knob 72 is moved up and down by a simple operation. The selection value can be increased or decreased according to the movement amount and movement direction of the knob 72. For this reason, even if it does not look at the screen of the touch panel 20, the effect that the touch gesture operation with respect to the slide bar displayed on the screen of the touch panel 20 can be performed appropriately is acquired.
  • step S5 to S9 since the touch gesture operation (steps S5 to S9) from the selection start to the selection end on the touch panel 20 is an operation by moving the knob, the user can explicitly indicate the operation and reduce selection errors. Can do.
  • the selected value is a large value and the number of digits of the selected value is a plurality of digits, it takes a long time to change the selected value only by moving the knob 72 on one bar 71.
  • the slide bar has a plurality of bars extending in the vertical direction (first direction).
  • the bars of the book are used to change the number of selected values per digit (eg, the number of one digit, the number of ten digits, the number of hundred digits).
  • the user selects one of the plurality of bars by moving the knob in a second direction that intersects (for example, orthogonally) the first direction, and the plurality of bars are selected in accordance with the movement of the knob in the first direction. Change the number of digits corresponding to the selected bar in the bar.
  • FIG. 13 shows an operation example of the touch input determination device according to the second embodiment.
  • the same step number as the step number of FIG. 3 is attached
  • the operation of the second embodiment shown in FIG. 13 is different from the operation of the first embodiment shown in FIG. 3 in that knob left / right operation and left / right operation processing in steps S14 and S15 are added. Except for this difference, the operation shown in FIG. 13 is the same as that of FIG.
  • FIG. 14 is a flowchart showing a left / right operation process in the touch input determination device according to the second embodiment.
  • FIG. 15 is a diagram showing the touch gesture operation in step S151 of FIG.
  • the selection value is an n-digit number composed of the number of the first digit to the number of the n-th digit.
  • n 3
  • the selected value is the number of the 1's place (number of the first digit), the number of the tens place (number of the 2nd digit), and the number of the hundreds place (the 3rd digit) Number).
  • step S14 in FIG. 13 the operation determination unit 14 of the control unit 12 responds to the movement of the knob 82 in the second direction intersecting the first direction, and the kth of the first to nth bar portions.
  • the bar portion is selected (k is an arbitrary integer not smaller than 1 and not larger than n).
  • step S15 in FIG. 13 step S151 in FIG. 14
  • the selected first portion is selected according to the movement amount of the knob 82 in the first direction. Change the number of the kth digit corresponding to the bar portion of k.
  • the second direction is a direction orthogonal to the first direction, but may not be orthogonal if the direction is different from the first direction.
  • the second embodiment is the same as the first embodiment except for the points described above.
  • Embodiment 3 ⁇ 3-1 Configuration
  • the slide bar has a plurality of bars extending in the vertical direction (first direction). In the bars, different values are set as the rate of change (resolution) that is the ratio of the amount of change in the selected value to the amount of movement of the knob in the first direction.
  • the configuration of the touch input determination device according to the third embodiment is the same as that of the first and second embodiments except for the control content of the control unit 12. For this reason, FIGS. 1 and 13 are referred to in the description of the third embodiment.
  • FIG. 16 is a flowchart showing left / right operation processing in the touch input determination apparatus according to the third embodiment.
  • FIG. 17 is a diagram showing the touch gesture operation in step S152 of FIG.
  • the bar 91 of the slide bar includes first to n-th bar portions extending in the first direction.
  • the bar 91 can be placed at any position on the screen of the touch panel 20.
  • a left and right operation for moving the knob 92 in the horizontal direction is performed to select the bar portion used for changing the selection value.
  • the up / down operation is performed at the leftmost bar portion, and when the selected value is to be increased / decreased at a medium resolution, the up / down operation is performed at the central bar portion. If the selection value is greatly changed, the up / down operation can be performed at the rightmost bar portion.
  • 10 touch input determination device 11 operation information input unit, 12 control unit, 13 operation mode switching unit, 14 operation determination unit, 15 first touch gesture determination unit, 16 second touch gesture determination unit, 17 storage unit, 18 Notification unit, 19 display control unit, 20 touch panel, 21 operation panel unit, 22 display panel unit, 41 CPU, 42 memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif de détermination d'entrée tactile (10) comprenant : une unité de commutation de mode de fonctionnement (13) qui commute le mode de fonctionnement d'utilisateur d'un panneau tactile entre un premier mode de fonctionnement, dans laquelle un composant d'affichage est actionné, et un second mode de fonctionnement, dans lequel l'écran entier est utilisé comme région de réception d'opération de geste tactile; et une unité de détermination d'opération (14) qui effectue un premier processus sur la base d'informations de fonctionnement si le mode de fonctionnement d'utilisateur est réglé sur le premier mode de fonctionnement, ou effectue un second processus sur la base d'informations de fonctionnement si le mode de fonctionnement d'utilisateur est réglé sur le second mode de fonctionnement. Après que le mode de fonctionnement d'utilisateur a été commuté du premier mode de fonctionnement au second mode de fonctionnement, si une première opération de geste tactile prédéterminée est effectuée à un emplacement sur l'écran du panneau tactile, alors l'unité de détermination d'opération (14) amène une barre coulissante, qui comprend une barre s'étendant dans une première direction et une partie de pincement, à être affichée à l'emplacement où la première opération de geste tactile a été effectuée. Ensuite, si une seconde opération de geste tactile pour déplacer la partie de pincement dans la première direction est effectuée, l'unité de détermination d'opération (14) change une valeur sélectionnée en fonction de la quantité de mouvement de la partie de pincement dans la première direction.
PCT/JP2017/011615 2017-03-23 2017-03-23 Dispositif de détermination d'entrée tactile, procédé de détermination d'entrée tactile et programme de détermination d'entrée tactile WO2018173180A1 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
PCT/JP2017/011615 WO2018173180A1 (fr) 2017-03-23 2017-03-23 Dispositif de détermination d'entrée tactile, procédé de détermination d'entrée tactile et programme de détermination d'entrée tactile
CN201780088528.0A CN110431525A (zh) 2017-03-23 2017-03-23 触摸输入判定装置、触摸输入判定方法和触摸输入判定程序
JP2017535719A JP6207804B1 (ja) 2017-03-23 2017-03-23 タッチ入力判定装置、タッチ入力判定方法、及びタッチ入力判定プログラム
DE112017007110.0T DE112017007110T5 (de) 2017-03-23 2017-03-23 Berührungseingabebeurteilungseinrichtung, Berührungseingabebeurteilungsverfahren und Berührungseingabebeurteilungsprogramm
KR1020197027062A KR20190112160A (ko) 2017-03-23 2017-03-23 터치 입력 판정 장치, 터치 입력 판정 방법, 및 터치 입력 판정 프로그램

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/011615 WO2018173180A1 (fr) 2017-03-23 2017-03-23 Dispositif de détermination d'entrée tactile, procédé de détermination d'entrée tactile et programme de détermination d'entrée tactile

Publications (1)

Publication Number Publication Date
WO2018173180A1 true WO2018173180A1 (fr) 2018-09-27

Family

ID=59997784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/011615 WO2018173180A1 (fr) 2017-03-23 2017-03-23 Dispositif de détermination d'entrée tactile, procédé de détermination d'entrée tactile et programme de détermination d'entrée tactile

Country Status (5)

Country Link
JP (1) JP6207804B1 (fr)
KR (1) KR20190112160A (fr)
CN (1) CN110431525A (fr)
DE (1) DE112017007110T5 (fr)
WO (1) WO2018173180A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113590014A (zh) * 2021-07-16 2021-11-02 日立楼宇技术(广州)有限公司 基于姿态动作的电梯召唤方法、装置和计算机设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1021034A (ja) * 1996-06-28 1998-01-23 Shimadzu Corp 数値入力装置
JPH11212726A (ja) * 1998-01-29 1999-08-06 Omron Corp 入力装置
WO2010071187A1 (fr) * 2008-12-18 2010-06-24 日本電気株式会社 Appareil de commande d'affichage de barre de défilement et procédé de commande d'affichage de barre de défilement
JP2012509393A (ja) * 2008-11-19 2012-04-19 ダウ コーニング コーポレーション シリコーン組成物およびその製造方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100064261A1 (en) * 2008-09-09 2010-03-11 Microsoft Corporation Portable electronic device with relative gesture recognition mode
KR101085702B1 (ko) * 2009-11-04 2011-11-22 삼성전자주식회사 터치스크린의 한글 입력 방법, 기록매체
JP5772773B2 (ja) * 2012-09-19 2015-09-02 コニカミノルタ株式会社 画像処理装置、操作標準化方法および操作標準化プログラム
JP2014203202A (ja) 2013-04-03 2014-10-27 キヤノン株式会社 情報処理装置、情報処理装置の制御方法、およびプログラム
US9582184B2 (en) * 2013-11-04 2017-02-28 Keysight Technologies, Inc. Touch screen control for adjusting a numerical value
CN103853495A (zh) * 2014-02-13 2014-06-11 喻应芝 车载设备触摸控制装置和方法
EP2930049B1 (fr) * 2014-04-08 2017-12-06 Volkswagen Aktiengesellschaft Interface utilisateur et procédé d'adaptation d'une vue sur une unité d'affichage
JP2016126715A (ja) 2015-01-08 2016-07-11 オンキヨー株式会社 電子機器、及び、電子機器の制御プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1021034A (ja) * 1996-06-28 1998-01-23 Shimadzu Corp 数値入力装置
JPH11212726A (ja) * 1998-01-29 1999-08-06 Omron Corp 入力装置
JP2012509393A (ja) * 2008-11-19 2012-04-19 ダウ コーニング コーポレーション シリコーン組成物およびその製造方法
WO2010071187A1 (fr) * 2008-12-18 2010-06-24 日本電気株式会社 Appareil de commande d'affichage de barre de défilement et procédé de commande d'affichage de barre de défilement

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113590014A (zh) * 2021-07-16 2021-11-02 日立楼宇技术(广州)有限公司 基于姿态动作的电梯召唤方法、装置和计算机设备
CN113590014B (zh) * 2021-07-16 2023-09-26 日立楼宇技术(广州)有限公司 基于姿态动作的电梯召唤方法、装置和计算机设备

Also Published As

Publication number Publication date
JPWO2018173180A1 (ja) 2019-03-28
KR20190112160A (ko) 2019-10-02
JP6207804B1 (ja) 2017-10-04
DE112017007110T5 (de) 2019-11-21
CN110431525A (zh) 2019-11-08

Similar Documents

Publication Publication Date Title
US10627990B2 (en) Map information display device, map information display method, and map information display program
JP5900500B2 (ja) 携帯電子機器及びキー表示プログラム
EP2530573B1 (fr) Procédé de contrôle tactile et appareil électronique
JPH11203044A (ja) 情報処理システム
JP2007293849A (ja) 機能アイコンの表示システム及び方法
KR101885132B1 (ko) 사용자 단말에서 터치 입력 장치 및 방법
JP6253284B2 (ja) 情報処理装置およびその制御方法、プログラム、記録媒体
JP6207804B1 (ja) タッチ入力判定装置、タッチ入力判定方法、及びタッチ入力判定プログラム
JP6342297B2 (ja) 表示制御装置および表示制御方法
JP5542624B2 (ja) プラント監視装置
WO2018173181A1 (fr) Dispositif de détermination d'entrée tactile, dispositif d'entrée de panneau tactile, procédé de détermination d'entrée tactile et programme de détermination d'entrée tactile
JP2001195170A (ja) 携帯型電子機器、入力制御装置、及び記憶媒体
US20140317568A1 (en) Information processing apparatus, information processing method, program, and information processing system
JP2018085071A (ja) 表示システム
KR101899884B1 (ko) 그래픽 사용자 인터페이스 장치 및 방법
JPH09128194A (ja) 表示監視装置
WO2015123835A1 (fr) Positionnement de curseur
US7106314B2 (en) User interface and method of adapting a sensor signal to actuate multiple dimensions
JP7257248B2 (ja) タッチパネル用コントローラ及びプログラム
JP2019067127A (ja) 表示制御装置
US10921894B2 (en) User interface device
JP6496345B2 (ja) 数値制御装置
JP2020086637A (ja) 情報処理装置およびその制御方法、並びにプログラム
JP2020013472A (ja) 画像出力装置、制御方法及びプログラム
JP5489218B2 (ja) 金融取引処理装置、画面切替方法、及びプログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017535719

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17901902

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20197027062

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 17901902

Country of ref document: EP

Kind code of ref document: A1