WO2018122891A1 - Dispositif d'entrée de panneau tactile, dispositif de détermination de geste tactile, procédé de détermination de geste tactile et programme de détermination de geste tactile - Google Patents

Dispositif d'entrée de panneau tactile, dispositif de détermination de geste tactile, procédé de détermination de geste tactile et programme de détermination de geste tactile Download PDF

Info

Publication number
WO2018122891A1
WO2018122891A1 PCT/JP2016/088610 JP2016088610W WO2018122891A1 WO 2018122891 A1 WO2018122891 A1 WO 2018122891A1 JP 2016088610 W JP2016088610 W JP 2016088610W WO 2018122891 A1 WO2018122891 A1 WO 2018122891A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch panel
sight
line
determination
screen
Prior art date
Application number
PCT/JP2016/088610
Other languages
English (en)
Japanese (ja)
Inventor
佐々木 雄一
森 健太郎
堀 淳志
絢子 永田
丸山 清泰
美穂 石川
山崎 聡
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2017522564A priority Critical patent/JP6177482B1/ja
Priority to PCT/JP2016/088610 priority patent/WO2018122891A1/fr
Priority to KR1020197017714A priority patent/KR102254794B1/ko
Priority to CN201680091771.3A priority patent/CN110114749B/zh
Priority to DE112016007545.6T priority patent/DE112016007545T5/de
Publication of WO2018122891A1 publication Critical patent/WO2018122891A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map

Definitions

  • the present invention receives a touch gesture operation and outputs a signal based on the touch gesture operation, and a touch panel input device for inputting operation information corresponding to the touch gesture operation and outputting a signal based on the input operation information
  • the present invention relates to a touch gesture determination device, a touch gesture determination method, and a touch gesture determination program.
  • a user who uses a touch panel input device performs a touch gesture operation while looking at a GUI (Graphical User Interface) screen displayed on the touch panel.
  • GUI Graphic User Interface
  • Touch gesture operation cannot be performed in "Status”.
  • Patent Document 1 discloses an operation part (hereinafter referred to as “display”) displayed on a touch panel based on a similarity between a shape formed by connecting contact points of a plurality of fingers touching the touch panel and a preset shape. Proposal of a device for determining a part.
  • Patent Document 2 discloses that when the area of one region defined by the positions of a plurality of fingers in contact with the touch panel is equal to or larger than a preset threshold, the user does not look at the touch panel screen. That is, an apparatus has been proposed that determines that a user performs a touch gesture operation (for example, a visually impaired person) in a “no gaze state” in which the user does not turn his / her gaze toward the screen of the touch panel.
  • a touch gesture operation for example, a visually impaired person
  • the object of the present invention is to display the touch gesture operation using the display component displayed on the touch panel, and the content of the operation mode in the gaze state and the gaze state in which the touch screen operation is performed using the entire touch panel screen as the reception area for the touch gesture operation.
  • Touch panel input device that can perform input operation by touch gesture operation easily and accurately by appropriately switching the contents of the operation mode, and the contents of the operation mode in the gaze state or the operation mode in the gaze state It is to provide a touch gesture determination device, a touch gesture determination method, and a touch gesture determination program for enabling an input operation by a touch gesture operation to be performed easily and accurately by appropriately switching between.
  • a touch panel input device displays an operation image on a screen, accepts a user's touch gesture operation, outputs operation information corresponding to the touch gesture operation, and receives the operation information.
  • a touch gesture determination device for generating a selection value based on the operation information, In the touch gesture determination device, the touch gesture operation may be either an operation in a gaze state where the user's line of sight is directed to the screen or an operation in a no gaze state where the user's line of sight is not directed to the screen.
  • a line-of-sight presence determination unit that outputs line-of-sight determination information indicating the result of the determination, an operation mode switching unit that outputs command information based on the line-of-sight determination information, and An operation determination unit that generates the selection value based on the operation information, and a display control unit that causes the touch panel to display an image according to the command information as the operation image.
  • the operation determination unit is a display component gesture that determines the selection value from operation information based on a touch gesture operation performed on the display component displayed as the operation image on the touch panel when the line of sight is present.
  • a determination unit; and an entire screen gesture determination unit that determines the selection value from operation information based on a touch gesture operation performed on the entire screen of the touch panel when the line of sight is absent.
  • the touch gesture determination method displays an operation image on a screen, accepts a user's touch gesture operation, and outputs the operation information from a touch panel that outputs operation information corresponding to the touch gesture operation.
  • a gaze presence / absence judgment step for outputting gaze judgment information indicating the result of the judgment, an operation judgment step for generating the selection value based on the operation information in a judgment method according to command information based on the gaze judgment information, and the touch panel
  • a display control step for displaying an image according to the command information as the operation image,
  • the selection value is determined from operation information based on a touch gesture operation performed on a display component
  • the present invention there is no line-of-sight state in which touch gesture operation is performed with the content of the operation mode in the line-of-sight state in which touch gesture operation is performed using the display component displayed on the touch panel and the entire touch panel screen as the reception area for touch gesture operation.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of a touch panel input device according to Embodiment 1.
  • FIG. (A) And (b) is a figure which shows the example of the screen of the touchscreen in the touchscreen input device which concerns on Embodiment 1.
  • FIG. 6 is a flowchart showing an operation (touch gesture determination method) of the touch gesture determination device in the touch panel input device according to Embodiment 1. It is a figure which shows an example of the screen of the touchscreen in the touchscreen input device which concerns on the modification of Embodiment 1.
  • 10 is a flowchart illustrating an operation (touch gesture determination method) of a touch gesture determination device in the touch panel input device according to the second embodiment.
  • 10 is a diagram illustrating a method for acquiring history information of touch gesture operations in the touch panel input device according to Embodiment 2.
  • FIG. It is a figure which shows the example which cannot perform the input as intended by performing touch gesture operation in the state without a gaze which is the state which does not look at the screen of a touchscreen in a touchscreen input device.
  • 10 is a diagram illustrating an example of a touch gesture operation performed in a state where there is no line of sight in which the screen of the touch panel is not viewed in the touch panel input device according to Embodiment 2.
  • FIG. 10 illustrates an example of a hardware configuration of a touch panel input device according to a third embodiment. 10 is a flowchart illustrating an operation (touch gesture determination method) of a touch gesture determination device in a touch panel input device according to a third embodiment. (A) And (b) is a figure which shows the determination method of a visual line presence state and a visual line absence state in the touch-panel input device which concerns on Embodiment 3. FIG.
  • the touch panel input device includes a touch panel having a touch operation screen (operation screen) and a touch gesture determination device that receives operation information on the touch panel.
  • the touch panel input device is mounted on the target device or connected to be communicable with the target device, the operation screen of the electric device as the target device, the operation screen of the camera as the target device, and the target device
  • the present invention can be applied to an operation screen of factory equipment, an operation screen mounted on a car, a ship, an aircraft, or the like as a target device, an operation screen of a portable information terminal such as a smartphone or a tablet terminal as a target device.
  • the touch panel input device uses a signal (for example, a selection value) based on operation information input by a touch gesture operation (also referred to as “touch operation”) from the operation screen of the touch panel as a target device equipped with the touch panel input device, or a touch panel. This can be provided to a target device that can communicate with the input device.
  • a touch gesture operation also referred to as “touch operation”
  • the touch panel is a touch gesture input unit that accepts a touch gesture operation performed by the user.
  • the touch gesture operation is an information input operation by a specific movement such as the user's finger (or the palm of the user or the finger and the palm of the finger user).
  • Touch gesture operation is tapping, which is an operation of tapping the operation screen of the touch panel with a finger, flicking, which is an operation of flicking the operation screen of the touch panel with a finger, and a swipe which is an operation of sliding the operation screen of the touch panel with a finger (sliding the finger).
  • tapping which is an operation of tapping the operation screen of the touch panel with a finger
  • flicking which is an operation of flicking the operation screen of the touch panel with a finger
  • a swipe which is an operation of sliding the operation screen of the touch panel with a finger (sliding the finger).
  • Touch gesture operations include dragging, which is an operation of dragging a display component on the touch panel with a finger, pinch-in, which is an operation of narrowing the interval between fingers while pinching with multiple fingers on the operation screen of the touch panel, and multiple touch gestures on the operation screen of the touch panel. It can include pinch out, which is an operation to increase the interval between fingers.
  • the touch gesture operation can include an operation using a touch pen which is a pen-type input auxiliary tool.
  • FIG. 1 is a functional block diagram showing a schematic configuration of a touch panel input device 1 according to Embodiment 1 of the present invention.
  • the touch panel input device 1 according to the first embodiment includes a touch gesture determination device 10 and a touch panel 20.
  • the touch gesture determination device 10 is a device that can execute the touch gesture determination method according to the first embodiment and the touch gesture determination program according to the first embodiment.
  • the touch panel 20 receives a touch gesture operation performed by a user and outputs operation information (also referred to as “touch information”) A0 corresponding to the touch gesture operation, and an operation panel.
  • the display panel unit 22 is arranged to overlap the unit 21 and can display an operation image such as a GUI screen.
  • the display panel unit 22 is, for example, a liquid crystal display.
  • the touch gesture determination device 10 includes an operation information input unit 11, a gaze presence / absence determination unit 12 as an operation mode determination unit, an operation mode switching unit 13, an operation determination unit 14, and a notification unit. 15 and a display control unit 16.
  • the operation determination unit 14 includes a display component gesture determination unit 141 and a whole screen gesture determination unit 142.
  • the operation information input unit 11 receives operation information (operation signal) A0 output from the operation panel unit 21.
  • the operation information input unit 11 outputs the input information A1 corresponding to the received operation information A0 to the line-of-sight presence determination unit 12 and the operation determination unit 14.
  • the input information A1 is information corresponding to the operation information A0, and may be the same information as the operation information A0.
  • the line-of-sight presence determination unit 12 is an operation performed by the user while looking at the screen of the touch panel 20, that is, the user's line of sight is the screen of the touch panel 20 Or an operation performed by the user without looking at the screen of the touch panel 20, that is, the user's line of sight is not directed at the screen of the touch panel 20. It is determined whether the operation is in a state where there is no line of sight.
  • the operation mode based on the presence of the line of sight is an operation mode in which a touch gesture operation is performed on the display component displayed on the touch panel 20, for example.
  • the operation mode on the premise of the state where there is no line of sight is, for example, an operation mode in which the entire screen of the touch panel 20 is set as one operation effective area for accepting a touch gesture operation.
  • the line-of-sight presence determination unit 12 uses line-of-sight determination information A2 indicating a determination result as to whether the touch gesture operation is an operation in a state where there is a line of sight or an operation in a state where there is no line of sight from the input information A1 based on the touch gesture operation of the user.
  • the gaze presence / absence determination unit 12 includes a storage unit (for example, a memory in FIG. 2 described later) that stores a predetermined touch gesture operation pattern, and stores and stores the touch gesture operation indicated by the input information A1. This can be performed based on the degree of similarity to the touch gesture operation pattern stored in the section.
  • the operation mode switching unit 13 uses the line-of-sight determination information A2 received from the line-of-sight presence determination unit 12 to determine the display content of the screen of the touch panel 20, the determination method by the operation determination unit 14 of the input information A1 based on the touch gesture operation from the touch panel 20, And command information A3 for switching (setting) the notification method by the notification unit 15 is output.
  • the operation determination unit 14 receives the command information A3 from the operation mode switching unit 13, and switches the determination method of the input information A1 according to the command information A3.
  • the entire screen gesture determination unit 142 determines a touch gesture operation on the entire screen of the touch panel 20 and determines a selection value A7 that is an output signal based on the input information A1.
  • the touch gesture operation with respect to the entire screen of the touch panel 20 is determined in the touch gesture operation with the line of sight in a limited narrow area of the touch panel 20 (a part of the screen of the touch panel 20), and the touch gesture operation in the state without the line of sight.
  • the touch panel 20 is used in a wide area (the entire screen of the touch panel 20).
  • the display component gesture determination unit 141 determines a touch gesture operation from the display component as the operation image displayed by the display panel unit 22 and the input information A1, and outputs an output signal based on the display component and the input information A1.
  • a certain selection value A7 is determined.
  • the determination of the touch gesture operation on the display component of the touch panel 20 is used in a wide area of the touch panel 20 (the entire screen of the touch panel 20) in the touch gesture operation in the presence of the line of sight, and is limited in the touch gesture operation in the state of no line of sight. Used in a narrow area (a part of the screen of the touch panel 20).
  • the display control unit 16 outputs an image signal A6 of an operation image displayed on the display panel unit 22 of the touch panel 20.
  • the display control unit 16 changes the display content of the screen of the display panel unit 22 according to the operation determination information A4 received from the operation determination unit 14 and the command information A3 received from the operation mode switching unit 13.
  • the notification unit 15 switches a method of notifying information to the user between a touch gesture operation in a line-of-sight state and a touch gesture operation in a state without a line of sight.
  • the notification unit 15 issues a notification of notification contents according to the command information A3 or outputs a notification signal.
  • the notification unit 15 notifies the status of the user's touch gesture operation by, for example, sound, screen display, vibration by a vibrator, or lamp lighting.
  • the notification unit 15 changes the notification content according to the operation determination information A4 received from the operation determination unit 14 and the command information A3 received from the operation mode switching unit 13.
  • the notification unit 15 When the notification by the notification unit 15 is a notification by sound, the notification unit 15 outputs a notification signal to a speaker as an audio output unit.
  • the speaker is shown in FIG.
  • the notification unit 15 When the notification by the notification unit 15 is an image display, the notification unit 15 sends notification information A5 to the display control unit 16, and the display control unit 16 transmits an image signal based on the notification information to the display panel unit 22.
  • the selection value A7 output from the operation determination unit 14 is a selection value determined by the operation determination unit 14 based on the input information A1, and an application program for a device in which the touch panel input device 1 is mounted based on the selection value A7. Etc. perform device control and the like.
  • FIG. 2 is a diagram illustrating an example of a hardware (H / W) configuration of the touch panel input device 1 according to the first embodiment.
  • the touch panel input device 1 according to the first embodiment includes a touch panel 20, a processor 31, a memory 32, and a speaker 33.
  • a touch gesture determination device 10 shown in FIG. 1 includes a memory 32 as a storage device that stores a touch gesture determination program as software, and a processor 31 as an information processing unit that executes the touch gesture determination program stored in the memory 32. (For example, by a computer).
  • the components 11 to 16 in FIG. 1 correspond to the processor 31 that executes the touch gesture determination program in FIG.
  • a part of the touch gesture determination device 10 shown in FIG. 1 can be realized by the memory 32 shown in FIG. 2 and the processor 31 that executes the touch gesture determination program.
  • the speaker 33 is a sound output unit used when, for example, a touch gesture operation state performed without a line of sight is notified by a sound such as an announcement.
  • the touch panel input device 1 may include an additional device such as a vibrator, a lamp, and a transmission device for wirelessly transmitting a notification signal, instead of the speaker 33 or as an additional configuration.
  • FIG. 3A and 3B are diagrams showing examples of the screen of touch panel 20 and touch gesture operation in touch panel input device 1 according to Embodiment 1.
  • FIG. 3A and 3B are diagrams showing examples of the screen of touch panel 20 and touch gesture operation in touch panel input device 1 according to Embodiment 1.
  • FIG. 3A shows an example of the screen of the touch panel 20 in the operation mode with the line of sight.
  • FIG. 3A is an example of a screen when the line-of-sight presence / absence determination unit 12 determines from the input information A1 indicating the touch gesture operation that the operation mode is the line-of-sight presence state.
  • the user can switch display contents on the screen by performing an operation of swiping the screen left or right with a finger.
  • the list 231 is a list displayed when the line-of-sight presence determination unit 12 determines that there is a line-of-sight presence state.
  • the list 231 is composed of a plurality of buttons (square areas with numbers), and the list 231 can be moved in the same direction by tracing (sliding) the finger upward or downward.
  • the line-of-sight presence determination unit 12 determines that there is a line of sight and taps the button (touch the button with a finger and release it within a predetermined time without moving the finger). To determine (confirm) the selected value.
  • FIG. 3B shows an example of the screen of the touch panel 20 in the operation mode in a state where there is no line of sight.
  • FIG. 3B is an example of a screen when the line-of-sight presence / absence determination unit 12 determines from the input information A1 indicating the touch gesture operation that the operation mode is a line-of-sight state.
  • the cancel area 241 in the absence of line of sight is a rectangular area surrounding “>> Cancel >>”, and in this rectangular area, the finger is traced (slided) from left to right in a predetermined direction. ), The operation mode in the absence of sight line is canceled, and the operation mode in the sight presence state is switched.
  • FIG. 4 is a flowchart showing the operation (touch gesture determination method) of the touch gesture determination apparatus 10 in the touch panel input device 1 according to the first embodiment.
  • step ST101 the operation mode switching unit 13 sets the operation mode in a line-of-sight state as the initial operation mode.
  • the operation mode switching unit 13 is configured such that the screen of the touch panel 20 becomes a screen of the operation mode with the line of sight, and the input method of the touch panel 20 is the input method by the display component using the operation mode with the line of sight.
  • the initial operation mode may be an operation mode without a line of sight.
  • the operation information input unit 11 acquires operation information A0 (coordinates indicating the position of the finger contact point, finger contact state, finger identification numbers for a plurality of contact points, etc.) from the touch panel 20.
  • the data is stored in a storage unit (for example, the memory 32 shown in FIG. 2).
  • the line-of-sight presence / absence determination unit 12 determines whether the touch gesture operation is an operation in a line-of-sight state from the history information of the operation information A0 (information on the contact point of the finger) stored in the storage unit. It is determined whether the operation is in the absence state, that is, whether or not there is a line of sight (operation mode determination).
  • the line-of-sight presence determination unit 12 determines that there is no line-of-sight state, for example, when any of the following determination conditions (1) to (3) is satisfied, and in other cases, there is a line-of-sight Judged as a state.
  • Judgment condition (1) When a finger touches one point on the screen of the touch panel 20 and the contact position has not moved for a time equal to or greater than a predetermined threshold value.
  • Judgment condition (2) Multiple fingers are touched.
  • Criteria (3) When touching a plurality of touch points on the screen of 20 and keeping touching for a time equal to or greater than a predetermined threshold value / Criteria (3): The palm keeps touching the screen of the touch panel 20 (ie In the case where a wide area equal to or greater than a predetermined threshold is kept in contact for a time equal to or greater than a predetermined threshold), however, another determination criterion may be used as the determination criterion.
  • the determination condition (3) for setting the line-of-sight state in step ST103 is satisfied, there is a possibility that the coordinates of the detected plurality of contact points and the number of contact points are unstable (that is, greatly fluctuate with time). is there. For this reason, using the history information of the operation information A0 (input information A1) for a period from the present time to a time point that is a predetermined period, the degree of coordinate blur (the amount of movement of the coordinates) of the contact point within this period (For example, when the maximum movement amount exceeds a predetermined threshold value), it may be determined that the touch is a palm touch.
  • the operation information A0 from the touch panel 20 is information indicating that the contact state is abnormal during the operation of bringing the palm into contact with the screen. For this reason, when there is a history indicating that the contact state is abnormal within this period using the history information of the operation information A0 in the period from the present time to a time point that is predetermined, the palm touch You may determine that there is.
  • step ST103 it is erroneously determined that the finger has been removed from the screen of the touch panel 20 during the operation of bringing the palm into contact with the screen. May be erroneously determined as a touch gesture operation by tapping a button. For this reason, using the history information of the operation information A0 during the period from the current time to the time point that is predetermined, the time is unstable for a predetermined time after all the fingers of the touch panel 20 are separated during this period. There is a standby process to detect whether touch detection (that is, detection of a situation in which the number and position of contact points fluctuate greatly with time) is not detected, or whether there is no notification of an abnormal contact state. When these are detected, it may be determined that the palm touches.
  • touch detection that is, detection of a situation in which the number and position of contact points fluctuate greatly with time
  • step ST103 the line-of-sight presence determination unit 12 determines that the line of sight is present when the finger is touched as displayed on the screen. For example, on the screen in the no-gaze state shown in FIG. 3B, the touch is started inside the cancel area 241 in the no-gaze state, and the left mark “>>” is passed through the character “Cancel”.
  • the line-of-sight presence determination unit 12 determines that the operation is a switching operation to the line-of-sight presence state. Note that the determination of the switching operation to the line-of-sight state in step ST103 is performed by tracing the cancel area 241 in the absence of the line of sight, when continuously tracing from the outside of the cancellation area 241 to the inside of the cancellation area 241. If the camera protrudes above or below the cancel area 241 or is traced to the right after the cancel area 241 is traced in the absence of the line of sight, You may determine not to determine.
  • step ST103 the condition for determining that the line-of-sight presence determination unit 12 is an operation for switching to the line-of-sight presence state is from the left mark “>>” in FIG. 3B to the character “Cancel”. Not only when the finger is slid so as to pass the right mark “>>”, but also in a state where there is no line of sight, the cancel region 241 may be touched and moved to the right by a predetermined distance or more.
  • step ST104 when the gaze presence / absence determination is changed (in the case of YES), the process proceeds to steps ST105 to ST108, and the operation mode switching unit 13 switches the input method by switching the operation screen of the touch panel 20, and the notification unit.
  • the command information A3 for switching the notification method by 15 is output.
  • step ST104 when the change to the eye gaze presence / absence determination has not occurred (in the case of NO), the process proceeds to step ST109.
  • step ST105 the operation determination unit 14 receives the command information A3 from the operation mode switching unit 13, issues a command to switch an effective input method for the entire screen, and the entire screen gesture determination unit 142 determines an effective input method for the entire screen. Switch.
  • the operation determination unit 14 receives the command information A3 from the operation mode switching unit 13, issues a command to switch the effective input method for the display component, and the display component gesture determination unit 141 is effective for the display component. Switch the input method.
  • the order of processing in steps ST105 and ST106 may be reversed.
  • the display control unit 16 receives the command information A3 for switching the screen from the operation mode switching unit 13, and switches the screen. For example, when receiving the command information A3 for instructing screen switching, the display control unit 16 displays, for example, the screen in the sight line presence state shown in FIG. In the absence state, for example, a screen in a state of no sight shown in FIG. 3B is displayed.
  • the notification unit 15 receives the command information A3 for switching the notification content of the operation mode switching unit 13, and switches the notification content. For example, in a state where the line of sight is present, the notification unit 15 outputs a voice announcement from the speaker, such as “If you press and hold the screen for a long time, the mode switches to the mode of operation without viewing the screen”, and the state of the line of sight is switched off. , “Since it has been pressed for a long time, it will switch to the operation screen for people with disabilities.” When the line of sight disappears, the selected value increases by touching the upper side of the screen, and the selected value decreases by touching the lower side. The selected value is confirmed by tapping two points. To output. At this time, the notification unit 15 may notify the current selection value by voice announcement every time an upper or lower tap occurs on the screen.
  • the notification of the notification unit 15 is not only a voice announcement, but also a notification from a touch panel 20 that is being operated by a different display device, a voice announcement from a smart watch, or a vibration of the touch panel. May be notified.
  • the operation determination unit 14 determines (determines) the selection value based on the input information. For example, in the operation mode with the line of sight, the operation determination unit 14 taps a displayed button, and determines the number displayed on the button as the selection value. On the other hand, in the operation mode with no line of sight, the operation determination unit 14 increases the selection value when the upper touch of the screen occurs, and decreases the selection value when the lower touch of the screen occurs. The operation determination unit 14 determines the selection value when two touches occur and the finger is released at the same time. When the operation determination unit 14 receives a touch gesture operation, the display control unit 16 changes the display content according to the touch gesture operation, and the notification unit 15 notifies the operation status according to the touch gesture operation. .
  • step ST109 if the cancel area 241 in the absence of line of sight is tapped, it is not an operation of tracing the cancel area 241, so that it is regarded as a lower operation on the screen and the selection value is reduced.
  • the determination of the gaze presence / absence determination unit 12 in step ST103 and the determination operation of the operation determination unit 14 in step ST109 store, for example, an operation determination table as shown in Table 1 in the storage unit of the gaze presence / absence determination unit 12. It can be realized by processing according to the contents of.
  • the operation information A0 based on the user's touch gesture operation is a touch gesture operation in a state where there is a line of sight, or a touch gesture operation in a state where there is no line of sight
  • the determination method of the input information A1 in the operation determination unit 14, the display control method by the display control unit 16, and the notification method by the notification unit 15 can be changed. For this reason, when the user is looking at the screen of the touch panel 20, the user taps a button as a display component displayed on the screen with his / her finger or traces the list displayed on the screen with his / her finger. State touch gesture operation is possible.
  • the user can perform an operation by accepting a touch gesture operation using the entire screen and notification of the operation state.
  • the touch gesture operation with the line of sight or the touch without the line of sight is performed regardless of whether the touch gesture operation with the line of sight or the touch gesture operation without the line of sight is performed. Since it is appropriately determined as a gesture operation, it is possible to accurately perform an input operation by a touch gesture operation.
  • the touch gesture operation when an operation that does not occur as a touch gesture operation in a gaze state is performed in a gaze state, it is determined that the touch gesture operation is a gaze state, and in a gaze state.
  • an operation that does not occur as a touch gesture operation without a line of sight is performed, it can be determined that the touch gesture operation is with a line of sight. In this case, since there are few misjudgments in the gaze presence / absence judgment, the judgment accuracy of the operation mode is improved.
  • the gaze presence / absence judgment unit 12 is in a gazeless state when three or more touch points (finger contact points) occur.
  • the operation mode may be determined.
  • a two-point touch operation for example, a pinch operation, a rotation operation, etc.
  • a touch gesture operation for example, a pinch operation, a rotation operation, etc.
  • the gaze presence / absence determination in step ST103 of FIG. 4 when a state in which the content of the screen of the touch panel 20 is not changed from the start of contact with the finger of the screen of the touch panel 20 continues for a predetermined time or longer, It may be determined that there is no line of sight. For example, even if a finger is brought into contact with the screen of the touch panel 20 and a tracing operation is generated, if there is no screen scrolling due to a left / right swipe of the finger or a list scrolling due to a vertical tracing operation of the screen, the presence / absence of gaze The determination unit 12 may determine that there is no line of sight.
  • the gaze presence / absence judgment unit 12 has gaze when detecting any of the vertical movement, the horizontal movement, and the diagonal movement of the finger contact point. You may determine with a state.
  • the gaze presence / absence judgment unit 12 has a gaze presence state when a specific locus such as a circle, square, or character is drawn by movement of the finger contact point. May be determined.
  • the determination by the gaze presence / absence determination unit 12 in step ST103 of FIG. 4 may be a condition in which a predetermined time has elapsed without any input operation from the touch panel 20 as a condition for changing from a gazeless state to a gaze presence state. .
  • it may be configured to automatically return to the line-of-sight state when the state without the touch gesture operation continues for a predetermined threshold time or more after switching to the line-of-sight state. In this case, it is possible to perform the touch gesture operation without looking at the screen by performing the touch gesture operation determined to be in the absence of the line of sight such as a long press again based on the voice announcement of the notification unit 15.
  • the selected value is increased in the clockwise direction by rotating the dial by touching two points on the screen, and the counterclockwise direction.
  • the selection value may be reduced.
  • the two-point tap is set as the decision operation, there is a possibility that the decision and the rotation operation may be mistaken. For example, when a tap occurs twice (a double tap occurs) within a predetermined time. May be a selection value determining operation.
  • Embodiment 2 ⁇ 2-1 Configuration
  • the touch gesture operation in the operation mode without the line of sight is limited to a simple touch gesture operation using the entire screen of the touch panel 20 as an operation effective area.
  • an apparatus and method for preventing an unintended operation due to not looking at the screen and enabling a more complicated touch gesture operation in the operation mode without a line of sight will be described.
  • FIG. 6 is a functional block diagram showing a schematic configuration of the touch panel input device 2 according to Embodiment 2 of the present invention.
  • the touch panel input device 2 includes a touch panel 20 and a touch gesture determination device 10a.
  • the touch gesture determination device 10a is a device that can execute the touch gesture determination method according to the second embodiment and the touch gesture determination program according to the second embodiment.
  • the touch gesture determination device 10a according to the second embodiment is different from the touch gesture determination device 10 according to the first embodiment in that it includes a gesture operation storage unit 17 and a gesture correction unit 18. Except for this point, the second embodiment is the same as the first embodiment. Therefore, the differences will be mainly described below.
  • the gesture operation storage unit 17 stores the history information of the touch gesture operation recognized by the operation determination unit 14.
  • the gesture correction unit 18 corrects the touch gesture operation recognition result determined by the operation determination unit 14 from the touch gesture operation data stored in the gesture operation storage unit 17 and determines the selection value A8.
  • the H / W configuration of the touch panel input device 2 includes a touch panel 20, a processor 31, a memory 32, and a speaker 33, as shown in FIG.
  • the touch gesture determination device 10a shown in FIG. 6 includes a memory 32 as a storage device that stores a touch gesture determination program as software, and a processor 31 as an information processing unit that executes the touch gesture determination program stored in the memory 32. (For example, by a computer).
  • the components 11 to 18 in FIG. 6 correspond to the processor 31 that executes the touch gesture determination program in FIG.
  • part of the touch gesture determination device 10a illustrated in FIG. 6 may be realized by the memory 32 illustrated in FIG. 2 and the processor 31 that executes the touch gesture determination program.
  • FIG. 7 is a flowchart showing the operation (touch gesture determination method) of the touch gesture determination apparatus 10a in the touch panel input device 2 according to the second embodiment.
  • the same step number as the step number shown in FIG. 4 is attached
  • the operation of the touch gesture determination device 10a according to the second embodiment is that after the touch gesture operation determination step ST109, there are a processing step (step ST201) for storing the touch gesture operation and a gesture correction processing step (step ST202). This is different from the operation of the touch gesture determination apparatus 10 in the first embodiment. Except for these points, the operation of the touch gesture determination device 10a in the second exemplary embodiment is the same as that in the first exemplary embodiment.
  • the gesture operation storage unit 17 stores the touch gesture operation recognized by the operation determination unit 14 as history information.
  • An example of the history information of the touch gesture operation is shown in Table 2.
  • FIG. 8 is a diagram illustrating a method for acquiring the history information of the touch gesture operation in the touch panel input device 2 according to the second embodiment.
  • the touch gesture operation history information shown in Table 2 is calculated by, for example, the method shown in FIG.
  • the amount of movement in the rotation direction that is, the angle change amount H [pixel] is the current touch gesture with respect to the straight line Q1 connecting the two points P1 and P2 touched by the previous touch gesture operation.
  • the perpendicular Q2 is dropped from the point P3 touched by the operation, and the length of the perpendicular Q2 is set.
  • the distance change amount D [pixel] is a change amount between the current distance between two points (distance between P1 and P3) L2 and the previous distance between two points (distance between P1 and P2) L1.
  • the movement amount M [pixel] is a movement amount between the current center position I2 between two points (distance between P1 and P3) and the central position I1 between the previous two points (distance between P1 and P2).
  • step ST202 when the gesture correction unit 18 performs a touch gesture operation without looking at the screen from the history information stored in the gesture operation storage unit 17, the intended operation information is not input to the touch gesture determination device 10a. Correct the recognition result.
  • FIG. 9 is a diagram illustrating an example of a touch gesture operation in which operation information as intended is not input to the touch gesture determination device 10a when a touch gesture operation is performed without looking at the screen of the touch panel 20 in the touch panel input device 2.
  • the selection value is increased by repeatedly drawing a circle with a finger clockwise as the first rotation direction on the screen of the touch panel 20, and the second rotation in the direction opposite to the first rotation direction.
  • a touch gesture operation that reduces a selected value by repeatedly drawing a circle with a finger counterclockwise as a direction can be given.
  • the gesture correction unit 18 uses the touch trajectory information accumulated in the gesture operation storage unit 17 in a period from the present time to a past time point that is a predetermined time later, and uses a fitting method such as the least square method to form a circle shape. And update the position of the circle. After updating the position of the circle, the gesture correction unit 18 calculates an angle change amount by calculating an angle formed by the previous finger contact point, the center position of the circle, and the current finger contact point. As described above, as shown in FIG. 9, it is possible to realize a touch gesture operation for repeatedly drawing a circle without looking at the 0 screen of the touch panel 20.
  • the gesture correction unit 18 recalculates the coordinate locus to correct the circle by changing the vertical and horizontal scales of the ellipse when the shape of the circle is an ellipse.
  • the angle formed by the contact point of the circle, the center position of the circle, and the current finger contact point may be calculated.
  • the operation determination unit 14 determines a touch gesture operation for drawing a circle.
  • the initial position of the circle may be the center position of the screen of the touch panel 20, or may be obtained from a locus for a predetermined time by a least square method or the like.
  • FIG. 10 is a diagram illustrating an example of a touch gesture operation performed without looking at the screen of the touch panel 20 in the touch panel input device 2 according to the second embodiment.
  • the operation that triggers the operation determination unit 14 to start determining a touch gesture operation for drawing a circle is determined by touching a plurality of contact points with a plurality of fingertips as shown on the left side of FIG. It may be a case where the number of the received items decreases. For example, as shown in FIG. 10, when the touch point becomes 1 after touching at 3 points, the operation determination unit 14 performs a touch gesture operation for drawing a circle around the center of gravity (center of gravity) of the three points.
  • the gesture correction unit 18 may correct the rotation amount by updating the position of the circle.
  • FIG. 11 is a diagram illustrating a scene where the orientation, angle, and enlargement / reduction amount of the camera are adjusted without looking at the touch panel input device 2 at hand in the touch panel input device 2 according to the second embodiment.
  • FIG. 11 illustrates a scene in which the camera orientation, angle, and enlargement / reduction amount are adjusted without looking at the touch panel at hand.
  • the camera video display device 401 displays video shot by the camera.
  • the notification unit 15 superimposes and displays the current camera operation status (direction, enlargement ratio, rotation angle) on the camera video display device 401.
  • the hand operation screen 402 passes touch information to the operation information input unit 11, switches the screen display between the gaze state and the gaze state, and displays the display content of the screen according to the touch gesture operation determined by the operation determination unit 14. change.
  • the screen 403 is a screen when the line-of-sight presence determination unit 12 determines that the line of sight is present, and a touch gesture operation for displaying a manual and switching the screen by tapping a hyperlink is possible. is there.
  • the screen 404 is a screen when the gaze presence / absence determination unit 12 determines that there is no gaze state.
  • the line-of-sight presence determination unit 12 determines a touch operation of three or more points, and determines that there is a line-of-sight state when there is an operation in which all fingers are separated and a predetermined time has elapsed.
  • the operation determination unit 14 determines whether there is an enlargement / reduction, movement, or rotation touch gesture operation after three or more touch operations, and according to the amount of each operation, the enlargement / reduction camera image 407, the movement camera image 408, and the rotation An image such as a camera image 409 in which the direction of the camera and the amount of enlargement / reduction is changed is displayed.
  • touch gesture operation that does not work as intended without looking at the screen is an operation of pressing the touch panel strongly.
  • the gaze presence / absence judging unit performs an operation of judging that there is no gaze by long pressing, a tendency to push the finger strongly appears when long pressing without looking at the screen.
  • the gesture correction unit 18 uses the fact that the movement amount of the finger position does not change more than a certain value or the movement amount becomes less than a certain value due to a weak pressing pressure based on the history information of the touch gesture operation shown in Table 2. Judge the push.
  • the correction processing of the gesture correction unit 18 can be applied not only to a long press but also to a tap or a two-point operation.
  • the gesture correcting unit 18 is a total value of movement amounts of three or more contact points in a period from the current time accumulated in the gesture operation storage unit 17 to a past time point that is a predetermined time later, between the two contact points.
  • the total value of the change amount of the distance and the total value of the change amount in the rotation direction are obtained, and the operation of the largest total value among these total values of the change amounts is output as the intended operation.
  • an operation as intended can be performed.
  • ⁇ 2-3 Effects as described above, in the second embodiment, an unintended operation by operating without looking at the screen by collecting and correcting the history of gestures recognized by the operation determination unit 14 In addition, even if the touch gesture operation is more complicated, the touch gesture operation can be performed without looking at the screen.
  • Embodiment 3 ⁇ 3-1 Configuration
  • the gaze presence / absence determination unit 12 performs gaze presence / absence determination (operation mode determination) based on the operation information A0 based on the touch gesture operation.
  • the operation used for the gaze presence / absence determination cannot be used as a touch gesture operation for original information input. Therefore, in the third embodiment, instead of performing gaze presence / absence determination (operation mode determination) from the operation information A0 based on the touch gesture operation, a camera serving as an imaging device that captures the user's face is provided, and the user's face camera From the image, a gaze presence / absence determination is performed.
  • FIG. 12 is a functional block diagram showing a schematic configuration of the touch panel input device 3 according to the third embodiment. 12, components that are the same as or correspond to the components shown in FIG. 1 are given the same reference numerals as those shown in FIG.
  • the touch panel input device 3 includes a touch panel 20 and a touch gesture determination device 10b.
  • the touch gesture determination device 10b is a device that can execute the touch gesture determination method according to the third embodiment and the touch gesture determination program according to the third embodiment.
  • the gaze presence / absence determination unit 12b performs gaze presence / absence determination (operation mode determination) from the camera image of the user's face acquired from the camera 34 through the camera image input unit 19. This is different from the touch gesture determination device 10 in the first embodiment.
  • the third embodiment is the same as the first embodiment.
  • the camera image input unit 19 and the gaze presence / absence determination unit 12b in the third embodiment can be applied to the touch panel input device 2 according to the second embodiment.
  • the camera image input unit 19 receives a camera image (image data) acquired by the camera 34 shooting a user's face.
  • the line-of-sight presence determination unit 12b as the operation mode determination unit receives the camera image of the user's face from the camera image input unit 19, performs image processing for extracting image data of the entire face and eyes, and performs the face orientation and the line of sight of the eyes. Detect the direction of. If the eyes are facing the screen of the touch panel 20, it is determined that the operation mode is in a line-of-sight state, and if the eyes are not facing the screen of the touch panel 20, it is determined that the operation mode is in a state without a line of sight.
  • FIG. 13 is a diagram illustrating an example of the H / W configuration of the touch panel input device 3 according to the third embodiment.
  • the H / W configuration of the touch panel input device 3 according to the third embodiment is the touch panel input device 1 according to the first embodiment in that the camera 34 is provided and the touch gesture determination program stored in the memory 32b. This is different from the H / W configuration. Except for these points, the H / W configuration of the third embodiment is the same as that of the first embodiment.
  • the camera 34 stores the camera image (image data) of the user's face acquired by camera shooting in the memory 32b as a storage unit, and the processor 31 receives the camera image from the memory 32b in the processing of the camera image input unit 19. Obtained and image processing is performed by the line-of-sight presence determination unit 12b.
  • FIG. 14 is a flowchart showing the operation (touch gesture determination method) of the touch gesture determination apparatus 10b in the touch panel input device 3 according to the third embodiment.
  • the same step number as the step number shown in FIG. 4 is attached
  • the operation of the touch gesture determination device 10b according to Embodiment 3 is that the line-of-sight presence / absence determination unit 12b performs line-of-sight presence determination from the camera image of the user's face acquired through the camera image input unit 19 (steps ST301 and ST302) and the camera.
  • the operation of the touch gesture determination apparatus 10 in Embodiment 1 differs from the operation of the touch gesture determination apparatus 10 in Embodiment 1 in that the operation information A0 by the touch gesture operation is acquired after the gaze presence / absence determination based on the image (step ST303). Except for these points, the operation of the touch panel input device 3 according to the third embodiment is the same as that of the first embodiment. Therefore, an operation different from the operation of the first embodiment will be described below.
  • the camera image input unit 19 receives the image data of the user's face from the camera 34 and stores the image data in the memory 32b.
  • step ST302 the line-of-sight presence determination unit 12b receives the image data of the user's face stored in the memory 32b by the camera image input unit 19 as shown in FIG. 15A, and extracts the face direction and line-of-sight. Image processing is performed, and if the line of sight is directed toward the screen of the touch panel 20 as shown on the left side of FIG. Further, the line-of-sight presence determination unit 12b determines that there is no line-of-sight state when the line of sight does not face the direction toward the screen of the touch panel 20, as illustrated on the right side of FIG.
  • the orientation of the face can be detected by, for example, a Cascade discriminator based on the Haar feature, and the line of sight extracts, for example, an eye part that is a facial part, and applies an edge filter to the eye image, It can be detected by extracting the outer diameter of the iris and fitting the ellipse to estimate the position of the pupil.
  • the face orientation and line-of-sight detection methods are not limited to these examples, and other methods may be employed.
  • ⁇ 3-3 Effect
  • the camera image of the camera 34 is used. By determining whether the user's line of sight is directed to the screen of the touch panel 20 or not, the user can switch the operation mode. Even if the touch gesture operation is not performed, the gaze presence / absence determination (operation mode determination) can be easily and accurately performed.
  • the gaze presence / absence determination is performed based on the camera image. Therefore, it is possible to eliminate an operation whose use is restricted in the touch gesture operation.
  • the touch panel 20 is viewed when inspecting equipment or when viewing a large screen display device installed at a distant position.
  • a touch gesture operation can be performed by a button operation using a normal display component.
  • the touch panel 20 is not viewed (when there is no line of sight)
  • the touch screen 20 is touched entirely. It can be set as the operation effective area which receives operation. For this reason, when the touch panel 20 is not viewed, the portable terminal device can be used as an operation device for checking a device or for a large-screen display device installed at a distance.
  • Embodiments 1 to 3 above are examples of the present invention, and various modifications are possible within the scope of the present invention.

Abstract

L'invention concerne un dispositif d'entrée de panneau tactile (1) comprenant : une unité de détermination de ligne de visée (12) qui détermine si une opération de geste tactile effectuée par un utilisateur est située à l'intérieur ou à l'extérieur de la ligne de visée de l'utilisateur et émet en sortie des informations de détermination de ligne de visée (A2) ; une unité de commutation de mode d'opération (13) qui émet en sortie des informations d'instruction (A3) en fonction des informations de détermination de ligne de visée (A2) ; une unité de détermination d'opération (14) qui génère une valeur sélectionnée en fonction d'informations d'opération (A0) à l'aide d'un procédé de détermination déterminé conformément aux informations d'instruction (A3) ; et une unité de commande d'affichage (16) qui provoque l'affichage par un panneau tactile (20) d'une image conformément aux informations d'instruction (A3). L'unité de détermination d'opération (14) comprend : une unité de détermination de geste de composant d'affichage (141) qui détermine une valeur sélectionnée à partir d'informations d'opération (A0) déterminées en fonction d'une opération de geste tactile effectuée par un utilisateur par rapport à un composant d'affichage affiché en tant qu'image d'opération, et qui est située à l'intérieur de la ligne de visée de l'utilisateur ; et une unité de détermination de geste d'écran complet (142) qui détermine une valeur sélectionnée à partir d'informations d'opération (A0) déterminées en fonction d'une opération de geste tactile effectuée par un utilisateur par rapport à l'écran complet et qui est situé à l'extérieur de la ligne de visée de l'utilisateur.
PCT/JP2016/088610 2016-12-26 2016-12-26 Dispositif d'entrée de panneau tactile, dispositif de détermination de geste tactile, procédé de détermination de geste tactile et programme de détermination de geste tactile WO2018122891A1 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2017522564A JP6177482B1 (ja) 2016-12-26 2016-12-26 タッチパネル入力装置、タッチジェスチャ判定装置、タッチジェスチャ判定方法、及びタッチジェスチャ判定プログラム
PCT/JP2016/088610 WO2018122891A1 (fr) 2016-12-26 2016-12-26 Dispositif d'entrée de panneau tactile, dispositif de détermination de geste tactile, procédé de détermination de geste tactile et programme de détermination de geste tactile
KR1020197017714A KR102254794B1 (ko) 2016-12-26 2016-12-26 터치 패널 입력 장치, 터치 제스처 판정 장치, 터치 제스처 판정 방법, 및 터치 제스처 판정 프로그램
CN201680091771.3A CN110114749B (zh) 2016-12-26 2016-12-26 触摸面板输入装置、触摸手势判定装置、触摸手势判定方法和记录介质
DE112016007545.6T DE112016007545T5 (de) 2016-12-26 2016-12-26 Tastfeld-eingabeeinrichtung, tastgesten-beurteilungseinrichtung, tastgesten-beurteilungsverfahren und tastgesten-beurteilungs-programm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/088610 WO2018122891A1 (fr) 2016-12-26 2016-12-26 Dispositif d'entrée de panneau tactile, dispositif de détermination de geste tactile, procédé de détermination de geste tactile et programme de détermination de geste tactile

Publications (1)

Publication Number Publication Date
WO2018122891A1 true WO2018122891A1 (fr) 2018-07-05

Family

ID=59559217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/088610 WO2018122891A1 (fr) 2016-12-26 2016-12-26 Dispositif d'entrée de panneau tactile, dispositif de détermination de geste tactile, procédé de détermination de geste tactile et programme de détermination de geste tactile

Country Status (5)

Country Link
JP (1) JP6177482B1 (fr)
KR (1) KR102254794B1 (fr)
CN (1) CN110114749B (fr)
DE (1) DE112016007545T5 (fr)
WO (1) WO2018122891A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021240671A1 (fr) * 2020-05-27 2021-12-02 三菱電機株式会社 Dispositif de détection de geste et procédé de détection de geste
WO2021240668A1 (fr) * 2020-05-27 2021-12-02 三菱電機株式会社 Dispositif de détection de geste et procédé de détection de geste

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112019007585B4 (de) * 2019-09-04 2023-10-12 Mitsubishi Electric Corporation Touchpanel-gerät, vorgangsidentifizierungsverfahren und vorgangsidentifizierungsprogramm
CN111352529B (zh) * 2020-02-20 2022-11-08 Oppo(重庆)智能科技有限公司 触摸事件的上报方法、装置、终端及存储介质
CN113495620A (zh) * 2020-04-03 2021-10-12 百度在线网络技术(北京)有限公司 一种交互模式的切换方法、装置、电子设备及存储介质
JP7014874B1 (ja) 2020-09-24 2022-02-01 レノボ・シンガポール・プライベート・リミテッド 情報処理装置及び情報処理方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000231446A (ja) * 1999-02-10 2000-08-22 Sharp Corp 表示一体型タブレット装置及びタブレット自動補正プログラムを記憶した記憶媒体
JP2003240560A (ja) * 2002-02-13 2003-08-27 Alpine Electronics Inc 視線を用いた画面制御装置
JP2006017478A (ja) * 2004-06-30 2006-01-19 Xanavi Informatics Corp ナビゲーション装置
JP2007302223A (ja) * 2006-04-12 2007-11-22 Hitachi Ltd 車載装置の非接触入力操作装置
JP2008024070A (ja) * 2006-07-19 2008-02-07 Xanavi Informatics Corp 車載情報端末
JP2008195142A (ja) * 2007-02-09 2008-08-28 Aisin Aw Co Ltd 車載機器の操作支援装置、操作支援方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101055193A (zh) * 2006-04-12 2007-10-17 株式会社日立制作所 车载装置的非接触输入操作装置
KR101252169B1 (ko) * 2011-05-27 2013-04-05 엘지전자 주식회사 휴대 단말기 및 그 동작 제어방법
US20140002376A1 (en) * 2012-06-29 2014-01-02 Immersion Corporation Method and apparatus for providing shortcut touch gestures with haptic feedback
KR102268045B1 (ko) * 2014-06-27 2021-06-22 엘지전자 주식회사 쇼 윈도우에 전시된 제품의 제품 정보 제공 장치 및 방법
JP6385173B2 (ja) 2014-07-15 2018-09-05 三菱電機株式会社 エレベータのタッチパネル式行き先階登録操作盤およびエレベータのタッチパネル式行き先階登録操作盤における利用者判定方法
JP6214779B2 (ja) 2014-09-05 2017-10-18 三菱電機株式会社 車載機器制御システム
KR102216944B1 (ko) 2014-09-23 2021-02-18 대원정밀공업(주) 펌핑 디바이스
KR102073222B1 (ko) * 2014-12-18 2020-02-04 한국과학기술원 사용자 단말 및 이의 햅틱 서비스 제공 방법

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000231446A (ja) * 1999-02-10 2000-08-22 Sharp Corp 表示一体型タブレット装置及びタブレット自動補正プログラムを記憶した記憶媒体
JP2003240560A (ja) * 2002-02-13 2003-08-27 Alpine Electronics Inc 視線を用いた画面制御装置
JP2006017478A (ja) * 2004-06-30 2006-01-19 Xanavi Informatics Corp ナビゲーション装置
JP2007302223A (ja) * 2006-04-12 2007-11-22 Hitachi Ltd 車載装置の非接触入力操作装置
JP2008024070A (ja) * 2006-07-19 2008-02-07 Xanavi Informatics Corp 車載情報端末
JP2008195142A (ja) * 2007-02-09 2008-08-28 Aisin Aw Co Ltd 車載機器の操作支援装置、操作支援方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021240671A1 (fr) * 2020-05-27 2021-12-02 三菱電機株式会社 Dispositif de détection de geste et procédé de détection de geste
WO2021240668A1 (fr) * 2020-05-27 2021-12-02 三菱電機株式会社 Dispositif de détection de geste et procédé de détection de geste

Also Published As

Publication number Publication date
DE112016007545T5 (de) 2019-09-19
KR102254794B1 (ko) 2021-05-21
CN110114749B (zh) 2022-02-25
KR20190087510A (ko) 2019-07-24
JPWO2018122891A1 (ja) 2018-12-27
JP6177482B1 (ja) 2017-08-09
CN110114749A (zh) 2019-08-09

Similar Documents

Publication Publication Date Title
JP6177482B1 (ja) タッチパネル入力装置、タッチジェスチャ判定装置、タッチジェスチャ判定方法、及びタッチジェスチャ判定プログラム
US9111076B2 (en) Mobile terminal and control method thereof
JP4275151B2 (ja) ユーザが調整可能な閾値を用いる赤目補正方法及び装置
CN104932809B (zh) 用于控制显示面板的装置和方法
US20090262187A1 (en) Input device
KR20150090840A (ko) 디스플레이 화면의 영역을 보호하는 디바이스 및 방법
TWI658396B (zh) 介面控制方法和電子裝置
JP5222967B2 (ja) 携帯端末
JPWO2013141161A1 (ja) 情報端末、入力受付制御方法及び入力受付制御プログラム
JP2007172303A (ja) 情報入力システム
CN107450820B (zh) 界面控制方法及移动终端
US11354031B2 (en) Electronic apparatus, computer-readable non-transitory recording medium, and display control method for controlling a scroll speed of a display screen
US10802620B2 (en) Information processing apparatus and information processing method
JP2015176311A (ja) 端末及び制御方法
JP2015118507A (ja) オブジェクト選択方法、装置及びコンピュータ・プログラム
JP6329373B2 (ja) 電子機器および電子機器を制御するためのプログラム
WO2017215211A1 (fr) Procédé d'affichage d'images basé sur un terminal intelligent ayant un écran tactile et appareil électronique
JP6616379B2 (ja) 電子機器
WO2019148904A1 (fr) Procédé de mise à l'échelle d'un écran de lunettes intelligentes, et lunettes intelligentes
JP2020017218A (ja) 電子機器、制御プログラム及び表示制御方法
WO2018123701A1 (fr) Dispositif électronique, procédé de commande associé et programme
JP6686885B2 (ja) 情報処理装置、情報処理方法及びプログラム
JP7179334B2 (ja) ジェスチャ認識装置及びジェスチャ認識装置用プログラム
US20210278899A1 (en) Display control method, display control system and wearable device
JP2017102676A (ja) 携帯端末装置、操作装置、情報処理方法及びプログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017522564

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16925576

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20197017714

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 16925576

Country of ref document: EP

Kind code of ref document: A1