WO2018122891A1 - Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program - Google Patents
Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program Download PDFInfo
- Publication number
- WO2018122891A1 WO2018122891A1 PCT/JP2016/088610 JP2016088610W WO2018122891A1 WO 2018122891 A1 WO2018122891 A1 WO 2018122891A1 JP 2016088610 W JP2016088610 W JP 2016088610W WO 2018122891 A1 WO2018122891 A1 WO 2018122891A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch panel
- sight
- line
- determination
- screen
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/0969—Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
Definitions
- the present invention receives a touch gesture operation and outputs a signal based on the touch gesture operation, and a touch panel input device for inputting operation information corresponding to the touch gesture operation and outputting a signal based on the input operation information
- the present invention relates to a touch gesture determination device, a touch gesture determination method, and a touch gesture determination program.
- a user who uses a touch panel input device performs a touch gesture operation while looking at a GUI (Graphical User Interface) screen displayed on the touch panel.
- GUI Graphic User Interface
- Touch gesture operation cannot be performed in "Status”.
- Patent Document 1 discloses an operation part (hereinafter referred to as “display”) displayed on a touch panel based on a similarity between a shape formed by connecting contact points of a plurality of fingers touching the touch panel and a preset shape. Proposal of a device for determining a part.
- Patent Document 2 discloses that when the area of one region defined by the positions of a plurality of fingers in contact with the touch panel is equal to or larger than a preset threshold, the user does not look at the touch panel screen. That is, an apparatus has been proposed that determines that a user performs a touch gesture operation (for example, a visually impaired person) in a “no gaze state” in which the user does not turn his / her gaze toward the screen of the touch panel.
- a touch gesture operation for example, a visually impaired person
- the object of the present invention is to display the touch gesture operation using the display component displayed on the touch panel, and the content of the operation mode in the gaze state and the gaze state in which the touch screen operation is performed using the entire touch panel screen as the reception area for the touch gesture operation.
- Touch panel input device that can perform input operation by touch gesture operation easily and accurately by appropriately switching the contents of the operation mode, and the contents of the operation mode in the gaze state or the operation mode in the gaze state It is to provide a touch gesture determination device, a touch gesture determination method, and a touch gesture determination program for enabling an input operation by a touch gesture operation to be performed easily and accurately by appropriately switching between.
- a touch panel input device displays an operation image on a screen, accepts a user's touch gesture operation, outputs operation information corresponding to the touch gesture operation, and receives the operation information.
- a touch gesture determination device for generating a selection value based on the operation information, In the touch gesture determination device, the touch gesture operation may be either an operation in a gaze state where the user's line of sight is directed to the screen or an operation in a no gaze state where the user's line of sight is not directed to the screen.
- a line-of-sight presence determination unit that outputs line-of-sight determination information indicating the result of the determination, an operation mode switching unit that outputs command information based on the line-of-sight determination information, and An operation determination unit that generates the selection value based on the operation information, and a display control unit that causes the touch panel to display an image according to the command information as the operation image.
- the operation determination unit is a display component gesture that determines the selection value from operation information based on a touch gesture operation performed on the display component displayed as the operation image on the touch panel when the line of sight is present.
- a determination unit; and an entire screen gesture determination unit that determines the selection value from operation information based on a touch gesture operation performed on the entire screen of the touch panel when the line of sight is absent.
- the touch gesture determination method displays an operation image on a screen, accepts a user's touch gesture operation, and outputs the operation information from a touch panel that outputs operation information corresponding to the touch gesture operation.
- a gaze presence / absence judgment step for outputting gaze judgment information indicating the result of the judgment, an operation judgment step for generating the selection value based on the operation information in a judgment method according to command information based on the gaze judgment information, and the touch panel
- a display control step for displaying an image according to the command information as the operation image,
- the selection value is determined from operation information based on a touch gesture operation performed on a display component
- the present invention there is no line-of-sight state in which touch gesture operation is performed with the content of the operation mode in the line-of-sight state in which touch gesture operation is performed using the display component displayed on the touch panel and the entire touch panel screen as the reception area for touch gesture operation.
- FIG. 2 is a diagram illustrating an example of a hardware configuration of a touch panel input device according to Embodiment 1.
- FIG. (A) And (b) is a figure which shows the example of the screen of the touchscreen in the touchscreen input device which concerns on Embodiment 1.
- FIG. 6 is a flowchart showing an operation (touch gesture determination method) of the touch gesture determination device in the touch panel input device according to Embodiment 1. It is a figure which shows an example of the screen of the touchscreen in the touchscreen input device which concerns on the modification of Embodiment 1.
- 10 is a flowchart illustrating an operation (touch gesture determination method) of a touch gesture determination device in the touch panel input device according to the second embodiment.
- 10 is a diagram illustrating a method for acquiring history information of touch gesture operations in the touch panel input device according to Embodiment 2.
- FIG. It is a figure which shows the example which cannot perform the input as intended by performing touch gesture operation in the state without a gaze which is the state which does not look at the screen of a touchscreen in a touchscreen input device.
- 10 is a diagram illustrating an example of a touch gesture operation performed in a state where there is no line of sight in which the screen of the touch panel is not viewed in the touch panel input device according to Embodiment 2.
- FIG. 10 illustrates an example of a hardware configuration of a touch panel input device according to a third embodiment. 10 is a flowchart illustrating an operation (touch gesture determination method) of a touch gesture determination device in a touch panel input device according to a third embodiment. (A) And (b) is a figure which shows the determination method of a visual line presence state and a visual line absence state in the touch-panel input device which concerns on Embodiment 3. FIG.
- the touch panel input device includes a touch panel having a touch operation screen (operation screen) and a touch gesture determination device that receives operation information on the touch panel.
- the touch panel input device is mounted on the target device or connected to be communicable with the target device, the operation screen of the electric device as the target device, the operation screen of the camera as the target device, and the target device
- the present invention can be applied to an operation screen of factory equipment, an operation screen mounted on a car, a ship, an aircraft, or the like as a target device, an operation screen of a portable information terminal such as a smartphone or a tablet terminal as a target device.
- the touch panel input device uses a signal (for example, a selection value) based on operation information input by a touch gesture operation (also referred to as “touch operation”) from the operation screen of the touch panel as a target device equipped with the touch panel input device, or a touch panel. This can be provided to a target device that can communicate with the input device.
- a touch gesture operation also referred to as “touch operation”
- the touch panel is a touch gesture input unit that accepts a touch gesture operation performed by the user.
- the touch gesture operation is an information input operation by a specific movement such as the user's finger (or the palm of the user or the finger and the palm of the finger user).
- Touch gesture operation is tapping, which is an operation of tapping the operation screen of the touch panel with a finger, flicking, which is an operation of flicking the operation screen of the touch panel with a finger, and a swipe which is an operation of sliding the operation screen of the touch panel with a finger (sliding the finger).
- tapping which is an operation of tapping the operation screen of the touch panel with a finger
- flicking which is an operation of flicking the operation screen of the touch panel with a finger
- a swipe which is an operation of sliding the operation screen of the touch panel with a finger (sliding the finger).
- Touch gesture operations include dragging, which is an operation of dragging a display component on the touch panel with a finger, pinch-in, which is an operation of narrowing the interval between fingers while pinching with multiple fingers on the operation screen of the touch panel, and multiple touch gestures on the operation screen of the touch panel. It can include pinch out, which is an operation to increase the interval between fingers.
- the touch gesture operation can include an operation using a touch pen which is a pen-type input auxiliary tool.
- FIG. 1 is a functional block diagram showing a schematic configuration of a touch panel input device 1 according to Embodiment 1 of the present invention.
- the touch panel input device 1 according to the first embodiment includes a touch gesture determination device 10 and a touch panel 20.
- the touch gesture determination device 10 is a device that can execute the touch gesture determination method according to the first embodiment and the touch gesture determination program according to the first embodiment.
- the touch panel 20 receives a touch gesture operation performed by a user and outputs operation information (also referred to as “touch information”) A0 corresponding to the touch gesture operation, and an operation panel.
- the display panel unit 22 is arranged to overlap the unit 21 and can display an operation image such as a GUI screen.
- the display panel unit 22 is, for example, a liquid crystal display.
- the touch gesture determination device 10 includes an operation information input unit 11, a gaze presence / absence determination unit 12 as an operation mode determination unit, an operation mode switching unit 13, an operation determination unit 14, and a notification unit. 15 and a display control unit 16.
- the operation determination unit 14 includes a display component gesture determination unit 141 and a whole screen gesture determination unit 142.
- the operation information input unit 11 receives operation information (operation signal) A0 output from the operation panel unit 21.
- the operation information input unit 11 outputs the input information A1 corresponding to the received operation information A0 to the line-of-sight presence determination unit 12 and the operation determination unit 14.
- the input information A1 is information corresponding to the operation information A0, and may be the same information as the operation information A0.
- the line-of-sight presence determination unit 12 is an operation performed by the user while looking at the screen of the touch panel 20, that is, the user's line of sight is the screen of the touch panel 20 Or an operation performed by the user without looking at the screen of the touch panel 20, that is, the user's line of sight is not directed at the screen of the touch panel 20. It is determined whether the operation is in a state where there is no line of sight.
- the operation mode based on the presence of the line of sight is an operation mode in which a touch gesture operation is performed on the display component displayed on the touch panel 20, for example.
- the operation mode on the premise of the state where there is no line of sight is, for example, an operation mode in which the entire screen of the touch panel 20 is set as one operation effective area for accepting a touch gesture operation.
- the line-of-sight presence determination unit 12 uses line-of-sight determination information A2 indicating a determination result as to whether the touch gesture operation is an operation in a state where there is a line of sight or an operation in a state where there is no line of sight from the input information A1 based on the touch gesture operation of the user.
- the gaze presence / absence determination unit 12 includes a storage unit (for example, a memory in FIG. 2 described later) that stores a predetermined touch gesture operation pattern, and stores and stores the touch gesture operation indicated by the input information A1. This can be performed based on the degree of similarity to the touch gesture operation pattern stored in the section.
- the operation mode switching unit 13 uses the line-of-sight determination information A2 received from the line-of-sight presence determination unit 12 to determine the display content of the screen of the touch panel 20, the determination method by the operation determination unit 14 of the input information A1 based on the touch gesture operation from the touch panel 20, And command information A3 for switching (setting) the notification method by the notification unit 15 is output.
- the operation determination unit 14 receives the command information A3 from the operation mode switching unit 13, and switches the determination method of the input information A1 according to the command information A3.
- the entire screen gesture determination unit 142 determines a touch gesture operation on the entire screen of the touch panel 20 and determines a selection value A7 that is an output signal based on the input information A1.
- the touch gesture operation with respect to the entire screen of the touch panel 20 is determined in the touch gesture operation with the line of sight in a limited narrow area of the touch panel 20 (a part of the screen of the touch panel 20), and the touch gesture operation in the state without the line of sight.
- the touch panel 20 is used in a wide area (the entire screen of the touch panel 20).
- the display component gesture determination unit 141 determines a touch gesture operation from the display component as the operation image displayed by the display panel unit 22 and the input information A1, and outputs an output signal based on the display component and the input information A1.
- a certain selection value A7 is determined.
- the determination of the touch gesture operation on the display component of the touch panel 20 is used in a wide area of the touch panel 20 (the entire screen of the touch panel 20) in the touch gesture operation in the presence of the line of sight, and is limited in the touch gesture operation in the state of no line of sight. Used in a narrow area (a part of the screen of the touch panel 20).
- the display control unit 16 outputs an image signal A6 of an operation image displayed on the display panel unit 22 of the touch panel 20.
- the display control unit 16 changes the display content of the screen of the display panel unit 22 according to the operation determination information A4 received from the operation determination unit 14 and the command information A3 received from the operation mode switching unit 13.
- the notification unit 15 switches a method of notifying information to the user between a touch gesture operation in a line-of-sight state and a touch gesture operation in a state without a line of sight.
- the notification unit 15 issues a notification of notification contents according to the command information A3 or outputs a notification signal.
- the notification unit 15 notifies the status of the user's touch gesture operation by, for example, sound, screen display, vibration by a vibrator, or lamp lighting.
- the notification unit 15 changes the notification content according to the operation determination information A4 received from the operation determination unit 14 and the command information A3 received from the operation mode switching unit 13.
- the notification unit 15 When the notification by the notification unit 15 is a notification by sound, the notification unit 15 outputs a notification signal to a speaker as an audio output unit.
- the speaker is shown in FIG.
- the notification unit 15 When the notification by the notification unit 15 is an image display, the notification unit 15 sends notification information A5 to the display control unit 16, and the display control unit 16 transmits an image signal based on the notification information to the display panel unit 22.
- the selection value A7 output from the operation determination unit 14 is a selection value determined by the operation determination unit 14 based on the input information A1, and an application program for a device in which the touch panel input device 1 is mounted based on the selection value A7. Etc. perform device control and the like.
- FIG. 2 is a diagram illustrating an example of a hardware (H / W) configuration of the touch panel input device 1 according to the first embodiment.
- the touch panel input device 1 according to the first embodiment includes a touch panel 20, a processor 31, a memory 32, and a speaker 33.
- a touch gesture determination device 10 shown in FIG. 1 includes a memory 32 as a storage device that stores a touch gesture determination program as software, and a processor 31 as an information processing unit that executes the touch gesture determination program stored in the memory 32. (For example, by a computer).
- the components 11 to 16 in FIG. 1 correspond to the processor 31 that executes the touch gesture determination program in FIG.
- a part of the touch gesture determination device 10 shown in FIG. 1 can be realized by the memory 32 shown in FIG. 2 and the processor 31 that executes the touch gesture determination program.
- the speaker 33 is a sound output unit used when, for example, a touch gesture operation state performed without a line of sight is notified by a sound such as an announcement.
- the touch panel input device 1 may include an additional device such as a vibrator, a lamp, and a transmission device for wirelessly transmitting a notification signal, instead of the speaker 33 or as an additional configuration.
- FIG. 3A and 3B are diagrams showing examples of the screen of touch panel 20 and touch gesture operation in touch panel input device 1 according to Embodiment 1.
- FIG. 3A and 3B are diagrams showing examples of the screen of touch panel 20 and touch gesture operation in touch panel input device 1 according to Embodiment 1.
- FIG. 3A shows an example of the screen of the touch panel 20 in the operation mode with the line of sight.
- FIG. 3A is an example of a screen when the line-of-sight presence / absence determination unit 12 determines from the input information A1 indicating the touch gesture operation that the operation mode is the line-of-sight presence state.
- the user can switch display contents on the screen by performing an operation of swiping the screen left or right with a finger.
- the list 231 is a list displayed when the line-of-sight presence determination unit 12 determines that there is a line-of-sight presence state.
- the list 231 is composed of a plurality of buttons (square areas with numbers), and the list 231 can be moved in the same direction by tracing (sliding) the finger upward or downward.
- the line-of-sight presence determination unit 12 determines that there is a line of sight and taps the button (touch the button with a finger and release it within a predetermined time without moving the finger). To determine (confirm) the selected value.
- FIG. 3B shows an example of the screen of the touch panel 20 in the operation mode in a state where there is no line of sight.
- FIG. 3B is an example of a screen when the line-of-sight presence / absence determination unit 12 determines from the input information A1 indicating the touch gesture operation that the operation mode is a line-of-sight state.
- the cancel area 241 in the absence of line of sight is a rectangular area surrounding “>> Cancel >>”, and in this rectangular area, the finger is traced (slided) from left to right in a predetermined direction. ), The operation mode in the absence of sight line is canceled, and the operation mode in the sight presence state is switched.
- FIG. 4 is a flowchart showing the operation (touch gesture determination method) of the touch gesture determination apparatus 10 in the touch panel input device 1 according to the first embodiment.
- step ST101 the operation mode switching unit 13 sets the operation mode in a line-of-sight state as the initial operation mode.
- the operation mode switching unit 13 is configured such that the screen of the touch panel 20 becomes a screen of the operation mode with the line of sight, and the input method of the touch panel 20 is the input method by the display component using the operation mode with the line of sight.
- the initial operation mode may be an operation mode without a line of sight.
- the operation information input unit 11 acquires operation information A0 (coordinates indicating the position of the finger contact point, finger contact state, finger identification numbers for a plurality of contact points, etc.) from the touch panel 20.
- the data is stored in a storage unit (for example, the memory 32 shown in FIG. 2).
- the line-of-sight presence / absence determination unit 12 determines whether the touch gesture operation is an operation in a line-of-sight state from the history information of the operation information A0 (information on the contact point of the finger) stored in the storage unit. It is determined whether the operation is in the absence state, that is, whether or not there is a line of sight (operation mode determination).
- the line-of-sight presence determination unit 12 determines that there is no line-of-sight state, for example, when any of the following determination conditions (1) to (3) is satisfied, and in other cases, there is a line-of-sight Judged as a state.
- Judgment condition (1) When a finger touches one point on the screen of the touch panel 20 and the contact position has not moved for a time equal to or greater than a predetermined threshold value.
- Judgment condition (2) Multiple fingers are touched.
- Criteria (3) When touching a plurality of touch points on the screen of 20 and keeping touching for a time equal to or greater than a predetermined threshold value / Criteria (3): The palm keeps touching the screen of the touch panel 20 (ie In the case where a wide area equal to or greater than a predetermined threshold is kept in contact for a time equal to or greater than a predetermined threshold), however, another determination criterion may be used as the determination criterion.
- the determination condition (3) for setting the line-of-sight state in step ST103 is satisfied, there is a possibility that the coordinates of the detected plurality of contact points and the number of contact points are unstable (that is, greatly fluctuate with time). is there. For this reason, using the history information of the operation information A0 (input information A1) for a period from the present time to a time point that is a predetermined period, the degree of coordinate blur (the amount of movement of the coordinates) of the contact point within this period (For example, when the maximum movement amount exceeds a predetermined threshold value), it may be determined that the touch is a palm touch.
- the operation information A0 from the touch panel 20 is information indicating that the contact state is abnormal during the operation of bringing the palm into contact with the screen. For this reason, when there is a history indicating that the contact state is abnormal within this period using the history information of the operation information A0 in the period from the present time to a time point that is predetermined, the palm touch You may determine that there is.
- step ST103 it is erroneously determined that the finger has been removed from the screen of the touch panel 20 during the operation of bringing the palm into contact with the screen. May be erroneously determined as a touch gesture operation by tapping a button. For this reason, using the history information of the operation information A0 during the period from the current time to the time point that is predetermined, the time is unstable for a predetermined time after all the fingers of the touch panel 20 are separated during this period. There is a standby process to detect whether touch detection (that is, detection of a situation in which the number and position of contact points fluctuate greatly with time) is not detected, or whether there is no notification of an abnormal contact state. When these are detected, it may be determined that the palm touches.
- touch detection that is, detection of a situation in which the number and position of contact points fluctuate greatly with time
- step ST103 the line-of-sight presence determination unit 12 determines that the line of sight is present when the finger is touched as displayed on the screen. For example, on the screen in the no-gaze state shown in FIG. 3B, the touch is started inside the cancel area 241 in the no-gaze state, and the left mark “>>” is passed through the character “Cancel”.
- the line-of-sight presence determination unit 12 determines that the operation is a switching operation to the line-of-sight presence state. Note that the determination of the switching operation to the line-of-sight state in step ST103 is performed by tracing the cancel area 241 in the absence of the line of sight, when continuously tracing from the outside of the cancellation area 241 to the inside of the cancellation area 241. If the camera protrudes above or below the cancel area 241 or is traced to the right after the cancel area 241 is traced in the absence of the line of sight, You may determine not to determine.
- step ST103 the condition for determining that the line-of-sight presence determination unit 12 is an operation for switching to the line-of-sight presence state is from the left mark “>>” in FIG. 3B to the character “Cancel”. Not only when the finger is slid so as to pass the right mark “>>”, but also in a state where there is no line of sight, the cancel region 241 may be touched and moved to the right by a predetermined distance or more.
- step ST104 when the gaze presence / absence determination is changed (in the case of YES), the process proceeds to steps ST105 to ST108, and the operation mode switching unit 13 switches the input method by switching the operation screen of the touch panel 20, and the notification unit.
- the command information A3 for switching the notification method by 15 is output.
- step ST104 when the change to the eye gaze presence / absence determination has not occurred (in the case of NO), the process proceeds to step ST109.
- step ST105 the operation determination unit 14 receives the command information A3 from the operation mode switching unit 13, issues a command to switch an effective input method for the entire screen, and the entire screen gesture determination unit 142 determines an effective input method for the entire screen. Switch.
- the operation determination unit 14 receives the command information A3 from the operation mode switching unit 13, issues a command to switch the effective input method for the display component, and the display component gesture determination unit 141 is effective for the display component. Switch the input method.
- the order of processing in steps ST105 and ST106 may be reversed.
- the display control unit 16 receives the command information A3 for switching the screen from the operation mode switching unit 13, and switches the screen. For example, when receiving the command information A3 for instructing screen switching, the display control unit 16 displays, for example, the screen in the sight line presence state shown in FIG. In the absence state, for example, a screen in a state of no sight shown in FIG. 3B is displayed.
- the notification unit 15 receives the command information A3 for switching the notification content of the operation mode switching unit 13, and switches the notification content. For example, in a state where the line of sight is present, the notification unit 15 outputs a voice announcement from the speaker, such as “If you press and hold the screen for a long time, the mode switches to the mode of operation without viewing the screen”, and the state of the line of sight is switched off. , “Since it has been pressed for a long time, it will switch to the operation screen for people with disabilities.” When the line of sight disappears, the selected value increases by touching the upper side of the screen, and the selected value decreases by touching the lower side. The selected value is confirmed by tapping two points. To output. At this time, the notification unit 15 may notify the current selection value by voice announcement every time an upper or lower tap occurs on the screen.
- the notification of the notification unit 15 is not only a voice announcement, but also a notification from a touch panel 20 that is being operated by a different display device, a voice announcement from a smart watch, or a vibration of the touch panel. May be notified.
- the operation determination unit 14 determines (determines) the selection value based on the input information. For example, in the operation mode with the line of sight, the operation determination unit 14 taps a displayed button, and determines the number displayed on the button as the selection value. On the other hand, in the operation mode with no line of sight, the operation determination unit 14 increases the selection value when the upper touch of the screen occurs, and decreases the selection value when the lower touch of the screen occurs. The operation determination unit 14 determines the selection value when two touches occur and the finger is released at the same time. When the operation determination unit 14 receives a touch gesture operation, the display control unit 16 changes the display content according to the touch gesture operation, and the notification unit 15 notifies the operation status according to the touch gesture operation. .
- step ST109 if the cancel area 241 in the absence of line of sight is tapped, it is not an operation of tracing the cancel area 241, so that it is regarded as a lower operation on the screen and the selection value is reduced.
- the determination of the gaze presence / absence determination unit 12 in step ST103 and the determination operation of the operation determination unit 14 in step ST109 store, for example, an operation determination table as shown in Table 1 in the storage unit of the gaze presence / absence determination unit 12. It can be realized by processing according to the contents of.
- the operation information A0 based on the user's touch gesture operation is a touch gesture operation in a state where there is a line of sight, or a touch gesture operation in a state where there is no line of sight
- the determination method of the input information A1 in the operation determination unit 14, the display control method by the display control unit 16, and the notification method by the notification unit 15 can be changed. For this reason, when the user is looking at the screen of the touch panel 20, the user taps a button as a display component displayed on the screen with his / her finger or traces the list displayed on the screen with his / her finger. State touch gesture operation is possible.
- the user can perform an operation by accepting a touch gesture operation using the entire screen and notification of the operation state.
- the touch gesture operation with the line of sight or the touch without the line of sight is performed regardless of whether the touch gesture operation with the line of sight or the touch gesture operation without the line of sight is performed. Since it is appropriately determined as a gesture operation, it is possible to accurately perform an input operation by a touch gesture operation.
- the touch gesture operation when an operation that does not occur as a touch gesture operation in a gaze state is performed in a gaze state, it is determined that the touch gesture operation is a gaze state, and in a gaze state.
- an operation that does not occur as a touch gesture operation without a line of sight is performed, it can be determined that the touch gesture operation is with a line of sight. In this case, since there are few misjudgments in the gaze presence / absence judgment, the judgment accuracy of the operation mode is improved.
- the gaze presence / absence judgment unit 12 is in a gazeless state when three or more touch points (finger contact points) occur.
- the operation mode may be determined.
- a two-point touch operation for example, a pinch operation, a rotation operation, etc.
- a touch gesture operation for example, a pinch operation, a rotation operation, etc.
- the gaze presence / absence determination in step ST103 of FIG. 4 when a state in which the content of the screen of the touch panel 20 is not changed from the start of contact with the finger of the screen of the touch panel 20 continues for a predetermined time or longer, It may be determined that there is no line of sight. For example, even if a finger is brought into contact with the screen of the touch panel 20 and a tracing operation is generated, if there is no screen scrolling due to a left / right swipe of the finger or a list scrolling due to a vertical tracing operation of the screen, the presence / absence of gaze The determination unit 12 may determine that there is no line of sight.
- the gaze presence / absence judgment unit 12 has gaze when detecting any of the vertical movement, the horizontal movement, and the diagonal movement of the finger contact point. You may determine with a state.
- the gaze presence / absence judgment unit 12 has a gaze presence state when a specific locus such as a circle, square, or character is drawn by movement of the finger contact point. May be determined.
- the determination by the gaze presence / absence determination unit 12 in step ST103 of FIG. 4 may be a condition in which a predetermined time has elapsed without any input operation from the touch panel 20 as a condition for changing from a gazeless state to a gaze presence state. .
- it may be configured to automatically return to the line-of-sight state when the state without the touch gesture operation continues for a predetermined threshold time or more after switching to the line-of-sight state. In this case, it is possible to perform the touch gesture operation without looking at the screen by performing the touch gesture operation determined to be in the absence of the line of sight such as a long press again based on the voice announcement of the notification unit 15.
- the selected value is increased in the clockwise direction by rotating the dial by touching two points on the screen, and the counterclockwise direction.
- the selection value may be reduced.
- the two-point tap is set as the decision operation, there is a possibility that the decision and the rotation operation may be mistaken. For example, when a tap occurs twice (a double tap occurs) within a predetermined time. May be a selection value determining operation.
- Embodiment 2 ⁇ 2-1 Configuration
- the touch gesture operation in the operation mode without the line of sight is limited to a simple touch gesture operation using the entire screen of the touch panel 20 as an operation effective area.
- an apparatus and method for preventing an unintended operation due to not looking at the screen and enabling a more complicated touch gesture operation in the operation mode without a line of sight will be described.
- FIG. 6 is a functional block diagram showing a schematic configuration of the touch panel input device 2 according to Embodiment 2 of the present invention.
- the touch panel input device 2 includes a touch panel 20 and a touch gesture determination device 10a.
- the touch gesture determination device 10a is a device that can execute the touch gesture determination method according to the second embodiment and the touch gesture determination program according to the second embodiment.
- the touch gesture determination device 10a according to the second embodiment is different from the touch gesture determination device 10 according to the first embodiment in that it includes a gesture operation storage unit 17 and a gesture correction unit 18. Except for this point, the second embodiment is the same as the first embodiment. Therefore, the differences will be mainly described below.
- the gesture operation storage unit 17 stores the history information of the touch gesture operation recognized by the operation determination unit 14.
- the gesture correction unit 18 corrects the touch gesture operation recognition result determined by the operation determination unit 14 from the touch gesture operation data stored in the gesture operation storage unit 17 and determines the selection value A8.
- the H / W configuration of the touch panel input device 2 includes a touch panel 20, a processor 31, a memory 32, and a speaker 33, as shown in FIG.
- the touch gesture determination device 10a shown in FIG. 6 includes a memory 32 as a storage device that stores a touch gesture determination program as software, and a processor 31 as an information processing unit that executes the touch gesture determination program stored in the memory 32. (For example, by a computer).
- the components 11 to 18 in FIG. 6 correspond to the processor 31 that executes the touch gesture determination program in FIG.
- part of the touch gesture determination device 10a illustrated in FIG. 6 may be realized by the memory 32 illustrated in FIG. 2 and the processor 31 that executes the touch gesture determination program.
- FIG. 7 is a flowchart showing the operation (touch gesture determination method) of the touch gesture determination apparatus 10a in the touch panel input device 2 according to the second embodiment.
- the same step number as the step number shown in FIG. 4 is attached
- the operation of the touch gesture determination device 10a according to the second embodiment is that after the touch gesture operation determination step ST109, there are a processing step (step ST201) for storing the touch gesture operation and a gesture correction processing step (step ST202). This is different from the operation of the touch gesture determination apparatus 10 in the first embodiment. Except for these points, the operation of the touch gesture determination device 10a in the second exemplary embodiment is the same as that in the first exemplary embodiment.
- the gesture operation storage unit 17 stores the touch gesture operation recognized by the operation determination unit 14 as history information.
- An example of the history information of the touch gesture operation is shown in Table 2.
- FIG. 8 is a diagram illustrating a method for acquiring the history information of the touch gesture operation in the touch panel input device 2 according to the second embodiment.
- the touch gesture operation history information shown in Table 2 is calculated by, for example, the method shown in FIG.
- the amount of movement in the rotation direction that is, the angle change amount H [pixel] is the current touch gesture with respect to the straight line Q1 connecting the two points P1 and P2 touched by the previous touch gesture operation.
- the perpendicular Q2 is dropped from the point P3 touched by the operation, and the length of the perpendicular Q2 is set.
- the distance change amount D [pixel] is a change amount between the current distance between two points (distance between P1 and P3) L2 and the previous distance between two points (distance between P1 and P2) L1.
- the movement amount M [pixel] is a movement amount between the current center position I2 between two points (distance between P1 and P3) and the central position I1 between the previous two points (distance between P1 and P2).
- step ST202 when the gesture correction unit 18 performs a touch gesture operation without looking at the screen from the history information stored in the gesture operation storage unit 17, the intended operation information is not input to the touch gesture determination device 10a. Correct the recognition result.
- FIG. 9 is a diagram illustrating an example of a touch gesture operation in which operation information as intended is not input to the touch gesture determination device 10a when a touch gesture operation is performed without looking at the screen of the touch panel 20 in the touch panel input device 2.
- the selection value is increased by repeatedly drawing a circle with a finger clockwise as the first rotation direction on the screen of the touch panel 20, and the second rotation in the direction opposite to the first rotation direction.
- a touch gesture operation that reduces a selected value by repeatedly drawing a circle with a finger counterclockwise as a direction can be given.
- the gesture correction unit 18 uses the touch trajectory information accumulated in the gesture operation storage unit 17 in a period from the present time to a past time point that is a predetermined time later, and uses a fitting method such as the least square method to form a circle shape. And update the position of the circle. After updating the position of the circle, the gesture correction unit 18 calculates an angle change amount by calculating an angle formed by the previous finger contact point, the center position of the circle, and the current finger contact point. As described above, as shown in FIG. 9, it is possible to realize a touch gesture operation for repeatedly drawing a circle without looking at the 0 screen of the touch panel 20.
- the gesture correction unit 18 recalculates the coordinate locus to correct the circle by changing the vertical and horizontal scales of the ellipse when the shape of the circle is an ellipse.
- the angle formed by the contact point of the circle, the center position of the circle, and the current finger contact point may be calculated.
- the operation determination unit 14 determines a touch gesture operation for drawing a circle.
- the initial position of the circle may be the center position of the screen of the touch panel 20, or may be obtained from a locus for a predetermined time by a least square method or the like.
- FIG. 10 is a diagram illustrating an example of a touch gesture operation performed without looking at the screen of the touch panel 20 in the touch panel input device 2 according to the second embodiment.
- the operation that triggers the operation determination unit 14 to start determining a touch gesture operation for drawing a circle is determined by touching a plurality of contact points with a plurality of fingertips as shown on the left side of FIG. It may be a case where the number of the received items decreases. For example, as shown in FIG. 10, when the touch point becomes 1 after touching at 3 points, the operation determination unit 14 performs a touch gesture operation for drawing a circle around the center of gravity (center of gravity) of the three points.
- the gesture correction unit 18 may correct the rotation amount by updating the position of the circle.
- FIG. 11 is a diagram illustrating a scene where the orientation, angle, and enlargement / reduction amount of the camera are adjusted without looking at the touch panel input device 2 at hand in the touch panel input device 2 according to the second embodiment.
- FIG. 11 illustrates a scene in which the camera orientation, angle, and enlargement / reduction amount are adjusted without looking at the touch panel at hand.
- the camera video display device 401 displays video shot by the camera.
- the notification unit 15 superimposes and displays the current camera operation status (direction, enlargement ratio, rotation angle) on the camera video display device 401.
- the hand operation screen 402 passes touch information to the operation information input unit 11, switches the screen display between the gaze state and the gaze state, and displays the display content of the screen according to the touch gesture operation determined by the operation determination unit 14. change.
- the screen 403 is a screen when the line-of-sight presence determination unit 12 determines that the line of sight is present, and a touch gesture operation for displaying a manual and switching the screen by tapping a hyperlink is possible. is there.
- the screen 404 is a screen when the gaze presence / absence determination unit 12 determines that there is no gaze state.
- the line-of-sight presence determination unit 12 determines a touch operation of three or more points, and determines that there is a line-of-sight state when there is an operation in which all fingers are separated and a predetermined time has elapsed.
- the operation determination unit 14 determines whether there is an enlargement / reduction, movement, or rotation touch gesture operation after three or more touch operations, and according to the amount of each operation, the enlargement / reduction camera image 407, the movement camera image 408, and the rotation An image such as a camera image 409 in which the direction of the camera and the amount of enlargement / reduction is changed is displayed.
- touch gesture operation that does not work as intended without looking at the screen is an operation of pressing the touch panel strongly.
- the gaze presence / absence judging unit performs an operation of judging that there is no gaze by long pressing, a tendency to push the finger strongly appears when long pressing without looking at the screen.
- the gesture correction unit 18 uses the fact that the movement amount of the finger position does not change more than a certain value or the movement amount becomes less than a certain value due to a weak pressing pressure based on the history information of the touch gesture operation shown in Table 2. Judge the push.
- the correction processing of the gesture correction unit 18 can be applied not only to a long press but also to a tap or a two-point operation.
- the gesture correcting unit 18 is a total value of movement amounts of three or more contact points in a period from the current time accumulated in the gesture operation storage unit 17 to a past time point that is a predetermined time later, between the two contact points.
- the total value of the change amount of the distance and the total value of the change amount in the rotation direction are obtained, and the operation of the largest total value among these total values of the change amounts is output as the intended operation.
- an operation as intended can be performed.
- ⁇ 2-3 Effects as described above, in the second embodiment, an unintended operation by operating without looking at the screen by collecting and correcting the history of gestures recognized by the operation determination unit 14 In addition, even if the touch gesture operation is more complicated, the touch gesture operation can be performed without looking at the screen.
- Embodiment 3 ⁇ 3-1 Configuration
- the gaze presence / absence determination unit 12 performs gaze presence / absence determination (operation mode determination) based on the operation information A0 based on the touch gesture operation.
- the operation used for the gaze presence / absence determination cannot be used as a touch gesture operation for original information input. Therefore, in the third embodiment, instead of performing gaze presence / absence determination (operation mode determination) from the operation information A0 based on the touch gesture operation, a camera serving as an imaging device that captures the user's face is provided, and the user's face camera From the image, a gaze presence / absence determination is performed.
- FIG. 12 is a functional block diagram showing a schematic configuration of the touch panel input device 3 according to the third embodiment. 12, components that are the same as or correspond to the components shown in FIG. 1 are given the same reference numerals as those shown in FIG.
- the touch panel input device 3 includes a touch panel 20 and a touch gesture determination device 10b.
- the touch gesture determination device 10b is a device that can execute the touch gesture determination method according to the third embodiment and the touch gesture determination program according to the third embodiment.
- the gaze presence / absence determination unit 12b performs gaze presence / absence determination (operation mode determination) from the camera image of the user's face acquired from the camera 34 through the camera image input unit 19. This is different from the touch gesture determination device 10 in the first embodiment.
- the third embodiment is the same as the first embodiment.
- the camera image input unit 19 and the gaze presence / absence determination unit 12b in the third embodiment can be applied to the touch panel input device 2 according to the second embodiment.
- the camera image input unit 19 receives a camera image (image data) acquired by the camera 34 shooting a user's face.
- the line-of-sight presence determination unit 12b as the operation mode determination unit receives the camera image of the user's face from the camera image input unit 19, performs image processing for extracting image data of the entire face and eyes, and performs the face orientation and the line of sight of the eyes. Detect the direction of. If the eyes are facing the screen of the touch panel 20, it is determined that the operation mode is in a line-of-sight state, and if the eyes are not facing the screen of the touch panel 20, it is determined that the operation mode is in a state without a line of sight.
- FIG. 13 is a diagram illustrating an example of the H / W configuration of the touch panel input device 3 according to the third embodiment.
- the H / W configuration of the touch panel input device 3 according to the third embodiment is the touch panel input device 1 according to the first embodiment in that the camera 34 is provided and the touch gesture determination program stored in the memory 32b. This is different from the H / W configuration. Except for these points, the H / W configuration of the third embodiment is the same as that of the first embodiment.
- the camera 34 stores the camera image (image data) of the user's face acquired by camera shooting in the memory 32b as a storage unit, and the processor 31 receives the camera image from the memory 32b in the processing of the camera image input unit 19. Obtained and image processing is performed by the line-of-sight presence determination unit 12b.
- FIG. 14 is a flowchart showing the operation (touch gesture determination method) of the touch gesture determination apparatus 10b in the touch panel input device 3 according to the third embodiment.
- the same step number as the step number shown in FIG. 4 is attached
- the operation of the touch gesture determination device 10b according to Embodiment 3 is that the line-of-sight presence / absence determination unit 12b performs line-of-sight presence determination from the camera image of the user's face acquired through the camera image input unit 19 (steps ST301 and ST302) and the camera.
- the operation of the touch gesture determination apparatus 10 in Embodiment 1 differs from the operation of the touch gesture determination apparatus 10 in Embodiment 1 in that the operation information A0 by the touch gesture operation is acquired after the gaze presence / absence determination based on the image (step ST303). Except for these points, the operation of the touch panel input device 3 according to the third embodiment is the same as that of the first embodiment. Therefore, an operation different from the operation of the first embodiment will be described below.
- the camera image input unit 19 receives the image data of the user's face from the camera 34 and stores the image data in the memory 32b.
- step ST302 the line-of-sight presence determination unit 12b receives the image data of the user's face stored in the memory 32b by the camera image input unit 19 as shown in FIG. 15A, and extracts the face direction and line-of-sight. Image processing is performed, and if the line of sight is directed toward the screen of the touch panel 20 as shown on the left side of FIG. Further, the line-of-sight presence determination unit 12b determines that there is no line-of-sight state when the line of sight does not face the direction toward the screen of the touch panel 20, as illustrated on the right side of FIG.
- the orientation of the face can be detected by, for example, a Cascade discriminator based on the Haar feature, and the line of sight extracts, for example, an eye part that is a facial part, and applies an edge filter to the eye image, It can be detected by extracting the outer diameter of the iris and fitting the ellipse to estimate the position of the pupil.
- the face orientation and line-of-sight detection methods are not limited to these examples, and other methods may be employed.
- ⁇ 3-3 Effect
- the camera image of the camera 34 is used. By determining whether the user's line of sight is directed to the screen of the touch panel 20 or not, the user can switch the operation mode. Even if the touch gesture operation is not performed, the gaze presence / absence determination (operation mode determination) can be easily and accurately performed.
- the gaze presence / absence determination is performed based on the camera image. Therefore, it is possible to eliminate an operation whose use is restricted in the touch gesture operation.
- the touch panel 20 is viewed when inspecting equipment or when viewing a large screen display device installed at a distant position.
- a touch gesture operation can be performed by a button operation using a normal display component.
- the touch panel 20 is not viewed (when there is no line of sight)
- the touch screen 20 is touched entirely. It can be set as the operation effective area which receives operation. For this reason, when the touch panel 20 is not viewed, the portable terminal device can be used as an operation device for checking a device or for a large-screen display device installed at a distance.
- Embodiments 1 to 3 above are examples of the present invention, and various modifications are possible within the scope of the present invention.
Abstract
Description
前記タッチジェスチャ判定装置は、前記タッチジェスチャ操作が、前記ユーザの視線が前記画面に向けられている視線有り状態における操作又は前記ユーザの視線が前記画面に向けられていない視線無し状態における操作のいずれであるかを判定し、前記判定の結果を示す視線判定情報を出力する視線有無判定部と、前記視線判定情報に基づく命令情報を出力する操作モード切替部と、前記命令情報に従う判定方法で、前記操作情報に基づく前記選択値を生成する操作判定部と、前記タッチパネルに、前記操作用画像として前記命令情報に従う画像を表示させる表示制御部と、を有し、
前記操作判定部は、前記視線有り状態のときに、前記タッチパネルに前記操作用画像として表示された表示部品に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定する表示部品ジェスチャ決定部と、前記視線無し状態のときに、前記タッチパネルの画面全体に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定する画面全体ジェスチャ決定部と、を有するものである。 A touch panel input device according to an aspect of the present invention displays an operation image on a screen, accepts a user's touch gesture operation, outputs operation information corresponding to the touch gesture operation, and receives the operation information. A touch gesture determination device for generating a selection value based on the operation information,
In the touch gesture determination device, the touch gesture operation may be either an operation in a gaze state where the user's line of sight is directed to the screen or an operation in a no gaze state where the user's line of sight is not directed to the screen. In the determination method according to the command information, a line-of-sight presence determination unit that outputs line-of-sight determination information indicating the result of the determination, an operation mode switching unit that outputs command information based on the line-of-sight determination information, and An operation determination unit that generates the selection value based on the operation information, and a display control unit that causes the touch panel to display an image according to the command information as the operation image.
The operation determination unit is a display component gesture that determines the selection value from operation information based on a touch gesture operation performed on the display component displayed as the operation image on the touch panel when the line of sight is present. A determination unit; and an entire screen gesture determination unit that determines the selection value from operation information based on a touch gesture operation performed on the entire screen of the touch panel when the line of sight is absent.
前記タッチジェスチャ操作が、前記ユーザの視線が前記画面に向けられている視線有り状態における操作又は前記ユーザの視線が前記画面に向けられていない視線無し状態における操作のいずれであるかを判定し、前記判定の結果を示す視線判定情報を出力する視線有無判定ステップと、前記視線判定情報に基づく命令情報に従う判定方法で、前記操作情報に基づく前記選択値を生成する操作判定ステップと、前記タッチパネルに、前記操作用画像として前記命令情報に従う画像を表示させる表示制御ステップと、を有し、
前記操作判定ステップにおいて、前記視線有り状態のときに、前記タッチパネルに前記操作用画像として表示された表示部品に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定し、前記視線無し状態のときに、前記タッチパネルの画面全体に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定するものである。 The touch gesture determination method according to another aspect of the present invention displays an operation image on a screen, accepts a user's touch gesture operation, and outputs the operation information from a touch panel that outputs operation information corresponding to the touch gesture operation. A method for receiving and generating a selection value based on the operation information,
Determining whether the touch gesture operation is an operation in a line-of-sight state in which the user's line of sight is directed to the screen or an operation in a line-of-sight state in which the user's line of sight is not directed to the screen; A gaze presence / absence judgment step for outputting gaze judgment information indicating the result of the judgment, an operation judgment step for generating the selection value based on the operation information in a judgment method according to command information based on the gaze judgment information, and the touch panel And a display control step for displaying an image according to the command information as the operation image,
In the operation determination step, when the line of sight is present, the selection value is determined from operation information based on a touch gesture operation performed on a display component displayed as the operation image on the touch panel, and the line of sight In the absence state, the selection value is determined from operation information based on a touch gesture operation performed on the entire screen of the touch panel.
《1-1》構成
図1は、本発明の実施の形態1に係るタッチパネル入力装置1の概略構成を示す機能ブロック図である。図1に示されるように、実施の形態1に係るタッチパネル入力装置1は、タッチジェスチャ判定装置10と、タッチパネル20とを有している。タッチジェスチャ判定装置10は、実施の形態1に係るタッチジェスチャ判定方法及び実施の形態1に係るタッチジェスチャ判定プログラムを実行することができる装置である。 << 1 >>
<< 1-1 >> Configuration FIG. 1 is a functional block diagram showing a schematic configuration of a touch
図4は、実施の形態1に係るタッチパネル入力装置1におけるタッチジェスチャ判定装置10の動作(タッチジェスチャ判定方法)を示すフローチャートである。 << 1-2 >> Operation FIG. 4 is a flowchart showing the operation (touch gesture determination method) of the touch
・判定条件(1):指がタッチパネル20の画面上の1点に接触し、予め決められた閾値以上の時間、接触位置が移動していない場合
・判定条件(2):複数の指がタッチパネル20の画面上の複数の接触点に接触し、予め決められた閾値以上の時間、接触し続けている場合
・判定条件(3):手の平がタッチパネル20の画面上に接触し続けている(すなわち、予め決められた閾値以上の広い面積が、予め決められた閾値以上の時間、接触し続けている)場合
ただし、判定基準として、他の判定基準を用いてもよい。 In the next step ST103, the line-of-sight
Judgment condition (1): When a finger touches one point on the screen of the
以上に説明したように、実施の形態1においては、ユーザのタッチジェスチャ操作に基づく操作情報A0が、視線有り状態におけるタッチジェスチャ操作であるか、視線無し状態におけるタッチジェスチャ操作であるかに応じて、操作判定部14における入力情報A1の判定方法、表示制御部16による表示制御方法、通知部15による通知方法を変更することができる。このため、ユーザは、タッチパネル20の画面を見ている状態では、画面に表示された表示部品としてのボタンを指でタップしたり、画面に表示されたリストを指でなぞったりすることで視線有り状態のタッチジェスチャ操作が可能である。また、ユーザは、画面を見ていない状態では、画面全体を使ったタッチジェスチャ操作の受け付けと、操作状態の通知で操作が可能である。このように、実施の形態1においては、タッチパネル20において、視線有り状態のタッチジェスチャ操作と視線無し状態のタッチジェスチャ操作のいずれを行っても、視線有り状態におけるタッチジェスチャ操作又は視線無し状態におけるタッチジェスチャ操作と適切に判定されるので、正確にタッチジェスチャ操作による入力操作を行うことができる。 << 1-3 >> Effect As described above, in the first embodiment, the operation information A0 based on the user's touch gesture operation is a touch gesture operation in a state where there is a line of sight, or a touch gesture operation in a state where there is no line of sight The determination method of the input information A1 in the
図4のステップST103における視線有無判定の他の例として、視線有無判定部12は、3点以上のタッチ点(指の接触点)が発生した場合に視線無し状態における操作モードと判定してもよい。このようにした場合には、タッチパネル20における2点タッチの操作(例えば、ピンチ、回転などの操作)を、タッチジェスチャ操作として使用することができる。 << 1-4 >> Modification As another example of the gaze presence / absence judgment in step ST103 of FIG. 4, the gaze presence /
《2-1》構成
上記実施の形態1においては、視線無し状態の操作モードにおけるタッチジェスチャ操作は、タッチパネル20の画面全体を操作有効領域とした、簡単なタッチジェスチャ操作に限定されている。実施の形態2においては、視線無し状態の操作モードにおいて、画面を見ないことによる意図しない操作を防ぎ、より複雑なタッチジェスチャ操作を可能にする装置及び方法について説明する。 << 2 >>
<< 2-1 >> Configuration In the first embodiment, the touch gesture operation in the operation mode without the line of sight is limited to a simple touch gesture operation using the entire screen of the
図7は、実施の形態2に係るタッチパネル入力装置2におけるタッチジェスチャ判定装置10aの動作(タッチジェスチャ判定方法)を示すフローチャートである。図7において、図4に示される処理ステップと同一又は対応する処理ステップには、図4に示されるステップ番号と同じステップ番号が付される。実施の形態2におけるタッチジェスチャ判定装置10aの動作は、タッチジェスチャ操作の判定ステップST109の後に、タッチジェスチャ操作を記憶させる処理ステップ(ステップST201)とジェスチャ補正処理ステップ(ステップST202)とを有する点において、実施の形態1におけるタッチジェスチャ判定装置10の動作と相違する。これらの点を除き、実施の形態2におけるタッチジェスチャ判定装置10aの動作は、実施の形態1のものと同じである。 << 2-2 >> Operation FIG. 7 is a flowchart showing the operation (touch gesture determination method) of the touch
以上に説明したように、実施の形態2においては、操作判定部14で認識したジェスチャの履歴を取り、補正することで、画面を見ずに操作することによる意図しない操作を防ぐことができ、かつ、より複雑なタッチジェスチャ操作であっても、画面を見ずにタッチジェスチャ操作することができるようになる。 << 2-3 >> Effects As described above, in the second embodiment, an unintended operation by operating without looking at the screen by collecting and correcting the history of gestures recognized by the
《3-1》構成
実施の形態1及び2においては、視線有無判定部12は、タッチジェスチャ操作に基づく操作情報A0から、視線有無判定(操作モード判定)を行っている。しかし、実施の形態1及び2に係るタッチパネル入力装置1及び2においては、視線有無判定に使用する操作を、本来の情報入力のためのタッチジェスチャ操作として使用することができない。そこで、実施の形態3においては、タッチジェスチャ操作に基づく操作情報A0から視線有無判定(操作モード判定)を行う代わりに、ユーザの顔を撮影する撮影装置としてのカメラを備え、ユーザの顔のカメラ画像から、視線有無判定を行う。 << 3 >>
<< 3-1 >> Configuration In the first and second embodiments, the gaze presence /
図14は、実施の形態3に係るタッチパネル入力装置3におけるタッチジェスチャ判定装置10bの動作(タッチジェスチャ判定方法)を示すフローチャートである。図14において、図4に示される処理ステップと同一又は対応する処理ステップには、図4に示されるステップ番号と同じステップ番号が付される。実施の形態3におけるタッチジェスチャ判定装置10bの動作は、視線有無判定部12bが、カメラ画像入力部19を通して取得したユーザの顔のカメラ画像から視線有無判定を行う点(ステップST301、ST302)及びカメラ画像に基づく視線有無判定の後にタッチジェスチャ操作による操作情報A0を取得する点(ステップST303)において、実施の形態1におけるタッチジェスチャ判定装置10の動作と相違する。これらの点を除き、実施の形態3に係るタッチパネル入力装置3の動作は、実施の形態1のものと同じである。したがって、以下に、実施の形態1の動作と異なる動作を説明する。 << 3-2 >> Operation FIG. 14 is a flowchart showing the operation (touch gesture determination method) of the touch
以上に説明したように、実施の形態3に係るタッチパネル入力装置3、タッチジェスチャ判定装置10b、タッチジェスチャ判定方法、及びタッチジェスチャ判定プログラムにおいては、カメラ34のカメラ画像を用いてユーザの視線がタッチパネル20の画面に向けられている視線有り状態であるか、視線有り状態以外の状態である視線無し状態であるのかを判定することで、ユーザが操作モードの切り替えのためのタッチジェスチャ操作を行わなくても、視線有無判定(操作モード判定)を容易且つ正確に行うことができる。 << 3-3 >> Effect As described above, in the touch
Claims (21)
- 画面に操作用画像を表示し、ユーザのタッチジェスチャ操作を受け付け、前記タッチジェスチャ操作に対応する操作情報を出力するタッチパネルと、
前記操作情報を受け取り、前記操作情報に基づく選択値を生成するタッチジェスチャ判定装置と
を有するタッチパネル入力装置であって、
前記タッチジェスチャ判定装置は、
前記タッチジェスチャ操作が、前記ユーザの視線が前記画面に向けられている視線有り状態における操作又は前記ユーザの視線が前記画面に向けられていない視線無し状態における操作のいずれであるかを判定し、前記判定の結果を示す視線判定情報を出力する視線有無判定部と、
前記視線判定情報に基づく命令情報を出力する操作モード切替部と、
前記命令情報に従う判定方法で、前記操作情報に基づく前記選択値を生成する操作判定部と、
前記タッチパネルに、前記操作用画像として前記命令情報に従う画像を表示させる表示制御部と、を有し、
前記操作判定部は、
前記視線有り状態のときに、前記タッチパネルに前記操作用画像として表示された表示部品に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定する表示部品ジェスチャ決定部と、
前記視線無し状態のときに、前記タッチパネルの画面全体に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定する画面全体ジェスチャ決定部と、を有する
ことを特徴とするタッチパネル入力装置。 A touch panel that displays an operation image on a screen, accepts a user's touch gesture operation, and outputs operation information corresponding to the touch gesture operation;
A touch gesture determination device that receives the operation information and generates a selection value based on the operation information,
The touch gesture determination device includes:
Determining whether the touch gesture operation is an operation in a line-of-sight state in which the user's line of sight is directed to the screen or an operation in a line-of-sight state in which the user's line of sight is not directed to the screen; A line-of-sight presence determination unit that outputs line-of-sight determination information indicating the result of the determination;
An operation mode switching unit that outputs command information based on the line-of-sight determination information;
An operation determination unit that generates the selection value based on the operation information in a determination method according to the command information;
A display control unit that causes the touch panel to display an image according to the command information as the operation image;
The operation determination unit is
A display component gesture determining unit that determines the selection value from operation information based on a touch gesture operation performed on the display component displayed as the operation image on the touch panel when the line of sight is present;
A whole screen gesture determining unit that determines the selection value from operation information based on a touch gesture operation performed on the entire screen of the touch panel when the line of sight is absent. . - 前記タッチジェスチャ判定装置は、前記命令情報に従う通知内容の通知を発する又は通知信号を出力する通知部をさらに有することを特徴とする請求項1に記載のタッチパネル入力装置。 The touch panel input device according to claim 1, wherein the touch gesture determination device further includes a notification unit that issues a notification of notification contents according to the command information or outputs a notification signal.
- 前記視線有無判定部は、前記タッチジェスチャ操作として、前記タッチパネルを予め決められた第1の閾値以上の時間押し続ける長押し操作、前記タッチパネルに複数の指を接触させる複数点操作、及び前記タッチパネルに手の平を接触させる手の平操作の内のいずれかが発生した場合に、前記ユーザの視線が前記視線無し状態であると判定する請求項1又は2に記載のタッチパネル入力装置。 The line-of-sight presence / absence determining unit includes, as the touch gesture operation, a long-press operation for continuously pressing the touch panel for a time equal to or greater than a predetermined first threshold, a multi-point operation for bringing a plurality of fingers into contact with the touch panel, and the touch panel The touch panel input device according to claim 1, wherein the user's line of sight determines that the line of sight of the user is in the state without the line of sight when any one of palm operations for bringing a palm into contact with the palm occurs.
- 前記視線有無判定部は、前記タッチジェスチャ操作として、前記タッチパネルの表示内容を変化させない操作を予め決められた第2の閾値以上の時間継続した場合に、前記ユーザの視線が前記視線無し状態であると判定する請求項1又は2に記載のタッチパネル入力装置。 The gaze presence / absence determining unit is configured such that the gaze of the user is in the no gaze state when an operation that does not change the display content of the touch panel is continued for a time equal to or greater than a predetermined second threshold as the touch gesture operation. The touch panel input device according to claim 1, wherein the touch panel input device is determined.
- 前記視線有無判定部は、前記タッチパネルの特定の領域が、予め決められた方向に、前記特定の領域からはみ出さずになぞられる操作が行われた場合に、前記ユーザの視線が前記視線無し状態であると判定する請求項1から4のいずれか1項に記載のタッチパネル入力装置。 The gaze presence / absence determining unit is configured such that, when an operation is performed in which a specific area of the touch panel is traced in a predetermined direction without protruding from the specific area, the user's gaze is in the no gaze state The touch panel input device according to claim 1, wherein the touch panel input device is determined to be.
- 前記視線有無判定部は、前記タッチパネルの画面上で特定の軌跡が描かれた場合に、前記ユーザの視線が前記視線無し状態であると判定する請求項1から5のいずれか1項に記載のタッチパネル入力装置。 The said eyes | visual_axis presence determination part determines that the said user's eyes | visual_axis is a state without the said eyes | visual_axis when the specific locus | trajectory is drawn on the screen of the said touch panel. Touch panel input device.
- 前記視線有無判定部は、前記タッチパネルの画面上で予め決められた第3の閾値以上の時間操作が無い場合に、前記ユーザの視線が前記視線無し状態であると判定する請求項1から6のいずれか1項に記載のタッチパネル入力装置。 The line-of-sight presence / absence determination unit determines that the line of sight of the user is in the state of no line of sight when there is no operation for a time equal to or greater than a predetermined third threshold on the screen of the touch panel. The touch panel input device according to any one of the above.
- 前記ユーザの顔を撮影することでカメラ画像を出力するカメラをさらに有し、
前記視線有無判定部は、前記カメラ画像に応じて前記判定を行う
ことを特徴とする請求項1又は2に記載のタッチパネル入力装置。 A camera that outputs a camera image by photographing the user's face;
The touch panel input device according to claim 1, wherein the line-of-sight presence / absence determination unit performs the determination according to the camera image. - 前記視線有無判定部により前記視線無し状態と判定された場合に、
前記操作判定部は、前記タッチパネルの画面の予め決められた第1端部側の領域をタッチすると前記選択値を増やし、前記タッチパネルの画面の前記第1端部側の反対側である第2端部側をタッチすると前記選択値を減らす
ことを特徴とする請求項1から8のいずれか1項に記載のタッチパネル入力装置。 When the gaze presence / absence judging unit judges the gaze absence state,
The operation determination unit increases the selection value when a predetermined region on the first end side of the screen of the touch panel is touched, and a second end that is opposite to the first end side of the screen of the touch panel. The touch panel input device according to any one of claims 1 to 8, wherein the selection value is reduced by touching a part side. - 前記視線有無判定部により前記視線無し状態と判定された場合に、
前記操作判定部は、前記タッチパネルの画面上において、前記画面をなぞる操作を行った位置に応じた値を、前記選択値として選択する
ことを特徴とする請求項1から8のいずれか1項に記載のタッチパネル入力装置。 When the gaze presence / absence judging unit judges the gaze absence state,
The said operation determination part selects the value according to the position which performed operation which traces the said screen on the screen of the said touch panel as the said selection value. Any one of Claim 1 to 8 characterized by the above-mentioned. The touch panel input device described. - 前記視線有無判定部により前記視線無し状態と判定された場合に、
前記操作判定部は、前記タッチパネルの画面上を複数の接触点で第1の回転方向に回転する操作を行うことで前記選択値を増やし、前記第1の回転方向の反対方向である第2の回転方向に回転する操作を行うことで前記選択値を減らす
ことを特徴とする請求項1から8のいずれか1項に記載のタッチパネル入力装置。 When the gaze presence / absence judging unit judges the gaze absence state,
The operation determination unit increases the selection value by performing an operation of rotating in the first rotation direction at a plurality of contact points on the screen of the touch panel, and a second direction that is opposite to the first rotation direction. The touch panel input device according to claim 1, wherein the selection value is reduced by performing an operation of rotating in a rotation direction. - 前記タッチジェスチャ判定装置は、
前記タッチジェスチャ操作の履歴情報を記憶するジェスチャ操作記憶部と、
前記ジェスチャ操作記憶部が記憶する前記履歴情報を基に、前記操作判定部による前記タッチジェスチャ操作の認識を補正するジェスチャ補正部と、
をさらに有することを特徴とする請求項1から8のいずれか1項に記載のタッチパネル入力装置。 The touch gesture determination device includes:
A gesture operation storage unit that stores history information of the touch gesture operation;
A gesture correction unit that corrects recognition of the touch gesture operation by the operation determination unit based on the history information stored in the gesture operation storage unit;
The touch panel input device according to claim 1, further comprising: - 前記視線有無判定部により前記視線無し状態と判定された場合に、
前記操作判定部は、前記タッチパネルの画面上を複数の接触点で第1の回転方向に回転する操作を行うことで前記選択値を増やし、前記第1の回転方向の反対方向である第2の回転方向に回転する操作を行うことで前記選択値を減らし、
前記ジェスチャ補正部は、前記回転する操作で描かれた円の位置及び軌跡を補正する
ことを特徴とする請求項12に記載のタッチパネル入力装置。 When the gaze presence / absence judging unit judges the gaze absence state,
The operation determination unit increases the selection value by performing an operation of rotating in the first rotation direction at a plurality of contact points on the screen of the touch panel, and a second direction that is opposite to the first rotation direction. The selected value is reduced by performing an operation to rotate in the rotation direction,
The touch panel input device according to claim 12, wherein the gesture correction unit corrects a position and a locus of a circle drawn by the rotating operation. - 前記タッチパネルの画面上において1つの接触点を移動させて円を繰り返し描く操作の、前記接触点の初期位置は、複数の接触点の重心点又は中心点であることを特徴とする請求項12に記載のタッチパネル入力装置。 The initial position of the contact point in an operation of repeatedly drawing a circle by moving one contact point on the touch panel screen is a center of gravity or a center point of a plurality of contact points. The touch panel input device described.
- 前記視線有無判定部により前記視線無し状態と判定された場合に、
前記ジェスチャ補正部は、
前記タッチパネルの画面上で入力された3点以上の接触点の移動量、2点の接触点間の距離の変化量で示される拡縮量、及び回転の角度を、現時点から予め決められた時間遡った時点までの期間について取得し、
前記接触点の移動量の合計値、前記接触点間の距離の変化量の合計値、及び前記回転の回転方向の変化量の合計値を求め、これら変化量の合計値の内の最も大きい合計値の操作を意図している操作として出力する
ことを特徴とする請求項12に記載のタッチパネル入力装置。 When the gaze presence / absence judging unit judges the gaze absence state,
The gesture correction unit
The amount of movement of three or more contact points input on the touch panel screen, the amount of expansion / contraction indicated by the amount of change in the distance between the two contact points, and the angle of rotation are traced back in advance from the present time. For the period up to
The total value of the movement amount of the contact point, the total value of the change amount of the distance between the contact points, and the total value of the change amount of the rotation direction of the rotation are obtained, and the largest sum among these total values of the change amount The touch panel input device according to claim 12, wherein a value operation is output as an intended operation. - 前記視線有無判定部により前記視線有り状態と判定された場合に、
前記タッチパネルの画面に対する複数点のタッチ操作が発生すると、前記操作判定部は、前記選択値を確定する
ことを特徴とする請求項9、10、及び13のいずれか1項に記載のタッチパネル入力装置。 When the gaze presence / absence judging unit determines that the gaze is present,
The touch panel input device according to any one of claims 9, 10, and 13, wherein when a plurality of touch operations on the screen of the touch panel occur, the operation determination unit determines the selection value. . - 前記視線有無判定部により前記視線無し状態と判定された場合に、
前記タッチパネルの画面に対するダブルタップが発生すると、前記操作判定部は、前記選択値を確定する
ことを特徴とする請求項11、14、及び15のいずれか1項に記載のタッチパネル入力装置。 When the gaze presence / absence judging unit judges the gaze absence state,
The touch panel input device according to any one of claims 11, 14, and 15, wherein the operation determination unit determines the selection value when a double tap occurs on a screen of the touch panel. - 前記視線有無判定部により前記視線無し状態と判定された場合に、
前記タッチパネルの画面に対する操作が予め決められた時間無ければ、前記操作判定部は、前記選択値を確定する
ことを特徴とする請求項9から11、13、及び15のいずれか1項に記載のタッチパネル入力装置。 When the gaze presence / absence judging unit judges the gaze absence state,
16. The operation determination unit according to claim 9, wherein the operation determination unit determines the selection value if there is no operation on the touch panel screen in a predetermined time. Touch panel input device. - 画面に操作用画像を表示し、ユーザのタッチジェスチャ操作を受け付け、前記タッチジェスチャ操作に対応する操作情報を出力するタッチパネルから、前記操作情報を受け取り、前記操作情報に基づく選択値を生成するタッチジェスチャ判定装置であって、
前記タッチジェスチャ操作が、前記ユーザの視線が前記画面に向けられている視線有り状態における操作又は前記ユーザの視線が前記画面に向けられていない視線無し状態における操作のいずれであるかを判定し、前記判定の結果を示す視線判定情報を出力する視線有無判定部と、
前記視線判定情報に基づく命令情報を出力する操作モード切替部と、
前記命令情報に従う判定方法で、前記操作情報に基づく前記選択値を生成する操作判定部と、
前記タッチパネルに、前記操作用画像として前記命令情報に従う画像を表示させる表示制御部と、を有し、
前記操作判定部は、
前記視線有り状態のときに、前記タッチパネルに前記操作用画像として表示された表示部品に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定する表示部品ジェスチャ決定部と、
前記視線無し状態のときに、前記タッチパネルの画面全体に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定する画面全体ジェスチャ決定部と、を有する
ことを特徴とするタッチジェスチャ判定装置。 A touch gesture that displays an operation image on a screen, receives a touch gesture operation of a user, receives the operation information from a touch panel that outputs operation information corresponding to the touch gesture operation, and generates a selection value based on the operation information. A determination device,
Determining whether the touch gesture operation is an operation in a line-of-sight state in which the user's line of sight is directed to the screen or an operation in a line-of-sight state in which the user's line of sight is not directed to the screen; A line-of-sight presence determination unit that outputs line-of-sight determination information indicating the result of the determination;
An operation mode switching unit that outputs command information based on the line-of-sight determination information;
An operation determination unit that generates the selection value based on the operation information in a determination method according to the command information;
A display control unit that causes the touch panel to display an image according to the command information as the operation image;
The operation determination unit is
A display component gesture determining unit that determines the selection value from operation information based on a touch gesture operation performed on the display component displayed as the operation image on the touch panel when the line of sight is present;
A touch gesture determination unit comprising: a whole screen gesture determination unit that determines the selection value from operation information based on a touch gesture operation performed on the entire screen of the touch panel when the line of sight is absent. apparatus. - 画面に操作用画像を表示し、ユーザのタッチジェスチャ操作を受け付け、前記タッチジェスチャ操作に対応する操作情報を出力するタッチパネルから、前記操作情報を受け取り、前記操作情報に基づく選択値を生成するタッチジェスチャ判定方法であって、
前記タッチジェスチャ操作が、前記ユーザの視線が前記画面に向けられている視線有り状態における操作又は前記ユーザの視線が前記画面に向けられていない視線無し状態における操作のいずれであるかを判定し、前記判定の結果を示す視線判定情報を出力する視線有無判定ステップと、
前記視線判定情報に基づく命令情報に従う判定方法で、前記操作情報に基づく前記選択値を生成する操作判定ステップと、
前記タッチパネルに、前記操作用画像として前記命令情報に従う画像を表示させる表示制御ステップと、
を有し、
前記操作判定ステップにおいて、前記視線有り状態のときに、前記タッチパネルに前記操作用画像として表示された表示部品に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定し、前記視線無し状態のときに、前記タッチパネルの画面全体に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定する
ことを特徴とするタッチジェスチャ判定方法。 A touch gesture that displays an operation image on a screen, receives a touch gesture operation of a user, receives the operation information from a touch panel that outputs operation information corresponding to the touch gesture operation, and generates a selection value based on the operation information. A determination method comprising:
Determining whether the touch gesture operation is an operation in a line-of-sight state in which the user's line of sight is directed to the screen or an operation in a line-of-sight state in which the user's line of sight is not directed to the screen; A line-of-sight presence determination step for outputting line-of-sight determination information indicating the result of the determination;
An operation determining step for generating the selection value based on the operation information in a determination method according to the command information based on the line-of-sight determination information;
A display control step of causing the touch panel to display an image according to the command information as the operation image;
Have
In the operation determination step, when the line of sight is present, the selection value is determined from operation information based on a touch gesture operation performed on a display component displayed as the operation image on the touch panel, and the line of sight The touch gesture determination method, wherein the selection value is determined from operation information based on a touch gesture operation performed on the entire screen of the touch panel in the absence state. - コンピュータに、
画面に操作用画像を表示し、ユーザのタッチジェスチャ操作を受け付け、前記タッチジェスチャ操作に対応する操作情報を出力するタッチパネルから、前記操作情報を受け取り、前記操作情報に基づく選択値を生成するために、
前記タッチジェスチャ操作が、前記ユーザの視線が前記画面に向けられている視線有り状態における操作又は前記ユーザの視線が前記画面に向けられていない視線無し状態における操作のいずれであるかを判定し、前記判定の結果を示す視線判定情報を出力する視線有無判定処理と、
前記視線判定情報に基づく命令情報に従う判定方法で、前記操作情報に基づく前記選択値を生成する操作判定処理と、
前記タッチパネルに、前記操作用画像として前記命令情報に従う画像を表示させる表示制御処理と、
を実行させ、
前記操作判定処理において、前記視線有り状態のときに、前記タッチパネルに前記操作用画像として表示された表示部品に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定し、前記視線無し状態のときに、前記タッチパネルの画面全体に対して行われたタッチジェスチャ操作に基づく操作情報から前記選択値を決定する
ことを特徴とするタッチジェスチャ判定プログラム。 On the computer,
To display an operation image on the screen, accept a user's touch gesture operation, and output the operation information corresponding to the touch gesture operation, receive the operation information, and generate a selection value based on the operation information ,
Determining whether the touch gesture operation is an operation in a line-of-sight state in which the user's line of sight is directed to the screen or an operation in a line-of-sight state in which the user's line of sight is not directed to the screen; Gaze presence determination processing for outputting gaze determination information indicating the result of the determination;
In the determination method according to the command information based on the line-of-sight determination information, an operation determination process for generating the selection value based on the operation information;
Display control processing for causing the touch panel to display an image according to the command information as the operation image;
And execute
In the operation determination process, when the line of sight is present, the selection value is determined from operation information based on a touch gesture operation performed on the display component displayed as the operation image on the touch panel, and the line of sight The touch gesture determination program, wherein the selection value is determined from operation information based on a touch gesture operation performed on the entire screen of the touch panel in the absence state.
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020197017714A KR102254794B1 (en) | 2016-12-26 | 2016-12-26 | Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program |
CN201680091771.3A CN110114749B (en) | 2016-12-26 | 2016-12-26 | Touch panel input device, touch gesture determination method, and recording medium |
DE112016007545.6T DE112016007545T5 (en) | 2016-12-26 | 2016-12-26 | TASTFIELD INPUT DEVICE, TASTGEST EVALUATION DEVICE, TASTGEST ASSESSMENT PROCEDURE, AND TASTGEST ASSESSMENT PROGRAM |
JP2017522564A JP6177482B1 (en) | 2016-12-26 | 2016-12-26 | Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program |
PCT/JP2016/088610 WO2018122891A1 (en) | 2016-12-26 | 2016-12-26 | Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/088610 WO2018122891A1 (en) | 2016-12-26 | 2016-12-26 | Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018122891A1 true WO2018122891A1 (en) | 2018-07-05 |
Family
ID=59559217
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/088610 WO2018122891A1 (en) | 2016-12-26 | 2016-12-26 | Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program |
Country Status (5)
Country | Link |
---|---|
JP (1) | JP6177482B1 (en) |
KR (1) | KR102254794B1 (en) |
CN (1) | CN110114749B (en) |
DE (1) | DE112016007545T5 (en) |
WO (1) | WO2018122891A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021240671A1 (en) * | 2020-05-27 | 2021-12-02 | 三菱電機株式会社 | Gesture detection device and gesture detection method |
WO2021240668A1 (en) * | 2020-05-27 | 2021-12-02 | 三菱電機株式会社 | Gesture detection device and gesture detection method |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6952942B2 (en) * | 2019-09-04 | 2021-10-27 | 三菱電機株式会社 | Touch panel device, operation identification method, and operation identification program |
CN111352529B (en) * | 2020-02-20 | 2022-11-08 | Oppo(重庆)智能科技有限公司 | Method, device, terminal and storage medium for reporting touch event |
CN113495620A (en) * | 2020-04-03 | 2021-10-12 | 百度在线网络技术(北京)有限公司 | Interactive mode switching method and device, electronic equipment and storage medium |
JP7014874B1 (en) | 2020-09-24 | 2022-02-01 | レノボ・シンガポール・プライベート・リミテッド | Information processing equipment and information processing method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000231446A (en) * | 1999-02-10 | 2000-08-22 | Sharp Corp | Display integrated type tablet device and storage medium stored with automatic tablet correction program |
JP2003240560A (en) * | 2002-02-13 | 2003-08-27 | Alpine Electronics Inc | Display controller using eyes |
JP2006017478A (en) * | 2004-06-30 | 2006-01-19 | Xanavi Informatics Corp | Navigation system |
JP2007302223A (en) * | 2006-04-12 | 2007-11-22 | Hitachi Ltd | Non-contact input device for in-vehicle apparatus |
JP2008024070A (en) * | 2006-07-19 | 2008-02-07 | Xanavi Informatics Corp | On-vehicle information terminal |
JP2008195142A (en) * | 2007-02-09 | 2008-08-28 | Aisin Aw Co Ltd | Operation supporting device and method for on-vehicle equipment |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101055193A (en) * | 2006-04-12 | 2007-10-17 | 株式会社日立制作所 | Noncontact input operation device for in-vehicle apparatus |
KR101252169B1 (en) * | 2011-05-27 | 2013-04-05 | 엘지전자 주식회사 | Mobile terminal and operation control method thereof |
US20140002376A1 (en) * | 2012-06-29 | 2014-01-02 | Immersion Corporation | Method and apparatus for providing shortcut touch gestures with haptic feedback |
KR102268045B1 (en) * | 2014-06-27 | 2021-06-22 | 엘지전자 주식회사 | An apparatus and method for proceeding information of products being displayed in the show window |
JP6385173B2 (en) | 2014-07-15 | 2018-09-05 | 三菱電機株式会社 | User judgment method on elevator touch panel type destination floor registration operation panel and elevator touch panel type destination floor registration operation panel |
WO2016035207A1 (en) | 2014-09-05 | 2016-03-10 | 三菱電機株式会社 | Control system for in-vehicle device |
KR102216944B1 (en) | 2014-09-23 | 2021-02-18 | 대원정밀공업(주) | Pumping device for vehicle seat |
KR102073222B1 (en) * | 2014-12-18 | 2020-02-04 | 한국과학기술원 | User terminal and method for providing haptic service of the same |
-
2016
- 2016-12-26 CN CN201680091771.3A patent/CN110114749B/en active Active
- 2016-12-26 WO PCT/JP2016/088610 patent/WO2018122891A1/en active Application Filing
- 2016-12-26 KR KR1020197017714A patent/KR102254794B1/en active IP Right Grant
- 2016-12-26 JP JP2017522564A patent/JP6177482B1/en active Active
- 2016-12-26 DE DE112016007545.6T patent/DE112016007545T5/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000231446A (en) * | 1999-02-10 | 2000-08-22 | Sharp Corp | Display integrated type tablet device and storage medium stored with automatic tablet correction program |
JP2003240560A (en) * | 2002-02-13 | 2003-08-27 | Alpine Electronics Inc | Display controller using eyes |
JP2006017478A (en) * | 2004-06-30 | 2006-01-19 | Xanavi Informatics Corp | Navigation system |
JP2007302223A (en) * | 2006-04-12 | 2007-11-22 | Hitachi Ltd | Non-contact input device for in-vehicle apparatus |
JP2008024070A (en) * | 2006-07-19 | 2008-02-07 | Xanavi Informatics Corp | On-vehicle information terminal |
JP2008195142A (en) * | 2007-02-09 | 2008-08-28 | Aisin Aw Co Ltd | Operation supporting device and method for on-vehicle equipment |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021240671A1 (en) * | 2020-05-27 | 2021-12-02 | 三菱電機株式会社 | Gesture detection device and gesture detection method |
WO2021240668A1 (en) * | 2020-05-27 | 2021-12-02 | 三菱電機株式会社 | Gesture detection device and gesture detection method |
Also Published As
Publication number | Publication date |
---|---|
CN110114749A (en) | 2019-08-09 |
JPWO2018122891A1 (en) | 2018-12-27 |
JP6177482B1 (en) | 2017-08-09 |
DE112016007545T5 (en) | 2019-09-19 |
KR102254794B1 (en) | 2021-05-21 |
CN110114749B (en) | 2022-02-25 |
KR20190087510A (en) | 2019-07-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6177482B1 (en) | Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program | |
US9111076B2 (en) | Mobile terminal and control method thereof | |
JP4275151B2 (en) | Red-eye correction method and apparatus using user-adjustable threshold | |
US20090262187A1 (en) | Input device | |
JP5889397B2 (en) | Information terminal, input reception control method, and input reception control program | |
KR20150090840A (en) | Device and mathod for shielding region of display screen of device | |
TWI658396B (en) | Interface control method and electronic device using the same | |
JP5222967B2 (en) | Mobile device | |
US20150103001A1 (en) | Touch control method and electronic device using the same | |
JP2007172303A (en) | Information input system | |
CN107450820B (en) | Interface control method and mobile terminal | |
US20200034032A1 (en) | Electronic apparatus, computer-readable non-transitory recording medium, and display control method | |
JP2015176311A (en) | Terminal and control method | |
US10802620B2 (en) | Information processing apparatus and information processing method | |
JP2015118507A (en) | Method, device, and computer program for selecting object | |
JP6329373B2 (en) | Electronic device and program for controlling electronic device | |
WO2017215211A1 (en) | Picture display method based on intelligent terminal having touch screen, and electronic apparatus | |
JP6616379B2 (en) | Electronics | |
JP2020017218A (en) | Electronic device, control program, and display control method | |
WO2018123701A1 (en) | Electronic device, method for control thereof and program | |
JP6686885B2 (en) | Information processing apparatus, information processing method, and program | |
JP7179334B2 (en) | GESTURE RECOGNITION DEVICE AND PROGRAM FOR GESTURE RECOGNITION DEVICE | |
US20210278899A1 (en) | Display control method, display control system and wearable device | |
JP2017102676A (en) | Portable terminal device, operation device, information processing method, and program | |
WO2014013752A1 (en) | Display device, control method therefor, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2017522564 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16925576 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197017714 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16925576 Country of ref document: EP Kind code of ref document: A1 |