CN110114749A - Panel input device, touch gestures decision maker, touch gestures determination method and touch gestures decision procedure - Google Patents

Panel input device, touch gestures decision maker, touch gestures determination method and touch gestures decision procedure Download PDF

Info

Publication number
CN110114749A
CN110114749A CN201680091771.3A CN201680091771A CN110114749A CN 110114749 A CN110114749 A CN 110114749A CN 201680091771 A CN201680091771 A CN 201680091771A CN 110114749 A CN110114749 A CN 110114749A
Authority
CN
China
Prior art keywords
sight
touch
picture
touch gestures
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201680091771.3A
Other languages
Chinese (zh)
Other versions
CN110114749B (en
Inventor
佐佐木雄一
森健太郎
堀淳志
永田绚子
丸山清泰
石川美穗
山崎聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN110114749A publication Critical patent/CN110114749A/en
Application granted granted Critical
Publication of CN110114749B publication Critical patent/CN110114749B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Panel input device (1) includes sight whether there is or not determination unit (12), determines that touch gestures operation has the operation under sight state or ignores the operation under linear state, output sight determines information (A2);Operation mode switching part (13), output determine the command information (A3) of information (A2) based on sight;Operation determination section (14) generates the selective value for being based on operation information (A0) by the determination method according to command information (A3);And display control section (16), it makes image of touch panel (20) display according to command information (A3), operation determination section (14) includes display unit gesture determination unit (141), it is when there is sight state, according to based on operation information (A0) the decision selective value to the touch gestures operation carried out as operation with the display unit that image is shown;And picture entirety gesture determination unit (142) determines selective value according to the operation information (A0) based on the touch gestures operation carried out to picture entirety when ignoring linear state.

Description

Panel input device, touch gestures decision maker, touch gestures determination method and Touch gestures decision procedure
Technical field
The present invention relates to the touch panel inputs for accepting the signal that touch gestures operate and output is operated based on touch gestures Device, and be entered and operate corresponding operation information with touch gestures and for exporting the letter based on the operation information being entered Number touch gestures decision maker, touch gestures determination method and touch gestures decision procedure.
Background technique
In general, watching the GUI shown in touch panel on one side using the user of panel input device (Graphical User Interface) picture carries out touch gestures operation on one side.But it in the driving of automobile or images The position of machine adjusts in medium task execution, and user can not carry out touch gestures behaviour while watching the picture of touch panel Make, that is, user can not carry out touch gestures behaviour in the case where making state i.e. " having sight state " of the sight towards the picture of touch panel Make.
As its countermeasure, patent document 1 proposes following device: according to connecing for multiple fingers of connection contact touch panel The similarity of shape obtained from contact and preset shape determines the operation part shown in touch panel (below Referred to as " display unit ").
In addition, patent document 2 proposes following device: 1 as defined in the position as multiple fingers of contact touch panel In the case that the area in a region is preset threshold value or more, it is judged as that user is the picture for not watching touch panel People, i.e. user are the progress touch gestures operations in the case where not making state i.e. " ignore linear state " of the sight towards the picture of touch panel People's (such as visually impaired person etc.).
Existing technical literature
Patent document
Patent document 1: International Publication 2016/035207
Patent document 2: Japanese Unexamined Patent Publication 2016-020267 bulletin
Summary of the invention
Subject to be solved by the invention
But in the device that above patent document 1,2 is recorded, user needs to concentrate on consciousness for utilizing multiple fingers Point is made the touch gestures operation of specific shape, therefore, sometimes for other tasks carried out parallel with touch gestures operation Attention reduces.
In addition, user needs to use sight towards the picture of touch panel in device described in Patent Document 1 The display unit shown in touch panel carries out touch gestures operation and therefore is sometimes difficult to correctly touch display unit Touch gesture operation.
In addition, user needs to touch in the operation effective coverage of touch panel in device described in Patent Document 2 Gesture operation is touched, therefore, operation information appropriate can not be inputted in the state of not watching the picture of touch panel sometimes.
The object of the present invention is to provide touched by suitably switching using the display unit shown in touch panel It touches the content of the operation mode for having sight state of gesture operation and is operated the picture entirety of touch panel as touch gestures Accept region carry out touch gestures operation the operation mode for ignoring linear state content, can easily and accurately carry out base The panel input device operated is inputted in touch gestures operation, and has the operation of sight state by suitably switching The content of the interior operation mode for perhaps ignoring linear state of mode can be carried out easily and accurately based on touch gestures operation Input touch gestures decision maker, touch gestures determination method and the touch gestures decision procedure of operation.
Means for solving the problems
The panel input device of one embodiment of the present invention includes touch panel, and display operation is used in picture Image accepts the touch gestures operation of user, exports operation information corresponding with touch gestures operation;And touch gestures Decision maker receives the operation information, generates the selective value based on the operation information, wherein the touch gestures are sentenced Determine device and include sight whether there is or not determination unit, determines that the touch gestures operation is the sight of the user towards the picture The sight for having operation or the user under sight state not towards the operation of the picture ignored under linear state, output Indicate that the sight of the result of the judgement determines information;Operation mode switching part exports and determines information based on the sight Command information;Operation determination section generates the institute based on the operation information by the determination method according to the command information State selective value;And display control section, make the touch panel displays according to the image of the command information as the behaviour Act on image, the operation determination section includes display unit gesture determination section, it is described have sight state when, according to being based on Operation information to the touch gestures operation carried out as the operation with the display unit that image is shown in the touch panel Determine the selective value;And picture entirety gesture determination section, it is described ignore linear state when, according to based on to the touch The operation information for the touch gestures operation that the picture entirety of panel carries out determines the selective value.
The touch gestures determination method of another mode of the invention receives operation information from touch panel, generates and is based on institute The selective value of operation information is stated, touch panel display operation image in picture accepts the touch gestures operation of user, defeated The operation information corresponding with touch gestures operation out, wherein the touch gestures determination method has follow steps: Sight determines that the touch gestures operation is that the sight of the user has sight state towards the picture whether there is or not determination step Under operation or the user sight not towards the operation of the picture ignored under linear state, output indicates the judgement Result sight determine information;Determination step is operated, sentencing according to the command information for determining information based on the sight is passed through Determine method, generates the selective value based on the operation information;And display control step, make the touch panel displays according to According to the image of the command information as the operation image, in the operation determination step, there is sight state described When, according to based on the touch gestures operation to being carried out as the operation with the display unit that image is shown in the touch panel Operation information determine the selective value, it is described ignore linear state when, according to all based on the picture to the touch panel The operation information of the touch gestures operation of progress determines the selective value.
Invention effect
According to the present invention, touch gestures operation is carried out using the display unit shown in touch panel by suitably switching The operation mode for having sight state content and the picture entirety of touch panel is accepted into region as what touch gestures operated Come carry out touch gestures operation the operation mode for ignoring linear state content, can easily and accurately carry out based on touch hand The input operation of gesture operation.
Detailed description of the invention
Fig. 1 is the functional block diagram for showing the outline structure of panel input device of embodiments of the present invention 1.
Fig. 2 is the figure for showing an example of the hardware configuration of panel input device of embodiment 1.
(a) and (b) of Fig. 3 is the example for showing the picture of the touch panel in the panel input device of embodiment 1 The figure of son.
Fig. 4 is movement (the touch hand for showing the touch gestures decision maker in the panel input device of embodiment 1 Gesture determination method) flow chart.
Fig. 5 is an example of the picture of the touch panel in the panel input device for the variation for showing embodiment 1 Figure.
Fig. 6 is the functional block diagram for showing the outline structure of panel input device of embodiments of the present invention 2.
Fig. 7 is movement (the touch hand for showing the touch gestures decision maker in the panel input device of embodiment 2 Gesture determination method) flow chart.
Fig. 8 is the acquirement for showing the historical information of the touch gestures operation in the panel input device of embodiment 2 The figure of method.
Fig. 9 is to show the state of the picture for not watching touch panel in panel input device to ignore under linear state Carry out touch gestures operation and can not be according to the figure of example for being intended to be inputted.
Figure 10 is to show the state of the picture for not watching touch panel in the panel input device of embodiment 2 i.e. Ignore the figure of the example of the touch gestures operation carried out under linear state.
Figure 11 be show the touch panel for not watching panel input device picture adjust touching with embodiment 2 Touch the direction, angle, the amount of zoom of image of the video camera in the video camera device for filming image that panel input device is communicated Example figure.
Figure 12 is the functional block diagram for showing the outline structure of panel input device of embodiments of the present invention 3.
Figure 13 is the figure for showing an example of the hardware configuration of panel input device of embodiment 3.
Figure 14 is the movement (touch for showing the touch gestures decision maker in the panel input device of embodiment 3 Gesture determination method) flow chart.
(a) and (b) of Figure 15 is to show in the panel input device of embodiment 3 to have sight state and without sight The figure of the determination method of state.
Specific embodiment
In the following, the embodiments of the present invention will be described with reference to the drawings.In the following description, touch panel input dress It sets and has the touch panel with touch operation picture (operation screen) and receive operation information in touch panel Touch gestures decision maker.Panel input device is equipped on object-based device, or sets in a manner of it can communicate with object Standby connection, thereby, it is possible to be applied to the operation screen of the electrical equipment as object-based device, as the video camera of object-based device Operation screen, as object-based device shop equipment operation screen, be equipped on automobile, ship, aircraft as object-based device Operation screens of portable information terminals such as the operation screen Deng on, the smart phone as object-based device and tablet terminal etc..Touching Touching panel input device (also referred to as " can will be touched and be grasped by touch gestures operation based on the operation screen from touch panel Make ") signal (such as selective value) of the operation information of input, be supplied to object-based device equipped with panel input device or The object-based device that can be communicated with panel input device.
Touch panel is the touch gestures input unit for accepting the touch gestures operation of user's progress.In addition, touch gestures Operation be based on user finger (user palm or finger user finger and palm) etc. special exercise information it is defeated Enter operation.Touch gestures operation can be comprising utilizing the finger gently operation i.e. click of the operation screen of knocking touch panel, benefit Flicked with the operation of the operation screen of finger attack touch panel, using finger stroke touch panel operation screen operation (operation of sliding finger) is gently swept.In addition, touch gestures operation can be comprising utilizing the display in finger drag touch panel The operation of component i.e. towing has been pinched using more fingers in the operation screen of touch panel and has made the operation of finger narrower intervals It reduces, the operation at the interval that expands in the operation screen of touch panel more fingers is amplified etc..In addition, touch gestures are grasped Making can also be comprising the operation of the felt pen used as pen type input auxiliary instrument.
" 1 " embodiment 1
" 1-1 " structure
Fig. 1 is the functional block diagram for showing the outline structure of panel input device 1 of embodiments of the present invention 1.Such as Shown in Fig. 1, the panel input device 1 of embodiment 1 has touch gestures decision maker 10 and touch panel 20.Touch hand Gesture decision maker 10 is that the touch gestures for the touch gestures determination method and embodiment 1 for being able to carry out embodiment 1 determine journey The device of sequence.
As shown in Figure 1, touch panel 20 includes operation panel portion 21, touch gestures operation by the user is accepted, Export operation information (also referred to as " touch information ") A0 corresponding with touch gestures operation;And display panel portion 22, with behaviour Make panel part 21 to overlap, can show the operations image such as GUI picture.Display panel portion 22 is, for example, liquid crystal display.
As shown in Figure 1, touch gestures decision maker 10 has operation information input unit 11, as operation mode determination unit Whether there is or not determination unit 12, operation mode switching part 13, operation determination section 14, notification unit 15 and display control sections 16 for sight.Operation Determination unit 14 has display unit gesture determination unit 141 and picture entirety gesture determination unit 142.
Operation information input unit 11 receives operation information (operation signal) A0 exported from operation panel portion 21.Operation information Input information A1 corresponding with the operation information A0 of receiving is output to sight whether there is or not determination unit 12 and operates judgement by input unit 11 Portion 14.Inputting information A1 is information corresponding with operation information A0, is also possible to information identical with operation information A0.
Whether there is or not determination units 12 for sight according to the input information A1 received from operation information input unit 11, determines touch gestures behaviour Work be user watch touch panel 20 picture carry out operation, i.e., the sight of user towards touch panel 20 picture state Having operation or the touch gestures operation under sight state is the operation that user does not watch that the picture of touch panel 20 carries out, I.e. the sight of user is not the operation ignored under linear state towards the state of the picture of touch panel 20.Before thering is sight state to be The operation mode mentioned is, for example, the operation mode that touch gestures operation is carried out to the display unit shown in touch panel 20.With nothing Operation mode premised on sight state is, for example, to operate using the picture entirety of touch panel 20 as being used to accept touch gestures 1 operation effective coverage operation mode.
Whether there is or not determination units 12 for sight according to the input information A1 of the touch gestures operation based on user, will indicate touch gestures Operation has the sight of the judgement result for operating or ignoring the operation under linear state under sight state to determine that information A2 is sent to Operation mode switching part 13.For example, whether there is or not determination units 12 to have the pattern for storing pre-determined touch gestures operation for sight Storage unit (such as memory in aftermentioned Fig. 2), can according to input information A1 shown in touch gestures operation with storage unit in The similarity of the pattern of the touch gestures operation of storage carries out the judgement.
Operation mode switching part 13 determines information A2 according to the sight received from sight whether there is or not determination unit 12, exports for cutting Change the display content of the picture of (setting) touch panel 20, operation determination section 14 to from touch panel 20 based on touch gestures The command information A3 of the notification method of the determination method and notification unit 15 of the input information A1 of operation.
Operation determination section 14 takes orders information A3 from operation mode switching part 13, switches according to command information A3 and inputs The determination method of information A1.
Picture entirety gesture determination unit 142 determines the touch gestures operation for the picture entirety of touch panel 20, determines Output signal, that is, selective value A7 based on input information A1.Touch gestures operation about the picture entirety for touch panel 20 Judgement, have under sight state touch gestures operation in, (the touch panel in the limited relatively narrow region of touch panel 20 A part of 20 picture) it uses, in ignoring the touch gestures operation under linear state, in the wider region of touch panel 20 (picture of touch panel 20 is all) uses.
Display unit gesture determination unit 141 is according to the display unit for being shown as operation image by display panel portion 22 With input information A1, touch gestures operation is determined, determine output signal, that is, selective value based on display unit and input information A1 A7.The judgement of touch gestures operation about the display unit for touch panel 20, the touch gestures in the case where there is sight state In operation, (picture of touch panel 20 is all) is used, the touching in the case where ignoring linear state in the wider region of touch panel 20 It touches in gesture operation, (a part of the picture of touch panel 20) uses in limited relatively narrow region.
The picture signal of the operation image shown in the display panel portion 22 of the output touch panel 20 of display control section 16 A6.Display control section 16 determines information A4 according to the operation received from operation determination section 14 and receives from operation mode switching part 13 Command information A3, change display panel portion 22 picture display content.
When notification unit 15 is operated when there is the touch gestures operation under sight state with the touch gestures ignored under linear state, Switching is directed to the information notice method of user.Notification unit 15 issues the notice of the content of announcement according to command information A3 or output is led to Know signal.For example, notification unit 15 is according to the input information A1 received by operation determination section 14, for example, aobvious by sound, picture Show, the situation that the vibration of vibrator or lamp are lighted etc., and the touch gestures of user to be notified to operate.In this way, notification unit 15 is according to from behaviour Operating for the receiving of judge portion 14 determines information A4 and the command information A3 from the receiving of operation mode switching part 13, in Notification of Changes Hold.In the case where the notice of notification unit 15 is the notice based on sound, notification unit 15 is to the loudspeaker as voice output portion Export notification signal.Loudspeaker is shown in aftermentioned Fig. 2.In the case where the notice of notification unit 15 is image display, notification unit Notification information A5 is sent to display control section 16 by 15, and display control section 16 sends the figure based on notification information to display panel portion 22 As signal.
The selective value A7 exported from operation determination section 14 is the selective value that operation determination section 14 is determined according to input information A1, Application program of device equipped with panel input device 1 etc. carries out equipment control etc. according to selective value A7.
Fig. 2 is the figure for showing an example of hardware (H/W) structure of the panel input device 1 of embodiment 1.Such as Fig. 2 Shown, the panel input device 1 of embodiment 1 has touch panel 20, processor 31, memory 32 and loudspeaker 33.Touch gestures decision maker 10 shown in FIG. 1 is able to use storage and deposits as the conduct of the touch gestures decision procedure of software The memory 32 of storage device and the place as information treatment part for executing the touch gestures decision procedure stored in memory 32 Device 31 (such as passing through computer) is managed to realize.In this case, the structural element 11~16 in Fig. 1 is equivalent in Fig. 2 and executes touch The processor 31 of gesture decision procedure.In addition, by memory 32 shown in Fig. 2 and touch gestures decision procedure can be executed Processor 31 realizes a part of touch gestures decision maker 10 shown in FIG. 1.
Loudspeaker 33 is, for example, by sound notifications such as broadcast to ignore the touch gestures operating conditions that linear state carries out In the case of the audio output unit that uses.Panel input device 1 also can replace loudspeaker 33 or as additional structure and With supplementing devices such as vibrator, lamp, sending devices for transmitting wirelessly notification signal.
(a) and (b) of Fig. 3 be show the touch panel 20 in the panel input device 1 of embodiment 1 picture and The figure of the example of touch gestures operation.
An example of (a) of Fig. 3 shown with the picture of the touch panel 20 under the operation mode of sight state.(a) of Fig. 3 be Sight is judged to being in whether there is or not determination unit 12 according to the input information A1 for indicating touch gestures operation the operation mould of sight state An example of picture in the case where formula.User passes through the operation for gently sweep using finger left direction or right direction picture, energy The display content of enough image switchings.List 231 is that sight is judged to showing in the case where having sight state whether there is or not determination unit 12 List.List 231 is made of multiple buttons (square area of mark number), passes through upward direction or lower direction stroke (sliding) Finger can make list 231 mobile to the same direction.For example, whether there is or not determination units for sight by utilizing finger touch button 232 12 are judged to having sight state, by click button (utilize finger touch button, do not move finger and in the pre-determined time Inside leave), determine (determination) selective value.
(b) of Fig. 3 shows an example of the picture of the touch panel 20 under the operation mode for ignoring linear state.(b) of Fig. 3 be Sight is judged to being in the operation mould for ignoring linear state whether there is or not determination unit 12 according to the input information A1 for indicating touch gestures operation An example of picture in the case where formula.For example, ignoring the cancellation region 241 under linear state is the rectangle region for surrounding " > > cancel > > " Linear state is ignored when i.e. stroke from left to right (sliding) finger on pre-determined direction in the rectangular area in domain Operation mode is cancelled, and switches to the operation mode of sight state.
" 1-2 " movement
Fig. 4 is the movement (touching for showing the touch gestures decision maker 10 in the panel input device 1 of embodiment 1 Touch gesture determination method) flow chart.
In step ST101, operation mode switching part 13 is set with the operation mode of sight state as initial operation mould Formula.At this point, operation mode switching part 13 exports command information A3, so that the picture of touch panel 20 becomes the behaviour for having sight state The picture of operation mode, the input method of touch panel 20 become the input of the display unit using the operation mode for having sight state Method.In addition it is possible to set initial mode of operation to ignore the operation mode of linear state.
In following step ST102, operation information input unit 11 obtains operation information A0 from touch panel 20 and (indicates The coordinate of the position of the contact point of finger, the contact condition of finger, finger identiflication number of multiple contact points etc.), it is stored In storage unit (such as memory 32 shown in Fig. 2).
In following step ST103, sight whether there is or not determination unit 12 according to stored in storage unit operation information A0 (with Related information in the contact point of finger etc.) historical information, carry out touch gestures operation have operation under sight state or Ignoring the judgement i.e. sight of the operation under linear state, whether there is or not judgement (judgements of operation mode).
In following step ST103, sight whether there is or not determination unit 12 for example meet decision condition below (1)~ (3) in the case where the either side in, it is judged to ignoring linear state, will in addition to this case where is judged to having sight state.
Decision condition (1): finger contacts 1 point on the picture of touch panel 20, and contact position is not moved to pre- The case where more than threshold value first determined time
Decision condition (2): multiple contact points on the picture of multiple finger contact touch panels 20, continuous contact reach The case where more than pre-determined threshold value time
Decision condition (3): (that is, more than pre-determined threshold value on the picture of palm continuous contact touch panel 20 Wider area continuous contact reaches the time of pre-determined threshold value or more) the case where
But as determinating reference, other determinating references also can be used.
When meeting decision condition (3) for being set as ignoring linear state in step ST103, multiple contact points for detecting The quantity potentially unstable (being dramatically changed with the time) of coordinate and contact point.Accordingly it is also possible to using from current time point The historical information of operation information A0 (inputting information A1) during until playing the time point during backtracking predetermines, according to The degree of the coordinate shift (coordinate amount of movement) of contact point in this period (such as in maximum amount of movement is more than pre-determined In the case where threshold value), be determined as be palm contact.
In addition, making on palm contacts picture in the decision condition (3) for being set as ignoring linear state in step ST103 When operation, the operation information A0 from touch panel 20 is the information for indicating contact condition exception.Accordingly it is also possible to using from working as Preceding time point play backtracking it is pre-determined during time point until during operation information A0 historical information, within this period In the presence of indicate contact condition exception history in the case where, be determined as be palm contact.
In addition, making on palm contacts picture in the decision condition (3) for being set as ignoring linear state in step ST103 When operation, possible erroneous judgement is set to finger and leaves from the picture of touch panel 20, is set to point touching face as a result, may judge by accident Display unit, that is, button touch gestures of the picture of plate 20 operate.Accordingly it is also possible to pre- using backtracking is played from current time point The historical information of operation information A0 during until time point during first determining, within this period, insertion is for detecting It is no not detect unstable touch (that is, not detecting the situation that the quantity of contact point and position are dramatically changed with the time) Reach time pre-determined from finger all leaves touch panel 20 or whether without abnormal contact condition notice to Machine processing, when detecting them, be determined as be palm contact.
In addition, whether there is or not determination units 12 the case where finger contacts the display on picture and stroke for sight in step ST103 Under be judged to having sight state.For example, carrying out in picture when ignoring linear state shown in (b) of Fig. 3 in no sight The inside in the cancellation region 241 under state starts to touch, to pass through right side via character " cancellation " from the label in left side " > > " In the case that the mode of label " > > " slides the operation for leaving finger after finger, alternatively, with from the label in left side " > > " via Character " cancellation " slides finger by way of the label on right side " > > " and is more than the feelings for cancelling the boundary line on right side in region 241 Under condition, sight is determined as it being the handover operation to there is sight state whether there is or not determination unit 12.In addition, about step ST103 to having The judgement of the handover operation of sight state, can also be in the case where ignoring linear state from the continuous stroke in outside in region 241 is cancelled to taking Disappear in the case where the inside in region 241 or region 241 is cancelled in stroke in the case where ignoring linear state midway is exposed to and cancels region Right direction stroke, which is cancelled, in the case where 241 upside or downside or in the case where ignoring linear state makes behind region 241 finger to left back In the case that direction returns to stroke, it is judged to not being judged to having sight state.
In addition, being determined as it being to the handover operation for having sight state whether there is or not determination unit 12 about sight in step ST103 Condition, be not only by the label in the left side of (b) from Fig. 3 " > > " via character " cancellation " by the label on right side " > > " in a manner of The case where sliding finger, is also possible to touch that cancel region 241 and right direction mobile pre-determined in the case where ignoring linear state Apart from above situation.
In step ST104, in sight, whether there is or not (in the case where being), processing is entered step in the case where determining to have change ST105~ST108, operation mode switching part 13 export the input method of the switching for the operation screen based on touch panel 20 Switching, notification unit 15 notification method switching command information A3.In step ST104, do not generate to sight whether there is or not In the case where the change of judgement (in the case where no), processing enters step ST109.
In step ST105, operation determination section 14 takes orders information A3 from operation mode switching part 13, issues switching needle Order to effective input method of picture entirety, the switching of picture entirety gesture determination unit 142 are effective defeated for picture entirety Enter method.
In following step ST106, operation determination section 14 takes orders information A3 from operation mode switching part 13, hair Switching is for the order of effective input method of display unit out, and the switching of display unit gesture determination unit 141 is for display unit Effective input method.Step ST105, the processing sequence of ST106 may be reversed.
In following step ST107, display control section 16 receives the life of image switching from operation mode switching part 13 Enable information A3, image switching.For example, display control section 16 is having view in the command information A3 for receiving order image switching There is picture when sight state in the case where linear state, such as shown in (a) of display Fig. 3, in the case where ignoring linear state, Such as picture when shown in (b) of display Fig. 3 ignoring linear state.
In following step ST108, notification unit 15 receives the life of the switching notice content of operation mode switching part 13 Enable information A3, switching notice content.For example, notification unit 15 exports " long-pressing picture in the case where there is sight state, from loudspeaker When, it switches to and does not watch the mode that picture is operated." as voice broadcast service, in addition, switching to the feelings for ignoring linear state It under condition, issues and " has carried out long press operation, therefore the operation screen for the person that switches to obstacle." such broadcast.Then, becoming In the case where ignoring linear state, making loudspeaker output, " selective value increases when touching the upside of picture, and selective value subtracts when touching downside It is small.Selective value is determined by double-clicking." as voice broadcast service.In addition, at this point, notification unit 15 can also be whenever generation picture Upside, downside click when, current selective value is notified by voice broadcast service.
In step ST108, the notice of notification unit 15 is not only voice broadcast service, can also by with the touch surface in operation The different display device of plate 20 is notified, is notified by the voice broadcast service from smartwatch, or make touch panel Vibration is notified.
In step ST109, operation determination section 14 determines (determination) selective value according to input information.For example, operation determines Portion 14 is under the operation mode for having sight state, and when clicking the button shown, the number shown in decision button is as choosing Select value.On the other hand, operation determination section 14 is under the operation mode for ignoring linear state, when generating the touch on the upside of picture, increases Add selective value, when generating the touch on the downside of picture, reduces selective value.Operation determination section 14 generate 2 touches and finger it is same When leave in the case where, determine selective value.In the case where having accepted touch gestures operation by operation determination section 14, display control Portion 16 processed changes display content according to touch gestures operation, and notification unit 15 is operated according to touch gestures carries out the logical of operating conditions Know.
In addition, in step ST109, in the case where clicking the cancellation region 241 ignored under linear state, due to not being The operation in region 241 is cancelled in stroke, is accordingly regarded as the downside operation of picture, reduces selective value.
In sight, whether there is or not such as this operation decision tables of storage table 1 in the storage unit of determination unit 12, according to the content of the table It is handled, thus, it is possible to realize the sight of step ST103, whether there is or not the operation determination sections of the judgement of determination unit 12 and step ST109 14 acts of determination.
[table 1]
" 1-3 " effect
As described above, in the embodiment 1, the operation that can be operated according to the touch gestures based on user is believed Breath A0 has the touch gestures under sight state to operate or ignore the touch gestures operation under linear state, change operation determination unit The notification method of the determination method of input information A1, the display control method of display control section 16, notification unit 15 in 14.Cause This, user is clicked in picture using finger in the state of watching the picture of touch panel 20 and is shown as display unit Button, or using the list shown in finger stroke picture, the touch gestures operation that thus, it is possible to carry out having sight state.This Outside, user can accept and operate shape by using what the touch gestures of picture entirety operated in the state of not watching picture The notice of state is operated.In this way, in the embodiment 1, in touch panel 20, either carrying out the touch for having sight state Gesture operation still carries out ignoring the touch gestures operation of linear state, the touch gestures for all being suitably judged to having under sight state It operates or ignores the touch gestures operation under linear state, therefore, can correctly carry out the input behaviour operated based on touch gestures Make.
In addition, in the embodiment 1, in the case where there is sight state, carrying out not as the touch gestures for having sight state In the case where operating the operation generated, it can be determined that be the touch gestures operation for ignoring linear state, in the case where ignoring linear state, It has carried out in the case where not operating the operation generated as the touch gestures for ignoring linear state, it can be determined that being to have sight state Touch gestures operation.In this case, sight, whether there is or not the misinterpretation of judgement is less, therefore, the judgement precision of operation mode improves.
" 1-4 " variation
As the sight in the step ST103 of Fig. 4 whether there is or not another example of judgement, whether there is or not determination units 12 to produce for sight In the case where 3 raw or more touch point (contact point of finger), it is judged to ignoring the operation mode under linear state.Such case Under, the operation (such as the operation such as kneading, rotation) for being able to use 2 touches in touch panel 20 is operated as touch gestures.
In addition, as the sight in the step ST103 of Fig. 4, whether there is or not another examples of judgement, can also connect since finger When the state that does not change of content that the picture of touching touch panel 20 plays the picture of sustained touch panel 20 reaches pre-determined Between it is above in the case where, be determined as in ignoring linear state.For example, even if making the picture of 1 finger contact touch panel 20 Paddling operation is generated, in the column for not generating the picture rolling that the left and right based on finger is gently swept or the paddling operation up and down based on picture In the case that table rolls, whether there is or not determination units 12 also can be determined that as in ignoring linear state for sight.
In addition, as the sight in the step ST103 of Fig. 4 whether there is or not another example of judgement, sight whether there is or not determination units 12 can also To be determined as having sight state in the longitudinal movement of the contact point of finger, transverse shifting, oblique any detection on the move.
In addition, as the sight in the step ST103 of Fig. 4 whether there is or not another example of judgement, sight whether there is or not determination units 12 can also To be determined as having view in the case where the movement of the contact point by finger is to describe the particular tracks such as round, four directions or character Linear state.
In addition, as the sight in the step ST103 of Fig. 4, whether there is or not the judgements of determination unit 12 to become having view from linear state is ignored The condition of linear state also can be set in the state of there is no the input operation from touch panel 20 by pre-determined The case where time.For example, it is also possible to be configured to after switching to and ignoring linear state, there is no the states of touch gestures operation to continue When more than pre-determined threshold time, auto-returned has sight state.In this case, the voice broadcast service of notification unit 15 is relied on, The touch gestures operation that long-pressing etc. is judged to ignoring linear state is carried out again, can not be watched picture and be carried out touch gestures behaviour Make.
In addition, being not only as another operating method for ignoring linear state in the step ST109 of Fig. 4 by upper and lower The method that picture is clicked and increases and decreases selective value is divided, it can also to be arranged with numerical value defeated for display on picture as shown in Figure 5 The image of the multiple buttons entered reads the project of button corresponding with the position of finger stroke is utilized by notification unit 15.
In addition, as another operating method for ignoring linear state in the step ST109 of Fig. 4, it can also be on picture It carries out 2 touches and is rotated in a manner of rotating driver plate, increase selective value in clockwise situation, in the case where counterclockwise Reduce selective value.In this case, decision and rotation process, therefore, example may be mistaken in the case where setting double-click to determine operation Such as, it can also be located at and continuously generate the case where clicking and (generating double-click) in the pre-determined time 2 times as the decision behaviour of selective value Make.
" 2 " embodiment 2
" 2-1 " structure
In above embodiment 1, the touch gestures operation ignored under the operation mode of linear state is defined in and sets touch surface The picture entirety of plate 20 is to operate the simple touch gestures operation of effective coverage.In embodiment 2, illustrate ignoring threadiness It under the operation mode of state, prevents from not watching being not intended to operate caused by picture, is able to carry out more complicated touch gestures operation Device and method.
Fig. 6 is the functional block diagram for showing the outline structure of panel input device 2 of embodiments of the present invention 2.? In Fig. 6, label identical with label shown in FIG. 1 is marked to the identical or corresponding structural element of structural element shown in FIG. 1. Panel input device 2 has touch panel 20 and touch gestures decision maker 10a.Touch gestures decision maker 10a is energy Enough execute the device of the touch gestures determination method of embodiment 2 and the touch gestures decision procedure of embodiment 2.Embodiment Touch gestures decision maker 10a in 2 and the touch gestures decision maker 10 in embodiment 1 the difference is that, have Gesture operation storage unit 17 and gesture correction unit 18.Other than this point, embodiment 2 is identical as embodiment 1.Therefore, under Face is illustrated centered on difference.
The historical information for the touch gestures operation that the operation determination section 14 that gesture operation storage unit 17 stores identifies.Gesture school Positive portion 18 is determined in correct operation determination unit 14 according to the data of the touch gestures operation stored in gesture operation storage unit 17 Touch gestures operate recognition result, determine selective value A8.
As shown in Fig. 2, the H/W structure of the panel input device 2 of embodiment 2 has touch panel 20, processor 31, memory 32 and loudspeaker 33.Touch gestures decision maker 10a shown in fig. 6 is able to use the touching stored as software It touches the touch gestures stored in the memory 32 and execution memory 32 as storage device of gesture decision procedure and determines journey The processor 31 (such as passing through computer) as information treatment part of sequence is realized.In this case, structural element 11 in Fig. 6~ 18 are equivalent to the processor 31 that touch gestures decision procedure is executed in Fig. 2.In addition, can also pass through memory 32 shown in Fig. 2 Touch gestures decision maker 10a a part that Fig. 6 shows is realized with the processor 31 for executing touch gestures decision procedure.
" 2-2 " movement
Fig. 7 is the movement (touching for showing the touch gestures decision maker 10a in the panel input device 2 of embodiment 2 Touch gesture determination method) flow chart.In Fig. 7, marked to the identical or corresponding processing step of processing step shown in Fig. 4 Number of steps identical with number of steps shown in Fig. 4.The movement and reality of touch gestures decision maker 10a in embodiment 2 Apply the movement of the touch gestures decision maker 10 in mode 1 the difference is that, touch gestures operation determination step There is the processing step (step ST201) and gesture correction process step (step of storage touch gestures operation after ST109 ST202).Other than this point, the movement of the touch gestures decision maker 10a in embodiment 2 is identical as embodiment 1.
In the step ST201 of Fig. 7, gesture operation storage unit 17 stores the touch gestures that operation determination section 14 identifies and grasps It is used as historical information.The example of the historical information of touch gestures operation is shown in table 2.
[table 2]
Fig. 8 is the acquirement for showing the historical information of the touch gestures operation in the panel input device 2 of embodiment 2 The figure of method.In step ST201, for example, the history operated by touch gestures shown in method computational chart 2 shown in Fig. 8 Information.In the example of fig. 8, relative to the straight line Q1 of 2 points P1, P2 touched in connection last time touch gestures operation, from working as The point P3 touched in preceding touch gestures operation draws vertical line Q2, if the amount of movement of direction of rotation, that is, angle variable quantity H [pixel] is The length of vertical line Q2.Between if distance change amount D [pixel] is current 2 points distance (the distance between P1 and P3) L2 with it is upper Variable quantity between secondary 2 points between distance (the distance between P1 and P2) L1.Between if amount of movement M [pixel] is current 2 points Between the center I2 of (the distance between P1 and P3) and 2 points of last time between the center I1 of (the distance between P1 and P2) Amount of movement.
In step ST202, the historical information that gesture correction unit 18 is stored according to gesture operation storage unit 17 is not being watched In the case that picture carries out touch gestures operation, the operation information for correcting intention is not input to touch gestures decision maker 10a's Recognition result.
Fig. 9 is to show the picture progress touch gestures operation for not watching touch panel 20 in panel input device 2 In the case where, it is intended that operation information be not input to touch gestures decision maker 10a touch gestures operation an example figure.Such as Shown in Fig. 9, enumerate following touch gestures operation: by the picture of touch panel 20 around as the suitable of the 1st direction of rotation Clockwise describes circle using finger repeatedly, increases selective value, passes through the 2nd rotation around the opposite direction as the 1st direction of rotation The counter clockwise direction for turning direction describes circle using finger repeatedly, reduces selective value.
When eyes utilize finger to describe bowlder not towards touch panel 20, even if intending to describe circle, round position is also slightly Micro- offset changes for the angle variable quantity H [pixel] of amount of movement M [pixel], therefore, it is difficult to carry out pre- prerequisite for obtaining The adjustment of fixed operating quantity.Therefore, gesture correction unit 18 is played using the slave current time point accumulated in gesture operation storage unit 17 Touch track information in a period of until the past time point of backtracking stipulated time, passes through fitting side as least square method The shape of formula estimation circle, the position of more new circle.Behind the position of more new circle, gesture correction unit 18 calculates the contact of last time finger Point, the center of circle and the contact point angulation of current finger, thus find out the variable quantity of angle.As described above, such as Shown in Fig. 9, it can be realized eyes and describe round touch gestures operation repeatedly not towards 0 picture of touch panel 20.
The position of the not only more new circle of gesture correction unit 18 can also change ellipse in the case where round shape is elliptical situation Vertical-horizontal proportion ruler and become round correction, coordinates computed track again calculates the contact point of last time finger, the center of circle The contact point angulation of position and current finger.
Before the position that gesture correction unit 18 corrects circle, operation determination section 14 determines to describe the touch gestures operation of circle. The initial position of circle at this time can be the center of the picture of touch panel 20, alternatively, can also time according to the rules Track is found out by least square method etc..
Figure 10 is not watch the picture of touch panel 20 in the panel input device 2 for show embodiment 2 to carry out The figure of an example of touch gestures operation.Start the behaviour of the triggering of the touch gestures for determining to describe circle operation as operation determination section 14 Work can be following situation: the touch of multiple contact points shown in the left side as Figure 10 based on multiple finger tips, the hand of contact Exponential quantity is less than pre-determined.For example, as shown in Figure 10, can also after 3 touches contact point become 1 when, operation is sentenced Portion 14 is determined centered on 3 position of centre of gravitys (focus point), starts the touch gestures for determining to describe circle operation, after, it is pre- in accumulation When more than the data volume first determined history, the school of the rotation amount based on round location updating is carried out by gesture correction unit 18 Just.
As due to not watching another example that picture carries out and does not operate according to the touch gestures of intention, multiple spot can be enumerated The touch gestures operation mediate, rotate, moved in parallel.Figure 11 is not seen in the panel input device 2 for show embodiment 2 It sees panel input device 2 on hand and adjusts the figure of the direction of video camera, angle, the scene of amount of zoom.Figure 11 illustrates not Viewing touch panel on hand and the direction and angle, the scene of amount of zoom for adjusting video camera.
In Figure 11, video camera display device 401 shows the image taken by video camera.In addition to this, it is taking the photograph In camera display device 401, the overlapping of notification unit 15 shows current camera operation situation (direction, magnifying power, rotation Angle).
Operation screen 402 delivers touch information to operation information input unit 11 on hand, and switching has sight state and without sight The picture of state is shown, is operated according to the touch gestures determined by operation determination section 14, and the display content of picture is changed.
In the operation mode for having sight state, picture 403 is that sight whether there is or not determination unit 12 is determined as there is sight state In the case of picture, show handbook, be able to carry out clickable hyperlinks come image switching touch gestures operation.
In the operation mode for ignoring linear state, picture 404 is that whether there is or not determination units 12 to be judged to ignoring linear state for sight In the case of picture.
Whether there is or not the touch operations that determination unit 12 determines 3 or more for sight, when the operation all left there are finger is passed through When the pre-determined time, it is determined as there is sight state.The judgement of operation determination section 14 whether there is after 3 or more touch operations Scaling, movement, the touch gestures operation rotated, take the photograph when showing video camera image 407 when scaling, movement according to respective operating quantity The such direction for changing video camera of video camera image 409 and the image after amount of zoom when camera image 408, rotation.
As due to not watching another example that picture carries out and does not operate according to the touch gestures of intention, there is strength and press The operation of touch panel.For example, in the feelings for the operation for carrying out sight whether there is or not determination units by long-pressing to be judged to ignoring linear state Under condition, length is carried out on time when not watching picture, shows the tendency that strength presses finger.
When strength presses finger, tendency that the direction from finger tip towards finger belly slightly moves that there are the positions of touch.Cause This, the historical information that gesture correction unit 18 is operated according to the touch gestures of table 2 is not become using the amount of movement due to finger position Change certain above or press pressure to weaken and make the certain following such property of amount of movement, is judged as long-pressing.
The correction process of the gesture correction unit 18 can not only be applied to long-pressing the case where, additionally it is possible to be applied to click and The case where 2 points of operations.
In the case where the touch gestures for not watching picture and zooming in and out, move, rotating operate, even if intending amplification, Touch gestures operation is also misidentified as rotation process sometimes.Therefore, gesture correction unit 18 is found out in gesture operation storage unit 17 Accumulation slave current time point play backtracking the stipulated time past time point until in a period of 3 or more contact points shifting The aggregate value of the variable quantity of the aggregate value and direction of rotation of the variable quantity of the distance between the aggregate value of momentum, 2 contact points, it is defeated The operation of the maximum aggregate value in the aggregate value of these variable quantities is as the operation being intended to out.As described above, even if not seeing When seeing picture and carrying out touch gestures operation, it is also able to carry out the operation of intention.
" 2-3 " effect
As described above, in embodiment 2, take out the history of the gesture identified by operation determination section 14 into Row correction, the operation being not intended to caused by being operated thereby, it is possible to prevent from not watching picture, even also, more multiple Miscellaneous touch gestures operation, can not also watch picture and carry out touch gestures operation.
" 3 " embodiment 3
" 3-1 " structure
In embodiment 1,2, whether there is or not determination units 12 for sight according to the operation information A0 operated based on touch gestures, carries out Whether there is or not judgement (operation mode judgements) for sight.But it in the panel input device of embodiment 1,21,2, is not available In sight, whether there is or not operations used in judgement as the original touch gestures operation for information input.Therefore, in embodiment 3 In, have and clap whether there is or not judgement (operation mode judgement) instead of carrying out sight according to the operation information A0 based on touch gestures operation The video camera as filming apparatus for taking the photograph user's face, carrying out sight according to the camera review of user's face, whether there is or not judgements.
Figure 12 is the functional block diagram for showing the outline structure of panel input device 3 of embodiment 3.In Figure 12, Label identical with label shown in FIG. 1 is marked to the identical or corresponding structural element of structural element shown in FIG. 1.Touch surface Plate input unit 3 has touch panel 20 and touch gestures decision maker 10b.Touch gestures decision maker 10b is to be able to carry out The device of the touch gestures decision procedure of the touch gestures determination method and embodiment 3 of embodiment 3.In embodiment 3 Touch gestures decision maker 10 in touch gestures decision maker 10b and embodiment 1 the difference is that, whether there is or not sentence for sight Determine portion 12b and sight is carried out according to the camera review of the user's face obtained by camera review input unit 19 from video camera 34 Whether there is or not judgement (operation mode judgements).Other than this point, embodiment 3 is identical as embodiment 1.Additionally it is possible to by real Apply the touch panel input of camera review input unit 19 and sight whether there is or not determination unit 12b applied to embodiment 2 in mode 3 Device 2.
Camera review input unit 19 receives the camera review (picture number that video camera 34 shoots user's face and obtains According to).As the sight of operation mode determination unit, whether there is or not determination unit 12b to receive taking the photograph for user's face from camera review input unit 19 Camera image extracts the image procossing of facial all and eyes image datas, detects the direction of face and the view of eyes The direction of line.If eyes are towards the picture of touch panel 20, the operation mode for being judged to having sight state, if eyes are not Towards the picture of touch panel 20, then it is judged to ignoring the operation mode of linear state.
Figure 13 is the figure for showing an example of the H/W structure of panel input device 3 of embodiment 3.It is right in Figure 13 Label identical with label shown in Fig. 2 is marked with the identical or corresponding structural element of structural element shown in Fig. 2.Embodiment The H/W structure of the panel input device 1 of the H/W structure and embodiment 1 of 3 panel input device 3 it is different it It is in touch gestures decision procedure this point that store with video camera 34 this point and memory 32b.In addition to these point with Outside, the H/W structure of embodiment 3 is identical as embodiment 1.
The camera review (image data) of the user's face obtained by video camera shooting is stored in by video camera 34 As in the memory 32b of storage unit, processor 31 is obtained in the processing of camera review input unit 19 from memory 32b Camera review, using sight, whether there is or not determination unit 12b to carry out image procossing.
" 3-2 " movement
Figure 14 is the movement for showing the touch gestures decision maker 10b in the panel input device 3 of embodiment 3 The flow chart of (touch gestures determination method).In Figure 14, to the identical or corresponding processing step of processing step shown in Fig. 4 Mark number of steps identical with number of steps shown in Fig. 4.The movement of touch gestures decision maker 10b in embodiment 3 Movement with the touch gestures decision maker 10 in embodiment 1 the difference is that, whether there is or not determination unit 12b according to logical for sight Cross camera review input unit 19 acquirement user's face camera review carry out sight whether there is or not determine this point (step ST301, ST302 obtain) and after the sight based on camera review is whether there is or not judgement the operation information A0 that is operated based on touch gestures this Point (step ST303).Other than these points, movement and the embodiment 1 of the panel input device 3 of embodiment 3 It is identical.Therefore, in the following, illustrating the movement different from the movement of embodiment 1.
In the step ST301 of Figure 14, camera review input unit 19 receives the picture number of user's face from video camera 34 According to making memory 32b store the image data.
(a) and (b) of Figure 15 is to show having sight state and ignoring in the panel input device 3 of embodiment 3 The figure of the determination method of linear state.In step ST302, whether there is or not determination unit 12b to receive video camera shown in (a) of Figure 15 for sight Image input unit 19 is stored in the image data of the user's face in memory 32b, extracts facial direction and sight Image procossing, as shown in the left side of (b) of Figure 15, if sight determines towards the direction of the picture towards touch panel 20 To there is sight state.In addition, as shown in the right side of (b) of Figure 15, if sight is not towards the picture towards touch panel 20 Direction, then whether there is or not determination unit 12b to be judged to ignoring linear state for sight.
For example, the direction of face can be detected by the Cascade identifier based on Haar characteristic quantity, for example, the face of extraction The outer diameter of iris is extracted, ellipse is intended for the image application boundary filter of eyes in part, that is, eyes part in portion The position to estimate pupil is closed, thus, it is possible to detect sight.The direction of face and the detection method of sight are not limited to these examples, Other methods can also be used.
" 3-3 " effect
As described above, the panel input device 3 of embodiment 3, touch gestures decision maker 10b, In touch gestures determination method and touch gestures decision procedure, the camera review judgement using video camera 34 is the sight of user There is sight state towards the picture of touch panel 20 still and there is the state other than sight state to ignore linear state, uses as a result, Family is without the touch gestures operation for switching operation modes, and also capable of easily and accurately carrying out sight, whether there is or not judgement (behaviour Operation mode determines).
In addition, being sentenced according to the panel input device 3 of embodiment 3, touch gestures decision maker 10b, touch gestures Determine method and touch gestures decision procedure, carrying out sight according to camera review, whether there is or not judgements, therefore, originally defeated for information In the touch gestures operation entered, the operation that limitation uses can be eliminated.
In addition, in the mobile communication terminal of the panel input device 3 equipped with embodiment 3, in repair apparatus In the case where or in the case that the display device of big picture of separated remote position is arranged in viewing, in viewing touch panel 20 When (have sight state), be able to carry out the touch gestures using common display unit based on push-botton operation and operate, do not seeing It sees and (ignores linear state) when touch panel 20, the picture entirety of touch panel 20 can be set to accept the operation of touch gestures operation Effective coverage.Therefore, it when not watching touch panel 20, is able to use mobile communication terminal and exists as the maintenance of equipment or setting The operation equipment of the display device of the big picture in a distant place.
Above embodiment 1~3 is illustration of the invention, can be made various changes within the scope of the invention.
Label declaration
1,2,3: panel input device;10,10a, 10b: touch gestures decision maker;11: operation information input unit; 12,12b: whether there is or not determination unit (operation mode determination units) for sight;13: operation mode switching part;14: operation determination section;15: notice Portion;16: display control section;17: gesture operation storage unit;18: gesture correction unit;19: camera review input unit;20: touching Panel;21: display panel portion;22: operation panel portion;31: processor;32,32b: memory;33: loudspeaker;34: video camera; 141: display unit gesture determination unit;142: picture entirety gesture determination unit;A0: operation information;A1: input information;A2: sight Determine information;A3: command information;A4: operation determines information;A5: notification information;A6: picture signal;A7, A8: selective value is (defeated Signal out);A9: camera review.

Claims (21)

1. a kind of panel input device, which is included
Touch panel, the display operation image in picture accept the touch gestures operation of user, output and the touch hand Gesture operates corresponding operation information;And
Touch gestures decision maker receives the operation information, generates the selective value based on the operation information,
It is characterized in that,
The touch gestures decision maker includes
Sight determines that the touch gestures operation is that the sight of the user has sight towards the picture whether there is or not determination unit The sight of operation or the user under state towards the operation of the picture ignored under linear state, does not export described in indicating The sight of the result of judgement determines information;
Operation mode switching part exports the command information that information is determined based on the sight;
Operation determination section generates the choosing based on the operation information by the determination method according to the command information Select value;And
Display control section makes the touch panel displays according to the image of the command information as the operation image,
The operation determination section includes
Display unit gesture determination section, it is described have sight state when, according to based on being shown to as the operation with image The selective value is determined in the operation information for the touch gestures operation that the display unit of the touch panel carries out;And
Picture entirety gesture determination section, it is described ignore linear state when, according to all based on the picture to the touch panel The operation information of the touch gestures operation of progress determines the selective value.
2. panel input device according to claim 1, which is characterized in that
The touch gestures decision maker also has notification unit, which issues the content of announcement according to the command information Notice or output notification signal.
3. panel input device according to claim 1 or 2, which is characterized in that
Operated as the touch gestures, produce touch panel described in Continued depression reach pre-determined the 1st threshold value with On time long press operation, so that multiple fingers is contacted the multipoint operation of the touch panel and make touch surface described in palm contacts In the case where any operation in the palm operation of plate, whether there is or not determination units to be determined as that the sight of the user is described for the sight Ignore linear state.
4. panel input device according to claim 1 or 2, which is characterized in that
It is operated as the touch gestures, the operation for the display content for persistently not changed the touch panel reaches preparatory In the case where the time more than 2nd threshold value determined, whether there is or not determination units to be determined as that the sight of the user is described for the sight Ignore linear state.
5. panel input device described in any one according to claim 1~4, which is characterized in that
It has been carried out on pre-determined direction in the specific region of the touch panel not expose from the specific region Mode stroke operation in the case where, whether there is or not determination units to be determined as that the sight of the user is described to ignore threadiness for the sight State.
6. panel input device according to any one of claims 1 to 5, which is characterized in that
In the case where depicting particular track on the picture of the touch panel, the sight whether there is or not determination unit be determined as it is described The sight of user ignores linear state described in being.
7. panel input device according to any one of claims 1 to 6, which is characterized in that
In the case that time more than the 3rd threshold value for reaching pre-determined on the picture of the touch panel does not operate, institute Stating sight, whether there is or not determination units to be determined as that the sight of the user ignores linear state to be described.
8. panel input device according to claim 1 or 2, which is characterized in that
The panel input device also has video camera, which exports camera shooting by shooting the face of the user Machine image,
Whether there is or not determination units according to the camera review progress judgement for the sight.
9. panel input device according to any one of claims 1 to 8, which is characterized in that
By the sight whether there is or not determination unit be determined as it is described ignore linear state in the case where,
The operation determination section increases at the region for touching the 1st pre-determined end side of picture of the touch panel The selective value reduces in i.e. 2 end side of opposite side for touching the 1st end side of picture of the touch panel The selective value.
10. panel input device according to any one of claims 1 to 8, which is characterized in that
By the sight whether there is or not determination unit be determined as it is described ignore linear state in the case where,
The operation determination section selects on the picture of the touch panel and has carried out the position of the operation of picture described in stroke Corresponding value is used as the selective value.
11. panel input device according to any one of claims 1 to 8, which is characterized in that
By the sight whether there is or not determination unit be determined as it is described ignore linear state in the case where,
The operation determination section rotate to the 1st direction of rotation with multiple contact points according on the picture of the touch panel Operation, increase the selective value, according to rotate to i.e. the 2nd direction of rotation of opposite direction of the 1st direction of rotation Operation, reduces the selective value.
12. panel input device according to any one of claims 1 to 8, which is characterized in that
The touch gestures decision maker also includes
Gesture operation storage unit stores the historical information of the touch gestures operation;And
Gesture correction unit corrects the operation determination section according to the historical information that the gesture operation storage unit stores Identification to touch gestures operation.
13. panel input device according to claim 12, which is characterized in that
By the sight whether there is or not determination unit be determined as it is described ignore linear state in the case where,
The operation determination section rotate to the 1st direction of rotation with multiple contact points according on the picture of the touch panel Operation, increase the selective value, according to rotate to i.e. the 2nd direction of rotation of opposite direction of the 1st direction of rotation Operation, reduces the selective value,
The position and track for the circle that the gesture correction unit correction is depicted in the operation of the rotation.
14. panel input device according to claim 12, which is characterized in that
Make 1 contact point mobile on the picture of the touch panel and describe repeatedly it is round operation, the contact point just Beginning position is the focus point or central point of multiple contact points.
15. panel input device according to claim 12, which is characterized in that
By the sight whether there is or not determination unit be determined as it is described ignore linear state in the case where,
In a period of the gesture correction unit is until the time point for playing the backtracking pre-determined time from current time point, obtain The variable quantity of the distance between the amount of movements of 3 or more the contact points inputted on the picture of the touch panel, 2 contact points Shown in amount of zoom and rotation angle, find out the distance between the aggregate value of the amount of movement of the contact point, the contact point Variable quantity aggregate value and the rotation direction of rotation variable quantity aggregate value, export in the aggregate value of these variable quantities Maximum aggregate value operation as intention operation.
16. according to panel input device described in any one in claim 9,10,13, which is characterized in that
By the sight whether there is or not determination unit be determined as it is described have sight state in the case where, generate for the touch panel When the touch operation of multiple points of picture, the operation determination section determines the selective value.
17. panel input device described in any one in 1,14,15 according to claim 1, which is characterized in that
By the sight whether there is or not determination unit be determined as it is described ignore linear state in the case where, generate for the touch panel When the double-click of picture, the operation determination section determines the selective value.
18. according to panel input device described in any one in claim 9~11,13,15, which is characterized in that
By the sight whether there is or not determination unit be determined as it is described ignore linear state in the case where, if for the touch panel The operation of picture does not reach the pre-determined time, then the operation determination section determines the selective value.
19. a kind of touch gestures decision maker receives operation information from touch panel, generates the choosing based on the operation information Value is selected, touch panel display operation image in picture accepts the touch gestures operation of user, output and the touch hand Gesture operates the corresponding operation information, which is characterized in that the touch gestures decision maker includes
Sight determines that the touch gestures operation is that the sight of the user has sight towards the picture whether there is or not determination unit The sight of operation or the user under state towards the operation of the picture ignored under linear state, does not export described in indicating The sight of the result of judgement determines information;
Operation mode switching part exports the command information that information is determined based on the sight;
Operation determination section generates the choosing based on the operation information by the determination method according to the command information Select value;And
Display control section makes the touch panel displays according to the image of the command information as the operation image,
The operation determination section includes
Display unit gesture determination section, it is described have sight state when, according to based on being shown to as the operation with image The selective value is determined in the operation information for the touch gestures operation that the display unit of the touch panel carries out;And
Picture entirety gesture determination section, it is described ignore linear state when, according to all based on the picture to the touch panel The operation information of the touch gestures operation of progress determines the selective value.
20. a kind of touch gestures determination method receives operation information from touch panel, generates the selection based on the operation information Value, touch panel display operation image in picture accept the touch gestures operation of user, output and the touch gestures Operate the corresponding operation information, which is characterized in that the touch gestures determination method has follow steps:
Sight determines that the touch gestures operation is that the sight of the user has sight towards the picture whether there is or not determination step The sight of operation or the user under state towards the operation of the picture ignored under linear state, does not export described in indicating The sight of the result of judgement determines information;
Determination step is operated, by the determination method according to the command information for determining information based on the sight, generates and is based on institute State the selective value of operation information;And
Display control step makes the touch panel displays according to the image of the command information as the operation image,
In the operation determination step, it is described have sight state when, according to based on being shown to as the operation with image The selective value is determined in the operation information for the touch gestures operation that the display unit of the touch panel carries out, and is ignored described When linear state, according to the operation information decision that the touch gestures carried out based on the picture entirety to the touch panel are operated Selective value.
21. a kind of touch gestures decision procedure, which is characterized in that in order to receive operation information from touch panel and generate based on institute The selective value of operation information is stated, the touch gestures decision procedure makes computer execute following processing, wherein the touch panel exists Display operation image in picture accepts the touch gestures operation of user, exports corresponding with touch gestures operation described Operation information:
Sight determines that the touch gestures operation is that the sight of the user has sight towards the picture whether there is or not determination processing The sight of operation or the user under state towards the operation of the picture ignored under linear state, does not export described in indicating The sight of the result of judgement determines information;
Operations decision process is generated by the determination method according to the command information for being determined information based on the sight and is based on institute State the selective value of operation information;And
Display control processing makes the touch panel displays according to the image of the command information as the operation image,
In the operations decision process, it is described have sight state when, according to based on being shown to as the operation with image The selective value is determined in the operation information for the touch gestures operation that the display unit of the touch panel carries out, and is ignored described When linear state, according to the operation information decision that the touch gestures carried out based on the picture entirety to the touch panel are operated Selective value.
CN201680091771.3A 2016-12-26 2016-12-26 Touch panel input device, touch gesture determination method, and recording medium Active CN110114749B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/088610 WO2018122891A1 (en) 2016-12-26 2016-12-26 Touch panel input device, touch gesture determination device, touch gesture determination method, and touch gesture determination program

Publications (2)

Publication Number Publication Date
CN110114749A true CN110114749A (en) 2019-08-09
CN110114749B CN110114749B (en) 2022-02-25

Family

ID=59559217

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680091771.3A Active CN110114749B (en) 2016-12-26 2016-12-26 Touch panel input device, touch gesture determination method, and recording medium

Country Status (5)

Country Link
JP (1) JP6177482B1 (en)
KR (1) KR102254794B1 (en)
CN (1) CN110114749B (en)
DE (1) DE112016007545T5 (en)
WO (1) WO2018122891A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111352529A (en) * 2020-02-20 2020-06-30 Oppo(重庆)智能科技有限公司 Method, device, terminal and storage medium for reporting touch event
CN113495620A (en) * 2020-04-03 2021-10-12 百度在线网络技术(北京)有限公司 Interactive mode switching method and device, electronic equipment and storage medium
CN114270299A (en) * 2019-09-04 2022-04-01 三菱电机株式会社 Touch panel device, operation recognition method, and operation recognition program

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021240668A1 (en) * 2020-05-27 2021-12-02 三菱電機株式会社 Gesture detection device and gesture detection method
WO2021240671A1 (en) * 2020-05-27 2021-12-02 三菱電機株式会社 Gesture detection device and gesture detection method
JP7014874B1 (en) 2020-09-24 2022-02-01 レノボ・シンガポール・プライベート・リミテッド Information processing equipment and information processing method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000231446A (en) * 1999-02-10 2000-08-22 Sharp Corp Display integrated type tablet device and storage medium stored with automatic tablet correction program
JP2006017478A (en) * 2004-06-30 2006-01-19 Xanavi Informatics Corp Navigation system
CN101055193A (en) * 2006-04-12 2007-10-17 株式会社日立制作所 Noncontact input operation device for in-vehicle apparatus
JP5028043B2 (en) * 2006-07-19 2012-09-19 クラリオン株式会社 In-vehicle information terminal
CN103577103A (en) * 2012-06-29 2014-02-12 英默森公司 Method and apparatus for providing shortcut touch gestures with haptic feedback

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3909251B2 (en) * 2002-02-13 2007-04-25 アルパイン株式会社 Screen control device using line of sight
JP2007302223A (en) * 2006-04-12 2007-11-22 Hitachi Ltd Non-contact input device for in-vehicle apparatus
JP2008195142A (en) * 2007-02-09 2008-08-28 Aisin Aw Co Ltd Operation supporting device and method for on-vehicle equipment
KR101252169B1 (en) * 2011-05-27 2013-04-05 엘지전자 주식회사 Mobile terminal and operation control method thereof
KR102268045B1 (en) * 2014-06-27 2021-06-22 엘지전자 주식회사 An apparatus and method for proceeding information of products being displayed in the show window
JP6385173B2 (en) 2014-07-15 2018-09-05 三菱電機株式会社 User judgment method on elevator touch panel type destination floor registration operation panel and elevator touch panel type destination floor registration operation panel
JP6214779B2 (en) 2014-09-05 2017-10-18 三菱電機株式会社 In-vehicle device control system
KR102216944B1 (en) 2014-09-23 2021-02-18 대원정밀공업(주) Pumping device for vehicle seat
KR102073222B1 (en) * 2014-12-18 2020-02-04 한국과학기술원 User terminal and method for providing haptic service of the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000231446A (en) * 1999-02-10 2000-08-22 Sharp Corp Display integrated type tablet device and storage medium stored with automatic tablet correction program
JP2006017478A (en) * 2004-06-30 2006-01-19 Xanavi Informatics Corp Navigation system
CN101055193A (en) * 2006-04-12 2007-10-17 株式会社日立制作所 Noncontact input operation device for in-vehicle apparatus
JP5028043B2 (en) * 2006-07-19 2012-09-19 クラリオン株式会社 In-vehicle information terminal
CN103577103A (en) * 2012-06-29 2014-02-12 英默森公司 Method and apparatus for providing shortcut touch gestures with haptic feedback

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114270299A (en) * 2019-09-04 2022-04-01 三菱电机株式会社 Touch panel device, operation recognition method, and operation recognition program
CN111352529A (en) * 2020-02-20 2020-06-30 Oppo(重庆)智能科技有限公司 Method, device, terminal and storage medium for reporting touch event
CN111352529B (en) * 2020-02-20 2022-11-08 Oppo(重庆)智能科技有限公司 Method, device, terminal and storage medium for reporting touch event
CN113495620A (en) * 2020-04-03 2021-10-12 百度在线网络技术(北京)有限公司 Interactive mode switching method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
JP6177482B1 (en) 2017-08-09
CN110114749B (en) 2022-02-25
KR20190087510A (en) 2019-07-24
DE112016007545T5 (en) 2019-09-19
JPWO2018122891A1 (en) 2018-12-27
WO2018122891A1 (en) 2018-07-05
KR102254794B1 (en) 2021-05-21

Similar Documents

Publication Publication Date Title
CN110114749A (en) Panel input device, touch gestures decision maker, touch gestures determination method and touch gestures decision procedure
US10438080B2 (en) Handwriting recognition method and apparatus
US10126826B2 (en) System and method for interaction with digital devices
US9733752B2 (en) Mobile terminal and control method thereof
US9329714B2 (en) Input device, input assistance method, and program
WO2017215375A1 (en) Information input device and method
CN108108117B (en) Screen capturing method and device and terminal
CN103502923A (en) Touch and non touch based interaction of a user with a device
WO2017047182A1 (en) Information processing device, information processing method, and program
WO2019033322A1 (en) Handheld controller, and tracking and positioning method and system
CN105320261A (en) Control method for mobile terminal and mobile terminal
CN105808129B (en) Method and device for quickly starting software function by using gesture
WO2018076609A1 (en) Terminal and method for operating terminal
US9948894B2 (en) Virtual representation of a user portion
WO2017215211A1 (en) Picture display method based on intelligent terminal having touch screen, and electronic apparatus
CN104516566A (en) Handwriting input method and device
CN114578956A (en) Equipment control method and device, virtual wearable equipment and storage medium
CN110947180A (en) Information processing method and device in game
CN114327047B (en) Device control method, device control apparatus, and storage medium
CN115167736B (en) Image-text position adjustment method, image-text position adjustment equipment and storage medium
CN115981481A (en) Interface display method, device, equipment, medium and program product
CN117555414A (en) Interaction method, interaction device, electronic equipment and storage medium
CN114637454A (en) Input method, input device and input device
CN117555413A (en) Interaction method, interaction device, electronic equipment and storage medium
CN117555412A (en) Interaction method, interaction device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant