WO2015093005A1 - Système d'affichage - Google Patents

Système d'affichage Download PDF

Info

Publication number
WO2015093005A1
WO2015093005A1 PCT/JP2014/006126 JP2014006126W WO2015093005A1 WO 2015093005 A1 WO2015093005 A1 WO 2015093005A1 JP 2014006126 W JP2014006126 W JP 2014006126W WO 2015093005 A1 WO2015093005 A1 WO 2015093005A1
Authority
WO
WIPO (PCT)
Prior art keywords
input device
hand
display device
display
contact
Prior art date
Application number
PCT/JP2014/006126
Other languages
English (en)
Japanese (ja)
Inventor
田中 剛
祥 園田
仁和 下中
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2015093005A1 publication Critical patent/WO2015093005A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to an input device such as a touch pad and a display system including a display device.
  • a notebook personal computer As a device having an input device and a display device, for example, a notebook personal computer is known.
  • a notebook personal computer when an operator's finger contacts the input device, the display device shows a cursor at the position of the display device corresponding to the contact position (for example, Patent Document 1).
  • the display system includes an input device that detects contact, and a display device that is connected to the input device and has data for displaying a hand-shaped cursor.
  • the display device displays a hand-shaped cursor. Since the display system displays a hand-shaped cursor on the display device, the operator can operate more intuitively.
  • FIG. 1 is an overall view of a display system according to an embodiment.
  • FIG. 2 is an overall view when a finger is in contact with the input device in FIG.
  • FIG. 3 is an overall view when the input is confirmed for the display system.
  • FIG. 4 is an overall view when the finger is in contact with the input device when the display device of the display system displays the keyboard screen.
  • FIG. 5 is a diagram when the display device of the input device of the display system displays an audio screen.
  • 6 is a diagram of the display device when a finger is in contact with the input device in FIG.
  • FIG. 7 is a diagram when the display device of the display system displays a handwriting input screen.
  • FIG. 8 is a diagram of the display device when a finger is in proximity to the input device in FIG. FIG.
  • FIG. 9 is a screen transition diagram of the display device of the display system.
  • FIG. 10 is a flowchart for determining from the left or right side of the input device of the display system.
  • FIG. 11 is an explanatory diagram when determining from the left or right side of the input device of the display system.
  • FIG. 1 is an overall view of a display system according to an embodiment.
  • the display system 10 includes an input device 20 and a display device 30.
  • the display system 10 further includes a cable 40.
  • the input device 20 includes a touch pad 21, a home switch 22, a return switch 23, and a housing 24.
  • the touch pad 21 is a sensor that measures capacitance, for example. When the operator's finger touches the touch pad 21, the touch pad 21 detects a change in capacitance and detects that it has touched itself. The touch pad 21 further detects where the touch pad 21 is touched. When the operator's finger approaches the touch pad 21, that is, when the operator's finger approaches the touch pad 21, the touch pad 21 detects a change in electrostatic capacitance and detects that there is an object close to itself. In addition, it detects where its position is close. The touch pad 21 detects a difference between contact and proximity based on a difference in capacitance.
  • the home switch 22 is a switch for switching the image displayed on the display device 30 to the main menu 31.
  • the display device 30 displays the main menu 31.
  • the return switch 23 is a switch that returns the input to the input device 20 to the previous state.
  • the housing 24 holds a touch pad 21, a home switch 22, and a return switch 23.
  • the display device 30 is electrically connected to the input device 20 by the cable 40.
  • the display device 30 is a liquid crystal display, a plasma display, an organic electroluminescence display, or the like.
  • the display device 30 displays not only the main menu 31 but also a keyboard screen 32, an audio screen 33, a handwriting input screen 34, and the like which will be described later.
  • the main menu 31 includes a selection menu for calling a keyboard screen 32, an audio screen 33, a handwriting input screen 34, and the like.
  • the display device 30 displays a main menu 31.
  • the main menu 31 is a screen on which a keyboard icon 31A, an audio icon 31B, a navigation icon 31C, a handwriting input icon 31D, a test icon 31E, and a game icon 31F are displayed.
  • FIG. 2 is an overall view when the finger of the left hand 51 is in contact with the input device 20 in FIG. An index finger that is a part of the left hand 51 contacts the touch pad 21, and at this time, the display device 30 displays a left hand cursor 35.
  • the left hand cursor 35 has a transparent fingertip and the entire left hand other than the fingertip is translucent.
  • the left hand cursor 35 overlaps the main menu 31, but at least the portion where the left hand cursor 35 overlaps the keyboard icon 31 ⁇ / b> A that is the selection menu is transparent, so the left hand cursor 35 does not hide the keyboard icon 31 ⁇ / b> A.
  • the display device 30 has data for displaying the left hand cursor 35.
  • the left hand 51 is in contact with the keyboard icon 31A of the main menu 31. That is, the display system 10 recognizes that the keyboard icon 31A is selected. At this time, the display device 30 reverses the brightness of the keyboard icon 31A to convey the icon selected by the operator in an easy-to-understand manner.
  • FIG. 3 is an overall view when the input is confirmed for the display system 10.
  • the display system 10 confirms the selection of the icon that has been displayed. That is, the display system 10 confirms the selection of the keyboard icon 31A.
  • the display device 30 displays the keyboard screen 32 of FIG.
  • FIG. 4 is an overall view when the finger of the left hand 51 is in contact with the input device when the display device 30 of the display system 10 displays the keyboard screen 32. Since the left hand 51 is in contact with the touch pad 21 of the input device 20, the display device 30 displays the left hand cursor 35. At this time, the left hand cursor 35 is transparent at the fingertip and the entire left hand other than the fingertip is translucent, as in FIG. In the state of FIG. 4, the character “A” on the keyboard screen 32 is selected. When the left hand 51 is spoken from this state, the display system 10 determines the selection of the character “A” on the keyboard screen 32.
  • FIG. 5 is a diagram when the display device 30 of the display system 10 displays the audio screen 33.
  • the display device 30 displays the audio screen 33.
  • FIG. 6 is a diagram of the display device 30 when the finger of the right hand is in contact with the input device 20 in FIG.
  • the display device 30 displays the right hand cursor 36.
  • the right hand cursor 36 is transparent with the fingertips, and the entire right hand other than the fingertips is translucent.
  • the display device 30 has data for displaying the right hand cursor 36.
  • FIG. 7 is a diagram when the display device 30 of the display system 10 displays the handwriting input screen 34.
  • the display device 30 displays a handwriting input screen 34.
  • FIG. 8 is a diagram of the display device 30 when a finger is in proximity to the input device 20 in FIG.
  • the handwriting input screen 34 in FIG. 8 is a screen for direct handwriting input with a finger instead of input using a keyboard or icons.
  • the display device 30 displays a pencil cursor 37 when a finger approaches the touch pad 21.
  • FIG. 9 is a screen transition diagram of the display device 30 of the display system 10.
  • the main menu 31 transits to a keyboard screen 32, an audio screen 33, a handwriting input screen 34, and the like. From the keyboard screen 32, the audio screen 33, the handwriting input screen 34, etc., a transition is made to the main menu 31.
  • FIG. 10 is a flowchart for determining which side of the input device 20 of the display system 10 is in contact from the left or right side.
  • FIG. 11 is an explanatory diagram when determining from the left or right side of the input device 20 of the display system 10.
  • the touch pad 21 determines that there is a hover state if a part of the object is in the hover state.
  • the display device 30 does not display the hand icon (S400).
  • the touch pad 21 detects the position of the object in the hover state, that is, the hover coordinates (S120).
  • the touch pad 21 detects the presence or absence of contact with itself (S130). When there is no contact, the display device 30 does not display the hand cursor (S400). When there is contact, the touch pad 21 detects the position of contact with itself, that is, touch coordinates (S140).
  • the display system 10 calculates the angle of the line connecting the touch coordinates to the hover coordinates (S150). As shown in FIG. 11, the reference line when calculating the angle is the positive direction of the X axis of the XY coordinates set on the same plane as the touch pad 21.
  • the display system 10 determines whether this angle is within a predetermined angle range (S160). When it is within the predetermined angle range, the display system 10 determines that it is in contact from the right side (S200), and the display device 30 displays the left hand cursor 35 (S210). When this angle is outside the predetermined angle range determined in advance, the display system 10 determines that contact is made from the left side (S300), and the display device 30 displays the right hand cursor 36 (S310).
  • the touch pad 21 is touched from the right side of the input device 20. Judge that there is. In the present embodiment, when there is a contact from the right side, the display system 10 considers the contact by the left hand 51 and the display device 30 displays the left hand cursor 35.
  • the display system 10 regards the contact with the right hand 52 and the display device 30 displays the right hand cursor 36.
  • the input device 20 is disposed between the right seat and the left seat such as a driver seat and a passenger seat of an automobile, for example. Therefore, the contact from the right side is determined as the contact by the left hand of the person in the right seat, and the contact from the left side is determined as the contact by the right hand of the person in the left seat.
  • the position of the input device 20 and the assumption of whether the contact is from the left or right hand when there is contact from the right side and the contact from the left or right hand when there is contact from the left side It is set in advance in consideration of the usage method.
  • the angle of the line connecting from the touch coordinates to the hover coordinates is 90 degrees and 270 degrees, it may be set in advance that the operation is considered to be performed from either the left side or the right side.
  • the display system 10 displays a hand-shaped cursor on the display device 30 when the input device 20 is touched, the operator can operate intuitively. Furthermore, since the display system 10 detects which side of the input device 20 is touched from left and right, and the display device 30 displays a hand-shaped cursor on the side operated by the operator, the operator can Intuitive operation.
  • the display system displays a hand-shaped cursor on the display device when there is an input on the input device, so that the operation of the operator is intuitive and useful as a display system.

Abstract

La présente invention concerne un dispositif d'entrée permettant de détecter un contact tactile et un dispositif d'affichage connecté au dispositif d'entrée, le dispositif d'entrée disposant de données permettant d'afficher un curseur en forme de main. Lorsque le dispositif d'entrée détecte un contact tactile sur le dispositif d'entrée, le dispositif d'affichage affiche le curseur en forme de main. Le système d'affichage permet à un opérateur de travailler de manière intuitive en raison du fait que lorsque l'opérateur touche le dispositif d'entrée afin de fournir une entrée au dispositif d'entrée, le dispositif d'affichage affiche le curseur en forme de main.
PCT/JP2014/006126 2013-12-16 2014-12-09 Système d'affichage WO2015093005A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361916292P 2013-12-16 2013-12-16
US61/916,292 2013-12-16

Publications (1)

Publication Number Publication Date
WO2015093005A1 true WO2015093005A1 (fr) 2015-06-25

Family

ID=53402384

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/006126 WO2015093005A1 (fr) 2013-12-16 2014-12-09 Système d'affichage

Country Status (1)

Country Link
WO (1) WO2015093005A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022127852A1 (fr) * 2020-12-18 2022-06-23 华为技术有限公司 Procédé et appareil d'affichage d'opération tactile de doigt

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006072854A (ja) * 2004-09-03 2006-03-16 Matsushita Electric Ind Co Ltd 入力装置
JP2007276615A (ja) * 2006-04-06 2007-10-25 Denso Corp プロンプター方式操作装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006072854A (ja) * 2004-09-03 2006-03-16 Matsushita Electric Ind Co Ltd 入力装置
JP2007276615A (ja) * 2006-04-06 2007-10-25 Denso Corp プロンプター方式操作装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022127852A1 (fr) * 2020-12-18 2022-06-23 华为技术有限公司 Procédé et appareil d'affichage d'opération tactile de doigt
CN114721576A (zh) * 2020-12-18 2022-07-08 华为技术有限公司 手指触摸操作显示方法及装置

Similar Documents

Publication Publication Date Title
JP4743267B2 (ja) 情報処理装置、情報処理方法およびプログラム
JP4372188B2 (ja) 情報処理装置および表示制御方法
JP5640486B2 (ja) 情報表示装置
US20140380209A1 (en) Method for operating portable devices having a touch screen
US20140062875A1 (en) Mobile device with an inertial measurement unit to adjust state of graphical user interface or a natural language processing unit, and including a hover sensing function
US20120068946A1 (en) Touch display device and control method thereof
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
JP5779156B2 (ja) 情報入力装置、その入力方法、及びコンピュータが実行可能なプログラム
TW201337717A (zh) 可觸控式電子裝置
JP5846129B2 (ja) 情報処理端末およびその制御方法
EP2835722A1 (fr) Dispositif d'entrée
JP2012099005A (ja) 入力装置、入力方法および入力プログラム
JP2014176019A (ja) 携帯型情報処理装置、その入力方法、およびコンピュータが実行可能なプログラム
JP5845585B2 (ja) 情報処理装置
US20110119579A1 (en) Method of turning over three-dimensional graphic object by use of touch sensitive input device
JP2006085218A (ja) タッチパネル操作装置
WO2015093005A1 (fr) Système d'affichage
WO2018123320A1 (fr) Dispositif d'interface utilisateur et appareil électronique
JP2011221822A (ja) 情報処理装置、操作入力装置、情報処理システム、情報処理方法、プログラム及び情報記憶媒体
TWI439922B (zh) 手持式電子裝置及其控制方法
US20180292924A1 (en) Input processing apparatus
JP5330175B2 (ja) タッチパッド、情報処理端末、タッチパッドの制御方法、及びプログラム
WO2014013587A1 (fr) Appareil d'affichage
US11893229B2 (en) Portable electronic device and one-hand touch operation method thereof
US20150138102A1 (en) Inputting mode switching method and system utilizing the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14871047

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14871047

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP