US20140006996A1 - Visual proximity keyboard - Google Patents

Visual proximity keyboard Download PDF

Info

Publication number
US20140006996A1
US20140006996A1 US13/985,011 US201213985011A US2014006996A1 US 20140006996 A1 US20140006996 A1 US 20140006996A1 US 201213985011 A US201213985011 A US 201213985011A US 2014006996 A1 US2014006996 A1 US 2014006996A1
Authority
US
United States
Prior art keywords
input device
input
interface system
display
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/985,011
Other languages
English (en)
Inventor
Apolon Ivankovic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2011900465A external-priority patent/AU2011900465A0/en
Application filed by Individual filed Critical Individual
Publication of US20140006996A1 publication Critical patent/US20140006996A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • an interface system for facilitating human interfacing with a computing device comprising:
  • the interface system may be arranged to facilitate display of a representation of the object relative to the input layout of the input device.
  • the representation of the object may indicate a distance between at least a portion of the object and the input layout of the input device.
  • the indication of the distance between the at least a portion of the object and the input layout is represented as colour or shading information.
  • the indication of the distance between the at least a portion of the object and the input layout is provided by altering a transparency level of a portion of the representation corresponding to the at least a portion of object.
  • the interface system may be arranged such that the displayed representation of the at least a portion of the object becomes more transparent the further away it is from the input layout.
  • the interface system may be arranged to facilitate visual representation on the computing device display of a touch event, the touch event corresponding to when the object touches the input device.
  • the touch event may be represented by highlighting an area of a representation of the input device that corresponds to a location of the touch event.
  • the input device comprises a touch screen interface.
  • the input device may be arranged to enable an input layout of the touch screen interface to be altered, wherein the interface system is arranged to facilitate display of the altered input layout on the computing device display.
  • the input device may comprise separate first and second input device portions.
  • the first and second input device portions may be couplable together in a releasably engagable configuration.
  • the interface system may be arranged to facilitate display of visual information on the computing device display indicative of an input layout of each of the first and second input device portions.
  • the system may be arranged to facilitate displaying the representations of the layouts of the first and second input device portions on the computing device display separately.
  • a method of interfacing with a computing device comprising the steps of:
  • a computer program arranged when loaded into a computing device to instruct the computing device to operate in accordance with the system of the first aspect of the present invention.
  • a computer readable medium having a computer readable program code embodied therein for causing a computing device to operate in accordance with the system of the first aspect of the present invention.
  • a data signal having a computer readable program code embodied therein to cause a computing device to operate in accordance with the system of the first aspect of the present invention.
  • FIG. 3 a is a top view of an input device of the interface system of FIG. 1 , the input device being shown in a coupled configuration;
  • FIG. 4 is a flow diagram of a method of interfacing with a computing device in accordance with an embodiment of the present invention.
  • the interface system comprises a touch based input device, for example a keyboard, arranged to detect touch based inputs.
  • the touch based input device may, for example, be a conventional type keyboard having physical keys and that detects keystrokes as the keys are depressed by a user.
  • the touch based input device may be a touch screen based keyboard, for example a touch screen that is arranged to display an input layout and that detects when a user touches parts of the screen corresponding to inputs of the input layout.
  • the interface system is arranged to detect a position of an object relative to the touch based input device. Since a user typically uses their hands to enter information via the touch based input device, the user's hands will typically be the object detected by the interface system. The relative position of the user's hands with respect to the touch based input device can then be visually represented, for example on a display of the computing device, so as to assist the user in using the touch based input device.
  • Visually representing the relative position of the user's hands with respect to the touch based input device provides visual feedback to the user indicating where their hands are in relation to the touch based input device.
  • the user can use this visual feedback to arrange their finger's over the keys they desire to touch so as to enter desired information. This can be of particular advantage when the user cannot, or finds it difficult to, look at the input device when inputting information but is able to view the display of the computing device.
  • the touch screen based input device is also arranged to provide haptic feedback to the user.
  • the touch screen based input device can be arranged so as to provide the user with physical feedback coinciding with when the user inputs information, analogous to feedback a user would feel when inputting information via a traditional keyboard.
  • the representation 24 of the user's hand also indicates how far parts of the hand are from the input layout of the input device 14 .
  • the further away a part of the user's hand the lighter the shading used in a corresponding portion of the representation 24 .
  • portions 26 of the representation 24 corresponding to finger tips of the user are shaded darker than portions 28 of the representation 24 corresponding to intermediate finger portions, indicating that the finger tips are closer to the input layout of the input device 14 than the intermediate finger portions.
  • shading is used in this example to provide an indication of how far parts of the user's hand are from the input layout of the input device 14 , it will be appreciated that colours could be used for a similar purpose wherein different colours correspond to different distances from the input device 14 .
  • the interface system 10 may still be arranged to no longer display the representation 24 when the user's hand is beyond the predefined distance from the input device 14 .
  • the input device 14 is arranged to enable an input layout of its touch screen interface to be altered and is particularly arranged to change dynamically based on current user interface input needs. For example, if the user is entering information into a field that requires only numbers, the touch screen interface of the input device is arranged to only display numbers, and to display the full standard alphanumeric keyboard face at other times. To cater for this, the interface system 10 is arranged to facilitate display of the altered input layout on the computing device display 18 .
  • Each input device portion 30 , 30 ′ has a respective input layout 32 , 32 ′.
  • the input layout 32 of the first input device portion 30 substantially corresponds to an input layout that would typically be found on a left-hand side of a standard keyboard
  • the input layout 32 ′ of the second input device portion 30 ′ substantially corresponds to an input layout that would typically be found on a right-hand side of a standard keyboard.
  • the interface system 10 is arranged to display the input layout of each of the first and second input device portions 30 , 30 ′ on the computing device display 18 as respective representations 22 , 22 ′ as shown in FIG. 2 .
  • the method 50 can be carried out using the interface system 10 described herein.
  • the interface system 10 can be used for different applications.
  • the input device 14 can be arranged in the split configuration wherein the first input device portion 30 is mounted on a left arm of a chair and the second input device portion 30 ′ is mounted on a right arm of a chair.
  • repetitive stress problems associated with typical keyboard use wherein a user places their hands out in front of them can be avoided as the user can instead rest their arms on the left and right arms of the chair.
  • the user is still able to enter information via the input device 14 since they are provided with visual feedback on the display 18 as to the relative position of their hands with respect to the input device 14 .
  • the user is not required to look down to orient their hands with respect to the first and second input device portions 30 , 30 ′.
  • the interface system 10 can be used to assist incapacitated users. For example, if a user is incapacitated and is required to lie flat on their back for long periods of time, the first and second input device portions 30 , 30 ′ can be placed on respective sides of the user's body next to each hand. This can enable the user to input information via the input device 14 with minimal arm movements and in the prone position.
  • virtual reality glasses are provided with orientation sensors, for example sensors based on accelerometer technology, so as to allow an orientation of the glasses to be determined.
  • the orientation of the glasses is communicated as orientation information to the interface system 10 , such as via a Bluetooth connection, and the interface system 10 is arranged to use the orientation information to determine when to display a representation 22 of an input layout of the touch based input device 14 , and a representation 24 of the user's hand.
  • the interface system 10 is arranged to not display the representations 22 , 24 . Instead, the user is presented with a full view of their virtual reality environment.
  • the change in orientation of the glasses is detected by the orientation sensors and the respective orientation information is communicated to the interface system 10 .
  • the interface system 10 is arranged to display the representations 22 , 24 .
  • the representations 22 , 24 can, for example, be shown in a location of the virtual reality environment that would correspond to a position of the input device 14 relative to the user in the real world.
  • the interface system 10 can be arranged to provide visual feedback regarding the position of the user's hands with respect to the input device 14 via the HUD. This can enable the user to concentrate on the view and the HUD while still inputting information via the input device 14 .
  • HUD heads up display
  • system 10 or method 50 may be implemented as a computer program that is arranged, when loaded into a computing device, to instruct the computing device to operate in accordance with the system 10 or method 50 .
  • system 10 or method 50 may be provided in the form of a computer readable medium having a computer readable program code embodied therein for causing a computing device to operate in accordance with the system 10 or method 50 .
  • system 10 or method 50 may be provided in the form of a data signal having a computer readable program code embodied therein to cause a computing device to operate in accordance with the system 10 or method 50 .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
US13/985,011 2011-02-13 2012-02-10 Visual proximity keyboard Abandoned US20140006996A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AU2011900465A AU2011900465A0 (en) 2011-02-13 Visual Proximity Keyboard
AU2011900465 2011-02-13
PCT/AU2012/000122 WO2012106766A1 (fr) 2011-02-13 2012-02-10 Clavier de proximité visuelle

Publications (1)

Publication Number Publication Date
US20140006996A1 true US20140006996A1 (en) 2014-01-02

Family

ID=46638072

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/985,011 Abandoned US20140006996A1 (en) 2011-02-13 2012-02-10 Visual proximity keyboard

Country Status (3)

Country Link
US (1) US20140006996A1 (fr)
AU (1) AU2012214109A1 (fr)
WO (1) WO2012106766A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11422670B2 (en) 2017-10-24 2022-08-23 Hewlett-Packard Development Company, L.P. Generating a three-dimensional visualization of a split input device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9388363B2 (en) * 2013-03-15 2016-07-12 Megasonic Sweeping, Incorporated Ultrasonic and megasonic method for extracting palm oil

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6600480B2 (en) * 1998-12-31 2003-07-29 Anthony James Francis Natoli Virtual reality keyboard system and method
US6611253B1 (en) * 2000-09-19 2003-08-26 Harel Cohen Virtual input environment
US20090013279A1 (en) * 2007-07-03 2009-01-08 Apple Inc. Form-field mask for sensitive data
US20110248941A1 (en) * 2010-03-17 2011-10-13 Samer Abdo System and method for capturing hand annotations

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4518955B2 (ja) * 2002-11-29 2010-08-04 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 接触エリアの移動させられた表現を用いたユーザインタフェース
US7653883B2 (en) * 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US20090251422A1 (en) * 2008-04-08 2009-10-08 Honeywell International Inc. Method and system for enhancing interaction of a virtual keyboard provided through a small touch screen
GB2462579A (en) * 2008-06-10 2010-02-17 Sony Service Ct Touch screen display including proximity sensor
US20100225588A1 (en) * 2009-01-21 2010-09-09 Next Holdings Limited Methods And Systems For Optical Detection Of Gestures

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6600480B2 (en) * 1998-12-31 2003-07-29 Anthony James Francis Natoli Virtual reality keyboard system and method
US6611253B1 (en) * 2000-09-19 2003-08-26 Harel Cohen Virtual input environment
US20090013279A1 (en) * 2007-07-03 2009-01-08 Apple Inc. Form-field mask for sensitive data
US20110248941A1 (en) * 2010-03-17 2011-10-13 Samer Abdo System and method for capturing hand annotations

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11422670B2 (en) 2017-10-24 2022-08-23 Hewlett-Packard Development Company, L.P. Generating a three-dimensional visualization of a split input device

Also Published As

Publication number Publication date
AU2012214109A1 (en) 2013-09-12
WO2012106766A1 (fr) 2012-08-16

Similar Documents

Publication Publication Date Title
US9652146B2 (en) Ergonomic motion detection for receiving character input to electronic devices
US8638315B2 (en) Virtual touch screen system
US6882337B2 (en) Virtual keyboard for touch-typing using audio feedback
EP3100151B1 (fr) Souris virtuelle pour un dispositif à écran tactile
US20190227688A1 (en) Head mounted display device and content input method thereof
US20120179963A1 (en) Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display
TW201344500A (zh) 電子系統
US8543942B1 (en) Method and system for touch-friendly user interfaces
US20150220156A1 (en) Interface system for a computing device with visual proximity sensors and a method of interfacing with a computing device
JP2012027957A (ja) 情報処理装置、プログラムおよびポインティング方法
US11392237B2 (en) Virtual input devices for pressure sensitive surfaces
US20140006996A1 (en) Visual proximity keyboard
US20200356258A1 (en) Multi-Perspective Input For Computing Devices
US20150309601A1 (en) Touch input system and input control method
EP3293624A1 (fr) Dispositif d'entrée et procédé
WO2016079931A1 (fr) Interface utilisateur avec capteur tactile
US20100265107A1 (en) Self-description of an adaptive input device
JP7012780B2 (ja) ゲーム装置及びプログラム
WO2015093005A1 (fr) Système d'affichage
US20230114333A1 (en) Method and device for managing multiple presses on a touch-sensitive surface
TWM532594U (zh) 用於頭戴顯示裝置的鍵盤
AU2013204699A1 (en) A headphone set and a connector therefor
KR20170130989A (ko) 아이 볼 마우스
KR20150049661A (ko) 터치패드 입력 정보 처리 장치 및 방법
CN103383589A (zh) 电子计算系统

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION