WO2013103968A2 - Commande à écran tactile - Google Patents

Commande à écran tactile Download PDF

Info

Publication number
WO2013103968A2
WO2013103968A2 PCT/US2013/020544 US2013020544W WO2013103968A2 WO 2013103968 A2 WO2013103968 A2 WO 2013103968A2 US 2013020544 W US2013020544 W US 2013020544W WO 2013103968 A2 WO2013103968 A2 WO 2013103968A2
Authority
WO
WIPO (PCT)
Prior art keywords
display
primary
mouse
changing
picture
Prior art date
Application number
PCT/US2013/020544
Other languages
English (en)
Other versions
WO2013103968A3 (fr
Inventor
Jack Atzmon
Original Assignee
Ergowerx International, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ergowerx International, Llc filed Critical Ergowerx International, Llc
Publication of WO2013103968A2 publication Critical patent/WO2013103968A2/fr
Publication of WO2013103968A3 publication Critical patent/WO2013103968A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This invention is related to an input device for a computer and more particularly to touchscreen input device for a computer.
  • Input devices for computers are well known in the art.
  • the movement of the mouse in an X-Y plane typically actuates a mechanical, optical, or electrical device within the mouse that produces X and Y position signals that are conveyed to the computer.
  • the computer typically uses the mouse X and Y position signals to manipulate the display of the computer screen, allowing a user to control a program.
  • a computer mouse also typically has one or more buttons which allow the user to further control a computer program.
  • the mouse and mouse button allow the user to move a cursor or other pointing device to a specific area of the computer screen and depress the one or more buttons to activate specific computer program functions. Pressing the button downward actuates the mouse buttons.
  • a computer mouse is one of the indispensable peripherals for controlling a computer.
  • Computer mice include the conventional roller-type mouse, track-ball type mouse, touch pad sensitive type mouse, and the optical mouse.
  • the mouse is typically one of a laser mouse, an optical mouse, a mechanical ball mouse, and the like.
  • the top surface of the mouse comprises a display.
  • the touchscreen mouse/controller is a smartphone, tablet computer, or special purpose controller having a screen and other necessary components to achieve the disclosed features, generally referred to as the Device.
  • the Device would act like a mouse when dragged along a surface.
  • the Device would use one or more of standard mouse technology, GPS, motion sensors, or a camera in the device to determine motion. Additionally, other well-known mouse technology can be used for motion detection.
  • the Device can control a television, tablet computer, computer or the like.
  • the Device would act like a minitablet allowing a user to utilize fingers to drag a piece of the primary screen around the Device display.
  • a specific multifinger swipe is used to replace or supplement the primary display with content from the secondary display.
  • the Device provides a section of the primary screen on its own display. This display of the Device is referred to as a second display.
  • the second display provides full touch control of the primary display even if the primary display is an ordinary monitor.
  • the second display could change without changing the primary display.
  • the second display could be used as a magnifier for a specific portion of the primary display without changing the primary display.
  • the second display provides a user with unique images such as a list of the fields you need to enter with the ability to enter them. For example, if a fillable form is displayed on the primary display, preferably only the fillable portion is presented on the second display.
  • data can be input via one or more of the primary display and the secondary screen.
  • a keyboard display can be generated on one of the primary display or the second display.
  • a cursor or the like can be moved using the touchscreen function of the Device and alphanumeric characters, radio buttons, or the like can be selected.
  • the alphanumeric characters, radio buttons, or the like are only displayed on one of the Device and the primary display.
  • the second display could lock onto any field from the primary display or page that the user wishes to allow the user the ability to watch a stock ticker all day, monitor an online auction, or the like.
  • the second display could be used with a television allowing you to preview channels on the second display and pick the channel for the primary television. In this manner, the channels can be scrolled through on the second display and ultimately selected for display on the primary television.
  • picture-in-picture TV options can be dragged onto the second display so two shows can be viewed at once, one on the primary one on the secondary. Alternatively, a third show can be watched on the secondary screen with the picture-in- picture TV options on the primary screen.
  • video conferencing applications Skype, or another application, can be dragged into the second display to allow you to move about your work area freely.
  • the device is a smartphone having a display, a microprocessor, and memory.
  • a program is stored on a nonvolatile memory that enables the smartphone to perform one or more of the above functions.
  • the program is stored remotely.
  • the microprocessor is preferably used to run a program that provides the disclosed capabilities.
  • Figure 1 is a top view of a device according to an embodiment of the invention.
  • Figure 1 is a top view of a device 10 according to an embodiment of the invention.
  • the device 10 is configured to at least control a device 20.
  • the device 10 comprises a right mouse button 12, a left mouse button 14, and a scroll element 16.
  • the right mouse button 12 and left mouse button 14 and scroll wheel are implemented as touch sensitive buttons but can also be implemented as mechanical buttons. It should be noted that the mouse controls are provided by an application being run on the touch device 10.
  • the mouse further comprises at least one touch sensitive function button or touch sensitive area 22.
  • the touch sensitive area 22 is user configurable.
  • Cursor control or a position-determining module for the device 10 can be standard mouse technology, GPS, motion sensors, or a camera in the device to determine motion.
  • mouse technology can be used for motion detection.
  • the touch sensitive area is used for overall device functionality such as data input into device 10, placing calls, and viewing data.
  • the device 10 includes a display.
  • the display is one of an LCD display, a plasma display, an LED display, or the like.
  • the display typically comprises an array of pixels arranged in two orthogonal dimensions to form a two go dimensional display, with X and Y coordinates used to indicate the location of each pixel in the array.
  • the display is configured to display a portion of the image presented on device 20 to which the device is coupled. In one embodiment, display 20 is arranged a distance from a surface of the mouse body.
  • the device 10 is connected to the device 20 via at least one of a wired connection 18, i.e., a PS/2 or USB connection or wireless connection, i.e., a Bluetooth connection.
  • a wired connection 18 i.e., a PS/2 or USB connection or wireless connection, i.e., a Bluetooth connection.
  • the mouse positioning data as well as the image data and mouse button controls are transferred between the device 10 and the device 20 via this connection.
  • the device 10 includes a port 34.
  • This external port 34 can be a USB port or memory port used to plug in additional devices.
  • the device 10 comprises a memory.
  • the memory is configured to store photos for use as a Screensaver for display.
  • the memory can also store button
  • the device 10 preferably includes a microprocessor.
  • the microprocessor in conjunction with the memory, provides functionality for the display and the various touch sensitive surfaces.
  • the user can configure the touch sensitive areas 22 of the device 10.
  • the touchscreen can be used for fingerprint recognition or other input.
  • a central processing unit (CPU) or processor performs the functions that will be described below.
  • the CPU/processor can be any of a number of well-known devices.
  • the device 10 includes the memory, which may comprise both random access memory (RAM) and read-only memory (ROM).
  • the display device 20 is configurable by at least one of the microprocessor and the device 10 to which is connected. The user is able to configure the display to display all or part of the image displayed on the device 20 to which the device 10 is connected.
  • a region of the display proximate to the cursor is magnified.
  • a zoom function can be easily implemented without varying the main display of the device 20 as the magnified image is displayed on the display of device 10.
  • the display for the device 10 can be configured using the device 20 or, alternatively, the display can be configured using a touch menu generated by the microprocessor.
  • the touchscreen can be used to enter passwords that are stored in the memory. Alternatively, the passwords are entered using device 20. Passwords can also be stored on a USB fob that is plugged into port 34.
  • the device 10 is a cellular telephone running a mouse application.
  • the display for the telephone functions as the display 20.
  • a user is able to have a selected portion of the device 20 displayed on the cellular telephone's display.
  • the cellular telephone's camera is used for cursor control of the device 20.
  • position sensors in the cellular telephone are used for cursor control.
  • the Device can control a television, tablet computer, computer or the like.
  • the Device would act like a minitablet allowing a user to utilize fingers to drag a piece of the primary screen around the Device display.
  • a specific multifinger swipe is used to replace or supplement the primary display with content from the secondary display.
  • the Device 10 provides a section of the primary screen of device 20 on its own display. This display of the Device 10 is referred to as a second display.
  • the second display provides full touch control of the primary display even if the primary display is an ordinary monitor.
  • the second display could change without changing the primary display.
  • the second display could be used as a magnifier for a specific portion of the primary display without changing the primary display.
  • the second display provides a user with unique images such as a list of the fields you need to enter with the ability to enter them. For example, if a fillable form is displayed on the primary display device 20, preferably only the fillable portion is presented on the device 10.
  • data can be input via one or more of the primary display and the secondary screen.
  • a keyboard display can be generated on one of the primary display or the second display.
  • a cursor or the like can be moved using the touchscreen function of the Device and alphanumeric characters, radio buttons, or the like can be selected.
  • the alphanumeric characters, radio buttons, or the like are only displayed on one of the Device and the primary display.
  • the second display of device 10 could lock onto any field from the primary display of device 20 or page that the user wishes to allow the user the ability to watch a stock ticker all day, monitor an online auction, or the like.
  • the second display of device 10 could be used with a television allowing you to preview channels on the second display and pick the channel for the primary television. In this manner, the channels can be scrolled through on the second display and ultimately selected for display on the primary television.
  • picture-in-picture TV options can be dragged onto the second display so two shows can be viewed at once, one on the primary one on the secondary. Alternatively, a third show can be watched on the secondary screen with the picture-in- picture TV options on the primary screen.
  • Skype or another application, can be dragged into the second display to allow you to move about your work area freely.
  • the device is a smartphone having a display, a microprocessor, and memory.
  • a program is stored on a nonvolatile memory that enables the smartphone to perform one or more of the above functions.
  • the program is stored remotely.
  • the microprocessor is preferably used to run a program that provides the disclosed capabilities.
  • device 10 is a smartphone capable of one or more of: acting like a mouse when dragged along a surface;
  • a computer storage media having embodied thereon computer-useable instructions that, when executed, perform a method, the method comprising one or more of:
  • the present invention may be described herein in terms of functional block components, code listings, optional selections and various processing steps. It should be appreciated that such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions.
  • the present invention may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
  • the software elements of the present invention may be implemented with any programming or scripting language such as C, C++, C#, Java, COBOL, assembler, PERL, or the like, with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements.
  • the present invention may employ any number of conventional techniques for data transmission, signaling, data processing, network control, and the like.
  • the present invention may be embodied as a method, a data processing system, a device for data processing, and/or a computer program product. Accordingly, the present invention may take the form of an entirely software embodiment, an entirely hardware embodiment, or an embodiment combining aspects of both software and hardware. Furthermore, the present invention may take the form of a computer program product on a computer-readable storage medium having computer-readable program code means embodied in the storage medium. Any suitable computer-readable storage medium may be utilized, including hard disks, CD-ROM, optical storage devices, magnetic storage devices, and/or the like.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means that implement the function specified in the flowchart block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer- implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • any databases, systems, or components of the present invention may consist of any combination of databases or components at a single location or at multiple locations, wherein each database or system includes any of various suitable security features, such as firewalls, access codes, encryption, de- encryption, compression, decompression, and/or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un dispositif et une application destinée au dispositif, qui fait en sorte que le dispositif se comporte comme une souris, se comporte comme une mini-tablette, présente une section d'un écran principal sur son propre affichage, assure une commande tactile de l'affichage principal, modifie son affichage sans modifier l'affichage principal, présente des images uniques parmi lesquelles une liste des champs dont la saisie est nécessaire, avec la possibilité de les saisir, de se verrouiller sur un champ ou une page quelconque, de pré-visualiser des chaînes de télévision et de sélectionner la chaîne destinée au téléviseur ou à l'affichage principal, d'afficher une image incrustée et d'exécuter des applications de telle façon que l'affichage principal ne soit pas affecté.
PCT/US2013/020544 2012-01-06 2013-01-07 Commande à écran tactile WO2013103968A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261583990P 2012-01-06 2012-01-06
US61/583,990 2012-01-06

Publications (2)

Publication Number Publication Date
WO2013103968A2 true WO2013103968A2 (fr) 2013-07-11
WO2013103968A3 WO2013103968A3 (fr) 2015-06-18

Family

ID=48745551

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/020544 WO2013103968A2 (fr) 2012-01-06 2013-01-07 Commande à écran tactile

Country Status (1)

Country Link
WO (1) WO2013103968A2 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015010148A (ja) * 2013-06-28 2015-01-19 住友化学株式会社 樹脂含浸シート
JP2015010147A (ja) * 2013-06-28 2015-01-19 住友化学株式会社 樹脂含浸シート
GB2533569A (en) * 2014-12-19 2016-06-29 Displaylink Uk Ltd A display system and method
JP2017033153A (ja) * 2015-07-30 2017-02-09 シャープ株式会社 情報処理装置、情報処理プログラムおよび情報処理方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7899915B2 (en) * 2002-05-10 2011-03-01 Richard Reisman Method and apparatus for browsing using multiple coordinated device sets
JP4262712B2 (ja) * 2005-10-24 2009-05-13 ソニー・エリクソン・モバイルコミュニケーションズ株式会社 携帯端末装置、マウスアプリケーションプログラム、及び携帯端末装置をワイヤレスマウス装置として用いる方法
US20070124792A1 (en) * 2005-11-30 2007-05-31 Bennett James D Phone based television remote control
US9955206B2 (en) * 2009-11-13 2018-04-24 The Relay Group Company Video synchronized merchandising systems and methods
US20110183654A1 (en) * 2010-01-25 2011-07-28 Brian Lanier Concurrent Use of Multiple User Interface Devices
US20110191813A1 (en) * 2010-02-04 2011-08-04 Mike Rozhavsky Use of picture-in-picture stream for internet protocol television fast channel change

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015010148A (ja) * 2013-06-28 2015-01-19 住友化学株式会社 樹脂含浸シート
JP2015010147A (ja) * 2013-06-28 2015-01-19 住友化学株式会社 樹脂含浸シート
GB2533569A (en) * 2014-12-19 2016-06-29 Displaylink Uk Ltd A display system and method
GB2533569B (en) * 2014-12-19 2019-06-26 Displaylink Uk Ltd A display system and method
US11080820B2 (en) 2014-12-19 2021-08-03 Displaylink (Uk) Limited System and method for displaying a portion of an image displayed on a screen on a display of a mobile device in magnified form
JP2017033153A (ja) * 2015-07-30 2017-02-09 シャープ株式会社 情報処理装置、情報処理プログラムおよび情報処理方法

Also Published As

Publication number Publication date
WO2013103968A3 (fr) 2015-06-18

Similar Documents

Publication Publication Date Title
EP2631776B1 (fr) Commander des vues dans un dispositif d'affichage à écran tactile
US20180101248A1 (en) Head-mounted display system, head-mounted display, and head-mounted display control program
JP6551502B2 (ja) ヘッドマウントディスプレイ、情報処理方法、及びプログラム
US10942620B2 (en) Information processing apparatus, information processing method, program, and information processing system
EP3144775B1 (fr) Système de traitement d'informations et procédé de traitement d'informations
US10360871B2 (en) Method for sharing screen with external display device by electronic device and electronic device
US20160370970A1 (en) Three-dimensional user interface for head-mountable display
EP2926215B1 (fr) Procédé et appareil pour faciliter une interaction avec un objet pouvant être vu par l'intermédiaire d'un afficheur
US20150074573A1 (en) Information display device, information display method and information display program
JP7064040B2 (ja) 表示システム、及び表示システムの表示制御方法
JP7005161B2 (ja) 電子機器及びその制御方法
CN107003719A (zh) 计算装置、用于控制所述计算装置的方法以及多显示器系统
US9377901B2 (en) Display method, a display control method and electric device
CN105210144A (zh) 显示控制装置、显示控制方法和记录介质
US10095384B2 (en) Method of receiving user input by detecting movement of user and apparatus therefor
CN113301506B (zh) 信息共享方法、装置、电子设备及介质
WO2017113757A1 (fr) Procédé d'agencement d'une interface enveloppante, procédés de permutation de contenu et liste de permutation dans un environnement immersif tridimensionnel
KR20150094967A (ko) 적어도 하나의 어플리케이션을 실행하는 전자 장치 및 그 제어 방법
WO2013103968A2 (fr) Commande à écran tactile
US20180203602A1 (en) Information terminal device
KR20140141305A (ko) 화면분할이 가능한 이동통신단말기 및 그 제어방법
EP3109734A1 (fr) Interface utilisateur tridimensionnelle pour afficheur
CA2749267A1 (fr) Souris d'ordinateur
US10013024B2 (en) Method and apparatus for interacting with a head mounted display
US20110267271A1 (en) Mouse

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13733753

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 13733753

Country of ref document: EP

Kind code of ref document: A2