WO2009000280A1 - Procédé et système offrant trois dimensions en entrée tridimensionnelle pour dispositifs portatifs - Google Patents

Procédé et système offrant trois dimensions en entrée tridimensionnelle pour dispositifs portatifs Download PDF

Info

Publication number
WO2009000280A1
WO2009000280A1 PCT/EG2007/000021 EG2007000021W WO2009000280A1 WO 2009000280 A1 WO2009000280 A1 WO 2009000280A1 EG 2007000021 W EG2007000021 W EG 2007000021W WO 2009000280 A1 WO2009000280 A1 WO 2009000280A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointer
axis
device display
hand
freedom
Prior art date
Application number
PCT/EG2007/000021
Other languages
English (en)
Inventor
Cherif Atia Algreatly
Original Assignee
Cherif Atia Algreatly
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cherif Atia Algreatly filed Critical Cherif Atia Algreatly
Priority to PCT/EG2007/000021 priority Critical patent/WO2009000280A1/fr
Priority to US11/906,520 priority patent/US20080062126A1/en
Publication of WO2009000280A1 publication Critical patent/WO2009000280A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom

Definitions

  • This invention relates generally to computer input devices, methods, and systems, and, more particularly, to a hand-held device that enables the user to interact with various 3D applications.
  • the use of the three-dimensional applications for the hand-held devices is greatly spreading to serve many different purposes, for example, the GPS 's demonstrate 3D modeling for buildings and landscapes, the cell phones present 3D interfaces and menus, and the gaming devices display 3D characters and virtual environment.
  • the operation of the hand-held devices is different from the computer, where using such 3D computer input devices that need a surface for support is not practical at all.
  • the hand-held device is mostly operated by its keyboard while the user is holding it by one hand, and in many cases the user may need to use the fingers of the same hand that is holding the device to operate the keyboard.
  • the present invention introduces this solution where it presents an efficient 3D input method and system for the hand-held devices that enables the user to completely control various 3D applications with one hand's finger using the keyboard of the hand-held devices in a simple and fast way. Accordingly, the user can operate several 3D applications that used to be limited to the computer only, for example the Internet world mapping such as Google Earth, the interactive three-dimensional graphics such as virtual reality, and the professional 3D software such as AutoCAD.
  • the Internet world mapping such as Google Earth
  • the interactive three-dimensional graphics such as virtual reality
  • AutoCAD professional 3D software
  • the present 3D input method and system can be used for the computer as well, where the users can move, edit, or navigate in 3D on the computer display in a practical and intuitive manner.
  • the present 3D input system for the hand-held device is comprised of three main elements.
  • the first element is a 3D input method that utilizes a 5-way button or five keys of the hand-held device keyboard that are relatively located in a cross-shape to each others to provide six degrees of freedom.
  • the second element is a pointer that rotates on the hand-held device display to target a specific spot or object on the virtual 3D environment.
  • the third element is a net of nodes that the pointer moves on to reach its target in 3D on the hand-held device display.
  • the first element of the present invention or the 3D input method utilizes a 5-way button on the hand-held device keyboard to provide six degrees of freedom.
  • the first degree of freedom represents a movement along the x-axis of the device display.
  • the second degree of freedom represents a movement along the y-axis of the device display.
  • the third degree of freedom represents a movement along the direction of the pointer in 3D on the device display.
  • the fourth degree of freedom represents a rotation about the x-axis.
  • the fifth degree of freedom represents a rotation about the y-axis.
  • the sixth degree of freedom represents a rotation about the pointer on the device display.
  • FIG. 1 illustrates a 5-way button comprised of five positions named +x, -x, +y, -y, and z that are relatively located in a cross-shape to each others to represent the top view of the six directions of the x, y, and z-axis of the Cartesian coordinate system which are illustrated in FIG. 2.
  • the "+x” position represents the positive direction of the x- axis.
  • the "-x” position represents the negative direction of the x-axis.
  • the "+y” position represents the positive direction of the y-axis.
  • the "-y” position represents the negative direction of the y-axis.
  • the "z” position represents both of the positive and negative directions of the z-axis.
  • Each pressing on one of the five positions of the 5-way button generates a unique signal indicating a specific position is pressed.
  • Each two different successive pressings on one or two positions of said 5-way button generate two unique successive signals that represent one degree of the six degrees freedom.
  • the user moves his/her finger horizontally to press on the "-x” position then the "+x” position to represent moving along the positive x-axis.
  • the user moves his/her finger horizontally to press on the "+x” position then the "-x” position to represent moving along the negative x-axis.
  • the user moves his/her finger vertically to press on the "-y” position then the "+y” position to represent moving along the positive y-axis.
  • the user moves his/her finger vertically to press on the "+y” position then the "-y” position to represent moving along the negative y-axis.
  • the user moves his/her finger up to press on the "z” position then the "+y” to represent moving along the positive z-axis.
  • the user moves his/her finger down to press on the "z” position then the "-y” to represent moving along the negative z-axis.
  • the height of the "z" button is lower than the other four buttons.
  • the user presses twice with his/her finger on the "+y” position to represent a clockwise rotation about the x-axis.
  • the user presses twice with his/her finger on the "-y” position to represent a counter-clockwise rotation about the x-axis.
  • the user presses twice with his/her finger on the "+ ⁇ ” position to represent a clockwise rotation about the y-axis.
  • the user presses twice with his/her finger on the "-x” position to represent a counter-clockwise rotation about the y-axis.
  • the user moves his/her finger clockwise to press, respectively, on any two successive positions such as “+y” and “+x”, “+x” and “-y”, “-y” and “-x”, or “-x” and “+y” to represent a clockwise rotation about the z- axis.
  • the user moves his/her finger counter-clockwise to press, respectively, on any two successive positions such as "+y” and “-x”, “-x” and "-y”, “-y” and “+ ⁇ ”, or "+x” and "+y” to represent a counter-clockwise rotation bout the z-axis.
  • FIG. 3 illustrates a table that indicates the user's finger movement to represent three degrees of freedom, to move along the x, y, or z-axis.
  • FIG. 4 illustrates another table that indicates the user's finger movement to represent another three degrees of freedom, to rotate about the x, y, or z-axis. As shown in these two tables each degree of freedom is represented by one movement of the user's finger on the positions of the 5-way button, except rotating about the z-axis which can be represented by more than one user finger's movements.
  • the second element of the present invention is the pointer which is illustrated in FIG. 5.1.
  • the pointer appears on the hand-held device display 110, it is comprised of a line 120 connects between two points or ends, the first end 130 is located in the center of the hand-held device display, and the second end 140 is located on one node on the virtual 3D environment as will described subsequently.
  • the pointer is targeting a cube 150 on the hand-held device display.
  • Each degree of freedom provided by the 5-way button manipulates the pointer and the virtual camera to move or rotate in specific direction on the hand-held device display. For example providing a movement along the positive x-axis, moves the pointer and the virtual camera on the positive x-axis of the hand-held device display as illustrated in FIG. 5.2. Providing a movement along the negative x-axis, moves the pointer and the virtual camera on the negative x-axis of the hand-held device display as illustrated in FIG. 5.3.
  • Providing a movement along the positive y-axis moves the pointer and the virtual camera on the positive y-axis of the hand-held device display as illustrated in FIG.5.4.
  • Providing a movement along the negative y-axis moves the pointer and the virtual camera on the negative y-axis of the hand-held device display as illustrated in FIG. 5.5.
  • Providing a movement along the positive z-axis moves the virtual camera on the direction of the pointer in 3D on the hand-held device display as illustrated in FIG.5.6.
  • Providing a movement along the negative z-axis moves the virtual camera on the opposite direction of the pointer in 3D on the hand-held device display as illustrated in FIG. 5.7.
  • the third element of the present invention is the net of nodes, which is a result of intersected hidden lines parallel to the x, y, and z-axis of the virtual 3D environment on the hand-held device display.
  • Each intersection is considered as one node, each node can be defined with a unique ID and an identified position in 3D such as (x, y, z).
  • FIG. 6 illustrates a cube with a plurality of intersected hidden lines to form a number of nodes 160, as shown in the figure; the x, y, and z-axis of the cube indicate numbers that represent the order of the hidden lines relative to origin.
  • the second end of the pointer 140 is moved from one node to another when the pointer is manipulated to be moved in 3D on the hand-held device display.
  • the second end of the pointer is located on node (0, 0, 0) of the cube and the pointer is rotated clockwise about the y-axis of the hand-held device display then the second end of the pointer will be moved parallel to the xy-plane of the cube to reach, respectively, the nodes (1, 0, 0), (2, 0, 0), (3, 0, 0), (4, 0, 0), (5, 0, 0), (6, 0, 0), (6, 1, 0), (6, 2, 0), and (6, 3, 0).
  • the second end of the pointer will be moved on the yz-plane of the cube to reach, respectively, the nodes (0, 0, 1), (0, 0, 2), (0, 0, 3), (0, 0, 4), (0, 0, 5), (0, 0, 6), (0, 1, 6), (0, 2, 6), and (0, 3, 6).
  • each target of the pointer such as icon, menu, or object
  • FIG. 7 illustrates a diagram that shows the main three elements of the present invention
  • the first element 170 is the 3D input method using the 5-way button
  • the second element 180 is the pointer that is moved in 3D on the hand-held device display
  • the third elements 190 is the net of nodes that covers the outer surface of the virtual 3D environment on the hand-held device display.
  • the first element 170 can be five keys on a hand-held device keyboard that are relatively located in a cross-shape to each others, it can also be any input device that provides six degrees of freedom.
  • the second element 180 can be a computer cursor that can be moved on the hand-held device and in the same time it turns to function as the pointer of the present invention when the sex degrees of freedom are provided.
  • the third element 190 can be points of intersections between the pointer's line and the planes of the virtual 3D environment where these intersection points are calculated with each rotation or movement of the pointer in 3D.
  • the 5-way button was used to move both of the pointer and the virtual camera together in the same time on the hand-held device display, however; it is possible to move the pointer alone without moving the virtual camera.
  • This can be done by; pressing one time on the "x” position to rotate the pointer clockwise about the y-axis. Pressing one time on the "-x” position to rotate the pointer counter-clockwise about the y- axis. Pressing one time on the "y” position to rotate the pointer clockwise about the x-axis. Pressing one time on the "-y” position to rotate the pointer counter-clockwise about the x- axis.
  • the period of this one time pressing is supposed to be different than the first pressing of the tables of FIGS. 3 and 4, for example if this one time pressing is two seconds or more, then the first pressing of the mentioned tables will be less than two seconds.
  • the present invention is not limited to the hand-held devices only but to the computer as well.
  • the 5-way button can be incorporated onto a top of a computer mouse or keyboard. It is also possible to use one of the 3D computer mice that provides six degrees of freedom to replace the 5-way button.
  • the computer cursor in this case will turn into the pointer of the present invention when the 5-way button or the 3D mouse starts to provide six degrees of freedom to the computer system, and returns to function as a traditional cursor when the mouse is moved on a mouse-pad or surface.
  • the computer system can calculate the intersection points between the pointer's line and the planes of the virtual 3D environment when the pointer is moved in 3D; as an alternative for the net of node.
  • the user will press twice on the "z" position of the 5-way button to indicate that the following input of the 5-way button represents moving objects, when finishing; the user presses again twice on the "z” position to indicate that the following input of the 5-way button represents targeting objects.
  • the provided six degrees of freedom will represent a movement along or rotation about the x, y, or z-axis of the virtual 3D environment on the hand-held device display.
  • FIG. 8.1 illustrates a cylinder 200 that has a circular hole 210 on its outer surface, the cylinder is located on the xy-plane of the virtual 3D environment on the hand-held device display 220, there are two dotted lines 230 and 240 that indicate the distance between the center of one base of the cylinder and the x and y-axis.
  • the first end of the pointer 250 is located in the center of the hand-held device display, the second end of the pointer 260 is targeting the center of the lower base of the cylinder, the pointer's line 270 connects between the first end and the second end of the pointer.
  • FIGS. 8.2 and 8.3 illustrate moving the second end of the pointer and the cylinder, respectively, in the direction of the positive and negative x-axis when the 5-way button provides a movement along the positive or negative x-axis.
  • FIGS. 8.4 and 8.5 illustrate moving the second end of the pointer and the cylinder, respectively, in the direction of the positive and negative y-axis when the 5-way button provides a movement along the positive or negative y-axis.
  • FIGS. 8.6 and 8.7 illustrate moving the second end of the pointer and the cylinder, respectively, in the direction of the positive and negative z-axis when the 5-way button provides a movement along the positive or negative z-axis.
  • FIGS. 8.8 and 8.9 illustrate rotating the cylinder, respectively, clockwise and counterclockwise about the x-axis when the 5-way button provides a clockwise rotation or counterclockwise rotation about the x-axis.
  • FIGS. 8.10 and 8.11 illustrate rotating the cylinder, respectively, clockwise and counter-clockwise about the y-axis when the 5-way button provides a clockwise rotation or counter-clockwise rotation about the y-axis.
  • FIGS. 8.12 and 8.13 illustrate rotating the cylinder, respectively, clockwise and counter-clockwise about the z-axis when the 5-way button provides a clockwise rotation or counter-clockwise rotation about the z-axis.
  • the first one is targeting objects in 3D
  • the second one is moving objects in 3D.
  • Another major application for the present invention is enabling the user to navigate in 3D on the hand-held device display where such application is important for the GPS, virtual reality and 3D games.
  • the direction of the pointer in 3D will be utilized to represent the direction of the virtual camera's orientation.
  • This idea enables the user to view the end path of the virtual camera which is the second end of the pointer before reaching this position.
  • This method also enables the user to accurately determine the virtual camera's path in 3D which is the same direction of the pointer in 3D.
  • the user will press on the "z" position three times before s/he starts to indicate that the following input of the 5-way button represents navigating in 3D.
  • FIG. 9 illustrates an example for a virtual reality application on the hand-held device display, where the user can navigate in 3D using the present invention.
  • This figure shows a 3D modeling 280 for buildings and landscape, and a pointer 290 that is targeting a specific spot on one of the buildings. It is important to note in such examples that the user will not have the projection illusion problem that is very common when the virtual reality applications are used on the computer display, where the traditional computer cursor can not help accurately navigating in 3D.
  • FIG. 10 illustrates a 3D interface comprised of three cylindrical strips 300, 310, and 320 where each one contains a number of different icons 330.
  • the first end of the pointer 340 is located on the axial center of the cylindrical strips; in the center of the device display, and the second end of the pointer 350 is located on one of the icons.
  • the user of the hand-held device can rotate the pointer to target any icon in any of the three cylindrical strips, move any icon from one strip to another to rearrange the groups of the icons in each cylindrical strip, rotate any one of the three cylindrical strips horizontally, or navigate in 3D to move the virtual camera to reach and penetrate any icon if this icon functions as a door or window that leads to another virtual 3D environment than the three cylindrical strips.
  • One important application for the present invention is enabling the user of the handheld device to interact with different 3D games. For example; in shooting games the pointer can control the direction in which the player's head faces while aiming to shoot his/her target. In flying games the user can control the different rotation of the air- vehicle such as airplane of rocket about different directions or axes.
  • FIG. 1 is a 5-way button comprised of five positions named +x, -x, +y, -y, and z that are relatively located in a cross-shape to each others.
  • FIG. 2 is the x, y, and z-axis of the Cartesian coordinate system forming a 3D cross shape.
  • FIG. 3 is a table indicates the user's finger movement on the 5-way button to represent three degrees of freedom, to move along the x, y, or z-axis.
  • FIG. 4 is a table indicates the user's finger movement on the 5-way button to represent another three degrees of freedom, to rotate about the x, y, or z-axis.
  • FIG. 5.1 is a pointer comprised of a first end located in the center of the hand-held device display and a second end targeting a cube on the hand-held device display.
  • FIGS. 5.2 to 5.13 are a pointer and a virtual camera's orientation that are moved along or rotated about the x, y, and z-axis on the hand-held device display.
  • FIG. 6 is a cube as an example for an object on the hand-held device display where said cube is divided by a plurality of intersected hidden lines to form a net of nodes.
  • FIG. 7 is a diagram shows the main three elements of the present invention, the first element 170 is the 3D input method using a 5-way button, the second element 180 is the pointer, and the third elements 190 is the net of nodes.
  • FIG. 8.1 is a cylinder on a hand-held device display that is targeted by a pointer to be moved in 3D.
  • FIGS. 8.2 to 8.13 are the cylinder on the hand-held device display where it is moved along or rotated about the x, y, and z-axis using the present invention.
  • FIG. 9 is an example for a virtual reality application on the hand-held device display where the user can navigate in 3D using the present invention.
  • FIG. 10 is a 3D interface comprised of three cylindrical strips, where each one contains a number of icons. Best Mode for Carrying Out the Invention
  • the 3D input method can be used with any five keys of any hand-held device keyboard such as cell phone, GPS, or laptop.
  • the only condition is to assign five keys that are relatively located in a cross shape to each others as possible.
  • the keys that are labeled 6, 4, 2, 8, and 5 can replace the 5-way button, also in case of using the computer keyboard; the K, H, U, N, and J key can also replace the 5-way button.
  • PCB printed circuit board
  • the PCB will process raw analog signals and convert them into digital signals that can be used for the microprocessor of computer system.
  • the sensor continuously generates specific data corresponding to the finger pressing time.
  • the computer system utilizes the amount of time of pressing on any of the five positions of the 5-way button that are indicated in the column of the 2 nd pressing in FIGS. 3 or 4 as a value for the movement distance along the x, y, or z-axis, or as a value for the rotation angle about the x, y, or z-axis.
  • the digital sensor provides five independent digital ON-OFF signals in the direction of North, East, South, West, and Origin where these directions are associated, respectively, with the "+y”, “+ ⁇ ", “-y”, “-x”, and "z” positions of the 5-way button. For example, if the user pressed on the "+x" position, which is the "East” point of the 5-way digital button, then a (0,1,0,0,0) signal is generated, and if the user then pressed on the "+y” position which is the "North” button, then a (1,0,0,0,0) signal is generated. Accordingly the computer system translates these two button pressings as a counter-clockwise rotation about the z- axis as defined in the table of FIG. 4.
  • the value of the rotation which means the rotation angle depends on the amount of time the user will keep the "+y” position pressed, which is the "North” button of the 5-way digital button, where the default is to return the digital sensors to the (0,0,0,0,0) state once the user releases.
  • the user of the hand-held devices to operate other 3D applications that used to be limited to the computer only, for example the Internet world mapping such as Google Earth, the interactive three-dimensional graphics such as virtual reality, and the professional 3D software such as AutoCAD.
  • the Internet world mapping such as Google Earth
  • the interactive three-dimensional graphics such as virtual reality
  • the professional 3D software such as AutoCAD.
  • the present invention can also be used for the computer, where it enables the users to move, edit, or navigate in 3D on the computer display.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé et un système offrant trois dimensions en entrée pour dispositifs portatifs permettant le déplacement, la navigation et l'édition en trois dimensions par l'utilisateur au moyen d'un doigt de la main de manière intuitive. La présente invention utilise un bouton à cinq directions sur le clavier du dispositif portatif pour assurer six degrés de liberté permettant d'obtenir un déplacement tridimensionnel d'un pointeur sur l'affichage du dispositif portatif, desservant une pluralité d'applications en trois dimensions telles que l'utilisation de GPS, la réalité virtuelle, et des jeux.
PCT/EG2007/000021 2006-07-06 2007-06-28 Procédé et système offrant trois dimensions en entrée tridimensionnelle pour dispositifs portatifs WO2009000280A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/EG2007/000021 WO2009000280A1 (fr) 2007-06-28 2007-06-28 Procédé et système offrant trois dimensions en entrée tridimensionnelle pour dispositifs portatifs
US11/906,520 US20080062126A1 (en) 2006-07-06 2007-10-01 3D method and system for hand-held devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EG2007/000021 WO2009000280A1 (fr) 2007-06-28 2007-06-28 Procédé et système offrant trois dimensions en entrée tridimensionnelle pour dispositifs portatifs

Publications (1)

Publication Number Publication Date
WO2009000280A1 true WO2009000280A1 (fr) 2008-12-31

Family

ID=40185209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EG2007/000021 WO2009000280A1 (fr) 2006-07-06 2007-06-28 Procédé et système offrant trois dimensions en entrée tridimensionnelle pour dispositifs portatifs

Country Status (1)

Country Link
WO (1) WO2009000280A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120047817A (ko) * 2010-10-28 2012-05-14 허니웰 인터내셔널 인코포레이티드 이미지 내에서 선택 부호를 제어하는 디스플레이 시스템

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5652603A (en) * 1994-06-16 1997-07-29 Abrams; Daniel Lawrence 3-D computer input device
US5729249A (en) * 1991-11-26 1998-03-17 Itu Research, Inc. Touch sensitive input control device
US5889505A (en) * 1996-04-04 1999-03-30 Yale University Vision-based six-degree-of-freedom computer input device
EP1283495A2 (fr) * 2001-08-10 2003-02-12 Wacom Co., Ltd Indicateur de six degrés de l'information de liberté

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5729249A (en) * 1991-11-26 1998-03-17 Itu Research, Inc. Touch sensitive input control device
US5652603A (en) * 1994-06-16 1997-07-29 Abrams; Daniel Lawrence 3-D computer input device
US5889505A (en) * 1996-04-04 1999-03-30 Yale University Vision-based six-degree-of-freedom computer input device
EP1283495A2 (fr) * 2001-08-10 2003-02-12 Wacom Co., Ltd Indicateur de six degrés de l'information de liberté

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JONES P.E. ET AL.: "Low-cost 3D input device for next-generation user interface", PROCEEDINGS OF 1997 INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATIONS AND SIGNAL PROCESSING, 1997. ICICS, NEW YORK, NY, USA, IEEE US, 9 September 1997 (1997-09-09) - 12 September 1997 (1997-09-12) *
SELAVO L. ET AL.: "SeeMote: in-situ visualization and logging device for wireless sensor networks", 3RD INTERNATIONAL CONFERENCE ON BROADBAND COMMUNICATIONS, NEWORKS AND SYSTEMS, SAN JOSE, CA, USA, 1 October 2006 (2006-10-01) - 5 October 2006 (2006-10-05), XP031155858 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120047817A (ko) * 2010-10-28 2012-05-14 허니웰 인터내셔널 인코포레이티드 이미지 내에서 선택 부호를 제어하는 디스플레이 시스템
EP2447822A3 (fr) * 2010-10-28 2017-03-29 Honeywell International Inc. Système d'affichage pour contrôler un symbole de sélection au sein d'une image

Similar Documents

Publication Publication Date Title
US20080062126A1 (en) 3D method and system for hand-held devices
US7969418B2 (en) 3-D computer input device and method
US9789391B2 (en) Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US9489053B2 (en) Skeletal control of three-dimensional virtual world
US7683883B2 (en) 3D mouse and game controller based on spherical coordinates system and system for use
US9134797B2 (en) Systems and methods for providing haptic feedback to touch-sensitive input devices
US9317108B2 (en) Hand-held wireless electronic device with accelerometer for interacting with a display
US9950256B2 (en) High-dimensional touchpad game controller with multiple usage and networking modalities
JP2004525675A (ja) ゲーム及びホーム・エンターテイメント・デバイス遠隔制御
JP2012507802A (ja) 携帯端末上での運動処理を使用するコンテンツの制御およびアクセス
KR20030009919A (ko) 관성 센서를 구비하는 컴퓨터 게임용 입력 장치
Hürst et al. Multimodal interaction concepts for mobile augmented reality applications
WO2018196552A1 (fr) Procédé et appareil pour affichage de type main destiné à être utilisé dans une scène de réalité virtuelle
AU2010350920A1 (en) Gun-shaped game controller
Chen et al. An integrated framework for universal motion control
WO2009000280A1 (fr) Procédé et système offrant trois dimensions en entrée tridimensionnelle pour dispositifs portatifs
Scerbo et al. Design issues when using commodity gaming devices for virtual object manipulation
JP6980748B2 (ja) 表示制御プログラム、表示制御方法、及び表示制御システム
US20230195243A1 (en) Directional input device for computer mouse
US20240019926A1 (en) Information processing apparatus, method, computer program and system
Yang et al. An intuitive human-computer interface for large display virtual reality applications
Park et al. 3D Gesture-based view manipulator for large scale entity model review
Imanullah et al. IMU-based joystick: Usage and interaction schemes in simulation game
Nguyen 3DTouch: Towards a Wearable 3D Input Device for 3D Applications
Murphy et al. Assessment of gesture-based natural interface systems in serious games

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07722745

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07722745

Country of ref document: EP

Kind code of ref document: A1