WO2003079179A1 - Motion mouse system - Google Patents

Motion mouse system Download PDF

Info

Publication number
WO2003079179A1
WO2003079179A1 PCT/KR2003/000524 KR0300524W WO03079179A1 WO 2003079179 A1 WO2003079179 A1 WO 2003079179A1 KR 0300524 W KR0300524 W KR 0300524W WO 03079179 A1 WO03079179 A1 WO 03079179A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
indicators
motion mouse
status information
motion
mouse
Prior art date
Application number
PCT/KR2003/000524
Other languages
French (fr)
Inventor
Seonghee Han
Original Assignee
Softronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Abstract

An apparatus for inputting three-dimensional movements and status information includes a motion mouse having at least one indicator showing the status information, which is arranged on a first surface of the motion mouse, and at least one control switch for controlling a status of the indicator, which is arranged on a second surface of the motion mouse, a camera for capturing an image of the motion mouse and an image analyzer for analyzing the image of the motion mouse by using the lens formula to recognize the three-dimensional movements and the status information of the motion mouse. The indicators may have different color, size or shape from each other to effectively represent a rotational movement of the motion mouse. The apparatus inputs three-dimensional movements of the motion mouse in a simple and cost-effective manner without transmitting/receiving any data between a computer and a pointing device through wire or wireless connection.

Description

MOTION MOUSE SYSTEM

Technical Field

The present invention relates to an apparatus and a method for inputting three-dimensional coordinates and status information of a pointing device into an information device such as a computer; and, more particularly, to an apparatus and a .method for analyzing an image obtained by capturing a movement of a pointing device in a three- dimensional space and recognizing changes of three- dimensional coordinates and status information of the pointing device.

Background Art

Generally, as a pointing device for positioning a cursor on a display screen of a computer, a device called "mouse" has been used. Such a pointing device detects its relative displacement over a plane by employing an optical method or a mechanical method for detecting a movement of a ball installed inside the pointing device. The detected displacement is converted into a displacement of the cursor on the display screen. However, since the mouse is designed to move along a two-dimensional plane, it has a limitation that three-dimensional movements of the mouse cannot be transferred to the computer.

Meanwhile, as users of personal computers or CAD (computer aided design) systems require a device for inputting positions and shapes of various three-dimensional objects, a variety of pointing devices for inputting three- dimensional coordinates have been proposed. Such three- dimensional pointing devices may have a function to measure its movement along the z-axis in addition to its two- dimensional movement over the x-y coordinate system. Further, the three-dimensional pointing devices may employ an accelerometer for detecting its rotating movement along the x, y and z axes.

However, the prior art pointing device may cause its user inconvenience since the pointing device may need a supplementary device such as a supporting board on which the pointing device moves. Further, the prior art three- dimensional pointing device has to be located within a short distance from a computer when they are wire-connected to each other. If the pointing device is wireless-connected to the computer through, e.g., IR (infrared) light, the limitation of distance between the pointing device and the computer may be overcome. Even in this case, however, if an IR light emitting direction of one side, e.g., the pointing device, does not coincide with an IR light reception direction of the other side, e.g., the computer, data communications between the both sides may be interrupted. Disclosure of Invention

It is, therefore, an object of the present invention to provide an apparatus and a method for analyzing an image obtained by capturing a movement of a pointing device in a three-dimensional space in front of a computer and recognizing changes of three-dimensional coordinates and status information of the pointing device.

In accordance with one aspect of the present invention, there is provided an apparatus for inputting three- dimensional coordinates and status information of a motion mouse, including: the motion mouse including more than one indicator, which are arranged on a first surface of the motion mouse, for representing the status information and more than one control switch, which are arranged on a second surface of the motion mouse, for controlling the status information represented by the indicators; a camera for capturing images of the indicators arranged on the first surface; and an image analyzer for analyzing the captured images and recognizing three-dimensional coordinates and status information of the motion mouse.

In accordance with another aspect of the present invention, there is provided a method for inputting three- dimensional coordinates and status information of a motion mouse including more than one indicator, which are arranged on a first surface of the motion mouse, for representing the status information and more than one control switch, which are arranged on a second surface of the motion mouse, for controlling the status information represented by the indicators, the method including the steps of: capturing images of the indicators; extracting areas of the indicators by using information on colors, sizes and/or shapes of the indicators or locations of the indicators on the first surface; determining center coordinates of the areas of the indicators; and determining changes of three-dimensional positions, an amount of rotation and status information of the motion mouse by using the determined center coordinates.

Brief Description of Drawings

The above and other objects and features of the present invention will become apparent from the following description of preferred embodiments given in conjunction with the accompanying drawings, in which:

Fig. 1 sets forth an apparatus for inputting three- dimensional coordinates and status information of a motion mouse in accordance with a preferred embodiment of the present invention;

Fig. 2 depicts a motion mouse in accordance with a preferred embodiment of the present invention; Fig. 3 illustrates a rotation of a front surface of a motion mouse in accordance with a preferred embodiment of the present invention, the front surface having two illuminators thereon;

Fig. 4 provides a flowchart showing a method for inputting three-dimensional coordinates and status information of a motion mouse in accordance with a preferred embodiment of the present invention;

Fig. 5 presents a diagram of a lens and an object for describing the concept of the lens formula; and

Fig. 6 charts a rotation of a front surface of a motion mouse in accordance with a preferred embodiment of the present invention and changes of locations of illuminators attached on the front surface in accordance with the rotation of the front surface.

Best Mode for Carrying Out the Invention

Referring to Fig. 1, there is provided a configuration of an apparatus for inputting three-dimensional coordinates and status information of a motion mouse in accordance with a preferred embodiment of the present invention. As shown in Fig. 1, the apparatus includes a computer 110, a camera 120 for capturing an image, which is connected to the computer 110, a pointing device (hereinafter referred to as "motion mouse") for representing three-dimensional positions and status information. A user of the computer 110 represents three-dimensional positions and status information by moving the motion mouse 200 in a three- dimensional space in front of the camera 120. The camera 120 captures images of the movements of the motion mouse 200, such that an analysis software executed in the computer 110 analyzes the images captured by the camera 120 to recognize the three-dimensional positions and the status information of the motion mouse 200.

Herein, since the motion mouse 200 does not have to transceive any information through a wired or wireless connection, the motion mouse 200 can move freely in a three- dimensional space in front of the camera 120. For instance, the motion mouse 200 may rotate or move in left/right and up/down directions or any combination thereof. The rotational movement of the motion mouse 200 may be decomposed into several rotational components on x, y and z axes (the z axis represents an axis passing through a center of a lens, i.e., angle of view, of the camera 120.)

Fig. 2 illustrates a configuration of the motion mouse 200 in accordance with a preferred embodiment of the present invention. The motion mouse 200 includes a plurality of illuminators 211 to 213 and a plurality of switches 221 to 223 for controlling ON/OFF status of the illuminators 211 to 213. The switches 221 to 223 respectively control flickering of the illuminators 211 to 213, so that a combination of the ON/OFF status of the illuminators 211 to 213 represents status information such as selection/cancellation of a specific function, scrolling of a computer screen and three-dimensional movements of the motion mouse 200.

Although the illuminators 211 to 213 have been described to be arranged in a line on the front surface of the motion mouse 200 in Fig. 2, the illuminators 211 to 213 may be arranged asymmetrically with respect to a center of the front surface of the motion mouse 200. When the illuminators 211 to 213 are arranged asymmetrically with respect to the center of the front surface of the motion mouse 200, the motion mouse 200 can represent any rotational movements. Further, as shown in Fig. 2, even when the illuminators 211 to 213 are arranged symmetrically with respect to the center of the front surface of the motion mouse 200, the illuminators 211 to 213 may be configured to have different sizes, shapes and/or colors. The number of the illuminators may be varied as long as the motion mouse can represent various three-dimensional movements.

Fig. 3 charts an exemplary image of a front surface of the motion mouse 200, which is captured by the camera 120. In Fig. 3, the motion mouse 200 employs two illuminators LI and L2, each of which has a different size. In this case, although the motion mouse 200 has only two illuminators LI and L2, the motion mouse 200 can represent various rotational movements.

In general, the illuminators 211 to 213 may be designed to emit visible rays. Alternatively, in order to perform more rapidly and precisely analysis of an image captured by the camera 120, the illuminators 211 to 213 may be implemented to irradiate infrared rays. In this case, an infrared filter may be installed in front of the camera 120 to filter the infrared rays emitted by the illuminators 211 to 213. The illuminators 211 to 213 need a power supply, which is installed inside the motion mouse 200 or the computer 110, to produce the rays. However, the illuminators 211 to 213 may be implemented by using fluorescent materials, which eliminate any needs for the power supply. In order to easily distinguish one of the illuminators from one another, the illuminators 211 to 213 may have different sizes, shapes and/or colors (or frequencies). Further, there may be installed shutters in front of the illuminators 211 to 213 to replace the flickering (i.e., ON/OFF) functions of the illuminators 211 to 213. In this case, the control switches 221 to 223 are configured to control the opening and shutting movements of the shutters.

In the meantime, analysis software is executed inside the computer 110 for analyzing images of the motion mouse 200, which are captured by the camera 120. In the following, a method for analyzing the images of the motion mouse in accordance with a preferred embodiment of the present invention will be described in detail with reference to Fig. 4 .

First, the camera 120 captures the movements of the motion mouse 200 and transfers the captured images of the motion mouse 200 to the computer 110 (step 402). Then, the analysis software, which is executed in the computer 110, performs analysis of the captured images.

The analysis software extracts areas of the illuminators in the images of the motion mouse 200 (step 404). The analysis software may preprocess the images to easily extract the areas of the illuminators from the images. For example, a threshold value may be predetermined to convert pixel levels of the images into binary values, i.e., 0 and 1, such that, when a pixel level of the images is larger than or equal to the threshold value, a high level, e.g., 1, is assigned to the pixel. Further, when the pixel level is smaller than the threshold value, a low level, e.g., 0, is assigned to the pixel. Meanwhile, the computer 110 stores information on locations, colors, sizes and/or shapes of the illuminators arranged on the front surface of the motion mouse 200, which is utilized by the analysis software to extract more precisely the areas of the illuminators.

Thereafter, center coordinates of the extracted areas of the illuminators are calculated (step 406) . The center coordinates of the extracted areas may be set to, e.g., centroids of the areas.

The analysis software then reckons amounts of forward/backward, right/left, up/down movements or rotations of the motion mouse 200 based on the center coordinates of the illuminators and the distances between the center coordinates (step 410). Herein, the analysis software utilizes the lens formula in calculating changes in positions of the motion mouse 200 on the z axis. For instance, as shown in Fig. 5, if it is assumed that a distance between an object O and a lens 510, a distance between an image I and the lens 510 and a focal distance of the lens 510 are b, a and f, respectively, a mathematical expression ("the lens formula") as shown in Equation (1) can be formulated.

I 1-1 Equation (1) α + b ~ f

In this case, a magnification m of the lens 510 can be determined as expressed in Equation (2).

b H m = Equation [ 2 ]

Therefore, the analysis software determines the magnification m by measuring a distance (i.e., the size H of the object) between the illuminators of the motion mouse 200 and a pixel distance (i.e., the size h of the image) between the center coordinates of the illuminators. Further, the analysis software estimates the distance b by using the fixed distance a in accordance with Equation (2). Accordingly, by measuring changes of the pixel distance (i.e., the size h of the image) between the center coordinates of the illuminators, changes in the positions of the illuminators on the z axis can be calculated.

Meanwhile, the amounts of rotations of the motion mouse 200 on the x, y and z axes may be calculated by measuring changes in the positions of the illuminators and the distances between the positions. For instance, as illustrated in Fig. 6, wherein four illuminators LI to L4 having different shapes or colors are arranged crosswise with respect to a center C on the front surface of the motion mouse 200, a coordinate (Δx, Δy) of the center C on the x-y plane with respect to the z axis ( 0) and the amount (Δθ) of rotations of the illuminators LI to L4 may be estimated. Further, by measuring changes in a ratio of a distance between the illuminators LI and L3 and that between the illuminators L2 and L4 , the amount of rotation of the motion mouse 200 with respect to the x axis or the y axis can be determined.

The analysis software recognizes ON/OFF status or brightness of the illuminators while calculating changes of the positions of the illuminators (step 408) . And then, the ON/OFF status or brightness of the illuminators is converted into information on selection/cancellation of specific functions or scrolling of a display screen.

The determined three-dimensional positions and status information of the motion mouse 200 are then stored in a memory of the computer 110 to be provided for an application program (step 412). The application program may be a browser program for navigating in a three-dimensional space composed by using three-dimensional graphics technology or a CAD program for editing three-dimensional objects. Although the method of the present invention has been described to be executed by software installed in the computer 110, a part or all of the steps of the method may be executed by an ad hoc hardware.

As described above, the method and apparatus for inputting three-dimensional coordinates and status information of a motion mouse in accordance with the present invention transfers changes of the three-dimensional coordinates and status information of the motion mouse without performing wireless or wired data transmission between the motion mouse and the computer. Therefore, the method and apparatus of the present invention have an advantage that a user can input three-dimensional movements of a pointing device more intuitively. These aspects of the present invention contribute to convenience and portability of a pointing device.

While the invention has been shown and described with respect to the preferred embodiments, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.

Claims

1. An apparatus for inputting three-dimensional coordinates and status information of a motion mouse, comprising: the motion mouse including more than one indicator, which are arranged on a first surface of the motion mouse, for representing the status information and more than one control switch, which are arranged on a second surface of the motion mouse, for controlling the status information represented by the indicators; a camera for capturing images of the indicators arranged on the first surface; and an image analyzer for analyzing the captured images and recognizing three-dimensional coordinates and status information of the motion mouse.
2. The apparatus of claim 1, wherein the indicators are illuminators, which are switchable ON or OFF, and each of the control switches controls the ON/OFF of the corresponding illuminator.
3. The apparatus of claim 2, wherein the illuminators emit infrared lights and the camera includes a filter for filtering the infrared lights.
4. The apparatus of claim 2, wherein the illuminators have different colors, sizes and/or shapes.
5. The apparatus of claim 2, wherein the illuminators are arranged asymmetrically with respect to a center of the first surface.
β. The apparatus of claim 1, wherein the motion mouse further includes shutters for opening and shutting the indicators, wherein the indicators have different colors, sizes and/or shapes and the control switches control the opening and shutting of the shutters.
7. The apparatus of claim 6, wherein the illuminators are arranged asymmetrically with respect to a center of the first surface.
8. The apparatus of claim 1, wherein the image analyzer performs the steps of: determining center coordinates of the indicators; calculating pixel distances between the center coordinates; and calculating changes in three-dimensional positions and an amount of rotation of the indicators by using the pixel distances and the lens formula.
9. The apparatus of claim 8, wherein the image analyzer analyzes the captured images to determine ON/OFF status, colors, sizes and/or shapes of the indicators, thereby determining the status information of the motion mouse.
10. A method for inputting three-dimensional coordinates and status information of a motion mouse including more than one indicator, which are arranged on a first surface of the motion mouse, for representing the status information and more than one control switch, which are arranged on a second surface of the motion mouse, for controlling the status information represented by the indicators, the method comprising the steps of: capturing images of the indicators; extracting areas of the indicators by using information on colors, sizes and/or shapes of the indicators or locations of the indicators on the first surface; determining center coordinates of the areas of the indicators; and determining changes of three-dimensional positions, an amount of rotation and status information of the motion mouse by using the determined center coordinates.
11. The method of claim 10, wherein the step of determining changes of the three-dimensional positions includes the steps of: calculating pixel distances between the center coordinates; and calculating changes in three-dimensional positions and an amount of rotation of the indicators by using the pixel distances and the lens formula.
12. The method of claim 11, wherein the step of determining changes in the three-dimensional positions further includes a step of analyzing the captured images to determine ON/OFF status, colors, sizes and/or shapes of the indicators, thereby determining the status information of the motion mouse.
PCT/KR2003/000524 2002-03-18 2003-03-18 Motion mouse system WO2003079179A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR20020014592A KR20030075399A (en) 2002-03-18 2002-03-18 Motion Mouse System
KR10-2002-0014592 2002-03-18

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2003215939A AU2003215939A1 (en) 2002-03-18 2003-03-18 Motion mouse system

Publications (1)

Publication Number Publication Date
WO2003079179A1 true true WO2003079179A1 (en) 2003-09-25

Family

ID=28036101

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2003/000524 WO2003079179A1 (en) 2002-03-18 2003-03-18 Motion mouse system

Country Status (2)

Country Link
KR (1) KR20030075399A (en)
WO (1) WO2003079179A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005116809A2 (en) 2004-05-24 2005-12-08 3D For All Számítástechnikai Fejlesztö Kft. System and method for operating in virtual 3d space and system for selecting an operation via a visualizing system
WO2007035314A2 (en) * 2005-09-15 2007-03-29 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
WO2008120189A1 (en) * 2007-03-29 2008-10-09 Cam-Trax Technologies Ltd System and method for tracking an electronic device
US7961907B2 (en) 2006-06-13 2011-06-14 Sunplus Technology Co., Ltd. Portable electronic device
WO2011127646A1 (en) * 2010-04-13 2011-10-20 Nokia Corporation An apparatus, method, computer program and user interface
EP2383635A3 (en) * 2010-04-29 2012-05-02 AU Optronics Corporation Camera based touch system
GB2473168B (en) * 2008-06-04 2013-03-06 Hewlett Packard Development Co System and method for remote control of a computer
US8542907B2 (en) 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US8840470B2 (en) 2008-02-27 2014-09-23 Sony Computer Entertainment America Llc Methods for capturing depth data of a scene and applying computer actions
US8961313B2 (en) 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
US8976265B2 (en) 2002-07-27 2015-03-10 Sony Computer Entertainment Inc. Apparatus for image and sound capture in a game environment
US9381424B2 (en) 2002-07-27 2016-07-05 Sony Interactive Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US9474968B2 (en) 2002-07-27 2016-10-25 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US9573056B2 (en) 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
US9682320B2 (en) 2002-07-22 2017-06-20 Sony Interactive Entertainment Inc. Inertially trackable hand-held controller
US9682319B2 (en) 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
US10099147B2 (en) 2004-08-19 2018-10-16 Sony Interactive Entertainment Inc. Using a portable device to interface with a video game rendered on a main display

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100807620B1 (en) * 2006-09-25 2008-03-03 엠텍비젼 주식회사 Apparatus for controlling moving of pointer and method thereof
KR100807625B1 (en) * 2006-09-25 2008-03-03 엠텍비젼 주식회사 Apparatus for controlling moving of pointer and method thereof
US9032334B2 (en) 2011-12-21 2015-05-12 Lg Electronics Inc. Electronic device having 3-dimensional display and method of operating thereof
WO2013094786A1 (en) * 2011-12-21 2013-06-27 Lg Electronics Inc. Electronic device having 3-dimensional display and method of operating thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5297061A (en) * 1993-05-19 1994-03-22 University Of Maryland Three dimensional pointing device monitored by computer vision
JPH0934633A (en) * 1995-07-17 1997-02-07 Sanyo Electric Co Ltd Space mouse and space mouse system
JPH09265346A (en) * 1996-03-29 1997-10-07 Bijiyuaru Sci Kenkyusho:Kk Space mouse, mouse position detection device and visualization device
JP2001060141A (en) * 1999-08-23 2001-03-06 Canon Inc Coordinate input device, control method therefor and computer readable memory

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5297061A (en) * 1993-05-19 1994-03-22 University Of Maryland Three dimensional pointing device monitored by computer vision
JPH0934633A (en) * 1995-07-17 1997-02-07 Sanyo Electric Co Ltd Space mouse and space mouse system
JPH09265346A (en) * 1996-03-29 1997-10-07 Bijiyuaru Sci Kenkyusho:Kk Space mouse, mouse position detection device and visualization device
JP2001060141A (en) * 1999-08-23 2001-03-06 Canon Inc Coordinate input device, control method therefor and computer readable memory

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9682320B2 (en) 2002-07-22 2017-06-20 Sony Interactive Entertainment Inc. Inertially trackable hand-held controller
US8976265B2 (en) 2002-07-27 2015-03-10 Sony Computer Entertainment Inc. Apparatus for image and sound capture in a game environment
US9474968B2 (en) 2002-07-27 2016-10-25 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US9381424B2 (en) 2002-07-27 2016-07-05 Sony Interactive Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US9682319B2 (en) 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
WO2005116809A2 (en) 2004-05-24 2005-12-08 3D For All Számítástechnikai Fejlesztö Kft. System and method for operating in virtual 3d space and system for selecting an operation via a visualizing system
US10099147B2 (en) 2004-08-19 2018-10-16 Sony Interactive Entertainment Inc. Using a portable device to interface with a video game rendered on a main display
WO2007035314A3 (en) * 2005-09-15 2007-06-21 Sony Computer Entertainment Inc Computer image and audio processing of intensity and input devices for interfacing with a computer program
WO2007035314A2 (en) * 2005-09-15 2007-03-29 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US9573056B2 (en) 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
US7961907B2 (en) 2006-06-13 2011-06-14 Sunplus Technology Co., Ltd. Portable electronic device
EP2284664A3 (en) * 2007-03-29 2011-05-04 Y.T. Ventures Ltd System and method for tracking an electronic device
US8094885B2 (en) 2007-03-29 2012-01-10 Y.T. Ventures Ltd System and method for tracking an electronic device
WO2008120189A1 (en) * 2007-03-29 2008-10-09 Cam-Trax Technologies Ltd System and method for tracking an electronic device
US8542907B2 (en) 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US8840470B2 (en) 2008-02-27 2014-09-23 Sony Computer Entertainment America Llc Methods for capturing depth data of a scene and applying computer actions
GB2473168B (en) * 2008-06-04 2013-03-06 Hewlett Packard Development Co System and method for remote control of a computer
US8736549B2 (en) 2008-06-04 2014-05-27 Hewlett-Packard Development Company, L.P. System and method for remote control of a computer
US8961313B2 (en) 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
US9535493B2 (en) 2010-04-13 2017-01-03 Nokia Technologies Oy Apparatus, method, computer program and user interface
WO2011127646A1 (en) * 2010-04-13 2011-10-20 Nokia Corporation An apparatus, method, computer program and user interface
US8338725B2 (en) 2010-04-29 2012-12-25 Au Optronics Corporation Camera based touch system
EP2383635A3 (en) * 2010-04-29 2012-05-02 AU Optronics Corporation Camera based touch system

Also Published As

Publication number Publication date Type
KR20030075399A (en) 2003-09-26 application

Similar Documents

Publication Publication Date Title
Zhang et al. Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper
US20110102570A1 (en) Vision based pointing device emulation
US20050275618A1 (en) Pointing device
US6624833B1 (en) Gesture-based input interface system with shadow detection
US7313255B2 (en) System and method for optically detecting a click event
US20050249386A1 (en) Pointing device having fingerprint image recognition function, fingerprint image recognition and pointing method, and method for providing portable terminal service using thereof
US20110243380A1 (en) Computing device interface
US20110234492A1 (en) Gesture processing
US6710770B2 (en) Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US20080036732A1 (en) Virtual Controller For Visual Displays
Harrison et al. OmniTouch: wearable multitouch interaction everywhere
US6764185B1 (en) Projector as an input and output device
US20110216007A1 (en) Keyboards and methods thereof
US20090278915A1 (en) Gesture-Based Control System For Vehicle Interfaces
US20110279397A1 (en) Device and method for monitoring the object's behavior
US7257255B2 (en) Capturing hand motion
US7598942B2 (en) System and method for gesture based control system
US6614422B1 (en) Method and apparatus for entering data using a virtual input device
US6554434B2 (en) Interactive projection system
US20110291988A1 (en) Method and system for recognition of user gesture interaction with passive surface video displays
US6594616B2 (en) System and method for providing a mobile input device
US20120044140A1 (en) Information display system and program, and optical input system, projection-type images and display apparatus
US7755608B2 (en) Systems and methods of interfacing with a machine
US20140369558A1 (en) Systems and methods for machine control
US8723789B1 (en) Two-dimensional method and system enabling three-dimensional user interaction with a device

Legal Events

Date Code Title Description
AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase
WWW Wipo information: withdrawn in national office

Country of ref document: JP

NENP Non-entry into the national phase in:

Ref country code: JP