GB2448319A - User Control of a Hand-Held Device - Google Patents

User Control of a Hand-Held Device Download PDF

Info

Publication number
GB2448319A
GB2448319A GB0706846A GB0706846A GB2448319A GB 2448319 A GB2448319 A GB 2448319A GB 0706846 A GB0706846 A GB 0706846A GB 0706846 A GB0706846 A GB 0706846A GB 2448319 A GB2448319 A GB 2448319A
Authority
GB
United Kingdom
Prior art keywords
hand
held device
device
image
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0706846A
Other versions
GB0706846D0 (en
Inventor
Joe Faith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northumbria University
Original Assignee
Northumbria University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northumbria University filed Critical Northumbria University
Priority to GB0706846A priority Critical patent/GB2448319A/en
Publication of GB0706846D0 publication Critical patent/GB0706846D0/en
Publication of GB2448319A publication Critical patent/GB2448319A/en
Application status is Withdrawn legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

A hand-held device (10), comprises a computing application which responds to directional commands of a user, a user-facing image registering unit to register a series of images, an image processing unit to derive motion data from the series of images corresponding to translational and/or rotational movement of the hand-held device in free space, and a direction control unit to convert the motion data into a directional command and to supply the directional command to the computing application.

Description

USER CONTROL OF A HAND-HELD DEVICE

The present invention relates in general to a hand-held device and to a method of controlling the hand-held device.

Hand-held devices are available in many different shapes and sizes and for many different functions. Examples include mobile electronic games consoles, personal music players and personal digital assistance (PDAs), as well as commUnicatiOn_oriented devices such as cellular telephones.

These hand-held devices typically contain computing applications requiring directional input from a user to control the movement of cursors, pointers, elements in games, the scrolling of a display screen or navigation through a menu structure. A directional command is supplied through a keypad, thumbwheel, touchpad, joystick or similar manipulable input. Typically these manipulable inputs are finger operated and can be difficult to use, particularly when the hand-held device is itself relatively small. The manipulable inputs tend to require relatively fine and accurate control by the user and sometimes operations become frustratingly difficult.

It is often desired to operate the hand-held device independently in free space. This restricts the use of other known devices for providing a directional input, such as a mouse or trackball, which rely on a desk or other fixed operating surface.

One aim of the present invention is to provide a hand-held device, and a method of controlling the same, which is simple and intuitive for a user to operate. A preferred aim is to avoid or reduce the use of manipulable inputs such as a keypad. Another preferred aim is to reduce the level of user dexterity required to operate the device.

Other aims and advantages of the invention will be discussed below or will be apparent from the following

description.

According to the present invention there is provided an apparatus and method as set forth in the appended claims.

Preferred features of the invention will be apparent from the dependent claims, and the description which follows.

Briefly, the present invention provides a hand-held device which carries an image receptor such as a camera. Images captured by the image receptor are processed to determine directional movements of the handheld device. The detected movement is then used to control an operation or output of the hand-held device.

In a first aspect of the present invention there is provided a hand-held device, comprising: a computing application of the hand-held device which responds to directional commands of a user; a user-facing image registering unit to register a series of images; an image processing unit to derive motion data from the series of images corresponding to translational and/or rotational movement of the hand-held device in free space; and a direction control unit to convert the motion data into a directional command and to supply the directional command to the computing application.

According to another aspect of the present invention there is provided method of controlling a hand-held device, comprising: registering a series of images taken from the hand-held device by a user-facing image_registering unit thereof; deriving motion data from the series of images corresponding to translational or rotational movement of the hand-held device in free space; and converting the motion data into a direction command to control a computing applicatioi of the hand-held device.

All of the features described herein may be combined with any of the above aspects, in any combination.

For a better understanding of the invention, and to show how embodiments of the same may be carried into effect, reference will now be made, by way of example, to the accompanying diagrammatic drawings in which: Figure 1 is a perspective view of a hand-held device as employed in a preferred embodiment of the present invention; Figure 2 is a schematic overview of the hand-held device of a preferred embodiment of the present invention; Figure 3 is a schematic overview of a method for controlling a hand-held device, according to a preferred embodiment of the present invention; Figure 4 is a schematic illustration of a hand-held device showing example movement directions; Figure 5 is a schematic view illustrating the field of view of a user facing camera of a mobile Lelecommurilcations device; Figure 6 is a schematic view illustrating the field of view of a scene-facing camera of the mobile telecommunications device; Figure 7a is a schematic illustration of a tilting motion about horizontal or vertical axes of the mobile telecommunications device; Figure 7b is a schematic illustration of a zooming motion of the mobile telecom]1unicatjons device; Figure 8 is a schematic illustration of four linear arrays used in analysis of images captured by the mobile telecommunications device; Figure 9 is graphic illustration of a match between arrays shown in Figure 8; Figure 10 is a schematic illustration of filtering that can be applied; and Figures ha and hib are schematic illustrations of the movement of an object between images in relation to linear arrays shown in Figure8.

Referring to Figure 1, a hand-held device 10 is shown according to a preferred embodiment of the present invention. In this example the hand-held device 10 is a communicator device such as a OSM cellular Lelephone.

The hand-ie1d device 10 includes a display screen 11 and one or moie user input keys or other manipulable inputs 12. Further, the hand-held device 10 carries a user-facing image receptor 15 such as a camera. In one embodiment the camera 15 is integrated within the hand- held device 10. In another embodiment (not shown) the camera 15 is removably attached to the hand-held device 10, such as with a clip-on fitting. In either case, it is preferred that the camera 15 is fixedly arranged in use with respect to a main body portion lOa of the hand-held device 10, such that the camera 15 faces the user and moves together with the hand-held device 10.

Figure 2 is a schematic representation of functional elements within the hand-held device 10. A control unit 16 receives image data from the camera 15. The control unit 16 includes an image processing unit 162 which performs a motion detection algorithm to produce motion data derived from the image data. Also, the control unit 16 includes a direction control unit 164 to translate the motion data into direction data, and thereby control a function or output of the hand-held device. The control unit 16 suitably includes a processor to perform computing operations, and has access to a memory 17 for data storage.

The hand-held device 10 suitably includes a microphone or other audio input 13 and a speaker or other audio output 14 In this case a radio frequency (RF) communication unit 18 is provided having an aerial 19 for wireless communication such as using GSN standards. In other embodiments the hand-held device 10 is arranged for local communication using, for example, Bluetooth or IEEE 802.11 WLAN prtoco1s Figure 3 is a schematic overview of a preferred method of controlling the hand-held device.

Referring to Figure 3, at step 300 a series of images are captured by the camera 15 and image data 301 is generated.

These images reflect the location and position (i.e. orientation) of the hand-held device 10 with respect to the user. The images can be a plurality of still images, or full motion video. In one embodiment, the camera 15 preferably supplies image data in the form of pixel values

in a 2D image field.

Step 310 comprises producing motion data 302 from the image data 301. Here, the image processing unit 162 performs a motion detection algorithm to produce a motion data stream.

At step 320 the motion data 302 is supplied to the direction control unit 164 to control a function or operation of the hand-held device.

The images are preferably captured by the user holding the device 10 in free space with the camera 15 facing towards the user as would be the case to make a video call, so that images of the user are captured and images of the user form a significant proportion of the captured content.

The camera 15 is fixedly carried by the device 10, such that movement of the device 10 causes images captured by the camera 15 to change. The changed image reflects the change of position of The device 10. Advantageously, the user moves the entire device 10, which requires relatively large motor movements. Most users find it much easier to make large-scale movements with larger motor muscles in their hand, arm or wrist as opposed to making very small movements with fine motor muscles in their fingers or thumb.

Controlling the hand-held device 10 using images from the camera 15 provides a more intuitive and simple user interface, compared with traditional keypads or other manipulable inputs. The user simply moves the whole device 10 rather than clicking a particular button.

The image-derived interface of the present invention also provides a richer experience for the user than can be achieved by conventional manipulable inputs. Most conventional user input techniques are restricted to translational movement in two directions (up-down and left-right) . However, through suitable image signal processing by the image processing unit 162, with the present invention it is possible to distinguish three dimensions of translation (up-down, left-right and zoom-in/out) as well as three dimensions of rotation (pitch, roll and yaw) . Although in practice few applications require control in all six of these dimensions of movement simultaneously, providing a combination of any two, three or more movements (such as pitch, roll and zoom) are immediately possible. Such combinations are especially useful within gaming applicatiors amongst others, by replacing the use of awkward and often unintuitive keypad combinations but still providing an equivalently complex user input.

Figure 4 is a schematic illustration of a hand-held device showing movement directions for X, Y and Z translations and R, P and Y rotations relative to the device.

Many mobile devices include two cameras: one facing backward (away from the user) to be used as an orthodox camera (see Figure 6, with dashed lines indicating a field of view); and one facing forward (towards the user) to support video calls (see Figure 5, with dashed lines

indicating the field of view) . Previous patent

applications --and prior art -utilise the backward-facing camera to detect motion of the device and to use this information in a control mechanism. However it is also possible to use the user-facing camera 15 in a way that produces a number of benefits: increasing the accuracy, reliability, and functionality of the existing technology The backward-facing camera, in normal use, is directed at some portion of the surrounding environment (Figure 6) . In order to use this scene for motion detection it must contain sufficient features. Where the camera is directed at a uniform flooring surface, for example, this may not be the case and the control mechanism will fail. However, a typical user's face contains sufficient features to reliably detect motion, so the control mechanism that using the user-facing camera 15 is less prone to failure (Figure 5) A motior-detectjor control mechanism detects reiative motion between the device 10 and the scene being viewed.

Thus, a mechanism that uses the backward-facing camera detects relative motion between the device and the surrounding environment. Therefore the user has to remain stationary with respect to their environment: the mechanism cannot be used while the user is walking or moving in a vehicle. However, the images produced from a user-facing camera 15 are dominated by the head and face of the user. As a result, a control mechanism using the user-facing camera detects relative motion between the device and the user's head: thus the mechanism can be used while the user is moving with respect to their environment.

A camera directed at the user's environment will detect features at a wide range of distances from the device (Figure 6) . This makes it practically impossible to detect zooming (in-out) motions of the device (zooming is illustrated in Figure 7b) since the optic flow field due zoom motions is not uniform across the image and is dependent on the distance of the object from the camera, with the magnitude of the flow reducing as the distance increases --unlike pitch or roll, which results in uniform translation of the optic flow. Previous patent applications and prior art have discussed the Possibility of detecting zoom motions, but they have never been successfully and reliably implemented in practice on a mobile device.

However, a camera 15 facing at a user has a large object at a relatively short distance, typically about 25cm (Figure 5) . This has the effect of increasing the magnitude and unitormity of the optic flow, which simplifies the problem of detecting zoom -which has now been demonstrated in practice.

Algorithm The technique described below builds on the technique described in GS2,424,055; and the reader is referred to that document for more information.

Many techniques are available to detect movements of the camera with respect to its environment using the optic flow field in a series of images. However, the limited processing power available on most mobile devices requires that, in order to be useful, an algorithm must be extremely efficient. One such algorithm is as follows.

Four linear arrays are positioned in the polar directions extending from the centre of each image (Figure 8), and the displacement of the image in the direction of each array between frames (dtlp, dDown, dLeft and dRight) is measured by finding the best match between array values.

Figure 9 shows how a feature four cells in width has been recognised in the upper part of the Figure representing a first image and again in the lower part representing an later image; the feature has moved to the right by d=4.

The net optic flow corresponding to translation of the image due to a tilting motion of the camera about both the horizontal and vertical axes (dY and dx) (Figure 7a) and an expansion of the image due to a zooming motion (Figure 7h) can then be derived using the following equations: (1): LIX = dRig/zi -dLe/t -dUp -dDowiz (4.dY- (lUp + dDmt,z + dLe/i + dRighi (3) : zoom = __________________________________________ In the example illustrated in Figs Ha and llb, the displacements measured between the two frames are as follows: dUp=3, dDown=0, dLeft=3, dRight=Q. Using Equations (1-3) these measured values yield derived translations dX=-1.5, dY=1.5, zoom=1.5; i.e. the arrays have detected a net movement to the left, upwards, and a positive zoom towards the object.

The accuracy and reliability of this method may be further enhanced by: -filtering the image such that the values of the cells in each of the arrays are taken to be the weighted sum of the image pixel values perpendicular to the direction of the array (Figure 10); and/or -using separate arrays for each of the red, green, and blue colour channels returned by most digital cameras. The displacement in each colour channel is then found independently, and then combined to give a mean measured displacement with a higher accuracy than can be found by taking aggregate grey-scale image brightness or normalised colour values.

The utility and functionality of this method may be increased by adjusting the aperture of both the linear arrays and perpendicular filters to weight features detected at the centre of the image more highiy, such that the calculated values of the arrays are dominated by the head and face ol the user, rather than their surroundings.

S As a result, the algorithm will measure motions of the camera relative to the head of the user, rather than their surroundings; which enables the technique to be used as a contiol mechanism while the user is in motion with respect to their surroundings.

This effect of isolating and detecting the relative motion of the user and the device from that of the device and the surroundings may be further enhanced by filtering the image prior to use to detect skin tones on the basis of their colour, using an algorithm such as those described by Kakumanu et al ("A survey of skin-color modeling and detection methods", Pattern Recognition, 40:3, 2007) The advantages of the method and apparatus described above are as follows.

-Taking the motion detection algorithms described in the GB2,424,Q55, and to use them in conjunction with a user-facing camera, which gives surprisingly effective results.

-Combining the measured displacements in four polar directions to produce net optic flow measurements corresponding to both tilting and zoom motions.

-Tuning the algorithm in such a way that the mechanism can be used while the user is in motion.

-Tuning the mechanism by hard-coding a reduced field of view.

--Tuning the mechanism by combining with a face-or skin-detection mechanism.

--Using the algorithm in combination with a user-facing camera to detect zoom motions in a way that had previously been described in theory, but not practically demonstrated.

implementations of the algorithm in a hand-held device can be used to scroll across, up or down, or zoom in/out of a map or document displayed on the screen of the hand-held device. Also, movement of the device can be used to move across a graphical user interface to move a cursor or to move between selected items shown on the screen. The screen may show letters that can be selected my moving the device as described above. In this way text can be inputted by selecting the letters needed. The method of letter selection may be a Dasher' type of text input method. The method of letter selection may utilise some form of text disambiguation or text input prediction.

The present invention has many benefits and advantages, as will he apparent from the description and explanation herein. In particular, a hand-held device is provided which is simple and intuitive for a user to operate. The image-derived control operations avoid or reduce the need for user to action a keypad or other manipulable inputs-type input. Moving substantially the whole device reduces the level of user dexterity required to operate the device. In some embodiments of the invention, this may allow people with movement difficulties, such as due to illness or injury, to better be able to operate a device.

Although a few preferred embodiments have been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the Invention, as defined in the appended claims.

Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.

All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.

Each eature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example Only of a generic series of equivalent or similar features.

The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specificatjoi (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (24)

1. A hand-held device, comprising: a computing application of the hand-held device which responds to dlrectional commands of a user; a user-facing image registering unit operable to register a series of images of a user operating the hand-held device; an image processing unit to derive motion data from the series of images corresponding to translational and/or rotational movement of the hand-held device in free space; and a direction control unit to convert the motion data into a directional command and to supply the directional command to the computing application.
2. The hand-held device of claim 1, wherein the image registering unit is a camera fixedly carried by the hand-held device.
3. The hand-held device of claim 1 or 2, comprising a radio frequency communication unit, and an aerial for wireless communication with other devices.
4. The hand-held device of claim 3, wherein the communication unit performs wireless communication in a cellular telecommunications network.
5. The hand-held device of any preceding claim, wherein the hand-held device is a mobile telephone.
6. The hand-held device of any preceding claim, further comprising: a display screen to provide a graphical user interface; and wherein the computing application controls the graphical user interface of the display screen in response to the directional command of the direction control unit.
7. The hand-held device of any preceding claim, further comprising: an audio output unit; and wherein the computing application controls an audio signal of the audio output in response the direction command of the direction control unit.
8. The hand-held device of any preceding claim, wherein the computing application controls an internal function of the hand-held device in response to the direction command of the direction control unit.
9. The hand-held device of claim 6, wherein the image registering unit is adapted, whilst a user is viewing the display screen, to register an image of a user's face.
10. The hand-held device of any preceding claim, wherein the image processing unit is adapted to derive motion data from the series of images corresponding to translational and/or rotational movement of the hand-held device in free space whilst the user is moving.
11. The hand-held device of any preceding claim, wherein the image registering unit is operable to detect tilting and/or zooming movements of the device with respect to an object located at a distance of generally in the range of to 60cm from the user.
12. The hand-held device of any preceding claim, wherein the image processing unit is operable to derive the motion data using at least three linear arrays extending from substantially a centre of the images.
13. The hand-held device of claim 12, wherein the image processing unit uses measured displacement in four linear arrays, dUp, dDown, dLeft and dRight, to derive the motion data using the foliwing equations: (1) : (IX = dRighi -dLefl (2) : dY = dUp -dDoii'rz dUp + dDown + dLeft + dRighi (3): zoom = ______________________________ 4 where dUp, dDown, dLeft and dRight are the image displacement measured in substantially orthogonal linear arrays extending from substantially the centre of the image and where dX is a lateral translation of an identified object, dY is a vertical translation of the identified object and zoom is a movement of the identified object towards or away from the image registering unit.
14. The hand-held device of claim 12 or claim 13, wherein the hand-held device is operable to smooth the image perpendicular to a major axis of the linear array.
15. The hand-held device of claim 14, wherpn the smoothing is Gaussian smoothing.
16. The hand-held device of any preceding claim wherein the device is usable to scroll around, and/or change the magnification of, a map, web page and/or a text or other document being shown on the device.
17. The hand-held device of any preceding claim wherein the device is usable to provide text input to an application running on the device wherein sequences of letters are chosen through motions of the device.
18. The hand-held device of claim 17 in which the method of letter selection is a Dasher' type of text input method.
19. The hand-held device of claim 17 in which the method of letter selection utilises some form of text disambiguation or text input prediction.
20. A method of controlling a hand-held device, comprising: registering a series of images taken from the hand-held device by a user-facing image_registering unit thereof; deriving motion data from the series of images corresponding to translational or rotational movement of the hand-held device in free space; and converting the motion data into a direction comniand to control a computing application of the hand-held device.
21. The method of claim 20, in which the direction command is a command to scroll across an image on a display of the hand-held device, and/or a command to zoom into and out of the image.
22. A computer readable storage medium having computer executable instructions stored thereon to cause a hand-held device to perform the method 0 claim 20 or claim 21.
23. A hand-held device substantially as hereinbefore described with reference to the accompanying drawings.
24. A method of controllirg a hand-hold device, substantially as hereinhefore described with reference to the accompanying drawings.
GB0706846A 2007-04-10 2007-04-10 User Control of a Hand-Held Device Withdrawn GB2448319A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0706846A GB2448319A (en) 2007-04-10 2007-04-10 User Control of a Hand-Held Device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0706846A GB2448319A (en) 2007-04-10 2007-04-10 User Control of a Hand-Held Device

Publications (2)

Publication Number Publication Date
GB0706846D0 GB0706846D0 (en) 2007-05-16
GB2448319A true GB2448319A (en) 2008-10-15

Family

ID=38091060

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0706846A Withdrawn GB2448319A (en) 2007-04-10 2007-04-10 User Control of a Hand-Held Device

Country Status (1)

Country Link
GB (1) GB2448319A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE44103E1 (en) 1997-10-28 2013-03-26 Apple Inc. Portable computers

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002366272A (en) * 2001-06-05 2002-12-20 Kyocera Corp Portable terminal device and method for scrolling display image
GB2387755A (en) * 2002-03-28 2003-10-22 Nec Corp Portable apparatus including improved pointing device using image shift
US20040180690A1 (en) * 2002-12-16 2004-09-16 Lg Electronics Inc. Apparatus for operating a mobile communication terminal with integrated photographic apparatus and method thereof
JP2004318793A (en) * 2003-04-17 2004-11-11 Kenichi Horie Information terminal based on operator's head position
FR2859800A1 (en) * 2003-09-12 2005-03-18 Wavecom Portable electronic device e.g. wireless telephone, has user interface that associates events with movements applied by user and analyzes principal motion vector that is determined by motion detection unit

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002366272A (en) * 2001-06-05 2002-12-20 Kyocera Corp Portable terminal device and method for scrolling display image
GB2387755A (en) * 2002-03-28 2003-10-22 Nec Corp Portable apparatus including improved pointing device using image shift
US20040180690A1 (en) * 2002-12-16 2004-09-16 Lg Electronics Inc. Apparatus for operating a mobile communication terminal with integrated photographic apparatus and method thereof
JP2004318793A (en) * 2003-04-17 2004-11-11 Kenichi Horie Information terminal based on operator's head position
FR2859800A1 (en) * 2003-09-12 2005-03-18 Wavecom Portable electronic device e.g. wireless telephone, has user interface that associates events with movements applied by user and analyzes principal motion vector that is determined by motion detection unit

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE44103E1 (en) 1997-10-28 2013-03-26 Apple Inc. Portable computers
USRE44855E1 (en) 1997-10-28 2014-04-22 Apple Inc. Multi-functional cellular telephone
USRE45559E1 (en) 1997-10-28 2015-06-09 Apple Inc. Portable computers
USRE46548E1 (en) 1997-10-28 2017-09-12 Apple Inc. Portable computers

Also Published As

Publication number Publication date
GB0706846D0 (en) 2007-05-16

Similar Documents

Publication Publication Date Title
US8581938B2 (en) Information processing apparatus, information processing method and program for magnifying a screen and moving a displayed content
KR101651975B1 (en) Multi-functional hand-held device
US9507431B2 (en) Viewing images with tilt-control on a hand-held device
US8693732B2 (en) Computer vision gesture based control of a device
EP2207342B1 (en) Mobile terminal and camera image control method thereof
US9325967B2 (en) Imaging apparatus
EP1513049B1 (en) Input key and input apparatus
US8854433B1 (en) Method and system enabling natural user interface gestures with an electronic system
US7069057B2 (en) Cellular phone including a display revealed by removing a removable operation unit
RU2242043C2 (en) Method for operation of user interface of portable data processing device
US20070019000A1 (en) Display control device, display control method, program, and portable apparatus
US20080284729A1 (en) Three dimensional volumetric display input and output configurations
US8689145B2 (en) 3D remote control system employing absolute and relative position detection
US20130154811A1 (en) Remote control device
EP2821879A1 (en) Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
JP4699955B2 (en) The information processing apparatus
US8624927B2 (en) Display apparatus, display control method, and display control program
KR101505198B1 (en) A wireless terminal and a driving method thereof
US20060164382A1 (en) Image manipulation in response to a movement of a display
EP2040156A2 (en) Image processing
US8279182B2 (en) User input device and method using fingerprint recognition sensor
Kratz et al. HoverFlow: expanding the design space of around-device interaction
US8542188B2 (en) Pointing input device, pointing control device, pointing control system, and pointing control method
US8434015B2 (en) Information processing apparatus, information processing method, and information processing program
US7774075B2 (en) Audio-visual three-dimensional input/output

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)