GB2458881A - Interface control using motion of a mobile device - Google Patents
Interface control using motion of a mobile device Download PDFInfo
- Publication number
- GB2458881A GB2458881A GB0805093A GB0805093A GB2458881A GB 2458881 A GB2458881 A GB 2458881A GB 0805093 A GB0805093 A GB 0805093A GB 0805093 A GB0805093 A GB 0805093A GB 2458881 A GB2458881 A GB 2458881A
- Authority
- GB
- United Kingdom
- Prior art keywords
- user
- desktop
- computing device
- pointer
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 claims abstract description 43
- 230000015654 memory Effects 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 5
- 238000010295 mobile communication Methods 0.000 claims description 2
- 230000004044 response Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000001815 facial effect Effects 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Geometry (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The invention relates to improved user interfaces for hand held computing devices such as mobile phones. It describes a method of controlling the location of a pointer on a two or three dimensional desktop using a handheld mobile computing device having a camera facing the user. The camera captures an image of the user's face detects a relative orientation of the handheld mobile computing device and the user's face and controls the pointer location on the two or three dimensional desktop responsive to the detected orientation. Preferably the pointer is represented on a user-facing display together with a portion of the desktop at the pointer location. The portion displayed is changed together with the detected orientation to give the impression that the desktop is located behind the display, such that the display forms a portal through which the user looks at the desktop.
Description
Mobile Computing Device Control Systems
FiELD OF THE INVENTION
This invention relates genera]ly to techniques for controlling hand held mobile computing devices such as mobile phones, and in particular to improved user interfaces for such devices.
BACKGROUND TO THE INVENTION
In general most current mobile phones use a keyboard and/or joystick type interface.
The recently introduced Apple iPhone (Registered Trade Mark) is touch sensitive and to a degree easier to use than conventional devices. However, there remains a need for iiiiproved techniques for interacting with hand held mobile computing devices, in particular because these devices are by their nature relatively small.
SUMMARY OF TI-TE INVENTION
According to a first aspect of the invention there is therefore provided a method of displaying a virtual space to a user of a handheld computing device, said computing device having a display and a camera facing said user, the method comprising: using the camera to capture an image of said user's face; detecting a relative orientation of said handheld mobile computing device and said user's face; and displaying a selected portion of said virtual space on said display in response to said detected orientation.
Embodiments of the invention address the problem of limited display size by employing the user-facing camera, together with image processing techniques, to determine a position of the user relative to the device and thus to enable the user to control what is displayed based upon this to provide an intuitive user interface. Thus in prefened embodiments the selected portion of the virtual space which is displayed changes with the user's viewing angle to give the impression to the user that the virtual space is being viewed through a window formed by the display of the handhcld computing device.
In embodiments the virtual space may comprise a substantially two-dimensional suriae (including, for example, a 2D surface on which virtual 3D objects are placed). In some preferred implementations the virtual space is used to display one or documents --in a broad sense including, for example, text and/or images, spreadsheets, web pages, computer program code, and in general any other material which may be displayed in a window on a computer system. The virtual space may also be used to display icons, menus/menu options and other (graphical) user interface components. Thus iii preferred embodiments the virtual space provides a virtual workspace or desktop., The detection of the user's position/orientation relative to the device may also be used to control a pointing device such as a cursor.
According to a related aspect of the invention there is therefore provided a method of controlling the location of a pointer on a two or three dimensional desktop, the method using a handheld mobile computing device having a camera facing the user, the method comprising: using the camera to capture an image of said user's face; detecting a relative orientation of said handheld mobile computing device and said user's face; and controlling said pointer location on said two or three dimensional desktop responsive to, said detected orientation.
Preferably the method further comprises representing the pointer position on a user-facing display of the mobile computing device together with a portion of the desktop at the pointer location. It will be understood that the pointer may comprise any convenient visual indication mechanism such as a cursor. The pointer itself may be invisible or only temporary visible. In such a case the location of the pointer may be conveyed to the user by, for example, highlighting an object on the desktop at the pointer location, it will be appreciated that the desktop will in general be displaying one o,rrnore documents (including web pages) or other windows, and/or icons and the like, in two or three dimensional view, over which the pointer may be moved.
Thus preferred embodiments of the technique additionally -or alternatively -include changing the portion of the desktop represented on the display together with the detected orientation to give the impression to the user that the (virtual) desktop is located behind the display such that the display forms a portal or window onto the desktop through which the desktop is viewed. In embodiments of aspects of the invention described herein this may be performed instead of controlling the location of a visible pointer, effectively to provide an enlarged display space.
in preferred enibodimentsthe display forms a portal or window onto a larger virtual desktop behind the display, and as the user tilts the mobile computing device (MCD) about one or Iwo axis the portion of the visual display presented to the user changes as though the user were moving his or her head to either side of a window to look through the window onto a larger region beyond. it will be appreciated that there need not be a "linear" relationship between the tilt of the device and the portion of the virtual desktop behind which is viewed -a relatively small tilt may, for example, move a significant (larger than linear) distance across the desktop behind. It will be appreciated that the sensitivity of motion of the pointer to tilt about either or both axis may be varied and, optionally, set by the user.
In preferred embodiments of the technique, the user-facing camera captures an image of the face of the user and then perthrms feature detection to identify features, such as eyes, in the face. The relative orientation of the mobile computing device and the user's face may then be detected by detecting, for example, a change in the spacing between the eyes and/or other facial features. Preferably orientation (tilt) in twp different axjs is detected; optionally an overall scaling of the face may additionally or alternatively be detected to provide a third variable, for example for controlling a position of the pointer in 3D space. However, even where lilt in only two directions is determined this does not preclude the use of a three dimensional desktop -for example the pointer may be moved over a tilted pseudo-three dimensional plane.
II will be appreciated that one advantage of embodiments of the method is that the virtual desktop provides a much greater region over which the user may interact than would appear to be available simply from the (relatively small) physical size of a handhcld device. A 3D virtual desktop potentially gives an even greater volume of space over which the user may interact. Providing a 2D or 3D user interface in this way helps to overcome the limitations of the small physical size of a hand held device which can make interaction using, say, a human finger relatively awkward.
In a related aspect the invention provides a graphical user interface (GUI) for a handheld mobile computing device, the graphical user interface comprising: means to detect a relative orientation of said mobile computing device and a portion of said user; and means to control the location of a pointer responsive to said detected orientation such that said user is able to control said pointer location by manipulating said orientation of said handheld mobile computing device relative to him or herself.
Preferred embodiments of the graphical user interface, as described above, employ facial image capture and processing. However it will be understood that, potentially, other techniques may also be employed to detect a relative orientation of the mobile computing device and a portion of the user (a pseudo-static portion of the user for example the user's face or body rather than the user's hand or arm). These include, but are not limited to, a miniature gyroscope or similar arrangement. Preferably, as described above, the user-facing display of the device operates as a portal through which a desktop larger than the portion of the desktop represented on the display may be viewed, apparently behind the display and changing with the detected orientation.
In the case of an invisible pointer the location of the pointer may be conveyed to the user by selectively displaying a portion of the desktop to the user to, in effect, expand the available display space.
Features of the above described aspects and embodiments of the invention may be substituted or combined in any permutation.
The invention also provides a mobile computing device, for example a mobile communications device or mobile phone, with a user-facing display and preferably (depending upon the implementation) a user-facing camera, configured to implement a method or graphical user interface of the type described above.
The invention further provides a carrier carrying a computer program code to, when running, implement a method or graphical user interface as described above. The carrier may comprise, for example, a disk, CD or DVD -ROM, or programmed memory (Firmware) such as read-only memory. Code (and/or data) to implement embodiments of the invention may comprise source, object or executable code in a conventional programming language (interpreted or compiled) or code for setting up or controlling an ASIC, FPGA or similar, or code for a hardware description language.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will now be further described, with reference to the accompanying figures in which: Figure 1 shows, schematically, a captured image of a face illustrating the extraction o' tilt and related data; Figures 2a to 2d show examples of a display of portions of a virtual desktop and a pointer location thereon according to mobile computing device (MCD) tilt angle; Figure 3 shows a flow diagram of a procedure to implement an embodiment of a graphical user interface according to the invention; and Figures 4a and 4b show a block diagram of a mobile computing device (MCD) and an example physical configuration of the device.
DETAILED DESCRIPTION OF REFERRED EMBODIMENTS
Referring to figure 1, this shows an image of a user's face which may be captured by a user-facing camera on a mobile phone. As illustrated, rotation about the Z axis may be determined by spacing between the eyes and rotation about the Y axis may be determined, for example, from a spacing between the eyes and mouth. Translation in an X direction may be determined, for example, by overall scaling of the face.
Figure 2a illustrates, conceptually, a pair of user's eyes (the two black dots) looking in different directions into the "window" formed by the display of the mobile computing device. Figures 2b to 2d show examples of tilting the mobile computing device giving a similar effect to that shown in figure 2a (for convenience of illustration the tilt is shown as rotation about the Z direction). As the MCD rotates about (in this example) the Z-axis the position of a point on the virtual desktop (VD) translates in the Y direction and, preferably, the portion of the virtual desktop displayed on the display of the MCD is determined by projecting the display of the MCD onto the virtual desktop in the direction indicated by the angle of lilt. It will be understood that the sensitivity to tilt and scaling of the virtual desktop (in effect the distance between the virtual desktop and the MCD in the illustrations) may be varied, optionally by the user.
Referring now to figure 3, this shows a flow diagram of a procedure to implement an embodiment of the user interface. At step S300 the GUI procedure initialises and, optionally captures an image of the user's face head on (900) in order to normalise subsequent variations. Alternatively, a default starting position, such as head on, may be assumed and variations from this detected.
At step S302 the procedure captures an image of the user and processes this to locate the face and facial features such as eyes, mouth and the like. Then (S304) the procedure determines the spacings of these features and hence angles the tilt in the Y and Z directions and, optionally, the distance in the X direction. These are then translated to a position on the virtual desktop (S306) as illustrated in figure 2, and the pointer is displayed together with the selected portion of the virtual desktop, as if being viewed through the MCD display as a portal or window (S308). The procedure then loops back to step S302 to capture a further image of the user unless some further user input, for example operation of a select button, is detected (S310).
Referring now to figure 4, this shows a simplified block diagram of a mobile phone 400 configured to implement the graphical user interface described above. Thus the phone 400 includes a display 402, a user-facing camera 404, and a physical user interface device 406 such as one or more buttons or a keyboard or joystick or, optionally, a touch sensitive portion of display 402. The phone also includes an RF module 408. These elements of the phone are controlled by a controller 410 comprising a processor, working memory, and program memory, the program memory storing a computer program code to implement a graphical user interface as described above. Optionally the code in the program memory may be provided on a carrier 412, illustratively shown by a floppy disk. Figure 4b shows a physical configuration of the phone from the user's perspective.
It will he understood that it is not essential to display a pointer and that embodiments of these techniques may be employed to provide a user with an expanded virtual workspace in conjunction, for example, with an alternative method of controlling a pointer location such as a joystick-type control or an up/downlleft/right control or a touch/pressure sensitive screen control to select a direction and/or rate of motion of a pointer within the workspace.
No doubt many other effective alternatives will occur to the skilled person. it will be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying within the spirit and scope of the claims appended hereto.
Claims (17)
- CLAIMS: I. A method of displaying a virtual space to a user of a handheld computing device, said computing device having a display and a camera facing said user, the method comprising: using the camera to capture an image of said user's face; detecting a relative orientation of said handheld mobile computing device and said user's face; and displaying a selected portion of said virtual space on said display in response to said detected orientation.
- 2. A method as claimed in claim 1 wherein said displayed selected portion of said virtual space changes with a viewing angle of said display by said user to give said user the impression that said virtual space is viewed through a window comprising said display of said handheld computing device.
- 3. A method as claimed in claim I or 2 wherein said virtual space comprises a substantially two-dimensional surface.
- 4. A method as claimed in claim 1, 2 or 3 further comprising displaying a pointer in said virtual space at a location determined by said detected relative orientation.
- 5. A method as claimed in any preceding claim wherein said virtual space comprises a desktop.
- 6. A method of controlling the location of a pointer on a two or three dimensional desktop, the method using a handheld mobile computing device having a camera facing the user, the method comprising: using the camera to capture an image of said user's face; detecting a relative orientation of said handheld mobile computing device and said user's face; and controlling said pointer location on said two or three dimensional desktop responsive to said detected orientation.
- 7. A method as claimed in claim 5 or 6 further comprising representing said pointer position on a user-facing display of said mobile computing device together with a portion of said desktop at said pointer location.
- 8. A method as claimed in claim 7 further comprising changing said portion of said desktop together with said detected orientation to give the impression to said user that said desktop is located behind said display, such that said display forms a portal onto said desktop through which said user views said desktop.
- 9. A method as claimed in any preceding claim wherein said orientation detecting comprises detecting changes in spacing between features of said captured image of said face:
- 10. A method as claimed in one of claims 4 to 9 wherein said pointer location is controllable in at least two dimensions in said virtual space or on said desktop.
- 11. A method as claimed in claim 10 wherein said pointer location is controllable in three dimensions in said virtual space or on said desktop.
- 12. A method as claimed in one of claims 4 to 11 wherein said pointer is invisible or only temporarily visible on said display, and wherein said location of said pointer is conveyed to said user by highlighting an object on said desktop at a said pointer location.
- 13. A graphical user interface (GUI) for a handheld mobile computing device, the graphical user interface comprising: means to detect a relative orientation of said mobile computing device and a portion of said user; and means to control the location of a pointer responsive to said detected orientation such that said user is able to control said pointer location by manipulating said orientation of said handheld mobile computing device relative to him or herself.
- 14. A graphical user interface as claimed in claim 13 wherein said location comprises a location on a virtual desktop displayed to said user on a user-facing display of said mobile computing device as if said user-lacing display were a window through which said desktop is viewed, and wherein a portion of said displayed desktop changes with changes in said detected orientation.
- 15. A mobile computing device having a user-facing display and a user-facing camera, said mobile computing device including a graphical user interface as claimed in claim 13 or 14 or said mobile computing device including a processor and program memory storing computer program code to implement the method of any one of claims Ito 12.
- 16. A carrier carrying computer program code to implement the method of any one of claims ito 12 or the graphical user interface of any one of claims 13 and 14.
- 17. A method, graphical user interface, mobile computing device or carrier as claimed in any preceding claim wherein said computing device comprises a mobile communication device or mobile phone.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0805093A GB2458881A (en) | 2008-03-19 | 2008-03-19 | Interface control using motion of a mobile device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0805093A GB2458881A (en) | 2008-03-19 | 2008-03-19 | Interface control using motion of a mobile device |
Publications (2)
Publication Number | Publication Date |
---|---|
GB0805093D0 GB0805093D0 (en) | 2008-04-23 |
GB2458881A true GB2458881A (en) | 2009-10-07 |
Family
ID=39356738
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0805093A Withdrawn GB2458881A (en) | 2008-03-19 | 2008-03-19 | Interface control using motion of a mobile device |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2458881A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2684103A1 (en) * | 2011-03-08 | 2014-01-15 | Nokia Corp. | An apparatus and associated methods for a tilt-based user interface |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6288704B1 (en) * | 1999-06-08 | 2001-09-11 | Vega, Vista, Inc. | Motion detection and tracking system to control navigation and display of object viewers |
WO2006026021A2 (en) * | 2004-08-27 | 2006-03-09 | Motorola, Inc. | Device orientation based input signal generation |
WO2007012768A2 (en) * | 2005-07-29 | 2007-02-01 | Realeyes3D | Method for controlling an interface using a camera equipping a communication terminal |
US20070071425A1 (en) * | 2005-09-27 | 2007-03-29 | Sharp Kabushiki Kaisha | Sensing apparatus, program execution apparatus and image-taking system |
EP1770487A2 (en) * | 2005-09-29 | 2007-04-04 | LG Electronics Inc. | Graphic signal display apparatus and method for hand-held terminal |
EP1785819A2 (en) * | 2005-10-24 | 2007-05-16 | Sony Ericsson Mobile Communications Japan, Inc. | Mobile terminal, mouse application program, and method for utilizing mobile terminal as wireless mouse device |
JP2007310424A (en) * | 2004-07-07 | 2007-11-29 | Nec Corp | 3d coordinate input system, apparatus, method, and program |
-
2008
- 2008-03-19 GB GB0805093A patent/GB2458881A/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6288704B1 (en) * | 1999-06-08 | 2001-09-11 | Vega, Vista, Inc. | Motion detection and tracking system to control navigation and display of object viewers |
JP2007310424A (en) * | 2004-07-07 | 2007-11-29 | Nec Corp | 3d coordinate input system, apparatus, method, and program |
WO2006026021A2 (en) * | 2004-08-27 | 2006-03-09 | Motorola, Inc. | Device orientation based input signal generation |
WO2007012768A2 (en) * | 2005-07-29 | 2007-02-01 | Realeyes3D | Method for controlling an interface using a camera equipping a communication terminal |
US20070071425A1 (en) * | 2005-09-27 | 2007-03-29 | Sharp Kabushiki Kaisha | Sensing apparatus, program execution apparatus and image-taking system |
EP1770487A2 (en) * | 2005-09-29 | 2007-04-04 | LG Electronics Inc. | Graphic signal display apparatus and method for hand-held terminal |
EP1785819A2 (en) * | 2005-10-24 | 2007-05-16 | Sony Ericsson Mobile Communications Japan, Inc. | Mobile terminal, mouse application program, and method for utilizing mobile terminal as wireless mouse device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2684103A1 (en) * | 2011-03-08 | 2014-01-15 | Nokia Corp. | An apparatus and associated methods for a tilt-based user interface |
EP2684103A4 (en) * | 2011-03-08 | 2014-11-26 | Nokia Corp | An apparatus and associated methods for a tilt-based user interface |
Also Published As
Publication number | Publication date |
---|---|
GB0805093D0 (en) | 2008-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11410276B2 (en) | Determination of an operation | |
US20190384450A1 (en) | Touch gesture detection on a surface with movable artifacts | |
AU2014201578B2 (en) | Method and apparatus for operating electronic device with cover | |
US9733752B2 (en) | Mobile terminal and control method thereof | |
US8854320B2 (en) | Mobile type image display device, method for controlling the same and information memory medium | |
US10579247B2 (en) | Motion-based view scrolling with augmented tilt control | |
US9740297B2 (en) | Motion-based character selection | |
JP4093823B2 (en) | View movement operation method | |
KR100871099B1 (en) | A method and device for changing an orientation of a user interface | |
US20090167702A1 (en) | Pointing device detection | |
KR20150090840A (en) | Device and mathod for shielding region of display screen of device | |
KR20140137996A (en) | Method and apparatus for displaying picture on portable devices | |
KR20230007515A (en) | Method and system for processing detected gestures on a display screen of a foldable device | |
KR20100136289A (en) | A display controlling method for a mobile terminal | |
US8384692B2 (en) | Menu selection method and apparatus using pointing device | |
GB2458881A (en) | Interface control using motion of a mobile device | |
CN108700958B (en) | Wearable information terminal | |
KR20110066545A (en) | Method and terminal for displaying of image using touchscreen | |
KR101165388B1 (en) | Method for controlling screen using different kind of input devices and terminal unit thereof | |
CN111247506A (en) | Screen control method and screen control system based on intelligent terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |