GB2378878A - Control of display by movement of handheld device - Google Patents

Control of display by movement of handheld device Download PDF

Info

Publication number
GB2378878A
GB2378878A GB0115869A GB0115869A GB2378878A GB 2378878 A GB2378878 A GB 2378878A GB 0115869 A GB0115869 A GB 0115869A GB 0115869 A GB0115869 A GB 0115869A GB 2378878 A GB2378878 A GB 2378878A
Authority
GB
United Kingdom
Prior art keywords
sensor
image
processor
display device
handheld display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0115869A
Other versions
GB0115869D0 (en
GB2378878B (en
Inventor
Thomas Albert Gaskell
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aeroflex Cambridge Ltd
Original Assignee
Ubinetics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubinetics Ltd filed Critical Ubinetics Ltd
Priority to GB0115869A priority Critical patent/GB2378878B/en
Publication of GB0115869D0 publication Critical patent/GB0115869D0/en
Publication of GB2378878A publication Critical patent/GB2378878A/en
Application granted granted Critical
Publication of GB2378878B publication Critical patent/GB2378878B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A handheld display device in the form of a PDA 15 is provided, the PDA comprising a liquid-crystal display (LCD) screen 3, a processor 5, a local memory 9, an input port 11 and first and second sensors 17, 19. Image data received from a remote source is transferred to the processor 5 by means of the data port 11. The processor 5 outputs the image data to the screen 3, with the updating of the image being controlled by signals generated by the first and second sensors 17, 19. Movement of the PDA 15 in a substantially vertical plane is detected by sensor 17, which causes the processor to update the image in a zooming effect, depending on the direction of movement. A tilting movement of the PDA 15 causes the processor to update the image in a scrolling effect, the direction of scrolling corresponding to the direction of tilt.

Description

<Desc/Clms Page number 1>
A Handheld Display Device This invention relates to a handheld display device, and particularly-to a handheld display device for displaying images on a screen.
Handheld display devices come in a variety of forms. A mobile telephone is a popular example, as is a personal digital assistant (PDA) or a handheld computer. This type of device generally includes a display screen for showing text or graphical information.
With the recent introduction of the Wireless Application Protocol (WAP) and other types of software for mobile telephones and PDAs, such devices can be used to display Internet web pages.
A disadvantage associated with a conventional handheld display device, is that, since its display screen is limited in physical size, the information which can be conveniently displayed is limited. For example, if a user wishes to view a document in the form of an image, it is generally only possible to view part of the image at any one time. In order to view a different part, the image has to be moved, for example by using an associated keyboard, which is cumbersome and repeatedly draws the user's attention away from the document.
A known display device comprises a handheld body which includes a sensor surface situated beneath the body, the sensor surface being movable over a flat reference surface (e. g. a table). Control of the image part being displayed requires movement of the display device over the reference surface. This is clearly not desirable for the majority of handheld display devices, such as mobile telephones or PDAs.
According to a first aspect of the present invention, there is provided a handheld display device comprising: a screen; a processor for outputting an image to the screen using image data from an image data source; and a sensor connected to the processor, the sensor being arranged to generate a varying sensor signal according to the free-space
<Desc/Clms Page number 2>
position of the sensor,. the processor updating the image data being displayed in accordance with the varying sensor signal.
Such a display device can update displayed image data according to the free-space position of the sensor. Accordingly, by orienting or moving the display device in a particular way, the image being displayed is updated to show some other set of image data. There is no requirement for a sensor surface, nor a reference surface for contacting the device. This is particularly convenient for devices such as mobile telephones or PDAs.
Preferably, the processor is arranged to output a sub-set of the image data thereby to display only part of an image to the screen, the image part being displayed changing in response to the varying sensor signal. In this sense, the image data being displayed relates to a sub-set of an overall image, e. g. a large document or web page. The varying sensor signal causes the processor to update the displayed image data.
As an example, movement of the sensor in a particular direction may cause the processor to update the displayed image data by showing an image part partially adjacent to that previously displayed. Thus, movement of the display device has the effect of scrolling the displayed image part. Preferably, this is achieved by providing a tilt sensor which causes the displayed image part to scroll in accordance with the direction of tilt. The degree of tilt may also be utilised to determine the rate of update, which has the effect of increasing or decreasing the degree of perceived scrolling. Alternatively, this scrolling effect may be achieved by providing sensors, for example velocity sensors or accelerometers, that allow the horizontal position of the display device to be determined. Such a sensor would cause the displayed image part to scroll in proportion to the free-space position of the sensor, without requiring a sensor surface or reference surface over which the sensor moves.
As a further example, movement of the sensor in first and second directions may cause the processor to update the displayed image data to show, respectively, a zoomed-in and zoomed-out version of the previously displayed image part. Thus, a user is able to
<Desc/Clms Page number 3>
select a close-up view of a particular part of the image, or to zoom-out and select a different part of the image to zoom-in to. The first and second directions may correspond to upwards and downwards movements, respectively, along a generally vertical plane.
According to a second aspect of the present invention, there is provided a hand held display device comprising : a screen; a processor for outputting an image part to the screen using image data from an image data source; and first and second sensors connected to the processor, the first and second sensors being arranged to generate, respectively, a first sensor signal which varies according to the free-space position of the hand held display device in a substantially vertical plane, and a second sensor signal which varies according to movement of the hand held display device in a further plane, the processor updating the image data being displayed in accordance with any one of the varying sensor signals.
By providing such a display device having two sensors, the image part being displayed can be adjusted in one of two planes, or indeed, in both, at a given time. Preferably, the first sensor signal has the effect of zooming-in or zooming-out of the image part being displayed, whilst the second sensor signal has the effect of scrolling the image part being displayed to an adjacent region of the previously displayed image..
Various sensors may be employed, examples of which include infra-red or ultrasonic detectors, pickup coils and remote transmitters, mercury switches, velocity sensors and accelerometers.
The invention will now be described, by way of example, with reference to the drawings, in which: Figure 1 is a block diagram showing the interrelationship of various elements of a handheld display device in the form of a PDA;
<Desc/Clms Page number 4>
Figure 2 is an end-face view of the PDA, the PDA being shown in two different operating configurations; Figure 3 shows the screen of the PDA, an image being shown on the screen in normal form and in zoomed-in form ; Figure 4 is an end-face view of the PDA, the PDA being shown in a tilted position; Figure 5 shows the screen of the PDA, with an image being shown at three different subsequent time periods; Figure 6 is a block diagram showing the interrelationship of various elements of a PDA having two sensors; Figure 7 shows the screen of the PDA, with an image being shown at two different levels of zoom and two different horizontal positions; Figure 8 is a block circuit diagram of an ultrasonic or infrared positioning sensing arrangement forming part of the PDA; Figure 9 shows schematically the screen of the PDA and its relationship with image data stored in memory for two different positions of the PDA; and Figure 10 shows schematically the screen of the PDA with an image being shown at two different levels of zoom.
Referring to Figure 1, a PDA 1 comprises a liquid-crystal display (LCD) screen 3, a processor 5, a sensor 7, a local memory unit 9, and a remote data input 11. The remote data input 11 comprises a conventional r. f. receiver. The processor 5 is arranged to receive remote data from an external image data source, by means of the remote data input 11. The external image data source may, for example, be a remote data source which transmits image data over a wireless (e. g. cellular) link. The image data can be
<Desc/Clms Page number 5>
stored in the memory unit 9 by the processor 5, and then output to the screen 3 by the processor 5; or, if the remote data link is fast enough, and the processor 5 controls the remote data source, the image data could be fed by the processor from the remote data input 11 to the screen 3. As will be explained in detail below, the displayed image data is updated in accordance with sensor signals received from the sensor 7.
The processor 5 is provided with software for controlling various aspects of the signal processing which is required for updating the image in response to the received sensor signals.
The local memory unit 9 can be used for temporarily storing data received from the external data source, or for storing other image information that is to be displayed on the display 3. Alternatively, remote data can be'streamed'over a very fast data link between the external source and the remote input 11.
In use, the PDA I is used to display a variety of image documents which are accessible over a wireless link, or stored locally in the memory unit 9. These may comprise conventional Internet web-pages, maps, engineering drawings etc. Whatever their form, it will be appreciated that these forms of digitised image will generally be of a much higher resolution and/or size than that displayable by the screen 3. It follows that the PDA 1 will only display a small part of the image, at a given time, in order that the perceivable image is of a good quality. In order that a user may navigate around the entire image, i. e. by viewing only parts of the image in turn, the sensor 7 is arranged such as to update the image data being displayed, according to the free space position of the sensor. This may be done in a number of ways, as explained with reference to the following first, second and third embodiments.
In the first embodiment, free-space movement of a sensor 7' (as shown in Figure 2) causes the screen 3 (under the control of the processor 5) to show zoomed-in and zoomed-out versions of the image part previously displayed. Thus, when a user wishes to navigate to a particular part of an image, he/she may zoom-out to show the entire image (albeit in an substantially unreadable form) and then may zoom-in to the image part of interest. The central point of the image part to be zoomed-in at may be adjusted
<Desc/Clms Page number 6>
by conventional means (such as navigation keys), or by using a further free-space sensor type, as will be explained later.
Figures 2a and 2b show two different arrangements wherein the sensor is employed to cause the screen 3 to zoom-in and zoom-out over an image part. In both cases, movement of the PDA 1, and so the sensor 7', in a substantially vertical plane causes the zooming-in or zooming-out effect, depending on whether the movement is in an upwards or downwards direction. If the sensor 7'moves upwards, the processor 5 causes a zooming-out effect on the screen 3, whilst downwards movement of the sensor results in a zooming-in effect.
Figure 2a shows an end-face of the PDA 1 positioned above a reference surface 13, such as a desk or floor surface. The sensor 7'can be in the form of an ultrasonic or infrared detector which is located towards the rear of the PDA 1. In this case, the sensor 7'determines its position in free space by measuring its distance from the reference surface 13.. If the distance increases over a particular time period, then it follows that the PDA 1 has moved upwards, and a suitably modulated sensor signal is output to the processor 5 for causing the displayed image to be updated so as to give a perceivable zooming-out effect. If the distance decreases, then the PDA 1 has moved downwards, and the sensor signal is modulated accordingly and output to the processor 5, causing the updated image to give a perceivable zooming-in effect.
Referring to Figure 2b, a similar arrangement is shown. In this case, however, the sensor 7'is in the form of an accelerometer or a velocity sensor. It will be appreciated that such an accelerometer can determine whether the PDA 1 is moving upwards or downwards according to the instantaneous movement of the sensor. Indeed, any detector or sensor capable of detecting an instantaneous movement may be employed. If a velocity sensor is used, its output is integrated by the processor 5 to give a measurement of distance ; and, if an accelerometer is used, its output is double integrated by the processor to give a measurement of distance. Again, if the sensor 7' determines that an upwards movement has occurred, a suitably modulated sensor signal is output to the processor 5 for causing the displayed image to be updated so as to give
<Desc/Clms Page number 7>
the zooming-out effect. If the sensor 7'is moved downwards, the sensor signal is modulated accordingly and output to the processor 5, causing the updated image to give the zooming-in effect.
Referring to Figure 3, the zooming-in effect is shown. Figure 3a shows a simple scene displayed on the screen 3 of the PDA, whilst Figure 3b shows a zoomed-in version of the same scene.
A second embodiment according to the present invention is shown in Figure 4, which shows an end-face of the PDA 1. In this embodiment, free-space movement of a sensor 7"causes the processor to show an adjacent part of the image previously displayed, i. e. to scroll the image in a particular direction. To facilitate this, the sensor 7"comprises a tilt sensor for detecting when the PDA 1 is tilted, and in which direction. The sensor 7"is capable of determining a tilting effect in four directions, i. e. left, right, forwards, and backwards, although for ease of explanation, only the effect of a left and right tilt is mentioned below.
When the PDA 1, and so the sensor 7", is tilted to the left (as shown in Figure 4) the processor 5 updates the screen 3 to show an adjacent region to the left-hand side of the previously displayed image. This process is repeated at a particular update rate until the PDA 1 is tilted to be horizontal. It follows that a tilt movement to the right will result in the image being updated to show an adjacent region to the right-hand side of the previously-displayed image. In use, the sensor arrangement enables a user to navigate over different parts of a large image, simply by orienting (in this case, by tilting) the PDA 1 in free-space.
Figure 5 illustrates the abovementioned updating effect caused by the tilting motion
shown in Figure 4. The updated image is shown at three consecutive time periods, t], t2, and ts. It will be seen that the updated image at t2 comprises a large quantity of image data from the previous image at ti, the amount of actual'shift'being small. This shift amount is determined by the software in the processor 5, and helps provide a smooth
<Desc/Clms Page number 8>
scrolling effect rather than a'jerky'movement of the image. The updated image at t3 shows further movement in the same direction.
In relation to both the first and second embodiments, the software in the processor 5 handles various signal processing operations for controlling the updating of image data. These signal processing operations not only control the respective zooming and scrolling effects, but also the rate at which the updating occurs. Indeed, in order for the user to have the maximum control over the image navigation process, it is preferable that the degree of movement of the PDA 1 be associated with the rate of image data update. Accordingly, in the case of the first embodiment, the speed of movement of the sensor 7'in the vertical plane determines the speed of the perceivable zooming effect. The faster the speed of movement, the faster the zooming effect. In the case of the second embodiment, the degree of tilt of the PDA 1 determines the speed of the perceivable scrolling. The higher the degree of incline, the faster the scrolling effect.
A third embodiment according to the present invention is shown in Figure 6. This embodiment effectively combines the operation of the first and second embodiments, and so like elements between Figures 1 and 6 are referred to by the same reference numerals. In this embodiment, a PDA 15 is provided with first and second sensors 17, 19, the first sensor being an accelerometer, and the second sensor being a tilt sensor.
The operation of the processor 5 is similar to that described in relation to the first and second embodiments, in that the image is updated according to a sensor signal. However, in this case, two sensor signals are provided from the first and second sensors 17,19. The sensors 17,19 are positioned and function in the same way as described previously. The processor 5 updates the image displayed in accordance with these two sensor signals, i. e. so that movement of the first sensor 17 causes a zooming effect on the image, whilst tilting of the second sensor 19 causes a scrolling effect on the image. Hence, the improved image navigation provided for by the first and second embodiments is improved further by using both navigation methods.
<Desc/Clms Page number 9>
Accordingly, the overall effect of using both sensors 17,19, is that the user can 'fly-over'the entire image by means of both moving the PDA 15 in a vertical plane, and also by tilting the PDA. Hence, the simple, intuitive act of raising the PDA 15, tilting it, and then lowering it results in the image being zoomed-out, scrolled to a selected part, and then zoomed-in to show a detailed view of that selected image part. In effect, the PDA 15 acts in a manner similar to that of a magnifying glass tilted and moved over a virtual document. Again, the speed of movement and the degree of tilt will determine the image update rate to control the speed of zoom and/or the scrolling speed, the appropriate signal processing being performed by software in the processor 5. This software should provide for a maximum degree of realism, i. e. by closely matching the real-time scrolling and zooming of the image with the actual movement of the PDA 15.
Ultimately, the user should be convinced that he is'gliding'in real-time above the surface of a virtual document.
As an alternative to the tilt sensor 7"and 19, position sensors, velocity sensors or accelerometers may be used. These would be mounted at 90 to each other, both in the nominally horizontal plane, such that the first (for example) detected position, velocity or acceleration is the left-to-right direction of the PDA, and the second detected position, velocity or acceleration is the forwards-and-backwards direction. The image on the display would be scrolled in accordance with the lateral position of the display as detected by the sensors. If the display is"zoomed in"by moving it downwards (see Figure 2b) or towards a surface (see Figure 2a), then a lateral movement of 10 mm in the position of the display would result in a similar movement of the image on the display, i. e. 10 mm. If the display is"zoomed out"by moving it upwards (see Figure 2b) or away from a surface (see Figure 2a), then a lateral movement of 10 mm in the position of the display would result in a larger movement of any specific feature of the image on the display. However, as that image is shown on the display in"zoomed out" form, i. e. at a lower resolution, the actual image seen would again appear to move in proportion of the movement to the display in free space.
For example, as shown in Figure 7, a horizontal movement of the display of 10 mm to the left when"zoomed in"may result in the image shown in Figure 7a changing to the
<Desc/Clms Page number 10>
image shown in Figure 7b. However, when"zoomed out", as shown in Figure 7c, the same movement of the display, i. e. 10 mm to the left, may then show the image shown in Figure 7d. In Figure 7d, the image has moved, compared to that shown in Figure 7c, by a similar amount but, being"zoomed out", the user can see much more"new" information on the display than when moving from Figure 7a to Figure 7b.
Figure 8 illustrates one way in which the PDA can manage the signal processing required to translate sensor signals into a zooming and/or scrolling effect effect. Thus, the signal processing required to produce a zooming and/or scrolling effect consists of the processing of the sensor signal in order to derive the sensor's position, and the use of that signal to affect the displayed image.
As mentioned above, different types of sensor can be used for this application, principally those that measure position, velocity, acceleration, or tilt (i. e. orientation).
The use of non-contacting position or distance-measuring sensors is well established in electronics-for example, in infrared or ultrasonic rangefinders for cameras, or in ultrasonic'tape measures'for measuring the size of rooms. These sensors use a variety of measurement techniques-optical, ultrasonic, inductive or capacitive coupling, etc.
In the arrangement shown in Figure 8, a positioning-sensing technique () will be described. This will be described as using an ultrasonic signal. However, it will be appreciated that the use of an infrared signal would use a very similar technique. A pulse generator 21 produces frequent repetitive pulses, e. g. 1000 times per second. These are fed to an ultrasonic transmitter 22, which transmits a short ultrasonic pulse towards a reference surface 23. The pulse generator 21 also resets a timing circuit 24, and starts a new timing measurement tT. The timing circuit 24 proceeds to count the number of pulses received at a clock input 25 from a high speed clock 26 whose frequency is fc.
The transmitted ultrasonic signal is reflected off the reference surface 23, and detected by a receiver 27. This amplifies the received pulse tR, and feeds it to the timing circuit
<Desc/Clms Page number 11>
24. The pulse stops the counting of the high speed clock 26, and is fed to the processor 5 (not shown in Figure 8) to tell it that a new measurement is ready.
The timing circuit 24, therefore, measures the time to between the transmit pulse and the receive pulse (tD = tR-tr) with a resolution of 1/fc, and the output of the timing circuit is a counter value Co which equals the high speed clock frequency fc multiplied by the time interval tD. The counter value Co is read by the processor 5 after receiving the received pulse tR.
The count value Co is proportional to the distance D between the transmitter/receiver 22/27 and the reference surface 23. As the transmitter/receiver 22/27 move closer to the reference surface 23, the value of Co decreases; and, as the transmitter/receiver move away from the reference surface the value of Co increases. Co is, therefore, a measure of distance of the sensors from a given reference surface.
Clearly, if two such sensors were mounted orthogonally, they would enable the position in a plane-with respect to reference surfaces-to be derived; and, if three were mounted and used orthogonally, they would enable the position in three-dimensional space to be derived.
It would also be possible to use other configurations, e. g. a fixed transmitter in place of the reference surface 23, and simply measure the distance from this to the receiver 27 in a similar way to that indicated, although it would have to feed a start pulse fT to the timing circuit 24 by an alternative means that was not significantly distance dependent.
However, additional signal processing would then be required.
It will be apparent that the position sensor described above could be modified in a number of ways. In particular, the transmitter/receiver pair 22/27 could operate with infared pulses rather than ultrasonic pulses. It would also be possible to use a velocity sensor or an accelerometer in place of the position sensor described. The signal processing requirements of velocity and accelerometer sensors are very similar. Both types of sensor require amplification before integration.
<Desc/Clms Page number 12>
In the case of a velocity sensor, a single stage of integration is required to derive distance; and, for an accelerometer, two stages of integration (i. e. double integration) are required to derive distance. Although the integration can be performed using analogue electronics, it is preferable to use the processor 5 to perform the integration, as this is most easily, reproducibly, and flexibly done via a software program.
The result of the single (or double) integration will be a value that is proportional to the position of the sensor in that particular axis with respect to its starting point. Clearly, if two such sensors are mounted orthogonally, they will enable the position in a plane to be derived; and, if three are mounted orthogonally, they will enable the position in three dimensional space to be derived.
When using the tilt sensor 7"of Figure 4, this can be in proportional form, e. g. a potentiometer or encoder with a pendulum attached; or in switched form, e. g. a mercury switch or a ball bearing running over curved switch contacts. A single tilt sensor can determine the orientation of a device in one axis. In the case of a proportional sensor, this will give an output proportional to the amount by which it is tilted, e. g. from the horizontal. A switched sensor usually indicates one of a limited number of states-e. g. tilted left/horizontal/tilted right (three-position switch), or tilted very left/tilted slightly left/horizontal/tilted slightly right/tilted very right (five-position switch).
Clearly, if two tilt sensors are mounted orthogonally, they will enable the orientation in two orthogonal axes, i. e. in a plane, to be determined.
The orientation of the tilt sensors can be used to derive the apparent position of the device as follows (note, only one axis will be described but the same principle would be applied to both axes, i. e. to the two orthogonally-mounted sensors). The output of the tilt sensor 7"is fed to the processor 5. The processor 5 takes no action if the sensor indicates horizontal. If the sensor 7"indicates that it is tilted slightly to the left, the processor 5 starts a counter counting slowly in a negative direction. The greater the
<Desc/Clms Page number 13>
angle of tilt, the faster the negative counting. Clearly, for a simple three-position tilt switch there is only one fixed rate of counting in each direction; whereas, for a five-position tilt switch, there can be two different rates of counting in each direction.
If the sensor 7"is returned to horizontal, the counting stops.
If the sensor 7"indicates that it is tilted slightly to the right, the processor 5 starts the counter counting slowly in a positive direction. The greater the angle of tilt, the faster the positive counting.
The value of the counter is proportional to the apparent position of the device in that axis, and it is this counter output that is used to feed to the processor 5 and affects the displayed image as described below.
Hence, tilting the sensor 7"left by 30 for two seconds, returning it to the horizontal, then tilting to the right by 30 for two seconds, and then returning it to the horizontal would result in the device's apparent position starting in the centre, then moving to the left at a fixed rate for two seconds, then stopping, then moving to the right at a fixed rate for two seconds back to its starting position, and then stopping again.
More sophistication can be added by altering the counting rate, depending not only on angle of tilt but also on duration of tilt (time), and in a non-linear way (e. g. by altering the apparent nature of the horizontal'dead band'in which the moving stops), to make it easier for the user to move the apparent position of the device wherever chosen.
To describe the manner in which signal processing is used to affect the displayed image, let us assume that the image is a large star shape, as shown in Figure 9a, and let us assume that the image of the star is too large to all be shown in the display (the screen 3) at maximum resolution.
Data corresponding to the complete image of the star will be held either in the memory unit 9, or in a remote data source connected via the remote data input 11. The
<Desc/Clms Page number 14>
processor 5 selects the portion of the image to be displayed at any one time, and feeds the appropriate image data to the display 3.
The exact method by which an image is represented in memory can vary ; but, for this example, let us assume the most common method, i. e. a direct'bit-mapping'between the memory location and the image, as shown in Figure 9b, with each memory location corresponding to one pixel on the display 3. The memory location could be locally in the memory unit 9, or accessed remotely via the remote data input 11. (Note that the approach proposed in this example is equally applicable, with suitable amendments of the specific details of operation, to many other ways of representing an image within a data storage device.) In Figure 9b, the memory is 64 kbytes, and is represented as being arranged in 256 'lines'of 256 memory addresses per line, i. e. a 256 x 256 array. The display 3 is 128 pixels x 128 pixels. The processor 5 can take the contents of the memory at various addresses, and feeds these to the appropriate pixel of the display 3. By choosing memory addresses whose positions in the 256 x 256 array correspond to the required positions on the display 3, a direct mapping of part or all of the image stored in memory to the image shown on the display can be achieved.
For the image shown in the display 3 of Figure 9a, the processor 5 would send the data from memory locations 16,191 to (16,191+128) to the first line of pixels on the display, memory locations (16, 191+256) to (16, 191+256+128) to the second line of pixels on the display, and so on, through to the final pixel on the final line of the display, 49,087, as shown in Figure 9b.
If the sensors, after suitable signal processing as described above, indicate that they had been moved by, say, one quarter of the width (equivalent to 32 memory locations) and one quarter of the height (equivalent to 32 memory locations) of the display 3 to the bottom right, we would expect the image shown on the display to be as shown in Figure 9c.
<Desc/Clms Page number 15>
For the image shown in the display 3 of Figure 9c, the processor 5 would send the data from memory locations 24,415 to (24,415+128) to the first line of pixels on the display, memory locations (24,415+256) to (24, 415+256+128) to the second line of pizxels on the display, and so on, through to the final pixel on the final line of the display, 57,311, as shown in Figure 9d.
It can, therefore, be seen that, in each axis, the portion of the memory to be displayed has a start point and a subsequent pattern of memory locations that are used to drive the display with pixel data, and it is this start point that is varied according to the processed sensor information. The greater the value from the sensors, i. e. the greater the movement of the sensors in that direction, the greater the off-set between the'normal' (initial state of the display) start point, e. g. 16,191 in Figure 9b, and the new start point 24,415 as shown in Figure 9d.
It should be understood that the image would not usually move from the position shown in Figure 9a to the position shown in Figure 9c instantaneously in one step, as the sensors would not move so quickly. The processor 5 will monitor the sensors frequently (e. g. thousands of times per second); and, as the value from the sensors change, it will vary the memory locations it sends to the display 3 at the same rate in proportion to the sensor variations, giving the effect of the image moving in the display in relationship to the display and sensor movement in space.
So, even if the sensors are moved quickly, the high repetition rate at which the sensors are monitored and the display 3 updated means that the difference in position of the image on each update of the display would only be small. The result would, therefore, appear as a smooth scrolling action, i. e. the movement of the image from the position shown in Figure 9a to that shown in Figure 9c would take place smoothly in a number of stages at a substantially similar rate to the speed of movement of the sensors.
By matching the movement of the image in the display 3 very accurately to the exact physical movement of the display and sensor in space, the impression will be given that the image is fixed in space, and the display is merely passing over it, and allowing the
<Desc/Clms Page number 16>
viewer to see a portion of the fixed image through a'moving window'. It is important that the apparent movement of the image matches the action taken by the user of the device very closely, as the illusion of viewing a fixed document through a moving window is easily destroyed if the matching is not accurate.
Figures 9 and 10 illustrate how the PDA manages the signal processing required to translate sensor signals into a zooming effect. Thus, the signal processing required to produce a zooming effect consists of processing of the sensor signal in order to derive the sensor's height (vertical position), and the use of that signal to affect the displayed image.
Different types of sensor can be used for this application, principally those that measure position (e. g. distance from a fixed surface), velocity, or acceleration.
The signal processing requirements of the position, velocity and accelerometer sensors are as described above for the scrolling effect. In order to describe the manner in which signal processing is used to affect the displayed image, let us assume that the image is a large star, as shown in Figure 9a, and let us assume that the star is too large to be completely shown in the display 3. The complete image of the star will be held either in the memory unit 9, or in a remote data source connected via the remote data input 11. The processor 5 selects the portion of the image to be displayed at any one time, and feeds the appropriate image data to the display 3.
The exact method by which an image is represented in memory can vary; but, for this example, let us assume the most common method, i. e. a direct mapping between memory location and image, with one memory location representing one pixel on the display 3, as shown in Figure 9b. For the nominally-sized image shown in the display 3 of Figure 9a, the processor 5 would send the data from memory locations 16,191 to (16, 191+128) to the first line of pixels on the display, memory locations (16, 191+256) to (16,191+256+128) to the second line of pixels on the display, and so on, through to the final memory address on the final line, 49,087.
<Desc/Clms Page number 17>
If the sensors, after suitable signal processing as described above, indicate that they had been moved upwards by, say 50 mm, we would expect the image to zoom out, and be shown on the display 3 as indicated in Figure 10a. For the image shown in the display of Figure 1 Oa, the processor 5 would have to utilise a wider range of data from the very first memory location 0 to the final memory address on the final line, 65,535, to the display as shown in Figure lOb ; and, since the display could not display all this data (there are 65,536 memory locations but only 16,384 pixels in the display) the processor 5 would only send alternate x-axis memory locations to the display, and would omit alternate rows to compress the image in the y-axis, so that only one in four memory locations is shown on the display.
If the sensors were moved even further upwards, then an even smaller image would be displayed by having the processor 5 show even fewer memory locations, e. g. sending only one out of every four x-axis memory locations to the display 3, and only one in every four rows, to compress the image so that only one in sixteen memory locations is shown on the display.
Enlargement of the image, i. e. zooming in, as shown in Figure 10c can be achieved in two ways, namely :- 1. By having the'nominal-sized'image, e. g. as shown in Figure 9a, at lower resolution than the image data stored in memory (or remotely), so that the processor 5 is already showing only one in four memory locations, for example. In this instance, 'zooming in'would actually mean showing the image at its full resolution.
2. If the'nominal-sized'image, e. g. as shown in Figure 9a, is already at maximum resolution, i. e. one pixel is represented by one memory location, then various well-known interpolation algorithms may be applied by the processor 5 to increase the apparent size of the image. For example, the simplest is for each x axis memory location to be fed to two adjacent pixels, and for each y axis line to be repeated. This increases the y axis image size (albeit with no increase in actual image resolution).
<Desc/Clms Page number 18>
The processor 5 will, therefore, monitor the sensor frequently (e. g. hundreds of times per second); and, as the signal processed value from the sensor changes, it will vary not just the start and end points of the memory locations it sends to the display 3, but also how many of the memory locations between the start and end points it sends to the display, in proportion to the sensor variations, giving the effect of the image zooming in and out in relationship to the display and sensor movement in space.
This zooming effect can, of course, be combined with the scrolling effect.

Claims (17)

Claims
1. A handheld display device comprising: a screen; a processor for outputting an image to the screen using image data from an image data source ; and a sensor connected to the processor, the sensor being arranged to generate a varying sensor signal according to the free-space position of the sensor, the processor updating the image data being displayed in accordance with the varying sensor signal.
2. A handheld display device according to claim 1, wherein the processor is arranged to output a sub-set of the image data thereby to display only part of an image to the screen, the image part being displayed changing in response to the varying sensor signal.
3. A handheld display device according to claim 2, wherein movement of the sensor in first and second directions causes the processor to update the displayed image data to show, respectively, a zoomed-in and a zoomed-out version of the previously displayed image part.
4. A handheld display device according to claim 3, wherein the sensor is arranged such that variation of the sensor signal depends on the distance of the sensor from a reference surface.
5. A handheld display device according to claim 3, wherein the sensor is arranged such that variation of the sensor signal depends on the instantaneous movement of the sensor.
6. A handheld display device according to claim 2, wherein the sensor is arranged such that movement of the sensor in a particular direction causes the processor to update the displayed image data by showing an image part overlapping that previously displayed.
<Desc/Clms Page number 20>
7. A handheld display device according to claim 6, wherein the sensor is a tilt sensor.
8. A handheld display device according to claim 7, wherein the rate at which the processor updates the displayed image depends on the degree of tilt of the tilt sensor.
9. A handheld display device according to claim 6, wherein the sensor is a velocity sensor, a position sensor or an accelerometer.
10. A handheld display device according to claim 9, wherein the processor is such that the rate at which it updates the displayed image depends upon the rate of change of the position of the device as indicated by the sensor.
11. A handheld display device comprising: a screen; a processor for outputting an image part to the screen using image data from an image data source; and first and second sensors connected to the processor, the first and second sensors being arranged to generate, respectively, a first sensor signal which varies according to the free-space position of the handheld display device in a substantially vertical plane, and a second sensor signal which varies according to movement of the handheld display device in a further plane, the processor updating the image data being displayed in accordance with any one of the varying sensor signals.
12. A handheld display device according to claim 11, wherein the sensor is arranged such that variation of the first sensor signal causes the processor to update the image data being displayed by means of displaying a zoomed-in or a zoomed-out version of the previously displayed image part, depending on the direction of movement in the substantially vertical plane.
13. A handheld display device according to claim 11 or claim 12, wherein the sensor is arranged such that variation of the second sensor signal causes the processor
<Desc/Clms Page number 21>
to update the image data being displayed by means of displaying an image part which is adjacent to the previously displayed image part.
14. A handheld display device according to any preceding claim, wherein the image data source is an external data source.
15. A handheld display device according to any preceding claim, wherein the device is in the form of a portable telephone.
16. A handheld display device according to any of claims 1 to 15, wherein the device is in the form of a PDA.
17. A handheld display device, substantially as hereinbefore shown and described with reference to the accompanying drawings.
GB0115869A 2001-06-28 2001-06-28 A handheld display device Expired - Fee Related GB2378878B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0115869A GB2378878B (en) 2001-06-28 2001-06-28 A handheld display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0115869A GB2378878B (en) 2001-06-28 2001-06-28 A handheld display device

Publications (3)

Publication Number Publication Date
GB0115869D0 GB0115869D0 (en) 2001-08-22
GB2378878A true GB2378878A (en) 2003-02-19
GB2378878B GB2378878B (en) 2005-10-05

Family

ID=9917566

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0115869A Expired - Fee Related GB2378878B (en) 2001-06-28 2001-06-28 A handheld display device

Country Status (1)

Country Link
GB (1) GB2378878B (en)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005027550A1 (en) * 2003-09-17 2005-03-24 Nokia Corporation A mobile cellular telephone with a display that is controlled partly by an incline sensor.
WO2005043332A2 (en) 2003-10-31 2005-05-12 Iota Wireless Llc Concurrent data entry for a portable device
GB2412048A (en) * 2004-03-09 2005-09-14 Jitendra Jayantilal Ranpura Viewing an image larger than the display device
FR2868902A1 (en) * 2004-04-07 2005-10-14 Eastman Kodak Co VIEWING METHOD AND APPARATUS FOR SIMULATING THE OPTICAL EFFECT OF A LENTICULAR NETWORK TEST
WO2005103863A3 (en) * 2004-03-23 2006-01-26 Fujitsu Ltd Distinguishing tilt and translation motion components in handheld devices
WO2005093550A3 (en) * 2004-03-01 2006-04-13 Apple Computer Methods and apparatuses for operating a portable device based on an accelerometer
EP1647875A2 (en) * 2004-10-15 2006-04-19 Nec Corporation Mobile terminal and display control method thereof
EP1686450A2 (en) 2004-12-30 2006-08-02 LG Electronics Inc. Image navigation in a mobile station
US7173604B2 (en) 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US7176886B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Spatial signatures
US7176887B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Environmental modeling for motion controlled handheld devices
US7176888B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Selective engagement of motion detection
US7180502B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Handheld device with preferred motion selection
US7180500B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited User definable gestures for motion controlled handheld devices
US7180501B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Gesture based navigation of a handheld user interface
EP1818768A2 (en) * 2006-01-19 2007-08-15 High Tech Computer Corp. Display controller with a motion sensor on a portable electronic device
US7280096B2 (en) 2004-03-23 2007-10-09 Fujitsu Limited Motion sensor engagement for a handheld device
WO2007133257A1 (en) * 2006-05-17 2007-11-22 Sony Ericsson Mobile Communications Ab Electronic equipment with screen pan and zoom functions using motion
US7301527B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Feedback based user interface for motion controlled handheld devices
US7301526B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Dynamic adaptation of gestures for motion controlled handheld devices
US7301528B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Distinguishing tilt and translation motion components in handheld devices
US7301529B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Context dependent gesture response
CN100351760C (en) * 2005-01-12 2007-11-28 宏达国际电子股份有限公司 Hand-held device
US7351925B2 (en) 2000-10-02 2008-04-01 Apple Inc. Method and apparatus for detecting free fall
EP1909473A1 (en) * 2006-10-02 2008-04-09 Koninklijke Philips Electronics N.V. Digital photo frame
US7365737B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Non-uniform gesture precision
US7365736B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Customizable gesture mappings for motion controlled handheld devices
US7365735B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Translation controlled cursor
US7505784B2 (en) 2005-09-26 2009-03-17 Barbera Melvin A Safety features for portable electronic device
CN101008881B (en) * 2006-01-25 2010-05-19 三星电子株式会社 Apparatus and method of scrolling screen in portable device and recording medium
EP2216703A2 (en) * 2009-02-06 2010-08-11 Sony Corporation Handheld electronic device
US7903084B2 (en) 2004-03-23 2011-03-08 Fujitsu Limited Selective engagement of motion input modes
CN102053775A (en) * 2009-10-29 2011-05-11 鸿富锦精密工业(深圳)有限公司 Image display system and method thereof
US7990365B2 (en) 2004-03-23 2011-08-02 Fujitsu Limited Motion controlled remote controller
US8230610B2 (en) 2005-05-17 2012-07-31 Qualcomm Incorporated Orientation-sensitive signal output
US8392340B2 (en) 2009-03-13 2013-03-05 Apple Inc. Method and apparatus for detecting conditions of a peripheral device including motion, and determining/predicting temperature(S) wherein at least one temperature is weighted based on detected conditions
US8692764B2 (en) 2004-03-23 2014-04-08 Fujitsu Limited Gesture based user interface supporting preexisting symbols
US9298362B2 (en) 2011-02-11 2016-03-29 Nokia Technologies Oy Method and apparatus for sharing media in a multi-device environment
US9563202B1 (en) 2012-06-29 2017-02-07 Monkeymedia, Inc. Remote controlled vehicle with a head-mounted display apparatus
US9579586B2 (en) 2012-06-29 2017-02-28 Monkeymedia, Inc. Remote controlled vehicle with a handheld display device
US10051298B2 (en) 1999-04-23 2018-08-14 Monkeymedia, Inc. Wireless seamless expansion and video advertising player
USRE48400E1 (en) 2005-09-26 2021-01-19 Tamiras Per Pte. Ltd., Llc Safety features for portable electronic device
US11266919B2 (en) 2012-06-29 2022-03-08 Monkeymedia, Inc. Head-mounted display for navigating virtual and augmented reality

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103885685B (en) * 2012-12-24 2017-12-15 腾讯科技(深圳)有限公司 page processing method and device
CN103150098B (en) * 2013-03-26 2016-01-06 锤子科技(北京)有限公司 A kind of method and device representing the image of mobile terminal shooting
US10055009B2 (en) 2014-05-30 2018-08-21 Apple Inc. Dynamic display refresh rate based on device motion
CN104636040B (en) * 2015-02-05 2017-12-12 惠州Tcl移动通信有限公司 A kind of image display processing method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5602566A (en) * 1993-08-24 1997-02-11 Hitachi, Ltd. Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor
EP0805389A2 (en) * 1996-04-30 1997-11-05 Sun Microsystems, Inc. Tilt-scrolling on the sunpad
EP0825514A2 (en) * 1996-08-05 1998-02-25 Sony Corporation Information processing device and method for inputting information by operating the overall device with a hand
WO1998014863A2 (en) * 1996-10-01 1998-04-09 Philips Electronics N.V. Hand-held image display device
GB2336747A (en) * 1998-04-22 1999-10-27 Nec Corp Hand held communication terminal and method of scrolling display screen of the same.
GB2336749A (en) * 1998-04-24 1999-10-27 Nec Corp Scrolling display of portable display device by shaking
WO2001027727A2 (en) * 1999-10-13 2001-04-19 Gateway, Inc. A system and method utilizing motion input for manipulating a display of data
GB2357684A (en) * 1999-12-21 2001-06-27 Motorola Ltd Hand-held terminal having a display screen which is controlled by movement of the terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5602566A (en) * 1993-08-24 1997-02-11 Hitachi, Ltd. Small-sized information processor capable of scrolling screen in accordance with tilt, and scrolling method therefor
EP0805389A2 (en) * 1996-04-30 1997-11-05 Sun Microsystems, Inc. Tilt-scrolling on the sunpad
EP0825514A2 (en) * 1996-08-05 1998-02-25 Sony Corporation Information processing device and method for inputting information by operating the overall device with a hand
WO1998014863A2 (en) * 1996-10-01 1998-04-09 Philips Electronics N.V. Hand-held image display device
GB2336747A (en) * 1998-04-22 1999-10-27 Nec Corp Hand held communication terminal and method of scrolling display screen of the same.
GB2336749A (en) * 1998-04-24 1999-10-27 Nec Corp Scrolling display of portable display device by shaking
WO2001027727A2 (en) * 1999-10-13 2001-04-19 Gateway, Inc. A system and method utilizing motion input for manipulating a display of data
GB2357684A (en) * 1999-12-21 2001-06-27 Motorola Ltd Hand-held terminal having a display screen which is controlled by movement of the terminal

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10051298B2 (en) 1999-04-23 2018-08-14 Monkeymedia, Inc. Wireless seamless expansion and video advertising player
US9575569B2 (en) 2000-10-02 2017-02-21 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US9921666B2 (en) 2000-10-02 2018-03-20 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US8698744B2 (en) 2000-10-02 2014-04-15 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US7351925B2 (en) 2000-10-02 2008-04-01 Apple Inc. Method and apparatus for detecting free fall
US9829999B2 (en) 2000-10-02 2017-11-28 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US7541551B2 (en) 2000-10-02 2009-06-02 Apple Inc. Method and apparatus for detecting free fall
KR100908647B1 (en) * 2003-09-17 2009-07-21 노키아 코포레이션 A mobile cellular telephone with a display that is controlled partly by an incline sensor
US10372313B2 (en) 2003-09-17 2019-08-06 Conversant Wireless Licensing S.A R.L. Mobile cellular telephone with a display that is controlled partly by an incline sensor
WO2005027550A1 (en) * 2003-09-17 2005-03-24 Nokia Corporation A mobile cellular telephone with a display that is controlled partly by an incline sensor.
EP1678654A2 (en) * 2003-10-31 2006-07-12 Iota Wireless LLC Concurrent data entry for a portable device
US7721968B2 (en) 2003-10-31 2010-05-25 Iota Wireless, Llc Concurrent data entry for a portable device
WO2005043332A2 (en) 2003-10-31 2005-05-12 Iota Wireless Llc Concurrent data entry for a portable device
EP1678654A4 (en) * 2003-10-31 2008-07-02 Iota Wireless Llc Concurrent data entry for a portable device
WO2005093550A3 (en) * 2004-03-01 2006-04-13 Apple Computer Methods and apparatuses for operating a portable device based on an accelerometer
GB2412048A (en) * 2004-03-09 2005-09-14 Jitendra Jayantilal Ranpura Viewing an image larger than the display device
US7176887B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Environmental modeling for motion controlled handheld devices
WO2005103863A3 (en) * 2004-03-23 2006-01-26 Fujitsu Ltd Distinguishing tilt and translation motion components in handheld devices
US7280096B2 (en) 2004-03-23 2007-10-09 Fujitsu Limited Motion sensor engagement for a handheld device
US11119575B2 (en) 2004-03-23 2021-09-14 Fujitsu Limited Gesture based user interface supporting preexisting symbols
US7301527B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Feedback based user interface for motion controlled handheld devices
US7301526B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Dynamic adaptation of gestures for motion controlled handheld devices
US7301528B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Distinguishing tilt and translation motion components in handheld devices
US7301529B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Context dependent gesture response
US7990365B2 (en) 2004-03-23 2011-08-02 Fujitsu Limited Motion controlled remote controller
US7903084B2 (en) 2004-03-23 2011-03-08 Fujitsu Limited Selective engagement of motion input modes
US7180500B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited User definable gestures for motion controlled handheld devices
US7180501B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Gesture based navigation of a handheld user interface
US7365737B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Non-uniform gesture precision
US7365736B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Customizable gesture mappings for motion controlled handheld devices
US7365735B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Translation controlled cursor
US7180502B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Handheld device with preferred motion selection
KR100853605B1 (en) 2004-03-23 2008-08-22 후지쯔 가부시끼가이샤 Distinguishing tilt and translation motion components in handheld devices
US7176888B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Selective engagement of motion detection
US7173604B2 (en) 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US8692764B2 (en) 2004-03-23 2014-04-08 Fujitsu Limited Gesture based user interface supporting preexisting symbols
US7176886B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Spatial signatures
WO2005099248A3 (en) * 2004-04-07 2008-03-13 Eastman Kodak Co Method and camera for simulating the optical effect of a lenticular grid hardcopy
WO2005099248A2 (en) * 2004-04-07 2005-10-20 Eastman Kodak Company Method and camera for simulating the optical effect of a lenticular grid hardcopy
FR2868902A1 (en) * 2004-04-07 2005-10-14 Eastman Kodak Co VIEWING METHOD AND APPARATUS FOR SIMULATING THE OPTICAL EFFECT OF A LENTICULAR NETWORK TEST
EP1647875A3 (en) * 2004-10-15 2010-02-03 Nec Corporation Mobile terminal and display control method thereof
EP1647875A2 (en) * 2004-10-15 2006-04-19 Nec Corporation Mobile terminal and display control method thereof
EP1686450A2 (en) 2004-12-30 2006-08-02 LG Electronics Inc. Image navigation in a mobile station
EP1686450A3 (en) * 2004-12-30 2012-08-15 LG Electronics Inc. Image navigation in a mobile station
CN100351760C (en) * 2005-01-12 2007-11-28 宏达国际电子股份有限公司 Hand-held device
US8230610B2 (en) 2005-05-17 2012-07-31 Qualcomm Incorporated Orientation-sensitive signal output
US7505784B2 (en) 2005-09-26 2009-03-17 Barbera Melvin A Safety features for portable electronic device
USRE48400E1 (en) 2005-09-26 2021-01-19 Tamiras Per Pte. Ltd., Llc Safety features for portable electronic device
US8280438B2 (en) 2005-09-26 2012-10-02 Zoomsafer, Inc. Safety features for portable electronic device
EP1818768A2 (en) * 2006-01-19 2007-08-15 High Tech Computer Corp. Display controller with a motion sensor on a portable electronic device
CN100429610C (en) * 2006-01-19 2008-10-29 宏达国际电子股份有限公司 Intuition type screen controller
EP1818768A3 (en) * 2006-01-19 2009-04-01 High Tech Computer Corp. Display controller with a motion sensor on a portable electronic device
US8081157B2 (en) 2006-01-25 2011-12-20 Samsung Electronics Co., Ltd. Apparatus and method of scrolling screen in portable device and recording medium storing program for performing the method
CN101008881B (en) * 2006-01-25 2010-05-19 三星电子株式会社 Apparatus and method of scrolling screen in portable device and recording medium
WO2007133257A1 (en) * 2006-05-17 2007-11-22 Sony Ericsson Mobile Communications Ab Electronic equipment with screen pan and zoom functions using motion
EP1909473A1 (en) * 2006-10-02 2008-04-09 Koninklijke Philips Electronics N.V. Digital photo frame
EP2216703A2 (en) * 2009-02-06 2010-08-11 Sony Corporation Handheld electronic device
EP2216703A3 (en) * 2009-02-06 2014-08-20 Sony Corporation Handheld electronic device
US8392340B2 (en) 2009-03-13 2013-03-05 Apple Inc. Method and apparatus for detecting conditions of a peripheral device including motion, and determining/predicting temperature(S) wherein at least one temperature is weighted based on detected conditions
CN102053775A (en) * 2009-10-29 2011-05-11 鸿富锦精密工业(深圳)有限公司 Image display system and method thereof
US9298362B2 (en) 2011-02-11 2016-03-29 Nokia Technologies Oy Method and apparatus for sharing media in a multi-device environment
US9782684B2 (en) 2012-06-29 2017-10-10 Monkeymedia, Inc. Remote controlled vehicle with a handheld display device
US9791897B2 (en) 2012-06-29 2017-10-17 Monkeymedia, Inc. Handheld display device for navigating a virtual environment
US9656168B1 (en) 2012-06-29 2017-05-23 Monkeymedia, Inc. Head-mounted display for navigating a virtual environment
US9658617B1 (en) 2012-06-29 2017-05-23 Monkeymedia, Inc. Remote controlled vehicle with a head-mounted display
US9919233B2 (en) 2012-06-29 2018-03-20 Monkeymedia, Inc. Remote controlled vehicle with augmented reality overlay
US9612627B2 (en) 2012-06-29 2017-04-04 Monkeymedia, Inc. Head-mounted display apparatus for navigating a virtual environment
US9579586B2 (en) 2012-06-29 2017-02-28 Monkeymedia, Inc. Remote controlled vehicle with a handheld display device
US10596478B2 (en) 2012-06-29 2020-03-24 Monkeymedia, Inc. Head-mounted display for navigating a virtual environment
US9563202B1 (en) 2012-06-29 2017-02-07 Monkeymedia, Inc. Remote controlled vehicle with a head-mounted display apparatus
US11266919B2 (en) 2012-06-29 2022-03-08 Monkeymedia, Inc. Head-mounted display for navigating virtual and augmented reality
US11969666B2 (en) 2012-06-29 2024-04-30 Monkeymedia, Inc. Head-mounted display for navigating virtual and augmented reality

Also Published As

Publication number Publication date
GB0115869D0 (en) 2001-08-22
GB2378878B (en) 2005-10-05

Similar Documents

Publication Publication Date Title
GB2378878A (en) Control of display by movement of handheld device
US10318017B2 (en) Viewing images with tilt control on a hand-held device
US9798395B2 (en) Electronic control apparatus and method for responsively controlling media content displayed on portable electronic device
KR100671585B1 (en) Method and device for browsing information on a display
US7203911B2 (en) Altering a display on a viewing device based upon a user proximity to the viewing device
US7570275B2 (en) Image information displaying apparatus
US5714972A (en) Display apparatus and display method
TWI571790B (en) Method and electronic device for changing coordinate values of icons according to a sensing signal
CN100458910C (en) Image display device and image display method
US6624824B1 (en) Tilt-scrolling on the sunpad
US5376947A (en) Touch-type input terminal apparatus for pointing or specifying position on display device
EP1686450B1 (en) Image navigation in a mobile station
EP1255186A2 (en) Web browser user interface for low-resolution displays
JP2012514786A (en) User interface for mobile devices
EP1779226B1 (en) Method and system for controlling a display
JP2004062774A (en) Presentation display device
US20110102466A1 (en) System and method for zooming images
WO2004019199A1 (en) Display device for presentation
KR20060031730A (en) Length measuring method for mobile communication terminal
CN1332330C (en) Palmtop with zooming display function
JP2002090152A (en) Input device and input system
WO2005041017A2 (en) Handheld device for navigating and displaying data
JP3533029B2 (en) Virtual object display device
KR20050094037A (en) Image control
TWI463481B (en) Image displaying system and method

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20070628