WO2005041018A1 - A handheld device for displaying information - Google Patents

A handheld device for displaying information Download PDF

Info

Publication number
WO2005041018A1
WO2005041018A1 PCT/EP2004/011987 EP2004011987W WO2005041018A1 WO 2005041018 A1 WO2005041018 A1 WO 2005041018A1 EP 2004011987 W EP2004011987 W EP 2004011987W WO 2005041018 A1 WO2005041018 A1 WO 2005041018A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
display
displayed
orientation
movement
Prior art date
Application number
PCT/EP2004/011987
Other languages
French (fr)
Inventor
Stefan Rapp
Original Assignee
Conante
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Conante filed Critical Conante
Priority to DE112004002015T priority Critical patent/DE112004002015T5/en
Priority to GB0610143A priority patent/GB2423137B/en
Publication of WO2005041018A1 publication Critical patent/WO2005041018A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1639Details related to the display arrangement, including those related to the mounting of the display in the housing the display being based on projection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Definitions

  • the present invention relates to a handheld device for displaying information according to claim 1 and to a method for displaying information according to claim 10.
  • the most commonly used system for displaying and navigating through data sets is a computer screen or monitor combined with a mouse as input device. By moving the mouse in different directions, the user can pan the available viewport around the data set
  • This panning operation is the basis operation for navigating in data sets. Often, additional navigation commands can be entered in order to e.g. zoom into the selected area of interest or to change the visualisation of the displayed data.
  • the above-mentioned object is achieved by a handheld device for displaying information according to claim 1.
  • the present invention reveals a handheld device for displaying information comprising a sensing unit for detecting the position, orientation and/or movement of the device and a display unit for displaying information, whereby the information is displayed depending on the detected position, orientation and/or movement.
  • the display unit could be anything capable of showing or viewing any kind of information, e.g. an image or data and the like to the user's eye when looking into the device, such as a projection unit projecting an image to a display inside the device or a near-to-eye-display or any other suitable device.
  • Examples for the display means are:
  • t-LCD backlit transmissive liquid crystal micro display
  • OLED Organic Light Emitting Diodes
  • a retinal display that directly stimulates the eye's retina through a modulated (laser) light beam
  • a MEMS (Micro Electro-Mechanical System) based laser projection unit with a projection surface inside the device that is illuminated by a modulated laser scanner and viewed through an optic, or
  • the above-mentioned object is further achieved by a method for a handheld device for displaying information according to claim 10.
  • the present invention reveals a method for displaying information comprising the steps of detecting the position, orientation and/or movement of the device and displaying an information, whereby the information is displayed depending on the detected position, orientation and/or movement of the device.
  • the panning motion triggered by the natural hand gesture of turning the device into the direction of interest conforms to the natural gesture of turning the head in order to change the viewport.
  • the device according to the present invention can be intuitively used to pan through a large image or data set and to visualise the area or the areas of interest.
  • the display means displays the information on a display inside the device. Further advantageously, the display can be seen through an eyepiece of the device.
  • the display means may display the information on or near to a user's eye or eyes.
  • the displayed information is a website.
  • the displayed information can be an image from a remote camera.
  • the device transmits and receives information signals from a server for providing programs and data within a network.
  • the sensing unit is a camera module.
  • the sensing unit is a motion detection unit comprising an accelerometer and a gyroscope and/or a magnetic field sensor.
  • Fig. 1 shows a schematic view of a portable device according to the present invention
  • Fig. 2 is a block diagram showing the elements of a preferred embodiment of a device in accordance with the present invention
  • Fig. 3 shows the use of the device with an integrated display means
  • Fig. 4 is a block diagram showing a first method of using the device
  • Fig. 5 is a block diagram showing a second method of using the device.
  • Fig. 6 is a block diagram showing a third method of using the device.
  • Fig. 1 shows a schematic view of an embodiment of the handheld device 9 according to the present invention.
  • the device 9 consists of a body 1 whereby the body 1 is adapted to be held by a user.
  • the body 1 hereby has a longish, e.g. a cylindrical or conical form and a circular, oval, rectangular or any other cross-section.
  • An external display 4 can be located in the outer shell of the body 1 for displaying symbols, pictures, menus, names, phone numbers or the like, but the display can also be omitted.
  • the device 9 comprises an input means, e.g. buttons, a keypad or the like for inputting data or information.
  • the device 9 further comprises an eyepiece 5 at one of the two sides of the device 9.
  • the eyepiece 5 hereby may be a conventional eyepiece like used in a telescope or a camera and enables the user to look into the device 9 in order to watch information, such as images or the like displayed inside the device 9 or any other suitable eye piece enabling a user to view images.
  • An essential feature of the device 9 according to the present invention is that the device is able to detect or recognise a change in the orientation and/or position in order to transfer these changes into a change of the displayed information, that is the image seen by the user and/or enables a user to navigate through displayed data by moving the device 9. Two possibilities to realise this motion detection will be explained afterwards with reference to the block diagram of Fig. 2.
  • the device 9 hereby comprises a display means 10 which displays information, such as an image or data.
  • a display 20 can be integrated into the device 9 and thereby seen trough the eyepiece 5.
  • the display means 20 may display an image or data directly in or on the eye or eyes of a user or near to the eye or eyes of a user.
  • the device 1 includes a motion detector unit 15 which comprises an accelerometer 17 and a gyroscope 16 and/or a magnetic field sensor 19.
  • the accelerometer 17 and gyroscope 16 and/or the magnetic field sensor 19 in combination are able to sense every motion the device 1 does undergo.
  • a gyroscope 16 for measuring dynamic changes or a magnetic field sensor 19 for measuring the static earth magnetic field are used.
  • a second alternative for sensing a movement of the device 9 uses an image based motion vector extraction mechanism.
  • the device 9 comprises a camera module 18 which contains a camera that repeatedly takes pictures of the area the device 9 is directed to. By analysing the pictures, in particular by analysing the differences between two following pictures, the camera module 18 is again able to recognise a change in the position or orientation of the device 9.
  • the motion detector unit 15 or the camera module 18 senses a change in the position and/or orientation of the device 9 and forwards this information to a processing unit 11 which processing unit 11 forms together with the camera module 18 or the motion detector unit 15 a unit enabling a user to navigate through displayed data or images by moving the device 9.
  • the processing unit 11 selects the corresponding data which are forwarded to the display means 10 for displaying these data or projecting these data.
  • the display means 10 could be a MEMS (Micro Electro-Mechanical System) based laser projection unit or another display means 10 suitable to display the data on a display or in the user's eye.
  • the whole information space i.e. the larger data set from which the actual projected data is selected, could be either stored locally in a memory 12 contained within the device 9. Alternatively, these data could also be stored within an external memory 13 whereas the processing unit 11 downloads the required data by a wireless communication link, e.g. via Bluetooth.
  • Fig. 3 shows the use of the device 9 with an integrated display 20, e.g. a projection surface.
  • the display 20 can be seen inside the device 9 through an eyepiece 5 situated at one end of the device 9.
  • the internal display 20 can e.g. be realised in a way that the user when looking into the eyepiece has the impression that the image or the data are projected to a wall of a room in which the user is located.
  • Fig. 4 shows a first example of the device 9 in a simple form.
  • the processing unit 11 by controlling the display means 10 displays on the display 20 a picture the camera unit 18 is taking from a part of the ambience lying directly in a line-of-sight of the device 9.
  • a further possibility is to display virtual worlds, such as data spaces, traditional sources such as web pages, electronic versions of bulletin boards and the like, on the display 20.
  • Fig. 5 shows a second example of the device 9 for the control of a remote camera 33.
  • the device 9 is linked to a remote pan and tilt camera 33 by a bidirectional telecommunication mechanism in such a way, that every movement of the device 1 is accounted for by a corresponding movement of the pan and tilt camera 33.
  • the movements of the device 9 either sensed by the motion detection unit 15 or by the camera module 18 are submitted to the processing unit 11, which submits this movement information of the device 9 to a remote pan and tilt mechanism 31.
  • the pan and tilt mechanism 31 is connected to a remote camera 33 and via the physical connection 32 the remote camera 33 is moved according to the movement information the pan and tilt mechanism 31 has received by the processing unit 11.
  • the remote camera 33 itself is connected to the processing unit 11 and submits the pictures taken from the environment surrounding the remote camera 33.
  • the processing unit 11 then displays theses pictures on the display means 20.
  • the user by moving the device 9 has the impression to move the remote camera 33 and to look around in environments remote from his actual position.
  • Fig. 6 shows a third example of the device 9.
  • a remote panoramic camera 36 constantly submits wide angle images or video streams 37 over a connection 37 to an image or video server 38.
  • the motion detection unit 15 or the camera module 18 of the device 1 the orientation and movement of the device is submitted by the processing unit 11 to the image or video server 38.
  • the image or video server 38 processes the movement information of the device 9 together with the images or video streams received from the panoramic camera 36 and submits a partial image or video stream over a connection 39 back to the processing unit in a way, that the user has impression to look around in the panoramic space recorded by the panoramic camera 36.
  • the low resolution stream is streamed always to allow rendering of the surroundings by the device 9 also in case of rapid movements to give the user the possibility of orientation. If the user for example looks at one spot for a predefined time, e.g. 0.5 seconds, a high resolution stream for this part is generated by the image or video server 38 and transmitted to the processing unit 11. By doing so, latencies of the transmitting network do not have such a strong adverse effect on the usability of the device.
  • the transfer of information from the camera module 18 to the processing unit 11 can be performed by video streaming, whereby the information is processed and analysed in the processing unit 11, or the information can be processed and analysed already in the camera module 18, whereby the analysing result is forwarded to the processing unit.
  • the device 9 with the functionality of a mobile phone for wireless communication systems, such as GSM, UMTS or the like.
  • a mobile phone for wireless communication systems, such as GSM, UMTS or the like.
  • an antenna, transceiver, microphone, loudspeakers and other components necessary for the functionality of a mobile phone are integrated into the device 9.
  • Dialling a phone number is accomplished either by inputting the phone number via the input means or by scrolling through a list of already stored numbers.
  • Another possibility for making a call is to display the functionalities, phone numbers and the like on the display means thereby enabling the user to browse over a graphical arrangement of images. By zooming into a picture or an accompanying phone number or symbol, a connection to that number is established.
  • the device 9 may also be used to set up a connection to an internet server and thereby watching internet information contents. For inputting the web addresses again the input means can be used.
  • the present invention thus provides a completely new solution for navigating through data sets and displaying at least a part of these data.
  • the navigation commands correspond to the motion of the user's hand which allows a very convenient and intuitive navigation within the data.

Abstract

The present invention relates to a handheld device for displaying information, e.g. images or data, comprising a sensing unit (15, 18) for detecting the position, orientation and/or movement of the device (9) and a display means (10) for displaying an information, whereby the information displayed depending on the detected position, orientation and/or movement of the device (9). The present invention further relates to a method for executing the steps on the device.

Description

A handheld device for displaying information
The present invention relates to a handheld device for displaying information according to claim 1 and to a method for displaying information according to claim 10.
The feature of accessing information in recent times has been addressed for various application scenarios and hardware components. A common feature of almost all known approaches is the notion of having a fixed display medium and dedicated input devices, e.g. a mouse, a pen, etc. , that the user operates in order to change the viewport on the information space or data set. Having selected the new viewport, the information space is correspondingly rendered onto the screen.
The most commonly used system for displaying and navigating through data sets is a computer screen or monitor combined with a mouse as input device. By moving the mouse in different directions, the user can pan the available viewport around the data set
- which is e.g. visualised as a map - until the area the user is interested in is located. With every motion of the mouse, the viewport within the data set is changed and the data shown on the display or monitor are updated correspondingly. This panning operation is the basis operation for navigating in data sets. Often, additional navigation commands can be entered in order to e.g. zoom into the selected area of interest or to change the visualisation of the displayed data.
The classical display and input devices as mentioned above suffer sometimes from the drawback that an operation or manipulation of the input device cannot be intuitively linked to the navigation operation initiated or triggered by this operation. Often, a user first has to learn several specific navigation procedures in order to change the visualisation of the data as desired.
It is therefore an object of the present invention to provide a device and a method for displaying information which allows an easy and intuitive access to and navigation through information.
The above-mentioned object is achieved by a handheld device for displaying information according to claim 1. The present invention reveals a handheld device for displaying information comprising a sensing unit for detecting the position, orientation and/or movement of the device and a display unit for displaying information, whereby the information is displayed depending on the detected position, orientation and/or movement.
Hereby, the display unit could be anything capable of showing or viewing any kind of information, e.g. an image or data and the like to the user's eye when looking into the device, such as a projection unit projecting an image to a display inside the device or a near-to-eye-display or any other suitable device. Examples for the display means are:
• a backlit transmissive liquid crystal micro display (t-LCD) or an array of OLED (Organic Light Emitting Diodes) that is seen through an optic, or
• a retinal display that directly stimulates the eye's retina through a modulated (laser) light beam, or
• a MEMS (Micro Electro-Mechanical System) based laser projection unit with a projection surface inside the device that is illuminated by a modulated laser scanner and viewed through an optic, or
• any technology that is suitable to fit inside the device and generates a perceivable image on the user's retina.
The above-mentioned object is further achieved by a method for a handheld device for displaying information according to claim 10.
The present invention reveals a method for displaying information comprising the steps of detecting the position, orientation and/or movement of the device and displaying an information, whereby the information is displayed depending on the detected position, orientation and/or movement of the device.
By detecting the actual position, orientation and/or movement of the device and at the same time adapting the displayed image to the position, orientation and/or movement, the panning motion triggered by the natural hand gesture of turning the device into the direction of interest conforms to the natural gesture of turning the head in order to change the viewport. Thus, the device according to the present invention can be intuitively used to pan through a large image or data set and to visualise the area or the areas of interest.
Advantageously, the display means displays the information on a display inside the device. Further advantageously, the display can be seen through an eyepiece of the device.
Alternatively, the display means may display the information on or near to a user's eye or eyes.
In a preferred embodiment the displayed information is a website.
The displayed information can be an image from a remote camera.
Advantageously, the device transmits and receives information signals from a server for providing programs and data within a network.
Preferably, the sensing unit is a camera module.
Further preferably, the sensing unit is a motion detection unit comprising an accelerometer and a gyroscope and/or a magnetic field sensor.
In the following description, a preferred embodiment of the present invention is explained in more detail in relation to the enclosed drawings, in which
Fig. 1 shows a schematic view of a portable device according to the present invention, Fig. 2 is a block diagram showing the elements of a preferred embodiment of a device in accordance with the present invention,
Fig. 3 shows the use of the device with an integrated display means, Fig. 4 is a block diagram showing a first method of using the device,
Fig. 5 is a block diagram showing a second method of using the device and
Fig. 6 is a block diagram showing a third method of using the device.
Fig. 1 shows a schematic view of an embodiment of the handheld device 9 according to the present invention. The device 9 consists of a body 1 whereby the body 1 is adapted to be held by a user. The body 1 hereby has a longish, e.g. a cylindrical or conical form and a circular, oval, rectangular or any other cross-section. An external display 4 can be located in the outer shell of the body 1 for displaying symbols, pictures, menus, names, phone numbers or the like, but the display can also be omitted. Further, the device 9 comprises an input means, e.g. buttons, a keypad or the like for inputting data or information.
The device 9 according to the present invention further comprises an eyepiece 5 at one of the two sides of the device 9. The eyepiece 5 hereby may be a conventional eyepiece like used in a telescope or a camera and enables the user to look into the device 9 in order to watch information, such as images or the like displayed inside the device 9 or any other suitable eye piece enabling a user to view images.
An essential feature of the device 9 according to the present invention is that the device is able to detect or recognise a change in the orientation and/or position in order to transfer these changes into a change of the displayed information, that is the image seen by the user and/or enables a user to navigate through displayed data by moving the device 9. Two possibilities to realise this motion detection will be explained afterwards with reference to the block diagram of Fig. 2.
The device 9 hereby comprises a display means 10 which displays information, such as an image or data. Hereby, a display 20 can be integrated into the device 9 and thereby seen trough the eyepiece 5. Alternatively, the display means 20 may display an image or data directly in or on the eye or eyes of a user or near to the eye or eyes of a user.
In a first alternative of the present invention, the device 1 includes a motion detector unit 15 which comprises an accelerometer 17 and a gyroscope 16 and/or a magnetic field sensor 19. The accelerometer 17 and gyroscope 16 and/or the magnetic field sensor 19 in combination are able to sense every motion the device 1 does undergo.
For example, with 2-way MEMS accelerometer s the up-down direction and the rotation direction can be measured. If also floor and ceiling are considered or rotation of the device 9 by 90°, a third acceleration axis orthogonal to the other two is recommendable as the left-right movements are not measured appropriately by the accelerometer 17. For this purpose a gyroscope 16 for measuring dynamic changes or a magnetic field sensor 19 for measuring the static earth magnetic field are used.
A second alternative for sensing a movement of the device 9 uses an image based motion vector extraction mechanism. In this solution, the device 9 comprises a camera module 18 which contains a camera that repeatedly takes pictures of the area the device 9 is directed to. By analysing the pictures, in particular by analysing the differences between two following pictures, the camera module 18 is again able to recognise a change in the position or orientation of the device 9.
The motion detector unit 15 or the camera module 18 senses a change in the position and/or orientation of the device 9 and forwards this information to a processing unit 11 which processing unit 11 forms together with the camera module 18 or the motion detector unit 15 a unit enabling a user to navigate through displayed data or images by moving the device 9. The processing unit 11 then selects the corresponding data which are forwarded to the display means 10 for displaying these data or projecting these data. The display means 10 could be a MEMS (Micro Electro-Mechanical System) based laser projection unit or another display means 10 suitable to display the data on a display or in the user's eye. The whole information space, i.e. the larger data set from which the actual projected data is selected, could be either stored locally in a memory 12 contained within the device 9. Alternatively, these data could also be stored within an external memory 13 whereas the processing unit 11 downloads the required data by a wireless communication link, e.g. via Bluetooth.
Fig. 3 shows the use of the device 9 with an integrated display 20, e.g. a projection surface. Hereby, the display 20 can be seen inside the device 9 through an eyepiece 5 situated at one end of the device 9. The internal display 20 can e.g. be realised in a way that the user when looking into the eyepiece has the impression that the image or the data are projected to a wall of a room in which the user is located.
Fig. 4 shows a first example of the device 9 in a simple form. Hereby, the processing unit 11 by controlling the display means 10 displays on the display 20 a picture the camera unit 18 is taking from a part of the ambience lying directly in a line-of-sight of the device 9. A further possibility is to display virtual worlds, such as data spaces, traditional sources such as web pages, electronic versions of bulletin boards and the like, on the display 20.
Fig. 5 shows a second example of the device 9 for the control of a remote camera 33. Hereby, the device 9 is linked to a remote pan and tilt camera 33 by a bidirectional telecommunication mechanism in such a way, that every movement of the device 1 is accounted for by a corresponding movement of the pan and tilt camera 33. Hereby, the movements of the device 9 either sensed by the motion detection unit 15 or by the camera module 18 are submitted to the processing unit 11, which submits this movement information of the device 9 to a remote pan and tilt mechanism 31. The pan and tilt mechanism 31 is connected to a remote camera 33 and via the physical connection 32 the remote camera 33 is moved according to the movement information the pan and tilt mechanism 31 has received by the processing unit 11. The remote camera 33 itself is connected to the processing unit 11 and submits the pictures taken from the environment surrounding the remote camera 33. The processing unit 11 then displays theses pictures on the display means 20. Hereby, the user by moving the device 9 has the impression to move the remote camera 33 and to look around in environments remote from his actual position.
Fig. 6 shows a third example of the device 9. Hereby, a remote panoramic camera 36 constantly submits wide angle images or video streams 37 over a connection 37 to an image or video server 38. By the motion detection unit 15 or the camera module 18 of the device 1 the orientation and movement of the device is submitted by the processing unit 11 to the image or video server 38. The image or video server 38 processes the movement information of the device 9 together with the images or video streams received from the panoramic camera 36 and submits a partial image or video stream over a connection 39 back to the processing unit in a way, that the user has impression to look around in the panoramic space recorded by the panoramic camera 36. Hereby, it is proposed to stream two video signals, one 360 degree low resolution stream and a high resolution stream only covering mainly the current view. The low resolution stream is streamed always to allow rendering of the surroundings by the device 9 also in case of rapid movements to give the user the possibility of orientation. If the user for example looks at one spot for a predefined time, e.g. 0.5 seconds, a high resolution stream for this part is generated by the image or video server 38 and transmitted to the processing unit 11. By doing so, latencies of the transmitting network do not have such a strong adverse effect on the usability of the device.
In the examples shown in Figs. 5 and 6, the transfer of information from the camera module 18 to the processing unit 11 can be performed by video streaming, whereby the information is processed and analysed in the processing unit 11, or the information can be processed and analysed already in the camera module 18, whereby the analysing result is forwarded to the processing unit.
It is also possible to combine the device 9 with the functionality of a mobile phone for wireless communication systems, such as GSM, UMTS or the like. For this purpose an antenna, transceiver, microphone, loudspeakers and other components necessary for the functionality of a mobile phone are integrated into the device 9. Dialling a phone number is accomplished either by inputting the phone number via the input means or by scrolling through a list of already stored numbers. Another possibility for making a call is to display the functionalities, phone numbers and the like on the display means thereby enabling the user to browse over a graphical arrangement of images. By zooming into a picture or an accompanying phone number or symbol, a connection to that number is established.
The device 9 may also be used to set up a connection to an internet server and thereby watching internet information contents. For inputting the web addresses again the input means can be used.
The present invention thus provides a completely new solution for navigating through data sets and displaying at least a part of these data. The navigation commands correspond to the motion of the user's hand which allows a very convenient and intuitive navigation within the data.

Claims

Claims
1. Handheld device (9) for displaying information comprising a sensing unit (15, 18) for detecting the position, orientation and/or movement of the device (9) and a display means (10) for displaying information, whereby the information is displayed depending on the detected position, orientation and/or movement of the device (9).
2. Device according to claim 1, characterised in that the display means (10) displays the information on a display (20) inside the device (9)
3. Device according to claim 2, characterised in that the display (20) can be seen through an eyepiece (5) of the device (9).
4. Device according to claim 1, characterised in that the display means displays the information on or near to a user's eye.
5. Device according to any of claims 1 to 4, characterised in that the displayed information is a website.
6. Device according to any of claims 1 to 4, characterised in, that the displayed information is an image of a remote camera (33, 36).
7. Device according to any of claims 1 to 6, characterised in, that two or more information signals are received from a server and, depending on the position, orientation or movement of the device (9), the device (9) chooses which of the information signals or which parts of the information signals are displayed.
8. Device according to any of the claims 1 to 6, characterised in, that the device (9) transmits and receives information signals to and from a server (38) for providing programs and data within a network.
9. Device according to any of the claims 1 to 8, characterised in that the sensing unit is a camera module (18).
10. Device according to any of the claims 1 to 8, characterised in that the sensing unit is a motion detection unit (15) comprising an accelerometer (17) and a gyroscope (16) or a magnetic field sensor (19).
11. Method for displaying information comprising the steps of detecting the position, orientation and/or movement of the device (9) and displaying information on a display means (10), whereby the information is displayed depending on the detected position, orientation and/or movement of the device (9).
12. Method according to claim 11, characterised in that the information is displayed on a display (20).
13. Method according to claim 12, characterised by providing an eyepiece (5) through which the display (20) can be seen.
14. Method according to claim 11, characterised in, that the information is displayed on or near a user's eye.
15. Method according to one of the claims 11 to 14, characterised in, that two or more information signals are received from a server and, depending on the position, orientation or movement of the device (9), the device (9) chooses which of the information signals or which parts of the information signals are displayed.
16. Method according to any of claims 11 to 15, transmitting and receiving information signals from a server (38) for providing programs and data within a network.
17. Method according to any of the claims 11 to 16, characterised by providing a camera module (18) as sensing unit.
18. Method according to any of the claims 11 to 17, characterised by providing as sensing unit a motion detection unit (15) comprising an accelerometer (17) and a gyroscope (16) or a magnetic field sensor (19).
PCT/EP2004/011987 2003-10-22 2004-10-22 A handheld device for displaying information WO2005041018A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112004002015T DE112004002015T5 (en) 2003-10-22 2004-10-22 A hand-held device for displaying information
GB0610143A GB2423137B (en) 2003-10-22 2004-10-22 A handheld device for displaying information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP03023975.0 2003-10-22
EP03023975 2003-10-22

Publications (1)

Publication Number Publication Date
WO2005041018A1 true WO2005041018A1 (en) 2005-05-06

Family

ID=34486092

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2004/011987 WO2005041018A1 (en) 2003-10-22 2004-10-22 A handheld device for displaying information

Country Status (3)

Country Link
DE (2) DE602004009333T2 (en)
GB (1) GB2423137B (en)
WO (1) WO2005041018A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2437497A1 (en) * 2010-10-04 2012-04-04 Mobotix AG Position-dependent camera switching
WO2013151855A1 (en) * 2012-04-03 2013-10-10 Cisco Technology, Inc. Motion responsive video capture during a video conference
US8644884B2 (en) 2011-08-04 2014-02-04 Qualcomm Incorporated Sensor-based user interface control

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0773494A1 (en) * 1995-11-13 1997-05-14 Motorola, Inc. Motion responsive cursor for controlling movement in a virtual image apparatus
US20020140667A1 (en) * 2001-04-02 2002-10-03 Toshio Horiki Portable communication terminal, information display device, control input device and control input method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0764754A (en) * 1993-08-24 1995-03-10 Hitachi Ltd Compact information processor
JP3120779B2 (en) * 1998-04-24 2000-12-25 日本電気株式会社 Display scrolling method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0773494A1 (en) * 1995-11-13 1997-05-14 Motorola, Inc. Motion responsive cursor for controlling movement in a virtual image apparatus
US20020140667A1 (en) * 2001-04-02 2002-10-03 Toshio Horiki Portable communication terminal, information display device, control input device and control input method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2437497A1 (en) * 2010-10-04 2012-04-04 Mobotix AG Position-dependent camera switching
US8644884B2 (en) 2011-08-04 2014-02-04 Qualcomm Incorporated Sensor-based user interface control
US9357043B2 (en) 2011-08-04 2016-05-31 Qualcomm Incorporated Sensor-based user interface control
WO2013151855A1 (en) * 2012-04-03 2013-10-10 Cisco Technology, Inc. Motion responsive video capture during a video conference
US8854415B2 (en) 2012-04-03 2014-10-07 Cisco Technology, Inc. Motion responsive video capture during a video conference

Also Published As

Publication number Publication date
DE602004009333D1 (en) 2007-11-15
GB2423137B (en) 2008-03-19
GB2423137A (en) 2006-08-16
GB0610143D0 (en) 2006-06-28
DE112004002015T5 (en) 2006-10-05
DE602004009333T2 (en) 2008-07-10

Similar Documents

Publication Publication Date Title
US20200302685A1 (en) Generating a Three-Dimensional Model Using a Portable Electronic Device Recording
CN111034181B (en) Image capturing apparatus, image display system, and operation method
US9335912B2 (en) GUI applications for use with 3D remote controller
US9122307B2 (en) Advanced remote control of host application using motion and voice commands
KR101648564B1 (en) Potable terminal, Remote camera, and PTZ control method of remote camera by potable terminal
US20020158908A1 (en) Web browser user interface for low-resolution displays
CN108664156B (en) 3D locator mapping
US20060061551A1 (en) Motion detection and tracking system to control navigation and display of portable displays including on-chip gesture detection
US20120075177A1 (en) Lapel microphone micro-display system incorporating mobile information access
US20110072399A1 (en) Method for providing gui which generates gravity map to move pointer and display apparatus using the same
US9262867B2 (en) Mobile terminal and method of operation
US20070205980A1 (en) Mobile projectable gui
US20110115883A1 (en) Method And System For Adaptive Viewport For A Mobile Device Based On Viewing Angle
US20020024506A1 (en) Motion detection and tracking system to control navigation and display of object viewers
EP1779226B1 (en) Method and system for controlling a display
JP2004534963A (en) Methods, systems and devices for augmented reality
JP2015041052A (en) Wristband type information processing apparatus and storage medium
JP2003131650A (en) Portable digital device
EP3144727A1 (en) Mobile terminal with panoramic camera
US11188144B2 (en) Method and apparatus to navigate a virtual content displayed by a virtual reality (VR) device
JP2013084029A (en) Display control device
JP2013506218A (en) Method for performing visual search based on movement or posture of terminal, terminal and computer-readable recording medium
KR20170066916A (en) Electronic apparatus and controlling method of thereof
US20080252737A1 (en) Method and Apparatus for Providing an Interactive Control System
CN111064896A (en) Device control method and electronic device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1120040020158

Country of ref document: DE

WWE Wipo information: entry into national phase

Ref document number: 0610143.0

Country of ref document: GB

Ref document number: 0610143

Country of ref document: GB

122 Ep: pct application non-entry in european phase