WO2009141497A1 - Device and method for displaying and updating graphical objects according to movement of a device - Google Patents

Device and method for displaying and updating graphical objects according to movement of a device Download PDF

Info

Publication number
WO2009141497A1
WO2009141497A1 PCT/FI2009/050394 FI2009050394W WO2009141497A1 WO 2009141497 A1 WO2009141497 A1 WO 2009141497A1 FI 2009050394 W FI2009050394 W FI 2009050394W WO 2009141497 A1 WO2009141497 A1 WO 2009141497A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
motion
light
updated
user interface
Prior art date
Application number
PCT/FI2009/050394
Other languages
French (fr)
Inventor
Turo Keski-Jaskari
Pauli Laitinen
Mikko Nurmi
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Publication of WO2009141497A1 publication Critical patent/WO2009141497A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the disclosed embodiments relate to a device and a method for displaying three dimensional objects as shadowed on a display and in particular to a device and a method for displaying three dimensional objects as shadowed taking into account the direction of the ambient light.
  • GUI graphical user interface
  • UI user interface
  • These objects can be control objects such as applications, organizational objects such as files and folders, or functional objects such as objects used in games or other applications.
  • US Patent Publication No. US 20070236485 discloses methods and systems for displaying an image as a virtual representation of an object based upon detected external light are described.
  • An illustrative computer-implemented method includes steps of detecting the ambient light of an environment, such as a room, at a display surface. Data representative of the detected ambient light is processed to determine a direction of the detected light with respect to the display surface.
  • An image is displayed on the display surface as a virtual representation of an object based upon the detected ambient light. Shadowing affects may be displayed to create the appearance that the virtual representation of the object casts a shadow on the display surface. Physical objects placed against or near to the surface of the display surface may also have images displayed on the display surface corresponding to shadows created by the ambient light.
  • the step of processing data representative of detected ambient light as is done by image processing from a captured image which is usually computationally very demanding requiring a lot of processing power and memory space to be performed.
  • Portable devices such as mobile or cellular phones and personal digital assistants (PDAs) commonly have lower computational resources compared to stationary devices such as desktop computers. As these portable devices are often moved around the ambient light changes often and frequent updates are necessary. The computationally demanding methods disclosed above are thus unsuitable for such portable devices.
  • a user interface is provided that is adapted to determine a direction of the ambient light and thereafter update the direction according to detected movements of the device incorporating the user interface .
  • the disclosed embodiments provide a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, and a controller configured to receive light data generated by said light sensor, determine an initial direction of an external light source according to said light data, receive motion data from said motion detector, determine a motion of said portable device based on said motion data and to determine an updated direction according to the determined motion and update said display according to the updated determined direction .
  • the update of the rendering and the display of the objects can be achieved using less computational power as it is, from a computational resource point of view, less demanding to determine a movement or motion than it is to fully analyze a captured image for detecting the ambient light.
  • Another advantage is that the direction of the ambient light can also be determined when the light sensor is covered.
  • the determination of the movement can either be achieved using a built-in camera or using motion sensors such as accelerometers or gyroscopes or other commonly known motion detectors .
  • controller is further configured to determine said initial direction of an external light source according to said motion data.
  • the controller is further configured to determine whether an update criterion is fulfilled and if said update criterion is fulfilled receive updated light data generated by said light sensor and determine a second initial direction of an external light source from said updated light data, receive further motion data and determine a second updated direction according to said further motion data and to update said display according to the second updated direction .
  • the direction of the ambient light should be updated using the more resource demanding step of determining the initial direction. This should be done regularly enough to ensure a real-life like rendering, but not too often as this would consume too much computational power. This can be achieved by using certain update criteria as below.
  • the aspects of the disclosed embodiments are also directed to providing a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, and a controller configured to receive light data generated by said light sensor, receive motion data from said motion detector, determine a motion of said portable device based on said motion data, determine a direction of an external light source according to light data generated by said light sensor and according to the determined motion update said display according to the determined direction.
  • the controller is further configured to determine whether an update criterion is fulfilled and if said update criterion is fulfilled receive updated light data generated by said light sensor, receive further motion data and determine a second updated direction and then determine a second direction of an external light source according to said updated light data and according to said further motion data and to update said display according to the second updated direction.
  • the update criterion is based on said received motion data.
  • the update criterion is related to any of or a combination of features taken from the group comprising: calculated travelled distance, acceleration, shock, rotation.
  • the update criterion is temporal.
  • the update criterion is temporal and based on motion.
  • the update criterion is related to any of or a combination of features taken from the group comprising rotation during period of time, series of accelerations detected.
  • the light sensor is an Ambient Light Sensor configured to generate said light data comprising an intensity level.
  • the light sensor is configured to generate said light data consisting of an intensity level.
  • the light sensor is an Ambient Light Sensor implemented through a phototransistor, a photodiode or an integrated chip.
  • the initial direction is determined to be a direction substantially being 90 degrees from a display plane, i.e. the normal to the display.
  • the intensity level is monitored. If it grows it means that it is being turned towards the light source and the shadows should be decreased. And if the intensity is fading it means that the device is being turned away from the light source and the shadows should grow in the direction the device is being turned.
  • the light sensor is a camera and said controller is configured to determine said initial direction through image processing of said received light data.
  • the processor can determine the position of the virtual shadows.
  • the display is configured to display virtual shadows of said graphical object, said virtual shadows virtually resulting from said external light source.
  • the display is configured to display moving shadows according to the determined motion and updated initial light direction.
  • the display is configured to display a second graphical object according to said updated light direction .
  • the first graphical object is different from said second graphical object.
  • the first graphical object carries a first data set and said second graphical object carries a second data set, wherein said first data set is different from said second data set.
  • first data set and said second data set are chosen from a group comprising: date, time, calendar view, graphical object, internet-related information, news, weather and application specific data.
  • graphical object is a two dimensional rendering of a three dimensional image.
  • the aspects of the disclosed embodiments are also directed to providing a user interface comprising a controller, a light sensor and a display configured to display a graphical object, wherein said light sensor is configured to generate a non-directional intensity reading and said controller is configured to determine a direction from this non-directional intensity reading.
  • the controller is configured to receive motion data from a motion sensor and to determine said direction based on said motion data and said non- directional intensity reading.
  • the light sensor is an Ambient Light Sensor implemented through a phototransistor, a photodiode or an integrated chip.
  • the aspects of the disclosed embodiments are also directed to providing a user interface comprising a light sensor, a motion detector, a display configured to display a first graphical object, and a controller operatively coupled to said light sensor and said motion detector, wherein said controller is operatively coupled to said display and configured to determine an initial direction of an external light source from light data generated by said light sensor, receive motion data from said motion detector, determine a motion of said portable device based on said motion data and to determine an updated direction according to the determined motion and update said display according to the updated determined direction.
  • the controller is further configured to determine whether an update criterion is fulfilled and if said update criterion is fulfilled receive updated light data generated by said light sensor and determine a second initial direction of an external light source from said updated light, receive further motion data and determine a second updated direction according to said further motion data.
  • the aspects of the disclosed embodiments are also directed to providing a user interface comprising light sensor means for generating light data, motion detector means for generating motion data, display means for displaying a first graphical object, and controller means for receiving light data generated by said light sensor means, determining an initial direction of an external light source according to said light data, receiving motion data from said motion detector means, determining an updated direction according to the motion data and the initial direction, and updating said display according to the updated direction.
  • the user interface further comprises controller means for determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data from said light sensor means and determining a second initial direction of an external light source according to said updated light, and for receiving further motion data and determining a second updated direction according to said further motion data and second initial direction and updating said display according to the second updated direction.
  • the aspects of the disclosed embodiments are also directed to providing a portable device comprising a user interface according to above.
  • the device is a mobile communication terminal or a laptop computer or a drawing pad or a personal digital assistant. It should also be noted that the device above have the same advantages and alternatives as the user interfaces described above.
  • the aspects of the disclosed embodiments are also directed to providing a method for displaying graphical objects in a device taking into account an ambient light source comprising receiving light data generated by a light sensor, determining an initial direction of an external light source according to said light data, receiving motion data from said a detector, determining a motion of said portable device based on said motion data and determining an updated direction according to the determined motion and updating said display according to the updated determined direction.
  • the method is for displaying graphical objects in a portable device.
  • the method further comprises determining said initial direction of an external light source according to said motion data.
  • the method further comprises determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data generated by said light sensor and determining a second initial direction of an external light source from said updated light, receiving further motion data and determining a second updated direction according to said further motion data and updating said display according to the second updated direction.
  • the aspects of the disclosed embodiments are also directed to providing a computer readable medium including at least computer program code for controlling a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, said computer readable medium comprising software code for receiving light data generated by said light sensor, software code for receiving motion data from said motion detector, software code for determining a motion of said portable device based on said motion data, software code for determining an direction of an external light source according to light data generated by said light sensor and according to the determined motion and software code for updating said display according to the determined direction.
  • the computer readable medium further comprises software code for determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data generated by said light sensor, receiving further motion data and determining a second updated direction and then determining a second direction of an external light source according to said updated light data and according to said further motion data and updating said display according to the second updated direction .
  • the aspects of the disclosed embodiments are also directed to providing a device incorporating and implementing or adapted to incorporate and to implement a computer readable medium according to above.
  • the aspects of the disclosed embodiments are also directed to providing a device comprising a user interface according to above.
  • the device is a mobile communications terminal .
  • the device is a laptop computer.
  • the device is a drawing pad. In one embodiment the device is a personal digital assistant .
  • the aspects of the disclosed embodiments are also directed to providing a device incorporating and implementing or adapted to incorporate and to implement a computer readable medium according to a computer readable medium including at least computer program code for controlling a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, said computer readable medium comprising software code for receiving light data generated by said light sensor, software code for receiving motion data from said motion detector, software code for determining a motion of said portable device based on said motion data, software code for determining an direction of an external light source according to light data generated by said light sensor and according to the determined motion and software code for updating said display according to the determined direction.
  • the aspects of the disclosed embodiments are also directed to providing a user interface comprising a motion detector configured to detect a movement pattern of a device and a display configured to display a first graphical object, wherein said display operatively coupled to said motion detector and configured to display a second graphical object upon detection of said movement pattern by said motion detector .
  • the aspects of the disclosed embodiments are also directed to providing a user interface comprising a display configured to display a first data set and a controller configured to determine a direction of said device and display a second data set according to the determined direction wherein said first data set is different from said second set.
  • the first graphical object is different from said second graphical object.
  • said movement pattern corresponds to a position change of said device.
  • said first and second graphical objects are each associated with a position. These positions can be any of side up, face down, face up, upright, upside- down etc.
  • said graphical object is associated with any of the following taken from the group comprising applications, functions, files, folders, modes.
  • the first graphical object carries a first data set and said second graphical object carries a second data set, wherein said first data set is different from said second data set.
  • the aspects of the disclosed embodiments are also directed to providing a user interface comprising a display configured to display a first data set and a controller configured to determine a direction of said device and display a second data set according to the determined direction wherein said first data set is different from said second set.
  • first data set and said second data set are chosen from a group comprising: date, time, calendar view, image data, internet-related information, news, weather and application specific data.
  • the aspects of the disclosed embodiments are also directed to providing a device comprising a user interface according to above.
  • the light sensor is replaced by a determination of the sun' s position for example through a database query.
  • Light sensor can be replaced by a determination of position of any moving object, for example other planets, stars, moon etc.
  • Fig. 1 is an overview of a telecommunications system in which a device according to the present application is used according to an embodiment
  • Fig. 2 is a plane front view of a device according to an embodiment
  • FIG. 3 is a block diagram illustrating the general architecture of a device of Fig. 1 in accordance with the present application
  • Fig. 4 a and b are views of a device according to an embodiment
  • Fig. 5a and 5b are flow charts describing a method according to an embodiment
  • Fig.6 shows intensity level diagrams for a device with different pivotal axis according to the teachings herein,
  • Fig.7a and 7b show display views of an alternative embodiment
  • Figure 8 shows a flowchart according to an embodiment.
  • the device, the method and the software product according to the teachings of this application in the form of a cellular/mobile phone will be described by the embodiments. It should be noted that although only an exemplary mobile phone is described herein with reference to the aspects of the disclosed embodiments, the teachings of this application can also be used in any portable electronic device such as a laptop, PDA, mobile communication terminal, drawing pad, electronic book and notepad and other portable electronic devices designed to use a three dimensional graphical user interface.
  • FIG. 1 illustrates an example of a cellular telecommunications system in which the teachings of the present application may be applied.
  • various telecommunications services such as cellular voice calls, www/wap browsing, cellular video calls, data calls, facsimile transmissions, music transmissions, still image transmissions, video transmissions, electronic message transmissions and electronic commerce may be performed between a mobile terminal 100 according to the teachings of the present application and other devices, such as another mobile terminal 106 or a stationary telephone 132.
  • a mobile terminal 100 may be performed by the teachings of the present application and other devices, such as another mobile terminal 106 or a stationary telephone 132.
  • different ones of the telecommunications services referred to above may or may not be available; the teachings of the present application are not limited to any particular set of services in this respect.
  • the mobile terminals 100, 106 are connected to a mobile telecommunications network 110 through RF links 102, 108 via base stations 104, 109.
  • the mobile telecommunications network 110 may be in compliance with any commercially available mobile telecommunications standard, such as GSM, UMTS, D- AMPS, CDMA2000, FOMA and TD-SCDMA.
  • the mobile telecommunications network 110 is operatively connected to a wide area network 120, which may be Internet or a part thereof.
  • An Internet server 122 has a data storage 124 and is connected to the wide area network 120, as is an Internet client computer 126.
  • the server 122 may host a www/wap server capable of serving www/wap content to the mobile terminal 100.
  • a public switched telephone network (PSTN) 130 is connected to the mobile telecommunications network 110 in a familiar manner.
  • Various telephone terminals, including the stationary telephone 132, are connected to the PSTN 130.
  • the mobile terminal 100 is also capable of communicating locally via a local link 101 to one or more local devices 103.
  • the local link can be any type of link with a limited range, such as Bluetooth, a Universal Serial Bus (USB) link, a Wireless Universal Serial Bus (WUSB) link, an IEEE 802.11 wireless local area network link, an RS-232 serial link, etc.
  • the local devices 103 can for example be various sensors that can communicate measurement values to the mobile terminal 100 over the local link 101.
  • the mobile terminal 200 comprises a speaker or earphone 202, a microphone 206, a touch display 203 and a set of keys 204 which may include virtual keys 204a, soft keys 204b, 204c and a joystick 205 or other type of navigational input device.
  • the mobile terminal 200 is also provided with a light sensor 207.
  • this light sensor is an Ambient Light Sensor (ALS) implemented as a photoresistor .
  • ALS Ambient Light Sensor
  • this light sensor is a camera.
  • the mobile terminal 200 is also provided with a motion detector 208 which can be any of an accelerometer a gyroscope or other commonly known motion sensor.
  • the mobile terminal has a controller 300 which is responsible for the overall operation of the mobile terminal and may be implemented by any commercially available CPU ("Central Processing Unit"), DSP ("Digital Signal Processor") or any other electronic programmable logic device.
  • the controller 300 has associated electronic memory 302 such as RAM memory, ROM memory, EEPROM memory, flash memory, or any combination thereof.
  • the memory 302 is used for various purposes by the controller 300, one of them being for storing data used by and program instructions for various software in the mobile terminal.
  • the software includes a real-time operating system 320, drivers for a man-machine interface (MMI) 334, an application handler 332 as well as various applications.
  • MMI man-machine interface
  • the applications can include a message text editor 350, a gaming application 360, as well as various other applications 370, such as applications for voice calling, video calling, sending and receiving Short Message Service (SMS) messages, Multimedia Message Service (MMS) messages or email, web browsing, an instant messaging application, a phone book application, a calendar application, a control panel application, a camera application, a notepad application, etc. It should be noted that two or more of the applications listed above may be executed as the same application
  • the MMI 334 also includes one or more hardware controllers, which together with the MMI drivers cooperate with the first display 336/203, and the keypad 338/204 as well as various other I/O devices such as microphone, speaker, vibrator, ringtone generator, LED indicator, camera, radio, motion detector etc. As is commonly known, the user may operate the mobile terminal through the man-machine interface thus formed.
  • the software also includes various modules, protocol stacks, drivers, etc., which are commonly designated as 330 and which provide communication services (such as transport, network and connectivity) for an RF interface 306, and optionally a Bluetooth interface 308 and/or an IrDA interface 310 for local connectivity.
  • the RF interface 306 comprises an internal or external antenna as well as appropriate radio circuitry for establishing and maintaining a wireless link to a base station (e.g. the link 102 and base station 104 in FIG. 1) .
  • the radio circuitry comprises a series of analogue and digital electronic components, together forming a radio receiver and transmitter. These components include, band pass filters, amplifiers, mixers, local oscillators, low pass filters, AD/DA converters, etc.
  • the mobile terminal also has a SIM card 304 and an associated reader.
  • the SIM card 304 comprises a processor as well as local work and data memory.
  • the various aspects of what is described above can be used alone or in various combinations.
  • the teaching of this application may be implemented by a combination of hardware and software, but can also be implemented in hardware or software.
  • the teaching of this application can also be embodied as computer readable code on a computer readable medium. It should be noted that the teaching of this application is not limited to the use in mobile communication terminals such as mobile phones, but can be equally well applied in Personal digital Assistants (PDAs) , laptops, drawing pads, personal organizers or any other device designed for rendering and providing a user with three dimensional objects on a two dimensional display.
  • PDAs Personal digital Assistants
  • Fig 4a shows a device according to the teachings herein which will be described with simultaneous reference to figure 5 which shows a flow chart of a method according to the teachings herein.
  • the device 400 is, in this exemplary embodiment, a mobile phone such as is described with reference to figure 2.
  • an ambient light source or light source 411 This light source can be any external light source such as incoming light form a window, a lamp, the sun etc.
  • the light sensor 407 receives the light from the light source 411 and generates light data which is then processed by the controller 300 to determine a first initial direction 409 of the light source 411 (step 520) .
  • the light data generated depends on the type of light sensor 407 being used.
  • the direction 409 indicates from which direction light from the external or ambient light source falls on the display. As a direction 409 is determined a shadow 411 of an object 412 being displayed on the display 403 is displayed corresponding to the incoming direction of the light of the external light source 411.
  • the motion detector 408 generates motion data (step 530) that is processed by the controller to determine a movement of the mobile phone 400 and how the light direction 409 changes according to the movement of the mobile phone 400.
  • FIG 4b the mobile phone 400 has been rotated anticlockwise with relation to the light source 411 as indicated by the arrow 412.
  • the light source 408 has been moved in figure 4a relative to figure 4b it is to be understood that it is the device 400 that has been moved, not the light source 408 itself.
  • the movement of the light source 408 in the figures is only for illustrative purposes to show the changed relative direction 409 between the device 400 and the light source 408.
  • an updated direction 409' has been determined based on the initial direction 409 and the movement of the mobile phone 400 determined according to the motion data a new or updated shadow 410' is displayed (step 504) .
  • the user will thus perceive a graphical user interface where three dimensional objects are provided with shadows that move and follow the ambient light thereby creating the perception that the objects are real and part of the users reality thus furthering the intuitive understanding of how the object functions and relates to other objects.
  • One advantage of using motion sensors to determine an updated direction of the light source is that provides a possibility of updating the display even if the light sensor 407 is blocked or covered, such as by a user's finger.
  • the direction 409 will also change and as the updated direction 409' based on the movement is a calculated direction it may be wrong and this error will, in some cases such as under advanced movement patterns or when the mobile device is stationary in a moving vehicle, grow over time and eventually the displayed shadow 411 is not in line with the real direction of the light source.
  • an update criterion is checked (step 505) . As the update criterion is fulfilled the steps of determining an initial direction 409 from light data generated by the light sensor is repeated which ensures that an updated and possibly corrected direction is used.
  • This update criterion can be based on a calculated traveled distance, which can be determined from a measured acceleration and lapsed time.
  • the update can then be executed as the devices stops shaking or performing the rapid or sudden movements. If the device is shaking hard it will most likely be difficult for a user to accurately perceive the shadows and thus it matters little whether there is a small error or not and an update can be held off until the movements stop without lessening the perception of the objects.
  • the update criterion can also be temporal in that at certain intervals the direction is updated or a combination of motion based and temporal criteria can be used such as above for the criterion of traveled distance.
  • the update criterion is related to rotations of the device. As a rotation generates a high angular speed which will lead to a fast directional change of the ambient light source 411, rotations generally require more updates than linear motions.
  • the movement sensor may also comprise a Global
  • GPS Positioning System
  • the shadows 410 can either be instantaneously updated or by allowing the displayed shadow 410 to wander to its correct position. Both embodiments have their advantages when it comes to user perception and aesthetic effects.
  • the light sensor 407 is an Ambient light source sensor, ALS, and the light data generated is simply an intensity reading.
  • Figure 6 shows a graph 620, 620' of how the light intensity measured by an ambient light sensor 607 changes according to how a mobile phone 600 is rotated around an axis 630, 630' .
  • Figure 6a shows a device 600 that is pivoted around a pivot axis 630 and a graph 620 of how the intensity level measured by the ambient light sensor 607 changes accordingly.
  • Figure 6a shows a device 600 that is pivoted around a pivot axis 630' and a graph 620' of how the intensity level measured by the ambient light sensor 607 changes accordingly. From this combination of light data and motion data a controller 300 can easily determine the direction of the ambient light without advanced image processing requiring a great deal of computational resources.
  • a simple light sensor only designed to measure one intensity level irrespective or color is also much cheaper than a camera.
  • the initial direction is taken to be a direction substantially being 90 degrees from a display plane, i.e. the normal to the display.
  • the intensity level is monitored. If it grows it means that it is being turned towards the light source and the shadows should be decreased. And if the intensity is fading it means that the device is being turned away from the light source and the shadows should grow in the direction the device is being turned.
  • the light sensor 407 is a camera and the light data generated is an image of the surroundings. From this image it is possible for a controller to determine the direction 409 of the light source 408 using image processing. Although this requires some more computational power it is only done at certain intervals or times when the initial direction is determined and not while the updated directions 409' are determined based on the movement of the mobile phone 400.
  • the direction of the ambient light source 411 is not measured using the light sensor, but it is determined based on calculations. These calculations are based on the position of the device, which position can be achieved through a GPS device, cellular triangulation or other location determining technique, in combination with calendar and time of day data to determine the position of the sun in the sky and from this determine the direction to the sun relative to the device taking into account the direction and position of the device. This direction can be achieved by using a built-in compass.
  • FIG. 7 shows a method of such an embodiment where it is determined if a user interface's graphical representation needs to be drawn (as when going from an inactive or idle mode where the screen or display is inactive to an active mode where the display is activated), step 710. If it is determined that such a graphical representation needs to be drawn the sun's (or position of any other object which movement can be measured or is known) position is determined in step 720, possibly by querying a database or as has been described above and by determining the position of the device through the use of a directional finding means such as a compass.
  • a directional finding means such as a compass.
  • step 740 virtual shadows corresponding to the sun' s position are defined for any graphical object that is comprised in the graphical representation of the user interface.
  • the graphical representation is then displayed on a display in step 750. Should further graphical objects need to be changed such as when a view of the user interface has been changed such as when a new application is started or a subfolder has been opened (step 760) new shadows is defined for these new objects by returning to step 740 and displayed in step 750.
  • step 770 the device's new position is checked by returning to step 720 and an updated direction to the sun is determined (730), shadows are redefined (740) and displayed (750) .
  • This provides for the possibility of determining the sun' s position without using a camera or other light sensor as these may be covered by a finger, a skin or a carrying case .
  • the database query replaces the light sensor for generating the light data and the motion sensor is the compass.
  • shadows can be rendered and displayed as being shortened or longer based on the sun's position (or the position of another planet or stellar body) on the sky. For example; in evening time shadows are longer and during the daytime the shadows are shorter as the sun is shining almost directly towards the earth.
  • tilting the device is one possible movement that is determined through the use of the accelerometer or movement sensor. Tilting or turning the device may also be taken into account when determining the length of the shadows, as if the sun's angle is wide in relation to the device screen, the shadow is rendered to be longer .
  • Figure 8 shows a screen view of an alternative device according to the teachings herein.
  • the device In a first initial state the device is in rest and the display 803 shows an object 812.
  • the object 812 In one embodiment the object 812 carries information content or data 813 which is made visible to a user.
  • the data 813 carried may be related to calendar entries, images, upcoming tasks, time of day, content fetched over an internet or Wireless Application Protocol connection, application specific data, news, weather information, slideshows etc.
  • the object 812 carries the data of the time of day 813.
  • a motion detector 408
  • the display 803 is updated to display a second object 812' .
  • the information 813' carried by the second object 812' indicates a sun and a temperature of 90 degrees Fahrenheit.
  • this second object 812' is the same as the first object 812, but the information 813 carried by the object 812 is changed.
  • the object 812 to be displayed is associated with one direction or position of the device. For example, when the device is lying on its right side, the time of day and data related to an alarm clock application is shown. When the device is lying on its left side data relating to a media player application is shown.
  • the objects 812 and/or the carried information 813 is not associated with a direction or position but is simply scrolled though by a user moving the device or making a predetermined movement pattern such as a shake, possibly in a specific direction, a flip, a rotation, a turn, a twist etc. This provides the possibility of having a number of applications and data relating to these to be activated and displayed simply by turning or moving the device. For example a device having four applications media player, calendar, text and message editor and a voice call application can be configured to allow a user to switch application or at least to view data or information relating to the various applications by flipping the phone.
  • the user interface is configured to detect a plurality of light sources and combine their effect so that each light source will give rise to a shadow.
  • An illustrative example is in some football games where you can see a player having four different shadows because of four different light sources at the stadium.
  • the determination of multiple light source can be implemented both when determining the light sources through the use of a light sensor and also through a database query.
  • the shadows are to only rendered and displayed on the display.
  • the user interface could also change the color of other components such as keys, the cover et.c. depending on the user interface's and corresponding device's capabilities.
  • the shadows are also applicable to two dimensional objects as well as three dimensional objects. In this case the shadows are replaced by gloss on the 2D icons.
  • teaching of the present application has been described in terms of a mobile phone, it should be appreciated that the teachings of the present application may also be applied to other types of electronic devices, such as music players, palmtop computers and the like. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the teachings of the present application.
  • the term “comprising” as used in the claims does not exclude other elements or steps.
  • the term “a” or “an” as used in the claims does not exclude a plurality.
  • a unit or other means may fulfill the functions of several units or means recited in the claims.

Abstract

A user interface includes a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, and a controller. The controller is configured to receive light data generated by the light sensor and to determine an initial direction of an external light source according to the light data. The controller is also configured to receive motion data from the motion detector and determine a motion of the portable device based on the motion data and to determine an updated direction according to the determined motion and update the display according to the updated determined direction.

Description

DEVICE AND METHOD FOR DISPLAYING AND UPDATING GRAPHICAL OBJECTS ACCORDING TO MOVEMENT OF A DEVICE.
FIELD The disclosed embodiments relate to a device and a method for displaying three dimensional objects as shadowed on a display and in particular to a device and a method for displaying three dimensional objects as shadowed taking into account the direction of the ambient light.
BACKGROUND
In the field of graphical user interfaces (GUI) it is commonly known to present user interface (UI) objects in three dimensions to add a level of perception to the two dimensional display. Using this level it is possible to provide a user with more information regarding a certain object and its relation to other objects.
These objects can be control objects such as applications, organizational objects such as files and folders, or functional objects such as objects used in games or other applications.
To further improve the perception of these objects it is known to apply a virtual shadow to the objects as they are then more easily perceived by a user requiring less cognitive effort to deduct its relation with other objects, the objects position in the virtual space presented on the display and it also has a more realistic look that the user is more accustomed to and therefore is easier to understand by a user . US Patent Publication No. US 20070236485 discloses methods and systems for displaying an image as a virtual representation of an object based upon detected external light are described. An illustrative computer-implemented method includes steps of detecting the ambient light of an environment, such as a room, at a display surface. Data representative of the detected ambient light is processed to determine a direction of the detected light with respect to the display surface. An image is displayed on the display surface as a virtual representation of an object based upon the detected ambient light. Shadowing affects may be displayed to create the appearance that the virtual representation of the object casts a shadow on the display surface. Physical objects placed against or near to the surface of the display surface may also have images displayed on the display surface corresponding to shadows created by the ambient light.
The step of processing data representative of detected ambient light, as is done by image processing from a captured image which is usually computationally very demanding requiring a lot of processing power and memory space to be performed.
Portable devices, such as mobile or cellular phones and personal digital assistants (PDAs) commonly have lower computational resources compared to stationary devices such as desktop computers. As these portable devices are often moved around the ambient light changes often and frequent updates are necessary. The computationally demanding methods disclosed above are thus unsuitable for such portable devices.
SUMMARY
On this background, it would be advantageous to provide a device and a method that overcomes or at least reduces the drawbacks indicated above. In one embodiment a user interface is provided that is adapted to determine a direction of the ambient light and thereafter update the direction according to detected movements of the device incorporating the user interface . The disclosed embodiments provide a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, and a controller configured to receive light data generated by said light sensor, determine an initial direction of an external light source according to said light data, receive motion data from said motion detector, determine a motion of said portable device based on said motion data and to determine an updated direction according to the determined motion and update said display according to the updated determined direction .
By determining a movement of a device incorporating a user interface and from this calculate a new direction of the ambient light the update of the rendering and the display of the objects can be achieved using less computational power as it is, from a computational resource point of view, less demanding to determine a movement or motion than it is to fully analyze a captured image for detecting the ambient light.
Another advantage is that the direction of the ambient light can also be determined when the light sensor is covered.
The determination of the movement can either be achieved using a built-in camera or using motion sensors such as accelerometers or gyroscopes or other commonly known motion detectors .
In one embodiment the controller is further configured to determine said initial direction of an external light source according to said motion data.
By taking use of the motion data already when deciding the first initial direction less image data need to be gathered and processed. Such a motion-based determination can easily be implemented to measure the light intensity in various directions and from this determine the light source. For example should the light intensity increase as the device is rotated to the left (from a user's perspective), the direction of the incoming light is from the left. In one embodiment the controller is further configured to determine whether an update criterion is fulfilled and if said update criterion is fulfilled receive updated light data generated by said light sensor and determine a second initial direction of an external light source from said updated light data, receive further motion data and determine a second updated direction according to said further motion data and to update said display according to the second updated direction .
To ensure that the rendered or virtual shadows are properly rendered and displayed in a manner which corresponds to a real-life shadow the direction of the ambient light should be updated using the more resource demanding step of determining the initial direction. This should be done regularly enough to ensure a real-life like rendering, but not too often as this would consume too much computational power. This can be achieved by using certain update criteria as below.
The aspects of the disclosed embodiments are also directed to providing a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, and a controller configured to receive light data generated by said light sensor, receive motion data from said motion detector, determine a motion of said portable device based on said motion data, determine a direction of an external light source according to light data generated by said light sensor and according to the determined motion update said display according to the determined direction. In one embodiment the controller is further configured to determine whether an update criterion is fulfilled and if said update criterion is fulfilled receive updated light data generated by said light sensor, receive further motion data and determine a second updated direction and then determine a second direction of an external light source according to said updated light data and according to said further motion data and to update said display according to the second updated direction. In one embodiment the update criterion is based on said received motion data.
In one embodiment the update criterion is related to any of or a combination of features taken from the group comprising: calculated travelled distance, acceleration, shock, rotation.
In one embodiment the update criterion is temporal.
In one embodiment the update criterion is temporal and based on motion.
In one embodiment the update criterion is related to any of or a combination of features taken from the group comprising rotation during period of time, series of accelerations detected.
In one embodiment the light sensor is an Ambient Light Sensor configured to generate said light data comprising an intensity level.
In one embodiment the light sensor is configured to generate said light data consisting of an intensity level.
In one embodiment the light sensor is an Ambient Light Sensor implemented through a phototransistor, a photodiode or an integrated chip.
In this embodiment the initial direction is determined to be a direction substantially being 90 degrees from a display plane, i.e. the normal to the display. As the device is moved the intensity level is monitored. If it grows it means that it is being turned towards the light source and the shadows should be decreased. And if the intensity is fading it means that the device is being turned away from the light source and the shadows should grow in the direction the device is being turned.
Using this initial direction there will be no erroneous virtual shadows in the beginning, but they will grow and be more and more correct as the device is moved around.
In one embodiment the light sensor is a camera and said controller is configured to determine said initial direction through image processing of said received light data.
Through the image processing shadows in the image of the surroundings can be identified and from the location of the camera the processor can determine the position of the virtual shadows.
In one embodiment the display is configured to display virtual shadows of said graphical object, said virtual shadows virtually resulting from said external light source.
In one embodiment the display is configured to display moving shadows according to the determined motion and updated initial light direction.
In one embodiment the display is configured to display a second graphical object according to said updated light direction . In one embodiment the first graphical object is different from said second graphical object.
In one embodiment the first graphical object carries a first data set and said second graphical object carries a second data set, wherein said first data set is different from said second data set.
In one embodiment the first data set and said second data set are chosen from a group comprising: date, time, calendar view, graphical object, internet-related information, news, weather and application specific data. In one embodiment the graphical object is a two dimensional rendering of a three dimensional image.
The aspects of the disclosed embodiments are also directed to providing a user interface comprising a controller, a light sensor and a display configured to display a graphical object, wherein said light sensor is configured to generate a non-directional intensity reading and said controller is configured to determine a direction from this non-directional intensity reading. In one embodiment the controller is configured to receive motion data from a motion sensor and to determine said direction based on said motion data and said non- directional intensity reading.
In one embodiment the light sensor is an Ambient Light Sensor implemented through a phototransistor, a photodiode or an integrated chip.
The aspects of the disclosed embodiments are also directed to providing a user interface comprising a light sensor, a motion detector, a display configured to display a first graphical object, and a controller operatively coupled to said light sensor and said motion detector, wherein said controller is operatively coupled to said display and configured to determine an initial direction of an external light source from light data generated by said light sensor, receive motion data from said motion detector, determine a motion of said portable device based on said motion data and to determine an updated direction according to the determined motion and update said display according to the updated determined direction. In one embodiment the controller is further configured to determine whether an update criterion is fulfilled and if said update criterion is fulfilled receive updated light data generated by said light sensor and determine a second initial direction of an external light source from said updated light, receive further motion data and determine a second updated direction according to said further motion data.
The aspects of the disclosed embodiments are also directed to providing a user interface comprising light sensor means for generating light data, motion detector means for generating motion data, display means for displaying a first graphical object, and controller means for receiving light data generated by said light sensor means, determining an initial direction of an external light source according to said light data, receiving motion data from said motion detector means, determining an updated direction according to the motion data and the initial direction, and updating said display according to the updated direction.
In one embodiment the user interface further comprises controller means for determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data from said light sensor means and determining a second initial direction of an external light source according to said updated light, and for receiving further motion data and determining a second updated direction according to said further motion data and second initial direction and updating said display according to the second updated direction.
It should also be noted that the user interfaces above have the same advantages and alternatives as described above.
The aspects of the disclosed embodiments are also directed to providing a portable device comprising a user interface according to above. In one embodiment the device is a mobile communication terminal or a laptop computer or a drawing pad or a personal digital assistant. It should also be noted that the device above have the same advantages and alternatives as the user interfaces described above.
The aspects of the disclosed embodiments are also directed to providing a method for displaying graphical objects in a device taking into account an ambient light source comprising receiving light data generated by a light sensor, determining an initial direction of an external light source according to said light data, receiving motion data from said a detector, determining a motion of said portable device based on said motion data and determining an updated direction according to the determined motion and updating said display according to the updated determined direction.
This method and the embodiments below have the same advantages as stated above.
In one embodiment the method is for displaying graphical objects in a portable device.
In one embodiment the method further comprises determining said initial direction of an external light source according to said motion data.
In one embodiment the method further comprises determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data generated by said light sensor and determining a second initial direction of an external light source from said updated light, receiving further motion data and determining a second updated direction according to said further motion data and updating said display according to the second updated direction. The aspects of the disclosed embodiments are also directed to providing a computer readable medium including at least computer program code for controlling a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, said computer readable medium comprising software code for receiving light data generated by said light sensor, software code for receiving motion data from said motion detector, software code for determining a motion of said portable device based on said motion data, software code for determining an direction of an external light source according to light data generated by said light sensor and according to the determined motion and software code for updating said display according to the determined direction.
This computer readable medium and the embodiments below have the same advantages as stated above.
In one embodiment the computer readable medium further comprises software code for determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data generated by said light sensor, receiving further motion data and determining a second updated direction and then determining a second direction of an external light source according to said updated light data and according to said further motion data and updating said display according to the second updated direction .
The aspects of the disclosed embodiments are also directed to providing a device incorporating and implementing or adapted to incorporate and to implement a computer readable medium according to above.
The aspects of the disclosed embodiments are also directed to providing a device comprising a user interface according to above. In one embodiment the device is a mobile communications terminal .
In one embodiment the device is a laptop computer.
In one embodiment the device is a drawing pad. In one embodiment the device is a personal digital assistant .
The aspects of the disclosed embodiments are also directed to providing a device incorporating and implementing or adapted to incorporate and to implement a computer readable medium according to a computer readable medium including at least computer program code for controlling a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, said computer readable medium comprising software code for receiving light data generated by said light sensor, software code for receiving motion data from said motion detector, software code for determining a motion of said portable device based on said motion data, software code for determining an direction of an external light source according to light data generated by said light sensor and according to the determined motion and software code for updating said display according to the determined direction. The aspects of the disclosed embodiments are also directed to providing a user interface comprising a motion detector configured to detect a movement pattern of a device and a display configured to display a first graphical object, wherein said display operatively coupled to said motion detector and configured to display a second graphical object upon detection of said movement pattern by said motion detector .
The aspects of the disclosed embodiments are also directed to providing a user interface comprising a display configured to display a first data set and a controller configured to determine a direction of said device and display a second data set according to the determined direction wherein said first data set is different from said second set. In one embodiment the first graphical object is different from said second graphical object.
In one embodiment said movement pattern corresponds to a position change of said device. In one embodiment said first and second graphical objects are each associated with a position. These positions can be any of side up, face down, face up, upright, upside- down etc.
In one embodiment said graphical object is associated with any of the following taken from the group comprising applications, functions, files, folders, modes.
In one embodiment the first graphical object carries a first data set and said second graphical object carries a second data set, wherein said first data set is different from said second data set.
The aspects of the disclosed embodiments are also directed to providing a user interface comprising a display configured to display a first data set and a controller configured to determine a direction of said device and display a second data set according to the determined direction wherein said first data set is different from said second set.
In one embodiment of any of the above the first data set and said second data set are chosen from a group comprising: date, time, calendar view, image data, internet-related information, news, weather and application specific data.
The aspects of the disclosed embodiments are also directed to providing a device comprising a user interface according to above. As an alternative to the embodiment above, the light sensor is replaced by a determination of the sun' s position for example through a database query. Light sensor can be replaced by a determination of position of any moving object, for example other planets, stars, moon etc. Further aspects, features, advantages and properties of device, method and computer readable medium according to the present application will become apparent from the detailed description .
BRIEF DESCRIPTION OF THE DRAWINGS
In the following detailed portion of the present description, the teachings of the present application will be explained in more detail with reference to the example embodiments shown in the drawings, in which:
Fig. 1 is an overview of a telecommunications system in which a device according to the present application is used according to an embodiment, Fig. 2 is a plane front view of a device according to an embodiment,
Fig. 3 is a block diagram illustrating the general architecture of a device of Fig. 1 in accordance with the present application, Fig. 4 a and b are views of a device according to an embodiment,
Fig. 5a and 5b are flow charts describing a method according to an embodiment,
Fig.6 shows intensity level diagrams for a device with different pivotal axis according to the teachings herein,
Fig.7a and 7b show display views of an alternative embodiment and
Figure 8 shows a flowchart according to an embodiment.
DETAILED DESCRIPTION
In the following detailed description, the device, the method and the software product according to the teachings of this application in the form of a cellular/mobile phone will be described by the embodiments. It should be noted that although only an exemplary mobile phone is described herein with reference to the aspects of the disclosed embodiments, the teachings of this application can also be used in any portable electronic device such as a laptop, PDA, mobile communication terminal, drawing pad, electronic book and notepad and other portable electronic devices designed to use a three dimensional graphical user interface.
FIG. 1 illustrates an example of a cellular telecommunications system in which the teachings of the present application may be applied. In the telecommunication system of FIG. 1, various telecommunications services such as cellular voice calls, www/wap browsing, cellular video calls, data calls, facsimile transmissions, music transmissions, still image transmissions, video transmissions, electronic message transmissions and electronic commerce may be performed between a mobile terminal 100 according to the teachings of the present application and other devices, such as another mobile terminal 106 or a stationary telephone 132. It is to be noted that for different embodiments of the mobile terminal 100 and in different situations, different ones of the telecommunications services referred to above may or may not be available; the teachings of the present application are not limited to any particular set of services in this respect. The mobile terminals 100, 106 are connected to a mobile telecommunications network 110 through RF links 102, 108 via base stations 104, 109. The mobile telecommunications network 110 may be in compliance with any commercially available mobile telecommunications standard, such as GSM, UMTS, D- AMPS, CDMA2000, FOMA and TD-SCDMA.
The mobile telecommunications network 110 is operatively connected to a wide area network 120, which may be Internet or a part thereof. An Internet server 122 has a data storage 124 and is connected to the wide area network 120, as is an Internet client computer 126. The server 122 may host a www/wap server capable of serving www/wap content to the mobile terminal 100.
A public switched telephone network (PSTN) 130 is connected to the mobile telecommunications network 110 in a familiar manner. Various telephone terminals, including the stationary telephone 132, are connected to the PSTN 130.
The mobile terminal 100 is also capable of communicating locally via a local link 101 to one or more local devices 103. The local link can be any type of link with a limited range, such as Bluetooth, a Universal Serial Bus (USB) link, a Wireless Universal Serial Bus (WUSB) link, an IEEE 802.11 wireless local area network link, an RS-232 serial link, etc.
The local devices 103 can for example be various sensors that can communicate measurement values to the mobile terminal 100 over the local link 101.
An embodiment 200 of the mobile terminal 100 is illustrated in more detail in FIG. 2. The mobile terminal 200 comprises a speaker or earphone 202, a microphone 206, a touch display 203 and a set of keys 204 which may include virtual keys 204a, soft keys 204b, 204c and a joystick 205 or other type of navigational input device.
The mobile terminal 200 is also provided with a light sensor 207. In one embodiment this light sensor is an Ambient Light Sensor (ALS) implemented as a photoresistor .
In one embodiment this light sensor is a camera.
The mobile terminal 200 is also provided with a motion detector 208 which can be any of an accelerometer a gyroscope or other commonly known motion sensor.
The internal component, software and protocol structure of the mobile terminal 200 will now be described with reference to FIG. 3. The mobile terminal has a controller 300 which is responsible for the overall operation of the mobile terminal and may be implemented by any commercially available CPU ("Central Processing Unit"), DSP ("Digital Signal Processor") or any other electronic programmable logic device. The controller 300 has associated electronic memory 302 such as RAM memory, ROM memory, EEPROM memory, flash memory, or any combination thereof. The memory 302 is used for various purposes by the controller 300, one of them being for storing data used by and program instructions for various software in the mobile terminal. The software includes a real-time operating system 320, drivers for a man-machine interface (MMI) 334, an application handler 332 as well as various applications. The applications can include a message text editor 350, a gaming application 360, as well as various other applications 370, such as applications for voice calling, video calling, sending and receiving Short Message Service (SMS) messages, Multimedia Message Service (MMS) messages or email, web browsing, an instant messaging application, a phone book application, a calendar application, a control panel application, a camera application, a notepad application, etc. It should be noted that two or more of the applications listed above may be executed as the same application
The MMI 334 also includes one or more hardware controllers, which together with the MMI drivers cooperate with the first display 336/203, and the keypad 338/204 as well as various other I/O devices such as microphone, speaker, vibrator, ringtone generator, LED indicator, camera, radio, motion detector etc. As is commonly known, the user may operate the mobile terminal through the man-machine interface thus formed.
The software also includes various modules, protocol stacks, drivers, etc., which are commonly designated as 330 and which provide communication services (such as transport, network and connectivity) for an RF interface 306, and optionally a Bluetooth interface 308 and/or an IrDA interface 310 for local connectivity. The RF interface 306 comprises an internal or external antenna as well as appropriate radio circuitry for establishing and maintaining a wireless link to a base station (e.g. the link 102 and base station 104 in FIG. 1) . As is well known to a man skilled in the art, the radio circuitry comprises a series of analogue and digital electronic components, together forming a radio receiver and transmitter. These components include, band pass filters, amplifiers, mixers, local oscillators, low pass filters, AD/DA converters, etc.
The mobile terminal also has a SIM card 304 and an associated reader. As is commonly known, the SIM card 304 comprises a processor as well as local work and data memory. The various aspects of what is described above can be used alone or in various combinations. The teaching of this application may be implemented by a combination of hardware and software, but can also be implemented in hardware or software. The teaching of this application can also be embodied as computer readable code on a computer readable medium. It should be noted that the teaching of this application is not limited to the use in mobile communication terminals such as mobile phones, but can be equally well applied in Personal digital Assistants (PDAs) , laptops, drawing pads, personal organizers or any other device designed for rendering and providing a user with three dimensional objects on a two dimensional display.
Fig 4a shows a device according to the teachings herein which will be described with simultaneous reference to figure 5 which shows a flow chart of a method according to the teachings herein. The device 400 is, in this exemplary embodiment, a mobile phone such as is described with reference to figure 2. In the environment surrounding the device 400 there is an ambient light source or light source 411. This light source can be any external light source such as incoming light form a window, a lamp, the sun etc. First (step 510), the light sensor 407 receives the light from the light source 411 and generates light data which is then processed by the controller 300 to determine a first initial direction 409 of the light source 411 (step 520) . The light data generated depends on the type of light sensor 407 being used. The direction 409 indicates from which direction light from the external or ambient light source falls on the display. As a direction 409 is determined a shadow 411 of an object 412 being displayed on the display 403 is displayed corresponding to the incoming direction of the light of the external light source 411.
Secondly or simultaneously, the motion detector 408 generates motion data (step 530) that is processed by the controller to determine a movement of the mobile phone 400 and how the light direction 409 changes according to the movement of the mobile phone 400.
In figure 4b the mobile phone 400 has been rotated anticlockwise with relation to the light source 411 as indicated by the arrow 412. It should be noted that although the light source 408 has been moved in figure 4a relative to figure 4b it is to be understood that it is the device 400 that has been moved, not the light source 408 itself. The movement of the light source 408 in the figures is only for illustrative purposes to show the changed relative direction 409 between the device 400 and the light source 408. As an updated direction 409' has been determined based on the initial direction 409 and the movement of the mobile phone 400 determined according to the motion data a new or updated shadow 410' is displayed (step 504) . The user will thus perceive a graphical user interface where three dimensional objects are provided with shadows that move and follow the ambient light thereby creating the perception that the objects are real and part of the users reality thus furthering the intuitive understanding of how the object functions and relates to other objects.
One advantage of using motion sensors to determine an updated direction of the light source is that provides a possibility of updating the display even if the light sensor 407 is blocked or covered, such as by a user's finger.
As the mobile phone 400 is moved, the direction 409 will also change and as the updated direction 409' based on the movement is a calculated direction it may be wrong and this error will, in some cases such as under advanced movement patterns or when the mobile device is stationary in a moving vehicle, grow over time and eventually the displayed shadow 411 is not in line with the real direction of the light source. To accommodate for this an update criterion is checked (step 505) . As the update criterion is fulfilled the steps of determining an initial direction 409 from light data generated by the light sensor is repeated which ensures that an updated and possibly corrected direction is used. This update criterion can be based on a calculated traveled distance, which can be determined from a measured acceleration and lapsed time. It can also be based on a series of sudden movements in which case the likelihood of the determined motion in relation to the light source has a greater error. To save battery power and computational power the update can then be executed as the devices stops shaking or performing the rapid or sudden movements. If the device is shaking hard it will most likely be difficult for a user to accurately perceive the shadows and thus it matters little whether there is a small error or not and an update can be held off until the movements stop without lessening the perception of the objects.
The update criterion can also be temporal in that at certain intervals the direction is updated or a combination of motion based and temporal criteria can be used such as above for the criterion of traveled distance.
In one embodiment the update criterion is related to rotations of the device. As a rotation generates a high angular speed which will lead to a fast directional change of the ambient light source 411, rotations generally require more updates than linear motions.
It is also possible to store data, relating to both movement and light, so that a more advanced model can be used to determine the updated direction 409' . For example if an initial direction 409 has been determined and the mobile phone 400 is then rotated 360 degrees, this would constitute a large movement and possibly fulfill an update criterion.
However as the rotation is a full turn it is easy to determine that the updated light direction 409' is the same as the initial direction 409 if data regarding the initial direction and motion history has been stored. Thus, a resource-demanding update is not necessary.
The movement sensor may also comprise a Global
Positioning System (GPS) device which can settle a situation where the mobile phone is stationary within a moving environment such as a moving vehicle in which case updates may be necessary.
After an updated direction 409' has been determined after an update criterion has been fulfilled the shadows 410 can either be instantaneously updated or by allowing the displayed shadow 410 to wander to its correct position. Both embodiments have their advantages when it comes to user perception and aesthetic effects. In one embodiment the light sensor 407 is an Ambient light source sensor, ALS, and the light data generated is simply an intensity reading.
Figure 6 shows a graph 620, 620' of how the light intensity measured by an ambient light sensor 607 changes according to how a mobile phone 600 is rotated around an axis 630, 630' . Figure 6a shows a device 600 that is pivoted around a pivot axis 630 and a graph 620 of how the intensity level measured by the ambient light sensor 607 changes accordingly. Figure 6a shows a device 600 that is pivoted around a pivot axis 630' and a graph 620' of how the intensity level measured by the ambient light sensor 607 changes accordingly. From this combination of light data and motion data a controller 300 can easily determine the direction of the ambient light without advanced image processing requiring a great deal of computational resources. Also, a simple light sensor only designed to measure one intensity level irrespective or color is also much cheaper than a camera. When determining the initial and updated direction of the light source and the resulting shadows using an ALS as above the initial direction is taken to be a direction substantially being 90 degrees from a display plane, i.e. the normal to the display. As the device is moved the intensity level is monitored. If it grows it means that it is being turned towards the light source and the shadows should be decreased. And if the intensity is fading it means that the device is being turned away from the light source and the shadows should grow in the direction the device is being turned.
In one embodiment the light sensor 407 is a camera and the light data generated is an image of the surroundings. From this image it is possible for a controller to determine the direction 409 of the light source 408 using image processing. Although this requires some more computational power it is only done at certain intervals or times when the initial direction is determined and not while the updated directions 409' are determined based on the movement of the mobile phone 400.
In one alternative embodiment the direction of the ambient light source 411 is not measured using the light sensor, but it is determined based on calculations. These calculations are based on the position of the device, which position can be achieved through a GPS device, cellular triangulation or other location determining technique, in combination with calendar and time of day data to determine the position of the sun in the sky and from this determine the direction to the sun relative to the device taking into account the direction and position of the device. This direction can be achieved by using a built-in compass.
The data relating to the suns position can be gathered from a meteorological database, possibly accessed over an internet connection. Figure 7 shows a method of such an embodiment where it is determined if a user interface's graphical representation needs to be drawn (as when going from an inactive or idle mode where the screen or display is inactive to an active mode where the display is activated), step 710. If it is determined that such a graphical representation needs to be drawn the sun's (or position of any other object which movement can be measured or is known) position is determined in step 720, possibly by querying a database or as has been described above and by determining the position of the device through the use of a directional finding means such as a compass. Thereafter, in step 740, virtual shadows corresponding to the sun' s position are defined for any graphical object that is comprised in the graphical representation of the user interface. The graphical representation is then displayed on a display in step 750. Should further graphical objects need to be changed such as when a view of the user interface has been changed such as when a new application is started or a subfolder has been opened (step 760) new shadows is defined for these new objects by returning to step 740 and displayed in step 750. Should a change in the device's position or orientation be detected, possibly through the use of a motion detector, a GPS device or a compass (step 770) the device's new position is checked by returning to step 720 and an updated direction to the sun is determined (730), shadows are redefined (740) and displayed (750) .
This provides for the possibility of determining the sun' s position without using a camera or other light sensor as these may be covered by a finger, a skin or a carrying case .
In this embodiment the database query replaces the light sensor for generating the light data and the motion sensor is the compass. It should be noted that shadows can be rendered and displayed as being shortened or longer based on the sun's position (or the position of another planet or stellar body) on the sky. For example; in evening time shadows are longer and during the daytime the shadows are shorter as the sun is shining almost directly towards the earth.
It should also be noted that tilting the device is one possible movement that is determined through the use of the accelerometer or movement sensor. Tilting or turning the device may also be taken into account when determining the length of the shadows, as if the sun's angle is wide in relation to the device screen, the shadow is rendered to be longer .
Figure 8 shows a screen view of an alternative device according to the teachings herein. In a first initial state the device is in rest and the display 803 shows an object 812. In one embodiment the object 812 carries information content or data 813 which is made visible to a user.
The data 813 carried may be related to calendar entries, images, upcoming tasks, time of day, content fetched over an internet or Wireless Application Protocol connection, application specific data, news, weather information, slideshows etc. In this embodiment the object 812 carries the data of the time of day 813. As the device is moved or rotated as indicated by arrow 812 this motion is detected by a motion detector (408) and the display 803 is updated to display a second object 812' . In this embodiment the information 813' carried by the second object 812' indicates a sun and a temperature of 90 degrees Fahrenheit.
In one embodiment this second object 812' is the same as the first object 812, but the information 813 carried by the object 812 is changed.
In one embodiment the object 812 to be displayed is associated with one direction or position of the device. For example, when the device is lying on its right side, the time of day and data related to an alarm clock application is shown. When the device is lying on its left side data relating to a media player application is shown. As an alternative to this embodiment the objects 812 and/or the carried information 813 is not associated with a direction or position but is simply scrolled though by a user moving the device or making a predetermined movement pattern such as a shake, possibly in a specific direction, a flip, a rotation, a turn, a twist etc. This provides the possibility of having a number of applications and data relating to these to be activated and displayed simply by turning or moving the device. For example a device having four applications media player, calendar, text and message editor and a voice call application can be configured to allow a user to switch application or at least to view data or information relating to the various applications by flipping the phone.
In one embodiment the user interface is configured to detect a plurality of light sources and combine their effect so that each light source will give rise to a shadow. An illustrative example is in some football games where you can see a player having four different shadows because of four different light sources at the stadium. The determination of multiple light source can be implemented both when determining the light sources through the use of a light sensor and also through a database query.
In one embodiment the shadows are to only rendered and displayed on the display. The user interface could also change the color of other components such as keys, the cover et.c. depending on the user interface's and corresponding device's capabilities.
It should be understood that the shadows are also applicable to two dimensional objects as well as three dimensional objects. In this case the shadows are replaced by gloss on the 2D icons.
The teaching of the present application has numerous advantages. Different embodiments or implementations may yield one or more of the following advantages. It should be noted that this is not an exhaustive list and there may be other advantages which are not described herein. One advantage of the teaching of this application is that the update of the direction requires less computational power which provides for cheaper devices. Another advantage of the teaching of the present application is that the less demanding computations require less battery power which leads to devices having longer battery times. Although the teaching of the present application has been described in detail for purpose of illustration, it is understood that such detail is solely for that purpose, and variations can be made therein by those skilled in the art without departing from the scope of the teaching of this application .
For example, although the teaching of the present application has been described in terms of a mobile phone, it should be appreciated that the teachings of the present application may also be applied to other types of electronic devices, such as music players, palmtop computers and the like. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the teachings of the present application. The term "comprising" as used in the claims does not exclude other elements or steps. The term "a" or "an" as used in the claims does not exclude a plurality. A unit or other means may fulfill the functions of several units or means recited in the claims.

Claims

CLAIMS :
1. A user interface comprising a motion detector configured to generate motion data, a display configured to display a first graphical object, and a controller configured to:
receive light data,
determine an initial direction of an external light source according to said light data,
receive motion data from said motion detector,
determine a motion of said portable device based on said motion data and to determine an updated direction according to the determined motion and
update said display according to the updated determined direction.
2. A user interface according to claim 1, further comprising a light sensor configured to generate said light data.
3. A user interface according to claim 1, wherein said controller is further configured to determine said initial direction of an external light source according to said motion data.
4. A user interface according to claim 1, wherein said controller is further configured to determine whether an update criterion is fulfilled and if said update criterion is fulfilled receive updated light data generated by said light sensor and determine a second initial direction of an external light source from said updated light, receive further motion data and determine a second updated direction according to said further motion data and to update said display according to the second updated direction.
5. A user interface according to claim 4, wherein said update criterion is based on said received motion data.
6. A user interface according to claim 5, wherein said update criterion is related to any of or a combination of features taken from the group comprising: calculated travelled distance, acceleration, shock, rotation.
7. A user interface according to claim 4, wherein said update criterion is temporal.
8. A user interface according to claim 4, wherein said update criterion is temporal and based on motion.
9. A user interface according to claim 8, wherein said update criterion is related to any of or a combination of features taken from the group comprising: rotation during period of time, series of accelerations detected.
10. A user interface according to claim 1, wherein said light sensor is an Ambient Light Sensor configured to generate said light data comprising an intensity level.
11. A user interface according to claim 1, wherein said light sensor is configured to generate said light data solely consisting of an intensity level.
12. A user interface according to claim 11, wherein said light sensor is an Ambient Light Sensor.
13. A user interface according to claim 1, wherein said light sensor is a camera and said controller is configured to determine said initial direction through image processing of said received light data.
14. A user interface according to claim 1, wherein said display is configured to display virtual shadows of said graphical object, said virtual shadows virtually resulting from said external light source.
15. A user interface according to claim 4, wherein said display is configured to display moving shadows according to the determined motion and updated initial light direction.
16. A user interface according to claim 4, wherein said display is configured to display a second graphical object according to said updated light direction.
17. A user interface according to claim 16, wherein said first graphical object is different from said second graphical object.
18. A user interface according to claim 16, wherein said first graphical object carries a first data set and said second graphical object carries a second data set, wherein said first data set is different from said second data set.
19. A user interface according to claim 16, wherein said first data set and said second data set are chosen from a group comprising: date, time, calendar view, graphical object, internet related information, news, weather and application specific data.
20. A user interface according to claim 1, wherein said graphical object is a two dimensional rendering of a three dimensional image.
21. A portable device comprising a user interface according to claim 1.
22. A device according to claim 21, wherein said device is a mobile communications terminal or a laptop computer or a drawing pad or a personal digital assistant.
23. A user interface comprising a controller, a light sensor and a display configured to display a graphical object, wherein said light sensor is configured to generate a non- directional intensity reading and said controller is configured to determine a direction from this non-directional intensity reading.
24. A user interface comprising according to claim 23, wherein said controller is configured to receive motion data from a motion sensor and to determine said direction based on said motion data and said non-directional intensity reading.
25. A user interface comprising according to claim 23, wherein said light sensor is an Ambient Light Sensor.
26. A portable device comprising a user interface according to claim 23.
27. A device according to claim 26, wherein said device is a mobile communication terminal or a laptop computer or a drawing pad or a personal digital assistant.
28. A user interface comprising a display configured to display a first data set and a controller configured to: determine a direction of said device and display a second data set according to the determined direction wherein said first data set is different from said second set.
29. A user interface comprising a light sensor, a motion detector, a display configured to display a first graphical object, and a controller operatively coupled to said light sensor and said motion detector, wherein said controller is operatively coupled to said display and configured to
determine an initial direction of an external light source from light data generated by said light sensor,
receive motion data from said motion detector
determine a motion of said portable device based on said motion data and to determine an updated direction according to the determined motion and
update said display according to the updated determined direction .
30. A user interface according to claim 29, wherein said controller is further configured to determine whether an update criterion is fulfilled and if said update criterion is fulfilled receive updated light data generated by said light sensor and determine a second initial direction of an external light source from said updated light, receive further motion data and determine a second updated direction according to said further motion data.
31. A portable device comprising a user interface according to claim 29.
32. A device according to claim 31, wherein said device is a mobile communication terminal, a laptop computer, a drawing pad or a personal digital assistant.
33. A user interface comprising light sensor means for generating light data, motion detector means for generating motion data, display means for displaying a first graphical object, and controller means for:
receiving light data generated by said light sensor means,
determining an initial direction of an external light source according to said light data,
receiving motion data from said motion detector means,
determining an updated direction according to the motion data and the initial direction, and
updating said display according to the updated direction.
34. A user interface according to claim 33, further comprising controller means for determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data from said light sensor means and determining a second initial direction of an external light source according to said updated light, and for receiving further motion data and determining a second updated direction according to said further motion data and second initial direction and updating said display according to the second updated direction.
35. A portable device comprising a user interface according to claim 33.
36. A device according to claim 35, wherein said device is a mobile communication terminal or a laptop computer or a drawing pad or a personal digital assistant.
37. A method for displaying graphical objects in a device taking into account an ambient light source comprising
receiving light data generated by a light sensor,
determining an initial direction of an external light source according to said light data,
receiving motion data from said detector,
determining a motion of said portable device based on said motion data and determining an updated direction according to the determined motion and
updating said display according to the updated determined direction .
38. A method according to claim 37, further comprising determining said initial direction of an external light source according to said motion data.
39. A method according to claim 37, further comprising determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data generated by said light sensor and determining a second initial direction of an external light source from said updated light, receiving further motion data and determining a second updated direction according to said further motion data and updating said display according to the second updated direction.
40. A method according to claim 38, wherein said update criterion is based on said received motion data.
41. A method according to claim 38, wherein said update criterion is related to any of or a combination of features taken from the group comprising: calculated travelled distance, acceleration, shock, rotation.
42. A method according to claim 38, wherein said update criterion is temporal.
43. A method according to claim 38, wherein said update criterion is temporal and based on motion.
44. A method according to claim 38, wherein said update criterion is related to any of or a combination of features taken from the group comprising: rotation during period of time, series of accelerations detected.
45. A method according to claim 37, further comprising displaying virtual shadows of said graphical object, said virtual shadows virtually resulting from said external light source .
46. A method according to claim 38, further comprising displaying moving virtual shadows according to the determined motion and updated initial light direction.
47. A method according to claim 37, further comprising displaying a second graphical object according to said updated light direction.
48. A method according to claim 47, wherein said first graphical object is different from said second graphical object .
49. A method according to claim 47, wherein said first graphical object carries a first data set and said second graphical object carries a second data set, wherein said first data set is different from said second data set.
50. A method according to claim 47, wherein said first data set and said second data set are chosen from a group comprising: date, time, calendar view, graphical object, internet related information, news, weather and application specific data.
51. A method according to claim 47, further comprising wherein said graphical object is a two dimensional rendering of a three dimensional image.
52. A computer readable medium including at least computer program code for controlling a user interface comprising a light sensor configured to generate light data, a motion detector configured to generate motion data, a display configured to display a first graphical object, said computer readable medium comprising:
software code for receiving light data generated by said light sensor, software code for receiving motion data from said motion detector,
software code for determining a motion of said portable device based on said motion data
software code for determining an direction of an external light source according to light data generated by said light sensor and according to the determined motion
software code for updating said display according to the determined direction.
53. A computer readable medium as in claim 52 further comprising software code for determining whether an update criterion is fulfilled and if said update criterion is fulfilled receiving updated light data generated by said light sensor, receiving further motion data and determining a second updated direction and then determining a second direction of an external light source according to said updated light data and according to said further motion data and updating said display according to the second updated direction .
54. A device incorporating and implementing or adapted to incorporate and to implement a computer readable medium according to claim 52.
PCT/FI2009/050394 2008-05-22 2009-05-13 Device and method for displaying and updating graphical objects according to movement of a device WO2009141497A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US5527808P 2008-05-22 2008-05-22
US61/055,278 2008-05-22

Publications (1)

Publication Number Publication Date
WO2009141497A1 true WO2009141497A1 (en) 2009-11-26

Family

ID=41339812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2009/050394 WO2009141497A1 (en) 2008-05-22 2009-05-13 Device and method for displaying and updating graphical objects according to movement of a device

Country Status (1)

Country Link
WO (1) WO2009141497A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012074756A1 (en) * 2010-11-29 2012-06-07 Google Inc. Mobile device image feedback
WO2013123599A1 (en) * 2012-02-24 2013-08-29 Research In Motion Limited Handheld device with notification message viewing
EP2741260A1 (en) * 2012-12-06 2014-06-11 Volvo Car Corporation Method and user interface system for adapting a graphic visualization of a virtual element
CN103968816A (en) * 2013-01-31 2014-08-06 三星电子株式会社 Apparatus and method for compass intelligent lighting for user interfaces
WO2014175600A1 (en) * 2013-04-24 2014-10-30 주식회사 실리콘아츠 Computer-executable three-dimensional content display method, three-dimensional content display apparatus for performing same, and recording medium for saving same
US8994653B2 (en) 2012-02-24 2015-03-31 Blackberry Limited Handheld device with notification message viewing
EP2732436B1 (en) * 2011-07-12 2019-09-18 Amazon Technologies, Inc. Simulating three-dimensional features

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001027735A1 (en) * 1999-10-12 2001-04-19 Myorigo Oy Operation method of user interface of hand-held device
US20020123841A1 (en) * 2000-09-19 2002-09-05 Hiroyuki Satoh Map display apparatus and display method therefor
US6452593B1 (en) * 1999-02-19 2002-09-17 International Business Machines Corporation Method and system for rendering a virtual three-dimensional graphical display
US20040070565A1 (en) * 2001-12-05 2004-04-15 Nayar Shree K Method and apparatus for displaying images
US20050219211A1 (en) * 2004-03-31 2005-10-06 Kotzin Michael D Method and apparatus for content management and control
WO2006040200A1 (en) * 2004-10-13 2006-04-20 Siemens Aktiengesellschaft Device and method for light and shade simulation in an augmented-reality system
US20060132675A1 (en) * 2003-07-01 2006-06-22 Domotion Ltd., Republic Of Korea Hand-held device having three-dimensional viewing function with tilt sensor and display system using the same
US20070236485A1 (en) * 2006-03-31 2007-10-11 Microsoft Corporation Object Illumination in a Virtual Environment
US20080108340A1 (en) * 2006-11-06 2008-05-08 Christopher Kent Karstens Environmental function changing
WO2008053533A1 (en) * 2006-10-31 2008-05-08 Pioneer Corporation Map display device, map display method, map display program, and recording medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6452593B1 (en) * 1999-02-19 2002-09-17 International Business Machines Corporation Method and system for rendering a virtual three-dimensional graphical display
WO2001027735A1 (en) * 1999-10-12 2001-04-19 Myorigo Oy Operation method of user interface of hand-held device
US20020123841A1 (en) * 2000-09-19 2002-09-05 Hiroyuki Satoh Map display apparatus and display method therefor
US20040070565A1 (en) * 2001-12-05 2004-04-15 Nayar Shree K Method and apparatus for displaying images
US20060132675A1 (en) * 2003-07-01 2006-06-22 Domotion Ltd., Republic Of Korea Hand-held device having three-dimensional viewing function with tilt sensor and display system using the same
US20050219211A1 (en) * 2004-03-31 2005-10-06 Kotzin Michael D Method and apparatus for content management and control
WO2006040200A1 (en) * 2004-10-13 2006-04-20 Siemens Aktiengesellschaft Device and method for light and shade simulation in an augmented-reality system
US20070236485A1 (en) * 2006-03-31 2007-10-11 Microsoft Corporation Object Illumination in a Virtual Environment
WO2008053533A1 (en) * 2006-10-31 2008-05-08 Pioneer Corporation Map display device, map display method, map display program, and recording medium
US20080108340A1 (en) * 2006-11-06 2008-05-08 Christopher Kent Karstens Environmental function changing

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012074756A1 (en) * 2010-11-29 2012-06-07 Google Inc. Mobile device image feedback
EP2732436B1 (en) * 2011-07-12 2019-09-18 Amazon Technologies, Inc. Simulating three-dimensional features
US9075451B2 (en) 2012-02-24 2015-07-07 Blackberry Limited Handheld device with notification message viewing
WO2013123599A1 (en) * 2012-02-24 2013-08-29 Research In Motion Limited Handheld device with notification message viewing
US10375220B2 (en) 2012-02-24 2019-08-06 Blackberry Limited Handheld device with notification message viewing
US9866667B2 (en) 2012-02-24 2018-01-09 Blackberry Limited Handheld device with notification message viewing
EP2631743A3 (en) * 2012-02-24 2016-01-27 BlackBerry Limited Handheld device with notification message viewing
US8994653B2 (en) 2012-02-24 2015-03-31 Blackberry Limited Handheld device with notification message viewing
CN103885737A (en) * 2012-12-06 2014-06-25 沃尔沃汽车公司 Method And User Interface System For Adapting A Graphic Visualization Of A Virtual Element
EP2741260A1 (en) * 2012-12-06 2014-06-11 Volvo Car Corporation Method and user interface system for adapting a graphic visualization of a virtual element
EP2783735A3 (en) * 2013-01-31 2016-06-15 Samsung Electronics Co., Ltd Apparatus and method for smart lighting of depicted objects on a display terminal
CN103968816A (en) * 2013-01-31 2014-08-06 三星电子株式会社 Apparatus and method for compass intelligent lighting for user interfaces
KR101474552B1 (en) * 2013-04-24 2014-12-22 주식회사 실리콘아츠 Method of computer running three dimensions contents display, apparatus performing the same and storage media storing the same
WO2014175600A1 (en) * 2013-04-24 2014-10-30 주식회사 실리콘아츠 Computer-executable three-dimensional content display method, three-dimensional content display apparatus for performing same, and recording medium for saving same

Similar Documents

Publication Publication Date Title
US10031656B1 (en) Zoom-region indicator for zooming in an electronic interface
US9575589B2 (en) Mobile terminal and control method thereof
US9367097B2 (en) System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device
CN108418953B (en) Screen control method and device of terminal, readable storage medium and terminal
EP2397936B1 (en) Mobile terminal and method of controlling the same
EP2328062A2 (en) Flexible home page layout for mobile devices
WO2009141497A1 (en) Device and method for displaying and updating graphical objects according to movement of a device
KR20100012665A (en) Mobile terminal and operation control method thereof
US20170269712A1 (en) Immersive virtual experience using a mobile communication device
KR20140122458A (en) Method and apparatus for screen display of portable terminal apparatus
CN110795007B (en) Method and device for acquiring screenshot information
WO2009147292A1 (en) User interface, device and method for displaying a stable screen view
KR101669487B1 (en) Mobile terminal and operation control method thereof
US20080012822A1 (en) Motion Browser
JP2022544978A (en) Interface display method and terminal
CN107168621A (en) A kind of view control method by sliding and mobile terminal
US20190212834A1 (en) Software gyroscope apparatus
CN108833679B (en) Object display method and terminal equipment
CN109343782A (en) A kind of display methods and terminal
CN111433832A (en) Entity globe with touch function, display terminal and map display method
KR20120057256A (en) Mobile terminal and operation method thereof
CN115379274B (en) Picture-based interaction method and device, electronic equipment and storage medium
CN109391733B (en) Virtual key display method, mobile terminal and storage medium
KR101741073B1 (en) Mobile terminal and method for running functions of the same
KR20110074111A (en) Mobile terminal and operation control method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09749981

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09749981

Country of ref document: EP

Kind code of ref document: A1