WO2012149627A1 - Methods for adjusting a presentation of graphical data displayed on a graphical user interface - Google Patents
Methods for adjusting a presentation of graphical data displayed on a graphical user interface Download PDFInfo
- Publication number
- WO2012149627A1 WO2012149627A1 PCT/CA2011/050270 CA2011050270W WO2012149627A1 WO 2012149627 A1 WO2012149627 A1 WO 2012149627A1 CA 2011050270 W CA2011050270 W CA 2011050270W WO 2012149627 A1 WO2012149627 A1 WO 2012149627A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- graphical data
- electronic device
- gui
- movement
- existing
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/109—Time management, e.g. calendars, reminders, meetings or time accounting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
- G09B29/006—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
- G09B29/007—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes using computer methods
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04804—Transparency, e.g. transparent or translucent windows
Definitions
- the present disclosure relates to adjusting graphical data displayed on graphical user interfaces of portable electronic devices and, more specifically, to overlaying new graphical data to existing graphical data in response to a predetermined movement of such devices.
- Portable electronic devices include, for example, mobile stations, cellular telephones, smart telephones, wireless personal digital assistants, and laptop computers with wireless capabilities.
- Such devices include displays and operating systems providing graphical user interfaces (GUIs) that impart, among other things, graphical data about GUIs.
- GUIs graphical user interfaces
- the displayed graphical data may be modified depending on the functions and operations being performed.
- a user may highlight, add, or remove graphical data displayed on the GUIs by, for example, inputting commands and functions via a keypad or the like.
- Fig. 1 is a block diagram of a portable electronic device, consistent with disclosed embodiments
- Fig. 2 is a top plan view of a portable electronic device, consistent with disclosed embodiments
- FIG. 3 is a flow diagram of an example process for adjusting a presentation of graphical data displayed on a display of a portable electronic device, consistent with disclosed embodiments;
- FIGs. 4 to 5 illustrate an example of adjustment of graphical data displayed on a display of a portable electronic device, consistent with disclosed embodiments
- Figs. 6 to 7 illustrate another example of adjustment of graphical data displayed on a display of a portable electronic device, consistent with disclosed embodiments
- Figs. 8 to 10 illustrate another example of adjustment of graphical data displayed on a display of a portable electronic device, consistent with disclosed embodiments
- Figs. 1 1 to 13 illustrate another example adjustment of graphical data displayed on a display of a portable electronic device, consistent with disclosed embodiments.
- Figs. 14 to 16 illustrate another example adjustment of graphical data displayed on a display of a portable electronic device, consistent with disclosed embodiments.
- the disclosure generally relates to a portable electronic device.
- portable electronic devices include mobile, or handheld, wireless communication devices such as pagers, cellular phones, cellular smart-phones, wireless organizers, personal digital assistants, tablets, and wirelessly enabled notebook computers.
- the portable electronic device may also be a portable electronic device without wireless communication capabilities, such as a handheld electronic game device, digital photograph album, digital camera, or other portable device.
- Fig. 1 illustrates a block diagram of an example of a portable electronic device 1 .
- Portable electronic device 1 includes multiple components, such as a processor 2 configured to control the overall operation of the portable electronic device 1 .
- processor 2 comprises a microprocessor. Communication functions, including data and voice communications, are performed through
- device 1 is configured to receive compressed data and to decompress and decrypt the data by using a decoder 4.
- Wireless network 5 may be any type of wireless network, including, but not limited to, data wireless networks, voice wireless networks, and networks that support both voice and data communications.
- Power source 6, such as one or more rechargeable batteries or a port to an external power supply, powers portable electronic device 1 .
- processor 2 interacts with other components of portable electronic device 1 , such as a random access memory (RAM) 7, a display 8, a speaker 9, a keypad 10, auxiliary I/O devices 1 1 , a data port 12, a microphone 13, a flash memory 14, and a clock 15.
- RAM random access memory
- display 8 a display 8
- speaker 9 a keypad 10
- auxiliary I/O devices 1 1 a data port 12
- microphone 13 a flash memory 14, and a clock 15.
- portable electronic device 1 uses a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 16 for communication with a network, such as wireless network 5.
- SIM/RUIM Removable User Identity Module
- user identification information may be programmed into flash memory 14.
- Portable electronic device 1 includes an operating system 17 and software programs or applications 18 that are executed by processor 2 and may be stored in a persistent, updatable store such as flash memory 14. Additional applications or programs may be loaded onto portable electronic device 1 through wireless network 5, auxiliary I/O subsystem 1 1 , data port 12, short-range communications subsystem 19, or any other suitable subsystem 20.
- a received signal such as a text message, an e-mail message, or web page download is processed by communication subsystem 3 and input to processor 2.
- Processor 2 processes the received signal for output to display 8 and/or to auxiliary I/O subsystem 1 1 .
- a user may generate data items, for example e-mail or text messages, which may be transmitted over wireless network 5 through communication subsystem 3.
- Speaker 9 outputs audible information converted from electrical signals, and microphone 13 converts audible information into electrical signals for processing.
- Electronic device 1 also includes a location module 21 .
- Location module 21 may include, for example, a GPS module, and may be configured to receive and interpret GPS signals from a system of satellites to triangulate the current location of device 1 .
- Electronic device 1 also includes a graphics system 22 configured to manage graphical data displayed on display 8.
- graphics system 22 includes a movement detection module 23 and a GUI adjustment module 24.
- Movement detection module 23 is configured to detect movement of electronic device 1 and input movement data to processor 2.
- movement detection module 23 includes a motion sensor 25 configured to generate a signal responsive to a change in orientation of electronic device 1 .
- motion sensor 25 includes a low-g micro-electromechanical system (MEMs) accelerometer.
- the accelerometer may be any type, including a capacitive,
- the accelerometer senses and converts an acceleration detected from a motion or a movement of electronic device 1 (e.g., tilt, rotation, inertial, or vibration) or gravity into an electrical signal and may be available in one, two, or three axis configurations.
- the accelerometer produces digital or analog output signals.
- motion sensor 25 includes a tilt, motion or orientation sensor, such as a gyroscope.
- GUI adjustment module 24 is configured to receive movement data determined by movement detection module 23 and associate new graphical data to existing graphical data displayed on display 8 based on the movement data. For instance, GUI adjustment module 24 is configured to overlay new graphical data to existing graphical data.
- Graphical data includes, for example, text, icons, graphics, dialog boxes, and any other visual information for a user. Graphical data, and
- orientation of the graphical data is stored in flash memory 14 and accessed and modified by GUI adjustment module 24.
- FIG. 2 A top plan view of portable electronic device 1 is shown generally in Fig. 2.
- the example portable electronic device 1 illustrated in Fig. 2 includes a housing 200 in which may be disposed various components such as those shown in Fig. 1 .
- various input apparatuses and output apparatuses, processor 2, and flash memory 14 for storing at least programs and/or applications 18 are disposed in housing 200.
- Processor 2 is responsive to input signals from input apparatus, such as keypad 10, and provides output signals to output apparatus, such as display 8 or speaker 9.
- Processor 2 also interfaces with flash memory 14 and is capable of executing programs 18.
- a screen image 26 is generated on display 8 and comprises graphical data related to one or more applications and/or programs stored in flash memory 14. For example, applications and/or programs may generate and control graphical data on screen image 26.
- screen image 26 includes graphical data related to a map application 27.
- Map application 27 provides a graphical interface to allow a user to determine the user's location and to navigate around a geographic region near the user's location.
- application 27 includes, for example, images depicting streets, points of interest, and text identifying street names and destinations. Accordingly, the GUI is comprised of the graphical data and user-operated functions related to a program or an application presented on display 8.
- the output apparatus includes display 8 and speaker 9, each of which is responsive to one or more output signals from processor 2.
- the input apparatus includes keypad 10.
- Keypad 10 includes input members 225, such as mechanical keys using, for example, a mechanical dome switch actuator.
- input members 225 on keypad 10 may be part of display 8, with display having a touch-sensitive configuration as is known in the art.
- input members 225 form a QWERTY keyboard, either in reduced or full format. In a reduced keyboard, a number of input members 225 are assigned to two or more characters. In other example embodiments, input members 225 are assigned characters alphabetically.
- handheld electronic device 1 includes other input apparatuses, such as a scroll wheel, an optical trackpad, or a ball located either on the face or side of device 1 .
- These input apparatuses provide additional inputs to processor 2.
- a scroll wheel provides one input to processor 2 when rotated and a second input to processor 2 when actuated.
- An optical trackpad provides one input to processor 2 when swiped and a second input to processor 2 when pressed or tapped.
- Fig. 3 illustrates a flow diagram of an example process for adjusting a presentation of graphical data displayed on display 8 of electronic device 1 .
- the process is carried out by software stored as part of programs 18, and executed by processor 2.
- a user opens an application or program, such as map application 27 discussed above, and processor 2 then generates default graphical data related to the user selected application (e.g., map application 27) on display 8 (step 301 ).
- the default graphical data includes an initial set of images, graphics, and/or text associated with the user selected application.
- the default graphical data is generated and positioned on display 8 under baseline conditions of electronic device 1 , such as when electronic device 1 is motionless or lying flat relative to the user.
- graphics system 22 monitors movements of electronic device 1 . That is, movement detection module 23 detects movements (e.g., shifts, tilts, rotations, vibrations, and the like) of electronic device 1 and delivers movement data corresponding to a detected movement to processor 2.
- Processor 2 stores the movement data in flash memory 14, and GUI adjustment module 24 receives the movement data from flash memory 14. If a movement has been detected, the movement data is obtained by GUI adjustment module 24 and analyzed to determined whether the detected movement is within the scope of a predetermined movement, step 303.
- the predetermined movement is a threshold motion or movement, which initiates GUI adjustment module 24 to adjust the default graphical data presented on display 8.
- the predetermined movement has particular motion
- GUI adjustment module 24 proceeds to adjust the default graphical data.
- rotation of electronic device 1 to the left or to the right relative to the user at a preset acceleration constitutes a predetermined movement appropriate for initializing graphical adjustment by GUI adjustment module 24.
- tilting electronic device 1 forwards or backwards at a preset acceleration may be the predetermined movement.
- predetermined movement is any other motion or movement, such as shifting or shaking electronic device 1 up or down and/or left or right.
- graphics system 22 continues to monitor movements of electronic device 1 , step 302. If movement of electronic device is within the scope of the predetermined movement, GUI adjustment module 24 then associates new graphical data to the default graphical data presented on display 8 (step 304).
- GUI adjustment module 24 overlays new graphical data on top of the default graphical data already existing on display 8. For instance, in response to the movement of electronic device 1 , GUI adjustment module 24 adds new images, text, dialog boxes, and/or graphics to the existing default images and graphics on display 8.
- the new graphical data presents additional information to the user pertaining to the user selected application.
- display 8 does not display new graphical data prior to the movement of electronic device 1 .
- the new images, text, and/or graphics are positioned directly on top of the existing images and graphics to highlight information previously presented by the default graphical data.
- the new images, text, and/or graphics are positioned adjacent, proximate, or separate from the existing graphical data on display 8.
- the new graphical data covers or masks at least some of the existing graphical data presented on display 8. Additionally, or alternatively, the new graphical data is positioned directly on top of the existing graphical data and displayed in a transparent mode to expose the existing graphical data. Such an arrangement allows the user to distinguish between the new and existing graphical data yet still identify both sets of graphical data.
- GUI adjustment module 24 also removes at least some of the default graphical data existing on display 8. For example, in response to an appropriate movement (e.g., movement within the scope of the predetermined movement) of electronic device 1 , GUI adjustment module 24 hides or eliminates existing images, graphics, and/or text from display 8. By removing certain graphical data, display 8 presents only a predetermined selection of graphical data, which may provide graphical clarity and organization to the user. In other words, GUI adjustment module 24 highlights graphical data to the user by displaying only certain images, text, and/or graphics. It should also be appreciated that GUI adjustment module 24 provides a smooth transition when overlaying new graphical data or removing existing graphical data.
- an appropriate movement e.g., movement within the scope of the predetermined movement
- GUI adjustment module 24 When overlaying new graphical data or removing certain existing graphical data, GUI adjustment module 24 maintains a position and an orientation of the existing graphical data remaining on display 8. In other words, certain existing images, graphics, and/or text retain a spatial arrangement on display 8. For example, the existing images, graphics, and/or text are notionally positioned relative to each other using x, y, and z Cartesian coordinates on display 8. GUI adjustment module 24 fixes the existing images, graphics, and/or text at their corresponding x, y, and z coordinates when new images, graphics, and/or text is added or when certain existing graphical data is removed. That is, the position of the existing graphical data does not move when new graphical data is added to display 8 and when some graphical data is eliminated from display 8.
- GUI adjustment module 24 regenerates the default graphical data on display 8 (step 301 ), if movement data obtained by module 24 is indicative of baseline conditions (e.g., the initial position and configuration of device prior to the predetermined movement).
- GUI adjustment module 24 regenerates the default graphical data if electronic device 1 is not moved for a set period of time, as indicated by the arrow labeled "timeout.” For example, when electronic device 1 moves to a new position from a baseline position, new graphical data is added to the default graphical data on display 8. If electronic device 1 remains at rest (or is not moved) at the new position for a set period of time, the new graphical data is gradually removed from display 8, and display 8 presents the default graphical data as initially displayed during baseline conditions.
- GUI adjustment module 24 obtains the time data and regenerates the default graphical data based on the aforementioned "timeout" conditions.
- Figs. 4-5 illustrate an example of a process for adjusting a presentation of graphical data displayed on display 8 of electronic device 1 related to map application 27.
- Map application 27 presents graphical data related to the geographic location of device 1 provided by location module 21 .
- map application 27 provides images depicting streets 28 and geographic terrain 29 bordered by streets 28.
- Map application 27 also includes destination markers 30, which graphically indicate the geographical locations of a desired destination, such as restaurants, gas stations, and the like.
- Fig. 4 depicts electronic device 1 under baseline conditions. That is, electronic device 1 remains motionless and the plane of display 8 is positioned flat relative to the user.
- the graphical data of map application 27 (e.g., streets 28, geographic terrain 29, and destination markers 30), as shown in Fig. 4, represents the default graphical data under baseline conditions of electronic device 1 .
- Fig. 5 illustrates the electronic device 1 of Fig. 4 rotated to a leftward position relative to the user.
- movement detection module 23 detects the movement
- GUI adjustment module 24 determines that the movement is within the scope of a predetermined movement.
- GUI adjustment module 24 then associates new graphical data to the existing, default graphical data presented on display 8.
- GUI adjustment module 24 overlays new text 31 on top of the existing street images 28.
- the text 31 provides street names to the street images 28, which may have previously been presented on display 8 under the baseline conditions of electronic device 1 .
- the text 31 is positioned directly over the existing street images 28 and provides additional graphical data pertaining to map application 27 simply through a user-controlled motion of device 1 .
- presenting the text 31 in a separate graphical arrangement prevents a cluttered GUI. Movement of electronic device 1 also initiates GUI adjustment module 24 to remove destination markers 30. Destination markers 30 is removed to clarify and highlight the text 31 associated with street images 28. Smaller destination pinpoints 32 replace destination markers 30 to improve GUI clarity.
- the default graphical arrangement presented in Fig. 4 is regenerated by moving electronic device 1 back to the position defined by the baseline conditions, or by waiting a set amount of time in the rotated position such that the above-described "timeout" function gradually reverts display 8 back to the default graphical arrangement. As illustrated in Figs. 4-5, the relative positions of the existing graphical data is maintained on display 8 when new graphical data is overlaid.
- the positions of street images 28 and geographical terrain 29 are fixed as street text 31 are overlaid onto the existing street images 28.
- the x, y, and z coordinates of the exiting graphical data do not change when new graphical data (e.g., street text 31 ) is added to display 8.
- FIG. 4-5 illustrate electronic device 1 being rotated in a leftward direction to initiate graphical adjustment by GUI adjustment module 24, it should be appreciated that any other motion initiates GUI adjustment module 24, such as rightward rotation, shaking, and/or upward and downward tilting.
- FIGs. 6-7 illustrate another example of a process for adjusting a
- video application 33 presents video data to the user in any known form, such as AVI, MPEG, MP4, and the like.
- a movie stored in RAM 7 or flash memory 14 or downloaded from wireless network 5 can be played for display on display 8.
- Fig. 6 depicts electronic device 1 under baseline conditions. That is, when electronic device 1 remains motionless, the plane of display 8 is positioned flat relative to the user.
- the graphical data of video application 33 (e.g., the scenes of the movie) represents the default graphical data.
- Fig. 7 illustrates the electronic device 1 of Fig. 6 tilted towards the user to a downward position.
- movement detection module 23 detects the movement
- GUI adjustment module 24 determines that the movement is within the scope of a predetermined movement.
- GUI adjustment module 24 then adds a playbar 34 directly on top of the scenes presented by video application 33.
- playbar 34 is overlaid on top of the previously existing scenes.
- Playbar 34 provides new graphical data to the user by indicating the progress and time elapsed of the movie being played.
- the playbar 34 allows for control of video playback, such as pausing, fast forwarding, reverse play, etc.
- the default graphical arrangement presented in Fig. 6 i.e., the video without playbar 34
- GUI adjustment module 24 may initiate GUI adjustment module 24 to adjust the presentation of graphical data related to video application 33 displayed on display 8.
- Figs. 8-10 illustrate another example of a process for adjusting a presentation of graphical data displayed on display 8 of electronic device 1 related to a calendar application 35.
- Calendar application 35 graphically tracks appointments and other status matters relating to a user and device 1 .
- Calendar application 35 provides a daily, weekly, and/or monthly electronic schedule of appointments, meetings, and events.
- calendar application 35 organizes and displays graphical data relating to particular events, for example, social events 36 (solid bars) and work-related events 37 (cross-hatched bars), on a schedule 38.
- Fig. 8 depicts electronic device 1 under baseline conditions. That is, electronic device 1 remains motionless and the plane of display 8 is positioned flat relative to the user. Accordingly, the graphical data of calendar application 35 (e.g., schedule 38, social events 36, and work-related events 37) represents the default graphical data under the baseline conditions of electronic device 1 .
- Fig. 9 illustrates the electronic device 1 of Fig. 8 rotated to a leftward position relative to the user.
- movement detection module 23 detects the movement
- GUI adjustment module 24 determines that the movement is within the scope of a predetermined movement.
- GUI adjustment module 24 then removes at least some of the existing graphical data presented on display 8. In one embodiment, for example, GUI adjustment module 24 removes the social events 36 images, thereby leaving only the work-related events 37 images on schedule 38.
- Figs. 1 1 -13 illustrate another example of a process for adjusting a presentation of graphical data displayed on display 8 of electronic device 1 related to another map application 39.
- Fig. 1 1 depicts electronic device 1 under baseline conditions.
- Map application 39 displays a highlighted navigation route 40 (solid lines) and highlighted traffic data 41 (i.e., the degree of congestion on a certain street, depicted as cross-hatched lines) associated with street images 42.
- GUI adjustment module 24 removes highlighted traffic data 41 from display 8 and leaves street images 42 and highlighted navigation route 40, as shown in Fig. 12.
- GUI adjustment module 24 removes highlighted navigation route 40 from display 8 and leaves street images 42 and highlighted traffic data 41 .
- GUI adjustment module 24 maintains their relative positions when certain graphical data is removed from display 8. Moreover, the default graphical arrangement presented in Fig. 1 1 is regenerated by moving device 1 back to the baseline position or by waiting for the "timeout" function to initiate. It should also be appreciated that any other motion may initiate GUI adjustment module 24 to adjust the presentation of graphical data related to map application 39 displayed on display 8.
- Figs. 14-16 illustrate another example of adjusting a presentation of graphical data displayed on display 8 of electronic device 1 related to a calendar application 350.
- Calendar application 350 graphically tracks appointments and other status matters relating to a user and device 1 .
- Calendar application 350 provides a daily, weekly, and/or monthly electronic schedule of appointments, meetings, and events.
- calendar application 350 organizes and displays graphical data relating to, for example, the number of events or the duration of one or more events, scheduled on a particular day.
- an event bar 370 indicates to a user the number of events or the duration of one or more events scheduled on a particular day relative to the other days based on the relative length of event bar 370.
- the graphical data of calendar application 350 (e.g., the windows representing the days of a month and event bars 370) represents the default graphical data under the baseline conditions of electronic device 1 .
- Fig. 15 illustrates the electronic device 1 of Fig. 14 rotated to a leftward position relative to the user.
- movement detection module 23 detects the movement
- GUI adjustment module 24 determines that the movement is within the scope of a predetermined movement.
- GUI adjustment module 24 then overlays a three-dimensional block 380 over the windows of days that have scheduled events.
- Each three-dimensional block 380 corresponds to a particular event bar 370.
- a height of each three-dimensional block 380 indicates to the user the number of events or the duration of one or more events scheduled on a particular day relative to the other days.
- the three-dimensional blocks 380 therefore provide an additional view of the events scheduled in calendar application 350.
- rotating electronic device 1 to a rightward position relative to the user overlays three-dimensional blocks 380 over the windows of days and provides a perspective view of the left side of each block 380.
- the default graphical arrangement presented in Fig. 14 is regenerated by moving device 1 back to the baseline position or by waiting for the "timeout" function to initiate. It should be appreciated that any other motion may initiate GUI adjustment module 24 to adjust the presentation of graphical data related to calendar application 350 displayed on display 8.
- the user may control the angle of rotation that initiates GUI adjustment module 24 to adjust the presentation of graphical data.
- the user may change the degree of leftward or rightward rotation of the electronic device 1 that initiates GUI adjustment module 24.
- the user may input the desired rotation angle as a setting of the electronic device 1 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Human Resources & Organizations (AREA)
- Computer Hardware Design (AREA)
- Educational Technology (AREA)
- Entrepreneurship & Innovation (AREA)
- Strategic Management (AREA)
- Educational Administration (AREA)
- Mathematical Physics (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ecology (AREA)
- Operations Research (AREA)
- Economics (AREA)
- Data Mining & Analysis (AREA)
- User Interface Of Digital Computer (AREA)
- Testing And Monitoring For Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CA2011/050270 WO2012149627A1 (en) | 2011-05-04 | 2011-05-04 | Methods for adjusting a presentation of graphical data displayed on a graphical user interface |
GB201204199A GB2504256B (en) | 2011-05-04 | 2011-05-04 | Methods for adjusting a presentation of graphical data displayed on a graphical user interface |
CN201180003756.6A CN102870065B (en) | 2011-05-04 | 2011-05-04 | For adjusting the method presented of the graph data that graphic user interface shows |
CA2834914A CA2834914C (en) | 2011-05-04 | 2011-05-04 | Methods for adjusting a presentation of graphical data displayed on a graphical user interface |
DE112011100058T DE112011100058T5 (en) | 2011-05-04 | 2011-05-04 | METHOD FOR ADAPTING A PRESENTATION OF GRAPHICAL DATA DISPLAYED ON A GRAPHIC USER INTERFACE |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CA2011/050270 WO2012149627A1 (en) | 2011-05-04 | 2011-05-04 | Methods for adjusting a presentation of graphical data displayed on a graphical user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012149627A1 true WO2012149627A1 (en) | 2012-11-08 |
Family
ID=46026292
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CA2011/050270 WO2012149627A1 (en) | 2011-05-04 | 2011-05-04 | Methods for adjusting a presentation of graphical data displayed on a graphical user interface |
Country Status (5)
Country | Link |
---|---|
CN (1) | CN102870065B (en) |
CA (1) | CA2834914C (en) |
DE (1) | DE112011100058T5 (en) |
GB (1) | GB2504256B (en) |
WO (1) | WO2012149627A1 (en) |
Families Citing this family (109)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8677377B2 (en) | 2005-09-08 | 2014-03-18 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US9318108B2 (en) | 2010-01-18 | 2016-04-19 | Apple Inc. | Intelligent automated assistant |
US8977255B2 (en) | 2007-04-03 | 2015-03-10 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US8676904B2 (en) | 2008-10-02 | 2014-03-18 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US20120309363A1 (en) | 2011-06-03 | 2012-12-06 | Apple Inc. | Triggering notifications associated with tasks items that represent tasks to perform |
US10276170B2 (en) | 2010-01-18 | 2019-04-30 | Apple Inc. | Intelligent automated assistant |
US8682667B2 (en) | 2010-02-25 | 2014-03-25 | Apple Inc. | User profiling for selecting user specific voice input processing information |
US9262612B2 (en) | 2011-03-21 | 2016-02-16 | Apple Inc. | Device access using voice authentication |
US10057736B2 (en) | 2011-06-03 | 2018-08-21 | Apple Inc. | Active transport based notifications |
US10134385B2 (en) | 2012-03-02 | 2018-11-20 | Apple Inc. | Systems and methods for name pronunciation |
US10417037B2 (en) | 2012-05-15 | 2019-09-17 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
DE212014000045U1 (en) | 2013-02-07 | 2015-09-24 | Apple Inc. | Voice trigger for a digital assistant |
US10652394B2 (en) | 2013-03-14 | 2020-05-12 | Apple Inc. | System and method for processing voicemail |
US10748529B1 (en) | 2013-03-15 | 2020-08-18 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US10176167B2 (en) | 2013-06-09 | 2019-01-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
AU2014278592B2 (en) | 2013-06-09 | 2017-09-07 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
WO2015020942A1 (en) | 2013-08-06 | 2015-02-12 | Apple Inc. | Auto-activating smart responses based on activities from remote devices |
EP3058446B1 (en) | 2013-10-14 | 2024-06-05 | Verizon Patent and Licensing Inc. | Systems and methods for providing context-based user interface |
US10296160B2 (en) | 2013-12-06 | 2019-05-21 | Apple Inc. | Method for extracting salient dialog usage from live data |
US9633004B2 (en) | 2014-05-30 | 2017-04-25 | Apple Inc. | Better resolution when referencing to concepts |
US9430463B2 (en) | 2014-05-30 | 2016-08-30 | Apple Inc. | Exemplar-based natural language processing |
AU2015266863B2 (en) | 2014-05-30 | 2018-03-15 | Apple Inc. | Multi-command single utterance input method |
US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US10127911B2 (en) | 2014-09-30 | 2018-11-13 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US10074360B2 (en) | 2014-09-30 | 2018-09-11 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US9668121B2 (en) | 2014-09-30 | 2017-05-30 | Apple Inc. | Social reminders |
US10152299B2 (en) | 2015-03-06 | 2018-12-11 | Apple Inc. | Reducing response latency of intelligent automated assistants |
US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
US9721566B2 (en) | 2015-03-08 | 2017-08-01 | Apple Inc. | Competing devices responding to voice triggers |
US10460227B2 (en) | 2015-05-15 | 2019-10-29 | Apple Inc. | Virtual assistant in a communication session |
US10083688B2 (en) | 2015-05-27 | 2018-09-25 | Apple Inc. | Device voice control for selecting a displayed affordance |
US10200824B2 (en) | 2015-05-27 | 2019-02-05 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device |
US9578173B2 (en) | 2015-06-05 | 2017-02-21 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US20160378747A1 (en) | 2015-06-29 | 2016-12-29 | Apple Inc. | Virtual assistant for media playback |
US10740384B2 (en) | 2015-09-08 | 2020-08-11 | Apple Inc. | Intelligent automated assistant for media search and playback |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10331312B2 (en) | 2015-09-08 | 2019-06-25 | Apple Inc. | Intelligent automated assistant in a media environment |
US10845949B2 (en) | 2015-09-28 | 2020-11-24 | Oath Inc. | Continuity of experience card for index |
US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
US10521070B2 (en) | 2015-10-23 | 2019-12-31 | Oath Inc. | Method to automatically update a homescreen |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10956666B2 (en) | 2015-11-09 | 2021-03-23 | Apple Inc. | Unconventional virtual assistant interactions |
US10049668B2 (en) | 2015-12-02 | 2018-08-14 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10831766B2 (en) | 2015-12-21 | 2020-11-10 | Oath Inc. | Decentralized cards platform for showing contextual cards in a stream |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
CN105807952B (en) * | 2016-03-07 | 2020-01-31 | 联想(北京)有限公司 | information processing method and electronic equipment |
US11227589B2 (en) | 2016-06-06 | 2022-01-18 | Apple Inc. | Intelligent list reading |
US10049663B2 (en) | 2016-06-08 | 2018-08-14 | Apple, Inc. | Intelligent automated assistant for media exploration |
US10586535B2 (en) | 2016-06-10 | 2020-03-10 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
DK179415B1 (en) | 2016-06-11 | 2018-06-14 | Apple Inc | Intelligent device arbitration and control |
DK201670540A1 (en) | 2016-06-11 | 2018-01-08 | Apple Inc | Application integration with a digital assistant |
US10474753B2 (en) | 2016-09-07 | 2019-11-12 | Apple Inc. | Language identification using recurrent neural networks |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
US10417266B2 (en) | 2017-05-09 | 2019-09-17 | Apple Inc. | Context-aware ranking of intelligent response suggestions |
DK201770383A1 (en) | 2017-05-09 | 2018-12-14 | Apple Inc. | User interface for correcting recognition errors |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
US10395654B2 (en) | 2017-05-11 | 2019-08-27 | Apple Inc. | Text normalization based on a data-driven learning network |
DK180048B1 (en) | 2017-05-11 | 2020-02-04 | Apple Inc. | MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION |
DK201770428A1 (en) | 2017-05-12 | 2019-02-18 | Apple Inc. | Low-latency intelligent automated assistant |
DK179745B1 (en) | 2017-05-12 | 2019-05-01 | Apple Inc. | SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT |
US11301477B2 (en) | 2017-05-12 | 2022-04-12 | Apple Inc. | Feedback analysis of a digital assistant |
DK179496B1 (en) | 2017-05-12 | 2019-01-15 | Apple Inc. | USER-SPECIFIC Acoustic Models |
DK201770411A1 (en) | 2017-05-15 | 2018-12-20 | Apple Inc. | Multi-modal interfaces |
US10403278B2 (en) | 2017-05-16 | 2019-09-03 | Apple Inc. | Methods and systems for phonetic matching in digital assistant services |
US10311144B2 (en) | 2017-05-16 | 2019-06-04 | Apple Inc. | Emoji word sense disambiguation |
US20180336892A1 (en) | 2017-05-16 | 2018-11-22 | Apple Inc. | Detecting a trigger of a digital assistant |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
DK179560B1 (en) | 2017-05-16 | 2019-02-18 | Apple Inc. | Far-field extension for digital assistant services |
US10733375B2 (en) | 2018-01-31 | 2020-08-04 | Apple Inc. | Knowledge-based framework for improving natural language understanding |
US10592604B2 (en) | 2018-03-12 | 2020-03-17 | Apple Inc. | Inverse text normalization for automatic speech recognition |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
US10909331B2 (en) | 2018-03-30 | 2021-02-02 | Apple Inc. | Implicit identification of translation payload with neural machine translation |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US10984780B2 (en) | 2018-05-21 | 2021-04-20 | Apple Inc. | Global semantic word embeddings using bi-directional recurrent neural networks |
DK201870355A1 (en) | 2018-06-01 | 2019-12-16 | Apple Inc. | Virtual assistant operation in multi-device environments |
DK180639B1 (en) | 2018-06-01 | 2021-11-04 | Apple Inc | DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT |
US11386266B2 (en) | 2018-06-01 | 2022-07-12 | Apple Inc. | Text correction |
DK179822B1 (en) | 2018-06-01 | 2019-07-12 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
US10944859B2 (en) | 2018-06-03 | 2021-03-09 | Apple Inc. | Accelerated task performance |
US11010561B2 (en) | 2018-09-27 | 2021-05-18 | Apple Inc. | Sentiment prediction from textual data |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11170166B2 (en) | 2018-09-28 | 2021-11-09 | Apple Inc. | Neural typographical error modeling via generative adversarial networks |
US10839159B2 (en) | 2018-09-28 | 2020-11-17 | Apple Inc. | Named entity normalization in a spoken dialog system |
US11475898B2 (en) | 2018-10-26 | 2022-10-18 | Apple Inc. | Low-latency multi-speaker speech recognition |
US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US11475884B2 (en) | 2019-05-06 | 2022-10-18 | Apple Inc. | Reducing digital assistant latency when a language is incorrectly determined |
DK201970509A1 (en) | 2019-05-06 | 2021-01-15 | Apple Inc | Spoken notifications |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
DK180129B1 (en) | 2019-05-31 | 2020-06-02 | Apple Inc. | User activity shortcut suggestions |
DK201970510A1 (en) | 2019-05-31 | 2021-02-11 | Apple Inc | Voice identification in digital assistant systems |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11360641B2 (en) | 2019-06-01 | 2022-06-14 | Apple Inc. | Increasing the relevance of new available information |
US11468890B2 (en) | 2019-06-01 | 2022-10-11 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
US11061543B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | Providing relevant data items based on context |
US11038934B1 (en) | 2020-05-11 | 2021-06-15 | Apple Inc. | Digital assistant hardware abstraction |
US11755276B2 (en) | 2020-05-12 | 2023-09-12 | Apple Inc. | Reducing description length based on confidence |
US11490204B2 (en) | 2020-07-20 | 2022-11-01 | Apple Inc. | Multi-device audio adjustment coordination |
US11438683B2 (en) | 2020-07-21 | 2022-09-06 | Apple Inc. | User identification using headphones |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030001863A1 (en) * | 2001-06-29 | 2003-01-02 | Brian Davidson | Portable digital devices |
US6834249B2 (en) * | 2001-03-29 | 2004-12-21 | Arraycomm, Inc. | Method and apparatus for controlling a computing system |
US20080030360A1 (en) * | 2006-08-02 | 2008-02-07 | Jason Griffin | System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device |
US20100004031A1 (en) * | 2008-07-07 | 2010-01-07 | Lg Electronics Inc. | Mobile terminal and operation control method thereof |
US20100149094A1 (en) * | 2008-10-24 | 2010-06-17 | Steve Barnes | Snow Globe Interface for Electronic Weather Report |
US20100214211A1 (en) * | 2009-02-24 | 2010-08-26 | Research In Motion Limited | Handheld electronic device having gesture-based control and a method of using same |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9727082B2 (en) * | 2005-04-26 | 2017-08-08 | Apple Inc. | Back-side interface for hand-held devices |
US7667686B2 (en) * | 2006-02-01 | 2010-02-23 | Memsic, Inc. | Air-writing and motion sensing input for portable devices |
US20100014094A1 (en) * | 2008-07-21 | 2010-01-21 | Cole Barrett E | Distributed gas detection |
-
2011
- 2011-05-04 CN CN201180003756.6A patent/CN102870065B/en active Active
- 2011-05-04 DE DE112011100058T patent/DE112011100058T5/en active Pending
- 2011-05-04 WO PCT/CA2011/050270 patent/WO2012149627A1/en active Application Filing
- 2011-05-04 CA CA2834914A patent/CA2834914C/en active Active
- 2011-05-04 GB GB201204199A patent/GB2504256B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6834249B2 (en) * | 2001-03-29 | 2004-12-21 | Arraycomm, Inc. | Method and apparatus for controlling a computing system |
US20030001863A1 (en) * | 2001-06-29 | 2003-01-02 | Brian Davidson | Portable digital devices |
US20080030360A1 (en) * | 2006-08-02 | 2008-02-07 | Jason Griffin | System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device |
US20100004031A1 (en) * | 2008-07-07 | 2010-01-07 | Lg Electronics Inc. | Mobile terminal and operation control method thereof |
US20100149094A1 (en) * | 2008-10-24 | 2010-06-17 | Steve Barnes | Snow Globe Interface for Electronic Weather Report |
US20100214211A1 (en) * | 2009-02-24 | 2010-08-26 | Research In Motion Limited | Handheld electronic device having gesture-based control and a method of using same |
Also Published As
Publication number | Publication date |
---|---|
GB2504256B (en) | 2019-12-25 |
CN102870065B (en) | 2016-01-20 |
CA2834914A1 (en) | 2012-11-08 |
DE112011100058T5 (en) | 2013-02-07 |
CN102870065A (en) | 2013-01-09 |
CA2834914C (en) | 2016-06-28 |
GB2504256A (en) | 2014-01-29 |
GB201204199D0 (en) | 2012-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2834914C (en) | Methods for adjusting a presentation of graphical data displayed on a graphical user interface | |
EP3258361B1 (en) | Mobile terminal using pressure sensor and method of controlling the mobile terminal | |
US8553016B2 (en) | Mobile terminal and method of controlling operation of the mobile terminal | |
EP2385453B1 (en) | Mobile terminal and method for displaying an image in a mobile terminal | |
EP2410715B1 (en) | Mobile terminal and controlling method thereof | |
EP2400737B1 (en) | A method for providing an augmented reality display on a mobile device | |
US8723812B2 (en) | Mobile terminal and method of controlling the mobile terminal | |
US20160291864A1 (en) | Method of interacting with a portable electronic device | |
US20130169545A1 (en) | Cooperative displays | |
EP2323026A2 (en) | Mobile terminal and image processing method therefor | |
KR20160080036A (en) | User termincal device and methods for controlling the user termincal device thereof | |
KR20090101733A (en) | Mobile terminal and displaying method of display information using face recognition thereof | |
US20110159885A1 (en) | Mobile terminal and method of controlling the operation of the mobile terminal | |
KR20110050248A (en) | Mobile device and method for dividing screen thereof | |
US9584651B2 (en) | Mobile terminal and method for controlling the same | |
EP2800083A1 (en) | Information processing device, information processing method, and program | |
EP2611117B1 (en) | Cooperative displays | |
KR20090070050A (en) | Mobile terminal and its method for controlling of user interface menu | |
KR101186334B1 (en) | Mobile terminal and operation control method thereof | |
WO2012061917A1 (en) | Motion gestures interface for portable electronic device | |
EP2520999A1 (en) | Methods for adjusting a presentation of graphical data displayed on a graphical user interface | |
US9041733B2 (en) | Methods for adjusting a presentation of graphical data displayed on a graphical user interface | |
US20130093667A1 (en) | Methods and devices for managing views displayed on an electronic device | |
KR101791061B1 (en) | Electronic device and audio playback method for electronic device | |
KR101651477B1 (en) | Mobile terminal and operation method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180003756.6 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 1204199 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20110504 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1204199.2 Country of ref document: GB Ref document number: 112011100058 Country of ref document: DE Ref document number: 1120111000588 Country of ref document: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11864832 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2834914 Country of ref document: CA |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11864832 Country of ref document: EP Kind code of ref document: A1 |