EP1290672A1 - Ansichtsnavigation und vergrvsserung eines tragbaren gerdts mit einer anzeige - Google Patents
Ansichtsnavigation und vergrvsserung eines tragbaren gerdts mit einer anzeigeInfo
- Publication number
- EP1290672A1 EP1290672A1 EP01928361A EP01928361A EP1290672A1 EP 1290672 A1 EP1290672 A1 EP 1290672A1 EP 01928361 A EP01928361 A EP 01928361A EP 01928361 A EP01928361 A EP 01928361A EP 1290672 A1 EP1290672 A1 EP 1290672A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- orientation
- view
- navigation mode
- view navigation
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/161—Indexing scheme relating to constructional details of the monitor
- G06F2200/1614—Image rotation following screen orientation, e.g. switching from landscape to portrait mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72445—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting Internet browser applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Definitions
- the present invention relates, in general, to the field of mobile computing and communication hand-held devices utilizing an information display, specifically to view navigation and scrolling of a stored virtual display or a magnified image of the display in response to changes of the orientation at which the device is held in one hand.
- Web Clipping is an approach taken by Palm Inc. to allow their commercially available Palm Series to browse the Internet.
- Web Clipping applications were developed for popular web sites, which respond to user's queries by clipping minimal information from the accessed sites in a typically textual form.
- the disadvantage of Web Clipping is that limited information is brought to the user, and not all web sites have the application to create the web clipping for the user. In many cases, even after clipping, the amount of available information is much more than can fit in one display view.
- optical magnification of a small display has been adopted for virtual reality helmets and other applications where the display is seated at a fixed distance from the user's eye.
- U.S. Patent No. 5,739,955 discloses such a virtual reality helmet with binocular magnifying optics.
- Optical magnifiers may appeal to persons with impaired vision who cannot view detailed information in a small handheld display.
- the main problem with optical magnification when used with a hand-held device is the difficulty of use.
- Such a hand-held device and its associated optics must be placed at a relatively fixed place in front of the user's eyes so that the magnified display can stay in focus.
- U.S. Patent No. 3,976,995 (Reissue Patent. 32,365) teaches the use of a processing display which moves the message across the display in a continuous fashion so that the display needs to be only large enough to present a relatively small portion of the total message. While this approach may be useful to display simple sentences, it is not practical when displaying complex graphic information. And even with simple character displays, the user needs to wait patiently while the message is scrolling around.
- U.S. Patent 5,311 ,203 discloses a hand-held viewing apparatus that determines the three dimensional direction in which it is pointing and automatically presents information to match the features visible in its field of view.
- This device is intended to observe, identify and locate stars or stellar constellations in an observed portion of the night sky. Once the exact direction of the device is measured, the device correlates the viewed objects and information stored in its database and displays identifying annotations near the corresponding objects. Since the device must correlate exactly between the database information with the observed objects, an exact spatial angle relative to earth is required, thus making the device prohibitively complex and expensive.
- the present invention seeks to provide a user friendly and convenient navigation of displayed information in a hand-held device, so that a large amount of data can be viewed in the relatively small size of the device's display.
- the present invention allows the operator to perform such navigation of the display view using the same hand that holds the device.
- a hand-held device in accordance with the present invention has a housing with a display, memory means to store a virtual display, processing means, and an orientation sensor responsive to changes in the spatial orientation at which the device is held.
- the display can be set at a navigation and scrolling mode or at a fixed mode. When the view navigation mode is set, the display scrolls the stored virtual display under the direction of the processing means in response to the changes in the orientation of the device measured by the orientation sensor.
- the display When set to the fixed mode, the display remains fixed and no longer follows the changes in orientation.
- the display provides a clear visual indication during the view navigation mode in order to alert the operator that it will scroll in response to changes in the orientation of the device.
- the device further comprises a set of two ergonomic switches placed along both sides of the housing so that the operator must press both switches during the view navigation mode.
- Such arrangement ensures convenient activation of the view navigation mode, as it is natural for the holding hand to press both switches at the same time. Also, the requirement that both switches must be pressed in order to activate the view navigation mode better protects against unintentional change of the displayed view.
- said ergonomic switches are replaced with means to detect a tap by the operator's finger at the bottom of the device as an instruction to set said view navigation mode after a short time delay. Once the view navigation mode is set, it remains in this mode for a preset period of time or as long as the operator substantially changes the orientation of the device.
- preset hand gestures by the operator communicate special commands to the hand-held device. For example, a quick upward jerk of the hand-held device without substantial change in orientation might be used to set the device to the view navigation mode.
- a built-in voice recognition means accepts a spoken word command from the operator to enter said view navigation mode.
- the present invention is particularly suitable for a display magnification application. Once set to magnification mode, the current view on the display can be arbitrarily magnified even by a large factor, and then using the view navigation mode to navigate the enlarged view on the display.
- FIG. 1A, FIG. 1B, and FIG. 1C show three subsequent views of a hand-held device incorporating the present invention when it is rolled from left to right while navigating the virtual display of FIG. 1 D.
- FIG. 2 indicates the relative axes of roll and pitch along which the hand-held device is rotated in order to navigate the view in accordance with the present invention.
- FIG. 3 illustrates the use of the present invention as a magnifier for the display of a hand-held device in accordance with the present invention.
- FIG. 4 shows an ergonomic embodiment of the present invention that uses two side switches to activate the view navigation mode.
- FIG. 5 is a block diagram of the embodiment in FIG. 4.
- FIG. 6 outlines the software flow diagram for the embodiment of the invention of FIG 5.
- FIG. 7 is a timing chart showing an example of the time dependent response curve of the system to changes in orientation along one axis.
- FIG. 8 is a block diagram of another embodiment of the present invention that uses a finger tap of the operator to set the hand-held device into view navigation mode.
- FIG. 9 is a perspective view of the hand-held device of FIG. 8 showing a cutaway view of the finger tap sensor.
- FIG. 10 is the block diagram of the embodiment of the present invention in the form of an add-on upgrade to an existing hand-held device.
- FIG. 1 1 shows yet another embodiment of the present invention that uses different sensors to track the orientation, and uses a speech recognition module to accept spoken commands to enter and exit view navigation mode.
- FIG. 12 is a timing chart illustrating how the view navigation mode is entered and exited in accordance with the present invention.
- FIG. 13A outlines the software flow diagram for the embodiment of FIG. 12C that remains in view navigation mode only for a fixed time period and then exits to the fixed mode.
- FIG. 13B outlines the software flow diagram for the embodiment of FIG. 12D that remains in view navigation mode as long as the operator continues to change the orientation of the device.
- FIG. 14 illustrates a further embodiment of the present invention that uses an additional Z-axis accelerometer to identify the operator's vertical hand gestures to enter the view navigation mode and for other commands.
- FIG. 15 illustrates the placement of the three accelerometers of the embodiment of FIG. 14 on the main PCB of the hand-held device.
- FIG. 16 is a timing chart illustrating the use of the embodiment of the present invention of FIG. 14 with three accelerometers.
- FIG. 17 outlines the software flow diagram for the embodiment of FIG. 14 that uses a vertical movement gesture of the operator to enter and exit the view navigation mode.
- This invention allows hand-held communication or computing devices with a relatively small display to conveniently navigate a large stored virtual display with one hand.
- Such devices may include mobile computers like the commercially available PALM PILOT, Cassiopea, PSION, Newton, and other palmtop computers. It may also include various PDA devices, mobile. hand-held terminals, advanced pagers, and a variety of mobile telephones with expanded information displays.
- the hand-held device in accordance with the present invention employs two operational modes, which throughout this document are referred to as view navigation mode and fixed mode.
- view navigation mode When set to the view navigation mode, the display view is automatically scrolled to follow the rotational movements of the holding hand.
- fixed mode When set back to the fixed mode, the display view becomes stationary and no longer follows said movements of the hand.
- FIG.1 shows an overview of the operation of a hand-held device 10 built in accordance with the present invention when it is set to the view navigation mode.
- the device has a flat display 12 that is typically made of a LCD with optional back lighting, and a plurality of operational keys.
- the display 12 is too small to show the entire virtual display 30 that is stored in the hand-held device and is shown in FIG. 1 D.
- the virtual display 30 shows a combination of easily identified graphical objects like a space station 26, a space shuttle 24, and an astronaut with the American flag 22, in addition to a character message 28.
- FIG. 1 A the navigation process is started when the operator's hand 20 rolls the device 10 to the left so that the display 12 shows the left portion of the stored virtual display 30.
- FIG. 1B shows how the view in the display 12 scrolls to the left, as the space shuttle picture 24 comes into view.
- FIG. 1 C further shows how the right portion of the virtual display 30 including the American flag 22 is viewed when the operator's hand 20 continues to roll to the right.
- FIG. 2 indicates the relative axes of orientation along which the hand-held device 10 is rotated in order to navigate the display 12 in accordance with the present invention.
- I will refer to axis 32 as the Y-axis or, with influence from aviation and naval terms, the roll axis.
- I will refer to axis 36 as the X-axis or the pitch axis.
- changes in the height at which the device 10 is held, and which are measured along the Z-axis 40 are used to switch the view navigation mode on. Referring back to the process shown in FIG. 1 , it should be clear even though not shown in this drawing, that the display can be navigated vertically as the device 10 is tilted back and forth along the pitch axis.
- the control software of the present invention smoothes the hand movements to provide convenient navigation even when using relatively coarse orientation sensors.
- FIG. 3 illustrates how the present invention is well adapted to provide display magnification and navigation of said magnified display.
- FIG. 3A shows a hand-held device that displays some information 50, which may not be viewable by a user who suffers from impaired vision.
- the display 12 shows a magnified portion 52 of the original information 50 as shown in FIG. 3B.
- the device 10 can now navigate the magnified view in accordance with the present invention.
- the display preferably includes some visual indication 54 to the operator, to alert that the display will be scrolled as the operator changes the device orientation.
- the indication shown in the drawing depicts four rotated 'L' shaped 54 markers at the four corners of the display 12.
- the visual indication 54 greatly benefits those embodiments of the present invention that do not use an actual switch for entering and leaving the view navigation mode. It should be noted that for a magnification application for persons with impaired vision, the display should employ an active TFT LCD or some other display technologies that exhibit sharp contrast features.
- FIG. 4 shows an ergonomic embodiment of the present invention that uses two side switches 62 and 64 to activate the view navigation mode.
- the drawing illustrates mechanical switches that utilize springs 66 to provide tactile response to the operator. Other tactile switches like the sealed membrane type and capacitance strip switches may be used.
- the hand-held device 10 is set to the view navigation mode only when both switches are pressed by the operator, and it reverts back to fixed mode when at least one of the switches is disengaged. Alternatively, the switches can be used to signal a command to enter the view navigation mode and allow the program to terminate the mode in accordance with the description below. While the present invention will work with one switch only, experimentation showed that this arrangement of two side switches is more ergonomic and provides better intuitive control to the operator. The use of two switches on both sides of the hand-held device seems to equally benefit both right-handed and left-handed persons.
- FIG. 5 is a block diagram of the embodiment in FIG. 4, where the view navigation circuitry of the present invention is fully integrated into the main circuitry of the hand-held device.
- All smart hand-held devices with displays 12 typically employ at least one micro-controller 100 or a micro-processor, memory 102 to store program and display data, and a display controller 104.
- Orientation sensor circuitry 80 includes X-axis 82 and Y-axis 84 accelerometers that are respectively mounted inside the hand-held device to align with axes 36 and 32 of FIG. 2.
- the view navigation according to the present invention provides a close loop between the user's movements and the actual navigation, there is no need for an exact alignment between the sensor and the device axes. In fact, it is enough that the two sensors will be generally perpendicular to each other. Also, any misalignment among the sensor axes of sensitivity and the device's X and Y axes can be corrected by the program.
- Such accelerometers are preferably implemented by a surface micro-machining technique that builds electromechanical structures in silicon. The small size of micro-machined structures makes it economical to place the circuitry and sensors on the same die. For example, a commercially available 1 micrometer CMOS process known as "iMEMS" from Analog Devices Inc.
- ADXL202 accelerometer which incorporates two axes and circuitry to digitally generate a duty cycle modulator (DCM) output.
- DCM duty cycle modulator
- Such DCM or other analog to digital conversion 86 and 88 are used to interface the orientation data to the micro-controller 100.
- the accelerometers provide tilt angle information depending on their inclination relative to earth gravity. When the operator is moving, there are also small acceleration artifacts picked up by the accelerometers. The program can filter out such acceleration artifacts, or the operator may re-navigate the view to correct undesired view movements.
- the right switch 62 and the left switch 64 are connected in series to the VCC potential 70 (or a ground potential if reverse logic is used) so that when both are pressed, they connect an activation signal to the micro-controller 100 via line 72.
- This signal instructs the program of the micro-controller to set to the view navigation mode. If the hand-held device includes a beeper 94, the program can provide a beep to alert the operator that the view navigation mode is set.
- the micro-controller instructs the display controller 104 to provide a visual indication 54 as shown in FIG. 3B to the operator that the hand-held device is in the view navigation mode.
- the micro-controller 100 translates the changes in pitch and roll orientation as communicated through lines 90 and 92 to navigation commands that scroll a virtual display which is stored in memory 102. For example, when the Y-axis accelerometer indicates that the operator has rotated the hand-held device 10 like in FIG. 1 B to the right, the micro-controller 100 controls the virtual display in the memory 102 and the display controller 104 to scroll the view in the display 12 to the right.
- FIG. 6 outlines the software flow diagram for the embodiment of the invention of FIG 5.
- the flow from start 120 to end 134 is performed several times a second in a standard polling process of the micro-controller 100.
- the initialization step at block 122 the current boundary of the display view is marked in comparison to the stored virtual display.
- the status of both navigation switches 62 and 64 are checked at block 124. If both switches are pressed, the system is set to view navigation mode in block 126 providing the visual indication 54 to alert the operator that changes in orientation of the hand-held device will navigate the display.
- the pitch and roll data are acquired, stored and compared to the previous reading. If a change in orientation is detected at block 130, the program computes the new boundary for the view at block 132.
- the program can be set with different response curves for computing the new boundary in response to changes in orientation at block 132.
- Fine or coarse modes of response can be set by the operator or can be changed dynamically during the time the system is in view navigation mode. With fine response, the display view navigates the virtual display at a relatively slow rate in response to the orientation changes. With coarse response, the display view changes rapidly in response to the orientation changes.
- FIG. 7 is a timing chart showing an example of the time dependent response curve of the system to changes in orientation along one axis. Similar relations are employed along the other orientation axis, although response figures may be set with different biases to each axis.
- FIG. 7A shows the relative response curve that is setup onto the program and may be modified by the operator. The relative response is obtained by dividing the amount of change in view navigation, which is a normalized figure proportional to what percentage of the virtual display has been scrolled, to the change of orientation that caused it. Thus, the operator can achieve fine and slow navigation when the relative response figure is low, and coarse and fast navigation when the relative response figure is high.
- FIG. 7B illustrates the reading from the orientation sensor on one axis as represented by the signal on line 90 or 92 of FIG. 5.
- FIG. 7C shows the time corresponding changes in orientation along the monitored axis as measured by the micro-controller 100 at block 128 of FIG. 6.
- FIG. 7D illustrates the resulting navigation of the view along the monitored axis as computed by the micro-controller 100 at block 132 in accordance with one mode of operation.
- FIG. 7E illustrates the resulting navigation of the view along the monitored axis as computed by the micro-controller 100 in response to the same stimuli but in accordance with an alternate mode of operation.
- FIG. 7A indicates that the device was switched to the view navigation mode at time t1 140, and switched back to the fixed mode at time t4. During the fixed mode periods 144 and 146 of the drawing, the view remain fixed on the display during the corresponding periods 148 and 150 in FIG.
- a filtering algorithm cleans the orientation data from jerks and other orientation "noises". For example, while unintentional noise 160 and 162 in FIG. 7B results in corresponding orientation change pulses 164 and 166 in FIG. 7C, the program actually ignores them and does not navigate the view as shown in FIG. 7D and FIG. 7E.
- FIG. 7D illustrates the view navigation response when using a mode of operation that changes the navigated view only during a period of orientation changes.
- a mode of operation that changes the navigated view only during a period of orientation changes.
- an orientation change is detected and the view is slowly navigated in response.
- the view navigates at an accelerated rate 176.
- a relatively slow orientation change 182 results in a rapid navigation 184 of the view. In this operation mode the view remains stationary in the absence of orientation changes.
- FIG. 7E illustrates the view navigation response when an alternate continuous mode of operation keeps the view navigating at the rate and direction which was established during the last valid orientation change.
- an orientation change is detected and the view is slowly navigated 186 in response.
- the view continues to navigate in the same direction until a change in orientation occurs again in period 174.
- a new navigation rate 188 in the same direction but at a rate responsive to the reduced orientation rate change in period 174 is multiplied by the increase relative response at 154.
- This navigation continues until period 180, which occurs during the coarse navigation.
- a relatively slow orientation change 182 results in a rapid navigation 190 of the view.
- the program employs a minimum response threshold to allow the navigation to stop when the operator slightly reverses the direction of orientation.
- Other modes of operation can be established as variants of those shown in FIG. 7D and FIG. 7E.
- Such a response curve with at least two settings of fine and coarse navigation allows exact view navigation.
- Other response curves may be a fixed value, or may toggle from fine to coarse navigation at each subsequent entry to view navigation mode.
- Another solution is to eliminate the switches altogether and to monitor the pitch and roll changes for user specific gestures that will activate the view navigation mode.
- the program will keep an ongoing track of the roll and pitch orientation in storage and will analyze it continuously to see if the operator's gestures have been detected. It seems that for a more reliable activation of the view navigation mode, an additional sensor is required.
- FIG. 8 is another embodiment of the present invention that substitutes the switches 62 and 64 with an activation detector 200 that responds to a finger tap of the operator to set the hand-held device into view navigation mode.
- the finger tap detector includes a sound/vibration transducer 202 to sense the finger tap and to output a voltage that represents the sensed tap vibrations.
- the output of the sensor 202 is connected by wire 203 to the input of amplifier 204.
- the amplifier 204 is set up with a certain threshold to insure that only signals above the threshold are amplified.
- the amplified output of amplifier 204 is filtered by a low pass filter 206, whose output is connected to the analog-to-digital converter 208 to provide digital data to the microcontroller 100 via connection 212.
- the micro-controller uses the orientation change information from the orientation sensor 80 to navigate the display 12 in a similar way to the discussion of FIG.5.
- FIG. 9 is a perspective view of the hand-held device 10 in accordance with the embodiment of FIG. 8 showing a cutaway view of the finger tap sensor 202.
- the finger tap sensor 202 is attached to the bottom of the housing of the device 10, or it may be placed in the area just beneath the top side of the display 12 where the operator's finger is likely to tap while the device is held by one hand.
- the sensor 202 is connected by wire 203 to the circuitry described above. Proper amplification can insure that the device will detect all relevant finger taps at the bottom.
- Such finger taps produce vibrations and sound patterns that are significantly different from the sound that may be created by stylus strikes on the screen or the activation of a key 13.
- the micro-controller can distinguish a finger tap sound from other noises such as those created when the device is placed on the table.
- the finger tap sensor 202 can be surface mounted to the bottom of the PCB assembly 450 inside the device 10.
- FIG. 10 is the block diagram of the embodiment of the present invention in the form of an add-on upgrade 300 that is attached to an existing handheld device 10.
- the add-on upgrade 300 is built on a miniature PCB that is connected to the mobile device 10 externally or internally, depending on the expansion capability of said hand-held device.
- the add-on upgrade 300 comprises the sensors described in other embodiments of the present invention, as well as a standard micro-controller 302.
- a micro-controller typically includes its own stored program and data memory as well as a communication channel 304 like UART, SPC, and I2C.
- the microcontroller 302 receives the orientation changes information in lines 90 and 92 as well as the signal to enter or exit the view navigation mode 212. Processing the orientation change data and navigation entry command as explained above, the micro-controller computes the desired navigation changes for the device's display and provides commands to the hand-held device via the communication channel 304.
- the commands which might be in the form of a serial protocol, interface to the handheld device via the application interface port 310.
- the advantages of using the finger tap detector to command the entry to the view navigation mode is that it is less costly to manufacture than the side switches of FIG. 4. It is also easier to add a view navigation system with a finger tap sensor to an existing hand-held device.
- FIG. 1 1 shows yet another embodiment of the present invention that uses different sensors.
- the orientation sensor assembly 380 that includes a magnetic direction sensor 364 and a tilt sensor 368 replaces the accelerometer based orientation sensor 80 of FIG. 8.
- Various devices are commercially available for measuring the magnetic direction of the device. For example, magneto-resistors placed along axes 32 and 36 will produce orientation indications as the device is moved relative to the magnetic field of the earth.
- Tilt sensors may be comprised of a potentiometer and a small weight coupled to the potentiometer that is free to rotate on a fixed axis around which the tilt is measured.
- Other types of commercially available inclinometers are made of small, liquid-filled variable resistors that change their resistance based on their inclination.
- orientation sensors like laser gyros, may be used instead of the accelerometers 82 and 84.
- Yet another embodiment may employ several sets of magneto-resistors to measure orientation changes in two degrees of freedom without the use of an inclinometer or the accelerometers.
- the navigation command unit uses a speech recognition module 400 instead of the finger tap sensor 200 of FIG. 8 to command the micro-controller 100 to enter into view navigation mode.
- the speech recognition module may already be a part of the hand-held device 10 regardless of the view navigation feature, and as such can be used to allow the operator to enter the view navigation mode verbally, while still holding the device in one hand.
- FIG. 11 omits the display 12, the display controller 104 and the memory 102 which are still connected to the microcontroller 100 as shown in FIG. 8 and FIG. 5.
- the magnetic direction sensor 364 and tilt sensor 368 can replace the accelerometers 82 and 84 in FIG. 5.
- the speech recognition circuitry 400 may be used instead of the finger tap detector 200 of FIG. 8, or the accelerometers 82 and 84 of the embodiment of FIG. 8 can replace the orientation sensor of FIG. 1 1. Selection of the sensor should be made with a consideration of the overall structure of the hand-held device. In most applications, the devices that do not require moving parts and may be integrated into the silicon die of the main unit will most likely prove the most cost efficient.
- FIG. 12 is a timing chart illustrating how the view navigation mode is entered and exited in the embodiments of the present invention that use the finger tap sensor of FIG. 5.
- the signal received from the sensor 202 on line 203 is shown in FIG. 12A.
- FIG. 12B shows the signal that represents the change of orientation along one axis.
- the signal 240 representing a finger tap is detected between time t1 and t2.
- a sufficiently large time delay of t3-t2 is introduced by the micro-controller to eliminate any wrong orientation change readings that result from the finger tap motion.
- the orientation change signals 242 and 244 are likely an artifact of the tapping by the operator.
- the micro-controller sets the device to the view navigation mode and alerts the operator with the visual indication 54.
- FIG. 12D are logic signals indicating with logic "high" that the device is set to the view navigation mode.
- FIG. 12C illustrates a navigation exit method by which the device remains at the view navigation mode 254 for a preset time of t4-t3.
- the view navigation mode terminates to fix the displayed view. If more navigation is required, the operator repeats the process by tapping again at the bottom of the device.
- FIG. 12D illustrates another navigation exit method by which the device remains at the view navigation mode 256 as long as the operator changes the orientation of the device.
- the time period 246 the operator rotates the device in one direction, and during time period 248, the operator changes the direction of rotation.
- the time period 250 which is equal to t6-t5
- the micro-controller terminates the view navigation mode 246 at time t6.
- the display will not navigate in response to renewed orientation changes 252.
- the timing diagram of FIG. 12 can be used for other embodiments of the present invention.
- FIG. 13A outlines the software flow diagram for the embodiment of FIG. 12C that remains in view navigation mode only for a fixed time period and then exits to the fixed mode.
- the process starts at block 260 when the micro-controller 100 identifies the finger tapping or the voice command to enter the view navigation mode.
- the current boundary of the display view is marked in comparison to the stored virtual display.
- a delay corresponding to the time length t3- t2 of FIG. 12 is introduced at block 262 to allow the device some time to stabilize after the finger tap.
- the system is set to view navigation mode in block 268 to start the view navigation. It also provides the visual indication 54 to alert the operator that changes in orientation of the hand-held device will navigate the display.
- a variation of the process may activate the navigation indication 54 before block 264 but still enter the view navigation mode after the delay in block 264.
- a process timer is activated at block 268 to limit the stay at view navigation mode to a period of time equal to t4-t3.
- Block 270 monitors the process timer. If the process timer has not expired, the process continues to block 272 to acquire the pitch and roll orientation data, or in other embodiments of the present invention, the azimuth and inclination data. The process at block 272 also stores the orientation data for the next iteration, and compares the new and old orientation data. If a change in orientation is detected at block 274, the program computes the new boundary for the view at block 276.
- the micro-controller may interrupt the process to perform other tasks. Also, at the end of block 276 the micro-controller may allocate time through its operating system to other tasks, or even just wait for a preset time, to limit the number of iterations of the view navigation process in blocks 270 to 276 that are performed each second.
- a proper value of iterations per second depends on the selection of a coarse or fine view navigation mode as well as the response time of the orientation sensors, and it can be in the range of 5 to 30 for most practical applications.
- the advantage of a higher number of iterations per second is that it allows the navigation to match more closely the rotational movements of the hand, at the cost of an increased overhead on the micro-controller operating system.
- FIG. 13B outlines the software flow diagram for the embodiment of FIG. 12D that remains in view navigation mode as long as the operator continues to change the orientation of the device.
- the view navigation mode terminates.
- the process starts in blocks 260, 262 and 264 as described in FIG. 13A.
- the process provides the visual indication 54 to alert the operator that changes in orientation of the hand-held device will navigate the display.
- the acquisition of orientation data in block 284 is similar to the discussion for block 272 of FIG. 13A. If no change is detected in the orientation data at block 286, the process activates the no-action timer of block 290 to t6-t5. It then continues to acquire orientation data in block 292. If no change in the orientation data is detected at block 294, the process checks if the no- action timer has expired. If the timer expired, the view navigation mode ends in block 278 and 280 as described in FIG 13A.
- Block 286 If a change in orientation is detected at block 286, the program computes the new boundary for the view at block 288 and refreshes the display with the new view. It also saves the new current orientation as the basis for comparison in the next iteration of the process. Similarly, if block 294 detects a change in orientation data, it proceeds to block 288. Block 288 also deactivates the no-action timer of block 290 since a rotational action has been just detected.
- FIG. 14 illustrates a further embodiment of the present invention that uses an additional Z-axis accelerometer to identify the operator's vertical hand gestures to enter the view navigation mode and for other commands.
- the orientation and movement sub-section 420 includes a Z-axis accelerometer 432 set to sense movement at the vertical axis 40 of FIG. 2, in addition to the orientation sensors 82 and 84.
- the output of the accelerometer 420 is connected to the analog-to-digital converter 434.
- the analog-to-digital converter provides a digital signal to the microcontroller 100 that is responsive to the device's vertical movements.
- the program of the micro-controller monitors vertical acceleration changes to detect special hand gestures by the operator. For example, a quick up and down gesture of the operator may indicate a command to enter the view navigation mode.
- the program may be adapted to identify additional types of hand gestures as different commands. While special hand gestures may be identified by the set of two X and Y accelerometers, the program needs to isolate the hand gesture which is intended as a command from similar signals that are in response to operator navigation of the display.
- the Z-axis accelerometer 432 provides an extra measurement to allow the program to identify vertical movements from rotational navigation motion. For clarity purposes, FIG. 15 omits the display 12, the display controller 104 and the memory 102 which are still connected to the micro-controller 100 as shown in FIG. 8 and FIG. 5.
- sensors that provide elevation information can be used instead of the Z- axis accelerometer.
- This may include an air pressure sensor that is sensitive to changes in air pressure, or a distance detector that measures the distance of the device from the floor.
- FIG. 15 illustrates the placement of the three accelerometers of the embodiment of FIG. 14 on the main PCB 450 of the hand-held device, the PCB 450 carries the micro-controller 100, connection means 452 for the LCD display as well as other components, which are not shown for clarity. While many commercially available accelerometers incorporate two or more accelerometers on the same chip, the drawing illustrates the placement of the accelerometers assuming one accelerometer per chip.
- the X-axis accelerometer 82 and the Y-axis accelerometer 84 are placed at a right angle to each other.
- the Z-axis accelerometer 432 is mounted so that its sensitivity axis is perpendicular to the PCB 450. With high volume manufacturing, all three accelerometers may be incorporated in the same chip.
- FIG. 16 is a timing chart illustrating the use of the embodiment of the present invention of FIG. 14 with three accelerometers.
- FIG. 16A shows the acceleration signal received from the Z-axis accelerometer 432.
- FIG. 16B shows the signal that represents the change of orientation along the X-axis.
- FIG. 16C shows the signal that represents the change of orientation along the Y-axis.
- FIG 16D is a logic signal indicating with logic "high" when the device is set to the view navigation mode. We assume that the gesture to enter or exit view navigation mode is a rapid vertical up and down motion while the device is at a relatively stable orientation.
- the Z-axis accelerometer exhibits a strong signal 472 that corresponds to a vertical gesture by the operator between time t10 and t11.
- the other accelerometers show signals 474 and 476 since some orientation changes are affected even when the operator tries to keep the device at a relatively stable orientation.
- the micro-controller's program determines from this combination of signals that a valid operator gesture has been received to enter view navigation mode. After a preset time delay of t12-t11 the program sets the device to the view navigation mode 490 at time t12. The relatively small time delay insures that the program ignores artifacts 478 and 480 which tend to follow the entry gesture.
- the program acquires the orientation signals 482 and 484 to navigate the display. The program also keeps in storage a short trail of the accelerometer signals.
- the operator makes another gesture 486.
- the program completed the identification of the gesture and exits the view navigation mode.
- the program then returns the view to its setting just before the initiation of the gesture at time t14 using the stored trails of the accelerometer signals.
- the operator is therefore attempting to stabilize the view at time t13 so that the desired final view is present during the period 488 before the gesture to terminate the view navigation mode. If it is not desired to store the data trail of the accelerometers so that the view can be restored after the exit gesture, the exit method of FIG. 12C or FIG. 12D may be used.
- FIG. 17 outlines the software flow diagram for the embodiment of FIG. 14 that uses a vertical movement gesture of the operator to enter and exit view navigation mode.
- the process starts at block 500 in accordance with the overall tasking scheme of the operating systems of the micro-controller 100.
- the process acquires all sensor data at block 502 and keeps a data trail in memory to allow data shape analysis at block 504. If the analysis currently shows no gesture, the process terminates at block 506.
- block 504 determines that a gesture was made, the process continues with the initialization block 508 where the current boundary of the display view is marked in comparison to the stored virtual display.
- a delay corresponding to the time length t12-t11 of FIG. 16 is introduced at block 509 to allow the device some time to stabilize after the gesture.
- the system is set to view navigation mode in block 510 to start the view navigation. It also provides the visual indication 54 to alert the operator that changes in orientation of the hand-held device will navigate the display.
- the process acquires all sensor data at block 512 and keeps a data trail in memory to allow gesture shape analysis at block 514. If a change in orientation is detected at block 516, the program computes the new boundary for the view at block 518. It also refreshes the display to show the new view and it saves the new current orientation as the basis for comparison in the next iteration of the process. The process continues with the next iteration at block 512 to continue the navigation of the view. If no change in orientation was detected in block 516, the program continues to monitor the sensors at block 512. If a vertical gesture is detected in block 514, the program uses the data trail from the accelerometers to restore the view to its state just before the gesture 520. It then continues to block 522 to turn off the navigation indication 54 and the process ends at block 524.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Finance (AREA)
- Economics (AREA)
- Accounting & Taxation (AREA)
- Marketing (AREA)
- Development Economics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Entrepreneurship & Innovation (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Navigation (AREA)
- Telephone Set Structure (AREA)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US543660 | 1995-10-16 | ||
US09/543,660 US6466198B1 (en) | 1999-11-05 | 2000-04-05 | View navigation and magnification of a hand-held device with a display |
PCT/US2001/010962 WO2001078055A1 (en) | 2000-04-05 | 2001-04-04 | View navigation and magnification of a hand-held device with a display |
Publications (3)
Publication Number | Publication Date |
---|---|
EP1290672A1 true EP1290672A1 (de) | 2003-03-12 |
EP1290672A4 EP1290672A4 (de) | 2005-04-06 |
EP1290672B1 EP1290672B1 (de) | 2008-01-02 |
Family
ID=24169004
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP01928361A Expired - Lifetime EP1290672B1 (de) | 2000-04-05 | 2001-04-04 | Ansichtsnavigation und vergrösserung eines tragbaren geräts mit einer anzeige |
Country Status (6)
Country | Link |
---|---|
US (2) | US6466198B1 (de) |
EP (1) | EP1290672B1 (de) |
AT (1) | ATE382889T1 (de) |
DE (1) | DE60132201T2 (de) |
HK (1) | HK1054610A1 (de) |
WO (1) | WO2001078055A1 (de) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9866667B2 (en) | 2012-02-24 | 2018-01-09 | Blackberry Limited | Handheld device with notification message viewing |
US10346022B2 (en) | 2013-07-24 | 2019-07-09 | Innoventions, Inc. | Tilt-based view scrolling with baseline update for proportional and dynamic modes |
Families Citing this family (471)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8352400B2 (en) | 1991-12-23 | 2013-01-08 | Hoffberg Steven M | Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore |
US6345104B1 (en) | 1994-03-17 | 2002-02-05 | Digimarc Corporation | Digital watermarks and methods for security documents |
JP3338777B2 (ja) * | 1998-04-22 | 2002-10-28 | 日本電気株式会社 | 携帯端末、及びその画面表示方法 |
US7966078B2 (en) | 1999-02-01 | 2011-06-21 | Steven Hoffberg | Network media appliance system and method |
US20060061550A1 (en) * | 1999-02-12 | 2006-03-23 | Sina Fateh | Display size emulation system |
US20060061551A1 (en) * | 1999-02-12 | 2006-03-23 | Vega Vista, Inc. | Motion detection and tracking system to control navigation and display of portable displays including on-chip gesture detection |
US7749089B1 (en) | 1999-02-26 | 2010-07-06 | Creative Kingdoms, Llc | Multi-media interactive play system |
US7406214B2 (en) | 1999-05-19 | 2008-07-29 | Digimarc Corporation | Methods and devices employing optical sensors and/or steganography |
US20020032734A1 (en) | 2000-07-26 | 2002-03-14 | Rhoads Geoffrey B. | Collateral data combined with user characteristics to select web site |
US8874244B2 (en) * | 1999-05-19 | 2014-10-28 | Digimarc Corporation | Methods and systems employing digital content |
US7760905B2 (en) * | 1999-06-29 | 2010-07-20 | Digimarc Corporation | Wireless mobile phone with content processing |
JP3847058B2 (ja) | 1999-10-04 | 2006-11-15 | 任天堂株式会社 | ゲームシステム及びそれに用いられるゲーム情報記憶媒体 |
US8391851B2 (en) | 1999-11-03 | 2013-03-05 | Digimarc Corporation | Gestural techniques with wireless mobile phone devices |
JP2001134382A (ja) * | 1999-11-04 | 2001-05-18 | Sony Corp | 図形処理装置 |
US6466198B1 (en) | 1999-11-05 | 2002-10-15 | Innoventions, Inc. | View navigation and magnification of a hand-held device with a display |
US7187412B1 (en) * | 2000-01-18 | 2007-03-06 | Hewlett-Packard Development Company, L.P. | Pointing device for digital camera display |
US6633314B1 (en) | 2000-02-02 | 2003-10-14 | Raja Tuli | Portable high speed internet device integrating cellular telephone and palm top computer |
US6941382B1 (en) * | 2000-02-07 | 2005-09-06 | Raja Tuli | Portable high speed internet or desktop device |
JP2001306254A (ja) * | 2000-02-17 | 2001-11-02 | Seiko Epson Corp | 打音検出による入力機能 |
US7878905B2 (en) | 2000-02-22 | 2011-02-01 | Creative Kingdoms, Llc | Multi-layered interactive play experience |
US7445550B2 (en) | 2000-02-22 | 2008-11-04 | Creative Kingdoms, Llc | Magical wand and interactive play experience |
US6761637B2 (en) | 2000-02-22 | 2004-07-13 | Creative Kingdoms, Llc | Method of game play using RFID tracking device |
US7302280B2 (en) * | 2000-07-17 | 2007-11-27 | Microsoft Corporation | Mobile phone operation based upon context sensing |
US7289102B2 (en) * | 2000-07-17 | 2007-10-30 | Microsoft Corporation | Method and apparatus using multiple sensors in a device with a display |
US20110267263A1 (en) | 2000-07-17 | 2011-11-03 | Microsoft Corporation | Changing input tolerances based on device movement |
US8120625B2 (en) * | 2000-07-17 | 2012-02-21 | Microsoft Corporation | Method and apparatus using multiple sensors in a device with a display |
US6520013B1 (en) * | 2000-10-02 | 2003-02-18 | Apple Computer, Inc. | Method and apparatus for detecting free fall |
US7688306B2 (en) | 2000-10-02 | 2010-03-30 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
US7191211B2 (en) | 2000-10-03 | 2007-03-13 | Raja Tuli | Portable high speed internet access device priority protocol |
US7066781B2 (en) | 2000-10-20 | 2006-06-27 | Denise Chapman Weston | Children's toy with wireless tag/transponder |
US8817045B2 (en) * | 2000-11-06 | 2014-08-26 | Nant Holdings Ip, Llc | Interactivity via mobile image recognition |
US8130242B2 (en) * | 2000-11-06 | 2012-03-06 | Nant Holdings Ip, Llc | Interactivity via mobile image recognition |
US6690358B2 (en) * | 2000-11-30 | 2004-02-10 | Alan Edward Kaplan | Display control for hand-held devices |
JP3655824B2 (ja) * | 2000-12-07 | 2005-06-02 | 日本電気株式会社 | 携帯情報端末装置及びその表示方法 |
US7123212B2 (en) * | 2000-12-22 | 2006-10-17 | Harman International Industries, Inc. | Information transmission and display method and system for a handheld computing device |
US20020109673A1 (en) * | 2001-01-04 | 2002-08-15 | Thierry Valet | Method and apparatus employing angled single accelerometer sensing multi-directional motion |
JP2002268622A (ja) * | 2001-03-09 | 2002-09-20 | Denso Corp | 携帯端末装置のユーザインターフェース装置 |
US6834249B2 (en) * | 2001-03-29 | 2004-12-21 | Arraycomm, Inc. | Method and apparatus for controlling a computing system |
US6798429B2 (en) * | 2001-03-29 | 2004-09-28 | Intel Corporation | Intuitive mobile device interface to virtual spaces |
FI117488B (fi) * | 2001-05-16 | 2006-10-31 | Myorigo Sarl | Informaation selaus näytöllä |
DE10125395A1 (de) * | 2001-05-23 | 2002-11-28 | Siemens Ag | Verfahren und Anordnung zum Navigieren innerhalb eines Bildes |
US7177906B2 (en) * | 2001-05-31 | 2007-02-13 | Palmsource, Inc. | Software application launching method and apparatus |
SE523636C2 (sv) * | 2001-07-22 | 2004-05-04 | Tomer Shalit Ab | Portabelt datoriserat handhållet organ och förfarande för hantering av ett på en skärm visat objekt |
USRE47457E1 (en) * | 2001-08-07 | 2019-06-25 | Facebook, Inc. | Control of display content by movement on a fixed spherical space |
US7365734B2 (en) * | 2002-08-06 | 2008-04-29 | Rembrandt Ip Management, Llc | Control of display content by movement on a fixed spherical space |
US6847351B2 (en) * | 2001-08-13 | 2005-01-25 | Siemens Information And Communication Mobile, Llc | Tilt-based pointing for hand-held devices |
FR2828754A1 (fr) * | 2001-08-14 | 2003-02-21 | Koninkl Philips Electronics Nv | Visualisation d'un montage d'une video panoramique par application de commandes de navigation a ladite video panoramique |
SE0103151D0 (en) * | 2001-09-19 | 2001-09-19 | Ericsson Telefon Ab L M | Method for navigation and selection at a terminal device |
US6670947B2 (en) * | 2001-10-22 | 2003-12-30 | Robert William Smyth | SO3 input device |
EP1443385A1 (de) * | 2001-10-24 | 2004-08-04 | Sony Corporation | Bildinformationsanzeigeeinrichtung |
US7714880B2 (en) | 2001-11-16 | 2010-05-11 | Honeywell International Inc. | Method and apparatus for displaying images on a display |
DE60232945D1 (de) * | 2001-11-22 | 2009-08-27 | Yamaha Corp | Elektronisches Gerät |
US7002553B2 (en) * | 2001-12-27 | 2006-02-21 | Mark Shkolnikov | Active keyboard system for handheld electronic devices |
US7843437B1 (en) * | 2002-01-14 | 2010-11-30 | Palm, Inc. | Hand-held browser transcoding |
WO2003073258A2 (en) * | 2002-02-21 | 2003-09-04 | Mobicom Corporation | Article comprising an adaptable input device |
JP3721141B2 (ja) * | 2002-03-25 | 2005-11-30 | 松下電器産業株式会社 | 携帯端末装置 |
US6967566B2 (en) | 2002-04-05 | 2005-11-22 | Creative Kingdoms, Llc | Live-action interactive adventure game |
US20070066396A1 (en) | 2002-04-05 | 2007-03-22 | Denise Chapman Weston | Retail methods for providing an interactive product to a consumer |
US7079452B2 (en) * | 2002-04-16 | 2006-07-18 | Harrison Shelton E | Time display system, method and device |
FI115258B (fi) | 2002-04-23 | 2005-03-31 | Myorigo Oy | Menetelmä ja elektroninen laite graafisessa käyttöliittymässä navigoimiseksi |
JP2003316502A (ja) * | 2002-04-25 | 2003-11-07 | Sony Corp | 端末装置、文字入力方法 |
US7519918B2 (en) * | 2002-05-30 | 2009-04-14 | Intel Corporation | Mobile virtual desktop |
US20030231189A1 (en) * | 2002-05-31 | 2003-12-18 | Microsoft Corporation | Altering a display on a viewing device based upon a user controlled orientation of the viewing device |
US7184025B2 (en) * | 2002-05-31 | 2007-02-27 | Microsoft Corporation | Altering a display on a viewing device based upon a user controlled orientation of the viewing device |
US7055749B2 (en) * | 2002-06-03 | 2006-06-06 | Symbol Technologies, Inc. | Re-configurable trigger assembly |
US7218311B2 (en) * | 2002-06-21 | 2007-05-15 | Akins Randy D | Sequential image advancing system (the S.I.A.S.) |
US7674184B2 (en) | 2002-08-01 | 2010-03-09 | Creative Kingdoms, Llc | Interactive water attraction and quest game |
SG131745A1 (en) * | 2002-08-23 | 2007-05-28 | Sony Corp | Movement-compensated visual display |
TW200407025A (en) * | 2002-08-27 | 2004-05-01 | Vitec Co Ltd | Pocket terminal device |
JP4126045B2 (ja) * | 2002-10-07 | 2008-07-30 | マイオリゴ ソシエテ ア リスポンサビリテ リミテ | 電子装置、カーソルを表示する方法及びコンピュータプログラム |
US20060176294A1 (en) * | 2002-10-07 | 2006-08-10 | Johannes Vaananen | Cursor for electronic devices |
US7064502B2 (en) * | 2002-11-22 | 2006-06-20 | Black & Decker Inc. | Power tool with remote stop |
US8176428B2 (en) | 2002-12-03 | 2012-05-08 | Datawind Net Access Corporation | Portable internet access device back page cache |
JP2004198450A (ja) * | 2002-12-16 | 2004-07-15 | Sharp Corp | 画像表示システム |
US20040119684A1 (en) * | 2002-12-18 | 2004-06-24 | Xerox Corporation | System and method for navigating information |
US6977675B2 (en) * | 2002-12-30 | 2005-12-20 | Motorola, Inc. | Method and apparatus for virtually expanding a display |
US20040125073A1 (en) * | 2002-12-30 | 2004-07-01 | Scott Potter | Portable electronic apparatus and method employing motion sensor for function control |
FI20022282A0 (fi) * | 2002-12-30 | 2002-12-30 | Nokia Corp | Menetelmä vuorovaikutuksen mahdollistamiseksi elektronisessa laitteessa ja elektroninen laite |
WO2004066615A1 (en) * | 2003-01-22 | 2004-08-05 | Nokia Corporation | Image control |
US20040145613A1 (en) * | 2003-01-29 | 2004-07-29 | Stavely Donald J. | User Interface using acceleration for input |
US7426329B2 (en) | 2003-03-06 | 2008-09-16 | Microsoft Corporation | Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player |
US20110205056A9 (en) * | 2003-03-24 | 2011-08-25 | Borovoy Richard D | Adding social networking to devices |
US7538745B2 (en) * | 2003-03-24 | 2009-05-26 | Ntag Interactive Corporation | Apparatus and method for enhancing face-to-face communication |
US9446319B2 (en) | 2003-03-25 | 2016-09-20 | Mq Gaming, Llc | Interactive gaming toy |
JP4144555B2 (ja) * | 2003-06-09 | 2008-09-03 | カシオ計算機株式会社 | 電子機器、表示制御方法及びプログラム |
US20040250220A1 (en) * | 2003-06-09 | 2004-12-09 | Mika Kalenius | System, apparatus, and method for navigation in a hypertext document |
DE10326811A1 (de) * | 2003-06-13 | 2005-01-20 | Siemens Ag | Verfahren zur Darstellung von Grafikobjekten und Kommunikationsgerät |
US20040259591A1 (en) * | 2003-06-17 | 2004-12-23 | Motorola, Inc. | Gesture-based interface and method for wireless device |
US6880258B2 (en) * | 2003-08-26 | 2005-04-19 | Horizon Hobby | Digital inclinometer and related methods |
US7489299B2 (en) * | 2003-10-23 | 2009-02-10 | Hillcrest Laboratories, Inc. | User interface devices and methods employing accelerometers |
JP3791848B2 (ja) * | 2003-10-28 | 2006-06-28 | 松下電器産業株式会社 | 画像表示装置、画像表示システム、撮影装置、画像表示方法、およびプログラム |
JP2007510234A (ja) * | 2003-10-31 | 2007-04-19 | イオタ・ワイアレス・エルエルシー | 携帯デバイス用同時データ入力 |
US20080129552A1 (en) * | 2003-10-31 | 2008-06-05 | Iota Wireless Llc | Concurrent data entry for a portable device |
US7401300B2 (en) * | 2004-01-09 | 2008-07-15 | Nokia Corporation | Adaptive user interface input device |
WO2005071636A1 (en) * | 2004-01-20 | 2005-08-04 | Koninklijke Philips Electronics, N.V. | Advanced control device for home entertainment utilizing three dimensional motion technology |
US8442331B2 (en) | 2004-02-15 | 2013-05-14 | Google Inc. | Capturing text from rendered documents using supplemental information |
US7707039B2 (en) | 2004-02-15 | 2010-04-27 | Exbiblio B.V. | Automatic modification of web pages |
US7812860B2 (en) | 2004-04-01 | 2010-10-12 | Exbiblio B.V. | Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device |
US10635723B2 (en) | 2004-02-15 | 2020-04-28 | Google Llc | Search engines and systems with handheld document data capture devices |
US10575376B2 (en) | 2004-02-25 | 2020-02-25 | Lynk Labs, Inc. | AC light emitting diode and AC LED drive methods and apparatus |
WO2011143510A1 (en) | 2010-05-12 | 2011-11-17 | Lynk Labs, Inc. | Led lighting system |
US10499465B2 (en) | 2004-02-25 | 2019-12-03 | Lynk Labs, Inc. | High frequency multi-voltage and multi-brightness LED lighting devices and systems and methods of using same |
KR100631834B1 (ko) * | 2004-03-03 | 2006-10-09 | 삼성전기주식회사 | 버튼 조작없이 번호입력이 가능한 휴대폰 및 상기 휴대폰의 번호 입력 방법 |
TWI255681B (en) * | 2004-03-11 | 2006-05-21 | Giga Byte Tech Co Ltd | Method for controlling cold cathode fluorescent lamp to emit and flicker with digital audio source of main board and device thereof |
DE102004012897B4 (de) * | 2004-03-16 | 2006-01-12 | Siemens Ag | Verfahren zur Darstellung von Grafikobjekten und Kommunikationsgerät |
FI20045078A (fi) * | 2004-03-16 | 2005-09-17 | Myorigo Oy | Laajakulmaoptiikalla ja säteilysensorilla varustettu mobiililaite |
US8842070B2 (en) * | 2004-03-17 | 2014-09-23 | Intel Corporation | Integrated tracking for on screen navigation with small hand held devices |
US7176887B2 (en) * | 2004-03-23 | 2007-02-13 | Fujitsu Limited | Environmental modeling for motion controlled handheld devices |
US7365737B2 (en) * | 2004-03-23 | 2008-04-29 | Fujitsu Limited | Non-uniform gesture precision |
US7180502B2 (en) * | 2004-03-23 | 2007-02-20 | Fujitsu Limited | Handheld device with preferred motion selection |
US7903084B2 (en) * | 2004-03-23 | 2011-03-08 | Fujitsu Limited | Selective engagement of motion input modes |
US7280096B2 (en) * | 2004-03-23 | 2007-10-09 | Fujitsu Limited | Motion sensor engagement for a handheld device |
US7173604B2 (en) * | 2004-03-23 | 2007-02-06 | Fujitsu Limited | Gesture identification of controlled devices |
US7301529B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Context dependent gesture response |
US7176888B2 (en) * | 2004-03-23 | 2007-02-13 | Fujitsu Limited | Selective engagement of motion detection |
US20050212753A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Motion controlled remote controller |
KR100853605B1 (ko) * | 2004-03-23 | 2008-08-22 | 후지쯔 가부시끼가이샤 | 핸드헬드 장치에서의 경사 및 평행 이동 운동 성분들의구별 |
US7365735B2 (en) * | 2004-03-23 | 2008-04-29 | Fujitsu Limited | Translation controlled cursor |
US7365736B2 (en) * | 2004-03-23 | 2008-04-29 | Fujitsu Limited | Customizable gesture mappings for motion controlled handheld devices |
US7176886B2 (en) * | 2004-03-23 | 2007-02-13 | Fujitsu Limited | Spatial signatures |
US7301528B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Distinguishing tilt and translation motion components in handheld devices |
US7180501B2 (en) * | 2004-03-23 | 2007-02-20 | Fujitsu Limited | Gesture based navigation of a handheld user interface |
US20050212760A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture based user interface supporting preexisting symbols |
US7301526B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Dynamic adaptation of gestures for motion controlled handheld devices |
US7180500B2 (en) * | 2004-03-23 | 2007-02-20 | Fujitsu Limited | User definable gestures for motion controlled handheld devices |
US7301527B2 (en) * | 2004-03-23 | 2007-11-27 | Fujitsu Limited | Feedback based user interface for motion controlled handheld devices |
US20050219223A1 (en) * | 2004-03-31 | 2005-10-06 | Kotzin Michael D | Method and apparatus for determining the context of a device |
US7990556B2 (en) | 2004-12-03 | 2011-08-02 | Google Inc. | Association of a portable scanner with input/output and storage devices |
US20060081714A1 (en) | 2004-08-23 | 2006-04-20 | King Martin T | Portable scanning device |
US8146156B2 (en) | 2004-04-01 | 2012-03-27 | Google Inc. | Archive of text captures from rendered documents |
US9143638B2 (en) | 2004-04-01 | 2015-09-22 | Google Inc. | Data capture from rendered documents using handheld device |
US20060098900A1 (en) | 2004-09-27 | 2006-05-11 | King Martin T | Secure data gathering from rendered documents |
US8081849B2 (en) | 2004-12-03 | 2011-12-20 | Google Inc. | Portable scanning and memory device |
US7894670B2 (en) | 2004-04-01 | 2011-02-22 | Exbiblio B.V. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US9008447B2 (en) | 2004-04-01 | 2015-04-14 | Google Inc. | Method and system for character recognition |
WO2008028674A2 (en) | 2006-09-08 | 2008-03-13 | Exbiblio B.V. | Optical scanners, such as hand-held optical scanners |
US9116890B2 (en) | 2004-04-01 | 2015-08-25 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8713418B2 (en) | 2004-04-12 | 2014-04-29 | Google Inc. | Adding value to a rendered document |
US8489624B2 (en) | 2004-05-17 | 2013-07-16 | Google, Inc. | Processing techniques for text capture from a rendered document |
US8620083B2 (en) | 2004-12-03 | 2013-12-31 | Google Inc. | Method and system for character recognition |
US8874504B2 (en) | 2004-12-03 | 2014-10-28 | Google Inc. | Processing techniques for visual capture data from a rendered document |
TWI248043B (en) * | 2004-04-20 | 2006-01-21 | Wistron Corp | Electrical device capable of auto-adjusting display direction as a tilt of a display |
US7339600B2 (en) * | 2004-04-26 | 2008-03-04 | Samsung Electronics Co., Ltd. | Apparatus and method for displaying a picture in a wireless terminal |
KR101192514B1 (ko) * | 2004-04-30 | 2012-10-17 | 힐크레스트 래보래토리스, 인크. | 틸트 보상과 향상된 사용성을 갖는 3d 포인팅 장치 |
JP2007535774A (ja) * | 2004-04-30 | 2007-12-06 | ヒルクレスト・ラボラトリーズ・インコーポレイテッド | 自由空間ポインティングデバイスにおける意図的でない動きを除去するための方法およびデバイス |
CN102566751B (zh) * | 2004-04-30 | 2016-08-03 | 希尔克瑞斯特实验室公司 | 自由空间定位装置和方法 |
PL2337016T3 (pl) * | 2004-04-30 | 2018-07-31 | Idhl Holdings Inc | Urządzenia wskazujące w przestrzeni swobodnej, z kompensacją nachylenia i usprawnioną użytecznością |
US8629836B2 (en) | 2004-04-30 | 2014-01-14 | Hillcrest Laboratories, Inc. | 3D pointing devices with orientation compensation and improved usability |
JP2005321972A (ja) * | 2004-05-07 | 2005-11-17 | Sony Corp | 情報処理装置、情報処理装置における処理方法及び情報処理装置における処理プログラム |
US7310086B2 (en) * | 2004-05-12 | 2007-12-18 | Avago Technologies Ecbu Ip (Singapore) Pte Ltd | Finger navigation input device |
WO2005119356A2 (en) | 2004-05-28 | 2005-12-15 | Erik Jan Banning | Interactive direct-pointing system and calibration method |
JP2008502043A (ja) * | 2004-06-04 | 2008-01-24 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | ユーザによるコンテンツナビゲーションのための携帯装置 |
FI119746B (fi) * | 2004-06-24 | 2009-02-27 | Nokia Corp | Elektronisen laitteen ohjaaminen |
JP4559140B2 (ja) * | 2004-07-05 | 2010-10-06 | ソフトバンクモバイル株式会社 | 電子機器 |
EP1767900A4 (de) * | 2004-07-15 | 2010-01-20 | Amosense Co Ltd | Mobilendgeräteeinrichtung |
US8346620B2 (en) | 2004-07-19 | 2013-01-01 | Google Inc. | Automatic modification of web pages |
JP2006031515A (ja) * | 2004-07-20 | 2006-02-02 | Vodafone Kk | 移動体通信端末、アプリケーションプログラム、画像表示制御装置及び画像表示制御方法 |
EP1738566A4 (de) * | 2004-08-09 | 2012-11-14 | Rpx Corp | Verfahren zur sammlung von messdaten und tragbare informationsvorrichtung |
US8560972B2 (en) * | 2004-08-10 | 2013-10-15 | Microsoft Corporation | Surface UI for gesture-based interaction |
US7138979B2 (en) * | 2004-08-27 | 2006-11-21 | Motorola, Inc. | Device orientation based input signal generation |
US7446753B2 (en) * | 2004-09-10 | 2008-11-04 | Hand Held Products, Inc. | Hand held computer device |
FR2876470B1 (fr) * | 2004-10-12 | 2006-12-22 | Eastman Kodak Co | Procede de controle d'affichage utilisant un equipement portable a capteur d'images |
US20060097983A1 (en) * | 2004-10-25 | 2006-05-11 | Nokia Corporation | Tapping input on an electronic device |
FR2877451B1 (fr) * | 2004-10-29 | 2008-09-19 | Radiotelephone Sfr | Systeme de commande a spheres mobiles et terminal equipe d'un tel systeme |
US7435177B1 (en) | 2004-11-12 | 2008-10-14 | Sprint Spectrum L.P. | Method and system for video-based navigation in an application on a handheld game device |
JP2006145616A (ja) * | 2004-11-16 | 2006-06-08 | Konica Minolta Photo Imaging Inc | 画像表示装置、電子機器、画像表示方法 |
WO2006058129A2 (en) | 2004-11-23 | 2006-06-01 | Hillcrest Laboratories, Inc. | Semantic gaming and application transformation |
KR100641182B1 (ko) * | 2004-12-30 | 2006-11-02 | 엘지전자 주식회사 | 휴대단말기에서의 가상화면 이동장치 및 방법 |
US7532198B2 (en) * | 2005-01-14 | 2009-05-12 | Research In Motion Limited | Handheld electronic device with roller ball input |
US20060164382A1 (en) * | 2005-01-25 | 2006-07-27 | Technology Licensing Company, Inc. | Image manipulation in response to a movement of a display |
US10569134B2 (en) * | 2005-01-26 | 2020-02-25 | K-Motion Interactive, Inc. | Method and system for athletic motion analysis and instruction |
US7539513B2 (en) | 2005-02-02 | 2009-05-26 | National Telephone Products, Inc. | Portable phone with ergonomic image projection system |
DE602005012792D1 (de) * | 2005-02-28 | 2009-04-02 | Research In Motion Ltd | System und Methode zum Navigieren anhand eines richtungsempfindlichen Sensors in der Benutzeroberfläche eines mobilen Gerätes |
US20060195354A1 (en) * | 2005-02-28 | 2006-08-31 | Ntag Interactive Corporation | Method of scoring the performance of attendees at a meeting |
US7519468B2 (en) | 2005-02-28 | 2009-04-14 | Research In Motion Limited | System and method for navigating a mobile device user interface with a directional sensing device |
US20090297062A1 (en) * | 2005-03-04 | 2009-12-03 | Molne Anders L | Mobile device with wide-angle optics and a radiation sensor |
US20090305727A1 (en) * | 2005-03-04 | 2009-12-10 | Heikki Pylkko | Mobile device with wide range-angle optics and a radiation sensor |
US7966084B2 (en) * | 2005-03-07 | 2011-06-21 | Sony Ericsson Mobile Communications Ab | Communication terminals with a tap determination circuit |
KR100702055B1 (ko) * | 2005-03-09 | 2007-04-02 | 인피닉스 주식회사 | 디지털 수평 측정기구 |
US20060259205A1 (en) * | 2005-05-13 | 2006-11-16 | Robert Bosch Gmbh | Controlling systems through user tapping |
US20070002018A1 (en) * | 2005-06-30 | 2007-01-04 | Eigo Mori | Control of user interface of electronic device |
US9285897B2 (en) | 2005-07-13 | 2016-03-15 | Ultimate Pointer, L.L.C. | Easily deployable interactive direct-pointing system and calibration method therefor |
AT502228B1 (de) * | 2005-08-11 | 2007-07-15 | Ftw Forschungszentrum Telekomm | Tragbare navigationsvorrichtung und verfahren zum funknavigieren |
US7942745B2 (en) | 2005-08-22 | 2011-05-17 | Nintendo Co., Ltd. | Game operating device |
US7927216B2 (en) | 2005-09-15 | 2011-04-19 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
JP4805633B2 (ja) | 2005-08-22 | 2011-11-02 | 任天堂株式会社 | ゲーム用操作装置 |
US8313379B2 (en) | 2005-08-22 | 2012-11-20 | Nintendo Co., Ltd. | Video game system with wireless modular handheld controller |
JP4262726B2 (ja) | 2005-08-24 | 2009-05-13 | 任天堂株式会社 | ゲームコントローラおよびゲームシステム |
US8870655B2 (en) | 2005-08-24 | 2014-10-28 | Nintendo Co., Ltd. | Wireless game controllers |
EP2764899A3 (de) * | 2005-08-29 | 2014-12-10 | Nant Holdings IP, LLC | Interaktivität mittels mobiler Bilderkennung |
US8308563B2 (en) | 2005-08-30 | 2012-11-13 | Nintendo Co., Ltd. | Game system and storage medium having game program stored thereon |
US7647175B2 (en) * | 2005-09-09 | 2010-01-12 | Rembrandt Technologies, Lp | Discrete inertial display navigation |
KR100677613B1 (ko) * | 2005-09-09 | 2007-02-02 | 삼성전자주식회사 | 멀티미디어 기기의 동작을 제어하는 방법 및 그 장치 |
US8157651B2 (en) | 2005-09-12 | 2012-04-17 | Nintendo Co., Ltd. | Information processing program |
US20070060216A1 (en) * | 2005-09-12 | 2007-03-15 | Cheng-Wen Huang | Portable communication apparatus |
US20070057911A1 (en) * | 2005-09-12 | 2007-03-15 | Sina Fateh | System and method for wireless network content conversion for intuitively controlled portable displays |
KR100651368B1 (ko) * | 2005-09-15 | 2006-11-29 | 삼성전자주식회사 | 휴대단말기의 움직임에 따른 이미지 제어방법 |
KR100746995B1 (ko) * | 2005-09-22 | 2007-08-08 | 한국과학기술원 | 직관적인 실제 공간적 조준에 따른 시스템 및 그식별방법과 통신방법 |
FI20055590L (fi) * | 2005-11-03 | 2007-05-04 | Wearfone Oy | Menetelmä ja laite äänen muodostamiseksi langattomasti käyttäjän korvaan |
US20070113207A1 (en) * | 2005-11-16 | 2007-05-17 | Hillcrest Laboratories, Inc. | Methods and systems for gesture classification in 3D pointing devices |
TWI291117B (en) * | 2005-12-29 | 2007-12-11 | High Tech Comp Corp | A tapping operation method and a mobile electrical apparatus with tapping operation function |
TW200725566A (en) * | 2005-12-30 | 2007-07-01 | High Tech Comp Corp | Display controller |
US7509588B2 (en) | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
TWI309034B (en) * | 2005-12-30 | 2009-04-21 | High Tech Comp Corp | Display controller |
US20070159456A1 (en) * | 2006-01-10 | 2007-07-12 | Unkrich Mark A | Navigation system |
CN101009089A (zh) * | 2006-01-26 | 2007-08-01 | 宏达国际电子股份有限公司 | 屏幕显示控制装置 |
WO2007086581A1 (ja) * | 2006-01-30 | 2007-08-02 | Kyocera Corporation | 携帯電子機器とその方位表示方法 |
US8139030B2 (en) * | 2006-02-01 | 2012-03-20 | Memsic, Inc. | Magnetic sensor for use with hand-held devices |
US7667686B2 (en) * | 2006-02-01 | 2010-02-23 | Memsic, Inc. | Air-writing and motion sensing input for portable devices |
CN101018241A (zh) * | 2006-02-06 | 2007-08-15 | 宏达国际电子股份有限公司 | 电子装置的操控方法及具敲击操控功能的可携式电子装置 |
US20070198324A1 (en) * | 2006-02-22 | 2007-08-23 | Borovoy Richard D | Enabling connections between and events attended by people |
JP2007233753A (ja) * | 2006-03-01 | 2007-09-13 | Fujitsu Ltd | 加速度センサを備えた情報処理装置 |
JP4530419B2 (ja) | 2006-03-09 | 2010-08-25 | 任天堂株式会社 | 座標算出装置および座標算出プログラム |
TW200734913A (en) * | 2006-03-10 | 2007-09-16 | Inventec Appliances Corp | Electronic device and method using displacement sensor to move position displayed on screen |
JP4151982B2 (ja) | 2006-03-10 | 2008-09-17 | 任天堂株式会社 | 動き判別装置および動き判別プログラム |
CN101042848A (zh) * | 2006-03-24 | 2007-09-26 | 宏达国际电子股份有限公司 | 屏幕显示控制装置及其屏幕显示控制方法 |
JP4684147B2 (ja) | 2006-03-28 | 2011-05-18 | 任天堂株式会社 | 傾き算出装置、傾き算出プログラム、ゲーム装置およびゲームプログラム |
US20070229650A1 (en) * | 2006-03-30 | 2007-10-04 | Nokia Corporation | Mobile communications terminal and method therefor |
US20070236334A1 (en) * | 2006-03-31 | 2007-10-11 | Borovoy Richard D | Enhancing face-to-face communication |
US7841967B1 (en) | 2006-04-26 | 2010-11-30 | Dp Technologies, Inc. | Method and apparatus for providing fitness coaching using a mobile device |
US20070268246A1 (en) * | 2006-05-17 | 2007-11-22 | Edward Craig Hyatt | Electronic equipment with screen pan and zoom functions using motion |
JP2009534690A (ja) * | 2006-07-10 | 2009-09-24 | メムシック,インコーポレイテッド | 磁場センサーを用いて偏揺れを感知するためのシステム、および、前記システムを用いた携帯用の電子装置 |
US8902154B1 (en) | 2006-07-11 | 2014-12-02 | Dp Technologies, Inc. | Method and apparatus for utilizing motion user interface |
US8139026B2 (en) | 2006-08-02 | 2012-03-20 | Research In Motion Limited | System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device |
US8493323B2 (en) * | 2006-08-02 | 2013-07-23 | Research In Motion Limited | System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device |
US7956849B2 (en) | 2006-09-06 | 2011-06-07 | Apple Inc. | Video manager for portable multifunction device |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US8842074B2 (en) | 2006-09-06 | 2014-09-23 | Apple Inc. | Portable electronic device performing similar operations for different gestures |
US7864163B2 (en) | 2006-09-06 | 2011-01-04 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying structured electronic documents |
TWI346494B (en) * | 2006-09-08 | 2011-08-01 | High Tech Comp Corp | Page movement controller and operating method thereof |
JP5173174B2 (ja) * | 2006-09-13 | 2013-03-27 | 任天堂株式会社 | ゲーム装置、ゲームプログラム、ゲームシステム、およびゲーム処理方法 |
US7889173B2 (en) * | 2006-09-14 | 2011-02-15 | Microsoft Corporation | Defining user input fields on a portable media device |
EP1914622A3 (de) * | 2006-10-16 | 2012-11-28 | Samsung Electronics Co., Ltd. | Verfahren und Vorrichtung zur Bewegung einer Liste auf einer Bildfläche |
US8965885B2 (en) * | 2006-11-14 | 2015-02-24 | Google Technology Holdings LLC | System and method for browsing web pages on a mobile communication device |
TWI330802B (en) * | 2006-12-13 | 2010-09-21 | Ind Tech Res Inst | Inertial sensing method and system |
US7999797B2 (en) * | 2006-12-26 | 2011-08-16 | Sony Ericsson Mobile Communications Ab | Detecting and locating a touch or a tap on an input surface |
US8214768B2 (en) * | 2007-01-05 | 2012-07-03 | Apple Inc. | Method, system, and graphical user interface for viewing multiple application windows |
US20080165148A1 (en) * | 2007-01-07 | 2008-07-10 | Richard Williamson | Portable Electronic Device, Method, and Graphical User Interface for Displaying Inline Multimedia Content |
US8519964B2 (en) | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
JP5127242B2 (ja) | 2007-01-19 | 2013-01-23 | 任天堂株式会社 | 加速度データ処理プログラムおよびゲームプログラム |
WO2008094458A1 (en) | 2007-01-26 | 2008-08-07 | F-Origin, Inc. | Viewing images with tilt control on a hand-held device |
US8620353B1 (en) | 2007-01-26 | 2013-12-31 | Dp Technologies, Inc. | Automatic sharing and publication of multimedia from a mobile device |
US8949070B1 (en) | 2007-02-08 | 2015-02-03 | Dp Technologies, Inc. | Human activity monitoring device with activity identification |
JP5607286B2 (ja) * | 2007-03-27 | 2014-10-15 | 日本電気株式会社 | 情報処理端末、情報処理端末の制御方法、およびプログラム |
WO2008128087A1 (en) * | 2007-04-13 | 2008-10-23 | Keynetik, Inc. | A force sensing apparatus and method to determine the radius of rotation of a moving object |
US9933937B2 (en) | 2007-06-20 | 2018-04-03 | Apple Inc. | Portable multifunction device, method, and graphical user interface for playing online videos |
CN101330811B (zh) * | 2007-06-22 | 2010-12-08 | 鸿富锦精密工业(深圳)有限公司 | 便携式电子装置及其操作方法 |
US20090002325A1 (en) * | 2007-06-27 | 2009-01-01 | Think/Thing | System and method for operating an electronic device |
US7860676B2 (en) | 2007-06-28 | 2010-12-28 | Hillcrest Laboratories, Inc. | Real-time dynamic tracking of bias |
US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
US20090007006A1 (en) * | 2007-06-29 | 2009-01-01 | Palm, Inc. | Automatic scrolling |
CN101355746B (zh) * | 2007-07-27 | 2012-05-16 | 深圳富泰宏精密工业有限公司 | 无线通信装置 |
US8555282B1 (en) | 2007-07-27 | 2013-10-08 | Dp Technologies, Inc. | Optimizing preemptive operating system with motion sensing |
US9619143B2 (en) | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
US8619038B2 (en) | 2007-09-04 | 2013-12-31 | Apple Inc. | Editing interface |
US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
US8144780B2 (en) * | 2007-09-24 | 2012-03-27 | Microsoft Corporation | Detecting visual gestural patterns |
US20090089705A1 (en) * | 2007-09-27 | 2009-04-02 | Microsoft Corporation | Virtual object navigation |
TW200915960A (en) * | 2007-09-28 | 2009-04-01 | Benq Corp | Sensing module |
US11317495B2 (en) | 2007-10-06 | 2022-04-26 | Lynk Labs, Inc. | LED circuits and assemblies |
US11297705B2 (en) | 2007-10-06 | 2022-04-05 | Lynk Labs, Inc. | Multi-voltage and multi-brightness LED lighting devices and methods of using same |
US20090099812A1 (en) * | 2007-10-11 | 2009-04-16 | Philippe Kahn | Method and Apparatus for Position-Context Based Actions |
KR101542274B1 (ko) * | 2007-10-16 | 2015-08-06 | 힐크레스트 래보래토리스, 인크. | 씬 클라이언트 상에서 동작하는 사용자 인터페이스의 빠르고 부드러운 스크롤링 |
US7800044B1 (en) | 2007-11-09 | 2010-09-21 | Dp Technologies, Inc. | High ambient motion environment detection eliminate accidental activation of a device |
US8418083B1 (en) * | 2007-11-26 | 2013-04-09 | Sprint Communications Company L.P. | Applying a navigational mode to a device |
TWI373708B (en) * | 2007-11-27 | 2012-10-01 | Htc Corp | Power management method for handheld electronic device |
US8213999B2 (en) * | 2007-11-27 | 2012-07-03 | Htc Corporation | Controlling method and system for handheld communication device and recording medium using the same |
US8260367B2 (en) * | 2007-12-12 | 2012-09-04 | Sharp Laboratories Of America, Inc. | Motion driven follow-up alerts for mobile electronic device |
US9569086B2 (en) * | 2007-12-12 | 2017-02-14 | Nokia Technologies Oy | User interface having realistic physical effects |
US8203528B2 (en) * | 2007-12-13 | 2012-06-19 | Sony Ericsson Mobile Communications Ab | Motion activated user interface for mobile communications device |
US20090160666A1 (en) * | 2007-12-21 | 2009-06-25 | Think/Thing | System and method for operating and powering an electronic device |
US20090167702A1 (en) * | 2008-01-02 | 2009-07-02 | Nokia Corporation | Pointing device detection |
US8423076B2 (en) * | 2008-02-01 | 2013-04-16 | Lg Electronics Inc. | User interface for a mobile device |
US8195220B2 (en) * | 2008-02-01 | 2012-06-05 | Lg Electronics Inc. | User interface for mobile devices |
WO2009105821A1 (en) * | 2008-02-29 | 2009-09-03 | Hamish Mclennan | A method and system responsive to intentional movement of a device |
US8624844B2 (en) | 2008-04-01 | 2014-01-07 | Litl Llc | Portable computer with multiple display configurations |
US9003315B2 (en) | 2008-04-01 | 2015-04-07 | Litl Llc | System and method for streamlining user interaction with electronic content |
US8612888B2 (en) | 2008-04-01 | 2013-12-17 | Litl, Llc | Method and apparatus for managing digital media content |
WO2009145854A1 (en) * | 2008-04-15 | 2009-12-03 | Hillcrest Laboratories, Inc. | Tracking determination based on intensity angular gradient of a wave |
US9582049B2 (en) * | 2008-04-17 | 2017-02-28 | Lg Electronics Inc. | Method and device for controlling user interface based on user's gesture |
JP4971241B2 (ja) * | 2008-05-09 | 2012-07-11 | 株式会社リコー | 画像表示装置 |
US8285344B2 (en) | 2008-05-21 | 2012-10-09 | DP Technlogies, Inc. | Method and apparatus for adjusting audio for a user environment |
JP5537044B2 (ja) * | 2008-05-30 | 2014-07-02 | キヤノン株式会社 | 画像表示装置及びその制御方法、コンピュータプログラム |
CN101598972A (zh) * | 2008-06-04 | 2009-12-09 | 鸿富锦精密工业(深圳)有限公司 | 电子装置及其功能变换方法 |
US9253416B2 (en) * | 2008-06-19 | 2016-02-02 | Motorola Solutions, Inc. | Modulation of background substitution based on camera attitude and motion |
US8996332B2 (en) | 2008-06-24 | 2015-03-31 | Dp Technologies, Inc. | Program setting adjustments based on activity identification |
US20090325710A1 (en) * | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Dynamic Selection Of Sensitivity Of Tilt Functionality |
CN101644987A (zh) * | 2008-08-08 | 2010-02-10 | 深圳富泰宏精密工业有限公司 | 移动终端及其菜单选择的方法 |
US8385971B2 (en) * | 2008-08-19 | 2013-02-26 | Digimarc Corporation | Methods and systems for content processing |
CA2734987A1 (en) | 2008-08-22 | 2010-02-25 | Google Inc. | Navigation in a three dimensional environment on a mobile device |
KR101481556B1 (ko) * | 2008-09-10 | 2015-01-13 | 엘지전자 주식회사 | 이동 단말기 및 이를 이용한 객체 표시방법 |
US20100060667A1 (en) * | 2008-09-10 | 2010-03-11 | Apple Inc. | Angularly dependent display optimized for multiple viewing angles |
US8872646B2 (en) | 2008-10-08 | 2014-10-28 | Dp Technologies, Inc. | Method and system for waking up a device due to motion |
EP2175343A1 (de) * | 2008-10-08 | 2010-04-14 | Research in Motion Limited | Verfahren und tragbare elektronische Vorrichtung mit grafischer Benutzeroberfläche, die Symbole dynamisch anordnet |
US20100095250A1 (en) * | 2008-10-15 | 2010-04-15 | Raytheon Company | Facilitating Interaction With An Application |
JP5280800B2 (ja) * | 2008-10-29 | 2013-09-04 | 京セラ株式会社 | 携帯機器、操作検出方法および操作検出プログラム |
KR101569176B1 (ko) | 2008-10-30 | 2015-11-20 | 삼성전자주식회사 | 오브젝트 실행 방법 및 장치 |
KR101185589B1 (ko) * | 2008-11-14 | 2012-09-24 | (주)마이크로인피니티 | 움직임 감지를 통한 사용자 명령 입력 방법 및 디바이스 |
US8645871B2 (en) * | 2008-11-21 | 2014-02-04 | Microsoft Corporation | Tiltable user interface |
US8717283B1 (en) * | 2008-11-25 | 2014-05-06 | Sprint Communications Company L.P. | Utilizing motion of a device to manipulate a display screen feature |
KR20100066036A (ko) * | 2008-12-09 | 2010-06-17 | 삼성전자주식회사 | 휴대 단말기 운용 방법 및 장치 |
US8248371B2 (en) * | 2008-12-19 | 2012-08-21 | Verizon Patent And Licensing Inc. | Accelerometer sensitive soft input panel |
US20100164756A1 (en) * | 2008-12-30 | 2010-07-01 | Nokia Corporation | Electronic device user input |
TW201025079A (en) * | 2008-12-30 | 2010-07-01 | E Ten Information Sys Co Ltd | Hand-held electronic device and operating method thereof |
US20100188397A1 (en) * | 2009-01-28 | 2010-07-29 | Apple Inc. | Three dimensional navigation using deterministic movement of an electronic device |
US8294766B2 (en) | 2009-01-28 | 2012-10-23 | Apple Inc. | Generating a three-dimensional model using a portable electronic device recording |
US8890898B2 (en) * | 2009-01-28 | 2014-11-18 | Apple Inc. | Systems and methods for navigating a scene using deterministic movement of an electronic device |
US8704767B2 (en) * | 2009-01-29 | 2014-04-22 | Microsoft Corporation | Environmental gesture recognition |
WO2010096193A2 (en) | 2009-02-18 | 2010-08-26 | Exbiblio B.V. | Identifying a document by performing spectral analysis on the contents of the document |
JP4706985B2 (ja) * | 2009-03-04 | 2011-06-22 | コニカミノルタビジネステクノロジーズ株式会社 | コンテンツ表示装置 |
KR101549556B1 (ko) * | 2009-03-06 | 2015-09-03 | 엘지전자 주식회사 | 휴대 단말기 및 그 제어방법 |
WO2010105246A2 (en) | 2009-03-12 | 2010-09-16 | Exbiblio B.V. | Accessing resources based on capturing information from a rendered document |
US8447066B2 (en) | 2009-03-12 | 2013-05-21 | Google Inc. | Performing actions based on capturing information from rendered documents, such as documents under copyright |
US8392340B2 (en) | 2009-03-13 | 2013-03-05 | Apple Inc. | Method and apparatus for detecting conditions of a peripheral device including motion, and determining/predicting temperature(S) wherein at least one temperature is weighted based on detected conditions |
US20100248203A1 (en) * | 2009-03-26 | 2010-09-30 | Kuo Hsiing Cho | Portable LED interactive learning device |
US8019903B2 (en) * | 2009-03-27 | 2011-09-13 | Microsoft Corporation | Removable accessory for a computing device |
TWI383315B (zh) * | 2009-03-27 | 2013-01-21 | Wistron Corp | 電腦螢幕畫面顯示方法、具有直立顯示裝置的電腦、內儲基本輸入輸出系統的紀錄媒體及電腦程式產品 |
US9529437B2 (en) | 2009-05-26 | 2016-12-27 | Dp Technologies, Inc. | Method and apparatus for a motion state aware device |
US20100302277A1 (en) * | 2009-05-27 | 2010-12-02 | International Business Machines Corporation | Image Modification for Web Pages |
US9298336B2 (en) | 2009-05-28 | 2016-03-29 | Apple Inc. | Rotation smoothing of a user interface |
US8265717B2 (en) | 2009-06-26 | 2012-09-11 | Motorola Mobility Llc | Implementation of touchpad on rear surface of single-axis hinged device |
US20100328219A1 (en) * | 2009-06-30 | 2010-12-30 | Motorola, Inc. | Method for Integrating an Imager and Flash into a Keypad on a Portable Device |
US8095191B2 (en) * | 2009-07-06 | 2012-01-10 | Motorola Mobility, Inc. | Detection and function of seven self-supported orientations in a portable device |
US8497884B2 (en) | 2009-07-20 | 2013-07-30 | Motorola Mobility Llc | Electronic device and method for manipulating graphic user interface elements |
US8462126B2 (en) * | 2009-07-20 | 2013-06-11 | Motorola Mobility Llc | Method for implementing zoom functionality on a portable device with opposing touch sensitive surfaces |
US8531571B1 (en) * | 2009-08-05 | 2013-09-10 | Bentley Systmes, Incorporated | System and method for browsing a large document on a portable electronic device |
US8675019B1 (en) | 2009-12-03 | 2014-03-18 | Innoventions, Inc. | View navigation guidance system for hand held devices with display |
US8494544B2 (en) * | 2009-12-03 | 2013-07-23 | Osocad Remote Limited Liability Company | Method, apparatus and computer program to perform location specific information retrieval using a gesture-controlled handheld mobile device |
US9081799B2 (en) | 2009-12-04 | 2015-07-14 | Google Inc. | Using gestalt information to identify locations in printed information |
US9323784B2 (en) | 2009-12-09 | 2016-04-26 | Google Inc. | Image search using text-based elements within the contents of images |
JP5454133B2 (ja) * | 2009-12-25 | 2014-03-26 | 富士通株式会社 | 検知情報補正装置、可搬型装置、検知情報補正方法、およびコンピュータプログラム |
US8736561B2 (en) | 2010-01-06 | 2014-05-27 | Apple Inc. | Device, method, and graphical user interface with content display modes and display rotation heuristics |
US8438504B2 (en) | 2010-01-06 | 2013-05-07 | Apple Inc. | Device, method, and graphical user interface for navigating through multiple viewing areas |
US8339364B2 (en) | 2010-02-03 | 2012-12-25 | Nintendo Co., Ltd. | Spatially-correlated multi-display human-machine interface |
CA2746481C (en) | 2010-02-03 | 2017-06-13 | Nintendo Co., Ltd. | Game system, controller device, and game process method |
US8814686B2 (en) | 2010-02-03 | 2014-08-26 | Nintendo Co., Ltd. | Display device, game system, and game method |
US8913009B2 (en) | 2010-02-03 | 2014-12-16 | Nintendo Co., Ltd. | Spatially-correlated multi-display human-machine interface |
US8947355B1 (en) | 2010-03-25 | 2015-02-03 | Amazon Technologies, Inc. | Motion-based character selection |
US8452260B2 (en) * | 2010-03-25 | 2013-05-28 | Hewlett-Packard Development Company, L.P. | Methods and apparatus for unlocking an electronic device |
US20110250967A1 (en) * | 2010-04-13 | 2011-10-13 | Kulas Charles J | Gamepiece controller using a movable position-sensing display device |
US8123614B2 (en) * | 2010-04-13 | 2012-02-28 | Kulas Charles J | Gamepiece controller using a movable position-sensing display device including a movement currency mode of movement |
US8267788B2 (en) * | 2010-04-13 | 2012-09-18 | Kulas Charles J | Gamepiece controller using a movable position-sensing display device including a movement currency mode of movement |
DE102010028716A1 (de) * | 2010-05-07 | 2011-11-10 | Robert Bosch Gmbh | Vorrichtung und Verfahren zum Betrieb einer Vorrichtung |
US8581844B2 (en) * | 2010-06-23 | 2013-11-12 | Google Inc. | Switching between a first operational mode and a second operational mode using a natural motion gesture |
US8672837B2 (en) | 2010-06-24 | 2014-03-18 | Hansen Medical, Inc. | Methods and devices for controlling a shapeable medical device |
US20120016641A1 (en) | 2010-07-13 | 2012-01-19 | Giuseppe Raffa | Efficient gesture processing |
JP6243586B2 (ja) | 2010-08-06 | 2017-12-06 | 任天堂株式会社 | ゲームシステム、ゲーム装置、ゲームプログラム、および、ゲーム処理方法 |
US10150033B2 (en) | 2010-08-20 | 2018-12-11 | Nintendo Co., Ltd. | Position calculation system, position calculation device, storage medium storing position calculation program, and position calculation method |
CN101938561A (zh) * | 2010-08-30 | 2011-01-05 | 惠州Tcl移动通信有限公司 | 一种挂断来电的方法及移动通信终端 |
JP5840385B2 (ja) | 2010-08-30 | 2016-01-06 | 任天堂株式会社 | ゲームシステム、ゲーム装置、ゲームプログラム、および、ゲーム処理方法 |
JP5840386B2 (ja) | 2010-08-30 | 2016-01-06 | 任天堂株式会社 | ゲームシステム、ゲーム装置、ゲームプログラム、および、ゲーム処理方法 |
US8688966B2 (en) | 2010-08-31 | 2014-04-01 | Apple Inc. | Systems, methods, and computer-readable media for presenting visual content with a consistent orientation |
US8972467B2 (en) | 2010-08-31 | 2015-03-03 | Sovanta Ag | Method for selecting a data set from a plurality of data sets by means of an input device |
US8767019B2 (en) | 2010-08-31 | 2014-07-01 | Sovanta Ag | Computer-implemented method for specifying a processing operation |
KR101492310B1 (ko) | 2010-11-01 | 2015-02-11 | 닌텐도가부시키가이샤 | 조작 장치 및 정보 처리 장치 |
US8723699B2 (en) * | 2010-11-09 | 2014-05-13 | Motorola Mobility Llc | Method and apparatus for controlling a device |
US9285883B2 (en) * | 2011-03-01 | 2016-03-15 | Qualcomm Incorporated | System and method to display content based on viewing orientation |
US9035940B2 (en) * | 2011-03-08 | 2015-05-19 | Nokia Corporation | Apparatus and associated methods |
US20120249595A1 (en) | 2011-03-31 | 2012-10-04 | Feinstein David Y | Area selection for hand held devices with display |
JP5689014B2 (ja) | 2011-04-07 | 2015-03-25 | 任天堂株式会社 | 入力システム、情報処理装置、情報処理プログラム、および3次元位置算出方法 |
US9041733B2 (en) | 2011-05-04 | 2015-05-26 | Blackberry Limited | Methods for adjusting a presentation of graphical data displayed on a graphical user interface |
US20120314899A1 (en) | 2011-06-13 | 2012-12-13 | Microsoft Corporation | Natural user interfaces for mobile image viewing |
WO2013026053A1 (en) | 2011-08-18 | 2013-02-21 | Lynk Labs, Inc. | Devices and systems having ac led circuits and methods of driving the same |
WO2013040498A1 (en) * | 2011-09-16 | 2013-03-21 | Translucent Medical, Inc. | System and method for virtually tracking a surgical tool on a movable display |
WO2013048469A1 (en) | 2011-09-30 | 2013-04-04 | Intel Corporation | Detection of gesture data segmentation in mobile devices |
US9121724B2 (en) * | 2011-09-30 | 2015-09-01 | Apple Inc. | 3D position tracking for panoramic imagery navigation |
US9965107B2 (en) | 2011-10-28 | 2018-05-08 | Atmel Corporation | Authenticating with active stylus |
US9116558B2 (en) | 2011-10-28 | 2015-08-25 | Atmel Corporation | Executing gestures with active stylus |
US9164603B2 (en) | 2011-10-28 | 2015-10-20 | Atmel Corporation | Executing gestures with active stylus |
US9247597B2 (en) | 2011-12-02 | 2016-01-26 | Lynk Labs, Inc. | Color temperature controlled and low THD LED lighting devices and systems and methods of driving the same |
WO2013095602A1 (en) * | 2011-12-23 | 2013-06-27 | Hewlett-Packard Development Company, L.P. | Input command based on hand gesture |
US9223138B2 (en) | 2011-12-23 | 2015-12-29 | Microsoft Technology Licensing, Llc | Pixel opacity for augmented reality |
US9606586B2 (en) | 2012-01-23 | 2017-03-28 | Microsoft Technology Licensing, Llc | Heat transfer device |
JP2013157959A (ja) * | 2012-01-31 | 2013-08-15 | Toshiba Corp | 携帯端末機器、携帯端末機器の音声認識処理方法、およびプログラム |
US9726887B2 (en) | 2012-02-15 | 2017-08-08 | Microsoft Technology Licensing, Llc | Imaging structure color conversion |
US9368546B2 (en) | 2012-02-15 | 2016-06-14 | Microsoft Technology Licensing, Llc | Imaging structure with embedded light sources |
US9297996B2 (en) | 2012-02-15 | 2016-03-29 | Microsoft Technology Licensing, Llc | Laser illumination scanning |
US9779643B2 (en) | 2012-02-15 | 2017-10-03 | Microsoft Technology Licensing, Llc | Imaging structure emitter configurations |
US9578318B2 (en) | 2012-03-14 | 2017-02-21 | Microsoft Technology Licensing, Llc | Imaging structure emitter calibration |
US11068049B2 (en) * | 2012-03-23 | 2021-07-20 | Microsoft Technology Licensing, Llc | Light guide display and field of view |
US20130254674A1 (en) * | 2012-03-23 | 2013-09-26 | Oracle International Corporation | Development mode activation for a mobile device |
US10191515B2 (en) | 2012-03-28 | 2019-01-29 | Microsoft Technology Licensing, Llc | Mobile device light guide display |
US9558590B2 (en) | 2012-03-28 | 2017-01-31 | Microsoft Technology Licensing, Llc | Augmented reality light guide display |
US9717981B2 (en) | 2012-04-05 | 2017-08-01 | Microsoft Technology Licensing, Llc | Augmented reality and physical games |
US10502876B2 (en) | 2012-05-22 | 2019-12-10 | Microsoft Technology Licensing, Llc | Waveguide optics focus elements |
US8989535B2 (en) | 2012-06-04 | 2015-03-24 | Microsoft Technology Licensing, Llc | Multiple waveguide imaging structure |
TW201403446A (zh) * | 2012-07-09 | 2014-01-16 | Hon Hai Prec Ind Co Ltd | 軟體介面顯示系統及方法 |
TW201404133A (zh) * | 2012-07-09 | 2014-01-16 | Wistron Corp | 自動拍照裝置及方法 |
JP2014035562A (ja) * | 2012-08-07 | 2014-02-24 | Sony Corp | 情報処理装置、情報処理方法及びコンピュータプログラム |
US9081542B2 (en) | 2012-08-28 | 2015-07-14 | Google Technology Holdings LLC | Systems and methods for a wearable touch-sensitive device |
JP6100497B2 (ja) * | 2012-10-09 | 2017-03-22 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、および画像表示方法 |
US9245497B2 (en) * | 2012-11-01 | 2016-01-26 | Google Technology Holdings LLC | Systems and methods for configuring the display resolution of an electronic device based on distance and user presbyopia |
US10192358B2 (en) | 2012-12-20 | 2019-01-29 | Microsoft Technology Licensing, Llc | Auto-stereoscopic augmented reality display |
US9160915B1 (en) * | 2013-01-09 | 2015-10-13 | Amazon Technologies, Inc. | Modifying device functionality based on device orientation |
US8769431B1 (en) | 2013-02-28 | 2014-07-01 | Roy Varada Prasad | Method of single-handed software operation of large form factor mobile electronic devices |
US9057600B2 (en) | 2013-03-13 | 2015-06-16 | Hansen Medical, Inc. | Reducing incremental measurement sensor error |
US9566414B2 (en) | 2013-03-13 | 2017-02-14 | Hansen Medical, Inc. | Integrated catheter and guide wire controller |
US9283046B2 (en) | 2013-03-15 | 2016-03-15 | Hansen Medical, Inc. | User interface for active drive apparatus with finite range of motion |
US9014851B2 (en) | 2013-03-15 | 2015-04-21 | Hansen Medical, Inc. | Systems and methods for tracking robotically controlled medical instruments |
US10849702B2 (en) | 2013-03-15 | 2020-12-01 | Auris Health, Inc. | User input devices for controlling manipulation of guidewires and catheters |
US9271663B2 (en) | 2013-03-15 | 2016-03-01 | Hansen Medical, Inc. | Flexible instrument localization from both remote and elongation sensors |
US9629595B2 (en) | 2013-03-15 | 2017-04-25 | Hansen Medical, Inc. | Systems and methods for localizing, tracking and/or controlling medical instruments |
US9459705B2 (en) | 2013-03-18 | 2016-10-04 | Facebook, Inc. | Tilting to scroll |
US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
DE102013214020A1 (de) * | 2013-07-17 | 2015-02-19 | Stabilo International Gmbh | Digitaler Stift |
US10126839B2 (en) | 2013-07-24 | 2018-11-13 | Innoventions, Inc. | Motion-based view scrolling with augmented tilt control |
JP5613314B1 (ja) * | 2013-11-14 | 2014-10-22 | Jfeシステムズ株式会社 | ジェスチャー検出装置、ジェスチャー検出プログラム、ジェスチャー認識装置およびジェスチャー認識プログラム |
US9134764B2 (en) * | 2013-12-20 | 2015-09-15 | Sony Corporation | Apparatus and method for controlling a display based on a manner of holding the apparatus |
USD750620S1 (en) * | 2014-02-21 | 2016-03-01 | Huawei Device Co., Ltd. | Tablet computer |
TWI547866B (zh) * | 2014-03-05 | 2016-09-01 | 佳世達科技股份有限公司 | 可攜式電子裝置及其控制方法 |
EP3243476B1 (de) | 2014-03-24 | 2019-11-06 | Auris Health, Inc. | Systeme und vorrichtungen für katheter zur förderung von instinkthandlungen |
US9816814B2 (en) * | 2014-06-25 | 2017-11-14 | Intel Corporation | Magnetometer unit for electronic devices |
US9304235B2 (en) | 2014-07-30 | 2016-04-05 | Microsoft Technology Licensing, Llc | Microfabrication |
US10592080B2 (en) | 2014-07-31 | 2020-03-17 | Microsoft Technology Licensing, Llc | Assisted presentation of application windows |
US10678412B2 (en) | 2014-07-31 | 2020-06-09 | Microsoft Technology Licensing, Llc | Dynamic joint dividers for application windows |
US10254942B2 (en) | 2014-07-31 | 2019-04-09 | Microsoft Technology Licensing, Llc | Adaptive sizing and positioning of application windows |
CN104216634A (zh) * | 2014-08-27 | 2014-12-17 | 小米科技有限责任公司 | 一种显示稿件的方法和装置 |
US9372347B1 (en) | 2015-02-09 | 2016-06-21 | Microsoft Technology Licensing, Llc | Display system |
US9423360B1 (en) | 2015-02-09 | 2016-08-23 | Microsoft Technology Licensing, Llc | Optical components |
US9513480B2 (en) | 2015-02-09 | 2016-12-06 | Microsoft Technology Licensing, Llc | Waveguide |
US11086216B2 (en) | 2015-02-09 | 2021-08-10 | Microsoft Technology Licensing, Llc | Generating electronic components |
US9429692B1 (en) | 2015-02-09 | 2016-08-30 | Microsoft Technology Licensing, Llc | Optical components |
US10018844B2 (en) | 2015-02-09 | 2018-07-10 | Microsoft Technology Licensing, Llc | Wearable image display system |
US9535253B2 (en) | 2015-02-09 | 2017-01-03 | Microsoft Technology Licensing, Llc | Display system |
US10317677B2 (en) | 2015-02-09 | 2019-06-11 | Microsoft Technology Licensing, Llc | Display system |
US9827209B2 (en) | 2015-02-09 | 2017-11-28 | Microsoft Technology Licensing, Llc | Display system |
CN104780258B (zh) * | 2015-03-18 | 2017-12-12 | 北京佳讯飞鸿电气股份有限公司 | 基于加速度传感器的除噪方法、主机处理器及调度终端 |
CN104796514B (zh) * | 2015-03-18 | 2017-12-15 | 北京佳讯飞鸿电气股份有限公司 | 一种基于nfc装置的调度终端及其除噪方法 |
US10310726B2 (en) * | 2015-05-14 | 2019-06-04 | Oath Inc. | Content navigation based upon motion |
CN108778113B (zh) | 2015-09-18 | 2022-04-15 | 奥瑞斯健康公司 | 管状网络的导航 |
US10143526B2 (en) | 2015-11-30 | 2018-12-04 | Auris Health, Inc. | Robot-assisted driving systems and methods |
KR102509018B1 (ko) | 2016-01-11 | 2023-03-14 | 삼성디스플레이 주식회사 | 표시 장치 및 그의 구동방법 |
US20180310759A1 (en) * | 2017-04-27 | 2018-11-01 | Meyer Intellectual Properties Ltd. | Control system for cooking |
CN107346173A (zh) * | 2016-05-06 | 2017-11-14 | 中兴通讯股份有限公司 | 一种终端提醒方法及装置、终端 |
US11037464B2 (en) | 2016-07-21 | 2021-06-15 | Auris Health, Inc. | System with emulator movement tracking for controlling medical devices |
US10600150B2 (en) * | 2016-10-31 | 2020-03-24 | Adobe Inc. | Utilizing an inertial measurement device to adjust orientation of panorama digital images |
US11507216B2 (en) | 2016-12-23 | 2022-11-22 | Realwear, Inc. | Customizing user interfaces of binary applications |
US10365493B2 (en) | 2016-12-23 | 2019-07-30 | Realwear, Incorporated | Modular components for a head-mounted display |
US10620910B2 (en) | 2016-12-23 | 2020-04-14 | Realwear, Inc. | Hands-free navigation of touch-based operating systems |
US10936872B2 (en) | 2016-12-23 | 2021-03-02 | Realwear, Inc. | Hands-free contextually aware object interaction for wearable display |
US11099716B2 (en) * | 2016-12-23 | 2021-08-24 | Realwear, Inc. | Context based content navigation for wearable display |
US10393312B2 (en) | 2016-12-23 | 2019-08-27 | Realwear, Inc. | Articulating components for a head-mounted display |
US10437070B2 (en) | 2016-12-23 | 2019-10-08 | Realwear, Inc. | Interchangeable optics for a head-mounted display |
US10244926B2 (en) | 2016-12-28 | 2019-04-02 | Auris Health, Inc. | Detecting endolumenal buckling of flexible instruments |
CN108268056B (zh) * | 2016-12-30 | 2020-12-15 | 昊翔电能运动科技(昆山)有限公司 | 手持云台校准方法、装置和系统 |
CN108990412B (zh) | 2017-03-31 | 2022-03-22 | 奥瑞斯健康公司 | 补偿生理噪声的用于腔网络导航的机器人系统 |
US10022192B1 (en) | 2017-06-23 | 2018-07-17 | Auris Health, Inc. | Automatically-initialized robotic systems for navigation of luminal networks |
EP3644885B1 (de) | 2017-06-28 | 2023-10-11 | Auris Health, Inc. | Ausrichtung eines elektromagnetischen feldgenerators |
AU2018292281B2 (en) | 2017-06-28 | 2023-03-30 | Auris Health, Inc. | Electromagnetic distortion detection |
US11439839B2 (en) * | 2017-08-09 | 2022-09-13 | Acuity Innovation And Design, Llc | Hand-held treatment device using LED light sources with interchangeable emitters |
US11079077B2 (en) | 2017-08-31 | 2021-08-03 | Lynk Labs, Inc. | LED lighting system and installation methods |
US10555778B2 (en) | 2017-10-13 | 2020-02-11 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
US11058493B2 (en) | 2017-10-13 | 2021-07-13 | Auris Health, Inc. | Robotic system configured for navigation path tracing |
WO2019113391A1 (en) | 2017-12-08 | 2019-06-13 | Auris Health, Inc. | System and method for medical instrument navigation and targeting |
US11510736B2 (en) | 2017-12-14 | 2022-11-29 | Auris Health, Inc. | System and method for estimating instrument location |
EP3684283A4 (de) | 2017-12-18 | 2021-07-14 | Auris Health, Inc. | Verfahren und systeme zur instrumentenverfolgung und -navigation innerhalb von luminalen netzwerken |
JP7214747B2 (ja) | 2018-03-28 | 2023-01-30 | オーリス ヘルス インコーポレイテッド | 位置センサの位置合わせのためのシステム及び方法 |
JP7225259B2 (ja) | 2018-03-28 | 2023-02-20 | オーリス ヘルス インコーポレイテッド | 器具の推定位置を示すためのシステム及び方法 |
WO2019222495A1 (en) | 2018-05-18 | 2019-11-21 | Auris Health, Inc. | Controllers for robotically-enabled teleoperated systems |
CN114601559B (zh) | 2018-05-30 | 2024-05-14 | 奥瑞斯健康公司 | 用于基于定位传感器的分支预测的系统和介质 |
EP3801189B1 (de) | 2018-05-31 | 2024-09-11 | Auris Health, Inc. | Pfadbasierte navigation von röhrenförmigen netzwerken |
KR102455671B1 (ko) | 2018-05-31 | 2022-10-20 | 아우리스 헬스, 인코포레이티드 | 이미지-기반 기도 분석 및 매핑 |
CN112236083B (zh) | 2018-05-31 | 2024-08-13 | 奥瑞斯健康公司 | 用于导航检测生理噪声的管腔网络的机器人系统和方法 |
JP7536752B2 (ja) | 2018-09-28 | 2024-08-20 | オーリス ヘルス インコーポレイテッド | 内視鏡支援経皮的医療処置のためのシステム及び方法 |
EP3989793A4 (de) | 2019-06-28 | 2023-07-19 | Auris Health, Inc. | Konsolenauflage und verfahren zu ihrer verwendung |
WO2021038495A1 (en) | 2019-08-30 | 2021-03-04 | Auris Health, Inc. | Instrument image reliability systems and methods |
JP2022546421A (ja) | 2019-08-30 | 2022-11-04 | オーリス ヘルス インコーポレイテッド | 位置センサの重みベースの位置合わせのためのシステム及び方法 |
WO2021044297A1 (en) | 2019-09-03 | 2021-03-11 | Auris Health, Inc. | Electromagnetic distortion detection and compensation |
WO2021137108A1 (en) | 2019-12-31 | 2021-07-08 | Auris Health, Inc. | Alignment interfaces for percutaneous access |
EP4084721A4 (de) | 2019-12-31 | 2024-01-03 | Auris Health, Inc. | Identifizierung eines anatomischen merkmals und anvisierung |
EP4084720A4 (de) | 2019-12-31 | 2024-01-17 | Auris Health, Inc. | Ausrichtungstechniken für perkutanen zugang |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0805389A2 (de) * | 1996-04-30 | 1997-11-05 | Sun Microsystems, Inc. | Verschiebung an dem Sunpad, gemäss seiner Neigung |
WO1998014863A2 (en) * | 1996-10-01 | 1998-04-09 | Philips Electronics N.V. | Hand-held image display device |
GB2336747A (en) * | 1998-04-22 | 1999-10-27 | Nec Corp | Hand held communication terminal and method of scrolling display screen of the same. |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4812831A (en) * | 1987-02-10 | 1989-03-14 | Amp Incorporated | Key switch with controllable illumination |
US5142655A (en) | 1987-10-14 | 1992-08-25 | Wang Laboratories, Inc. | Computer input device using an orientation sensor |
DE68925124T2 (de) * | 1988-11-14 | 1996-07-04 | Wang Laboratories | Durch pressen bedienbare steuervorrichtung für rechneranzeigesysteme |
US5543588A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Touch pad driven handheld computing device |
US5296871A (en) * | 1992-07-27 | 1994-03-22 | Paley W Bradford | Three-dimensional mouse with tactile feedback |
JPH0764754A (ja) * | 1993-08-24 | 1995-03-10 | Hitachi Ltd | 小型情報処理装置 |
JPH0895539A (ja) * | 1994-09-28 | 1996-04-12 | Nec Corp | プレゼンテーション支援装置 |
CA2159251C (en) | 1994-12-19 | 2000-10-24 | Alan Edward Kaplan | Interactive pointing device |
JP3990744B2 (ja) * | 1995-09-08 | 2007-10-17 | キヤノン株式会社 | 電子機器及びその制御方法 |
US5703623A (en) * | 1996-01-24 | 1997-12-30 | Hall; Malcolm G. | Smart orientation sensing circuit for remote control |
CA2264167A1 (en) * | 1996-08-28 | 1998-03-05 | Via, Inc. | Touch screen systems and methods |
US6088023A (en) * | 1996-12-10 | 2000-07-11 | Willow Design, Inc. | Integrated pointing and drawing graphics system for computers |
US6008810A (en) * | 1997-03-07 | 1999-12-28 | International Business Machines Corporation | Mobile client computer programmed for system message display |
US6057554A (en) * | 1997-05-12 | 2000-05-02 | Plesko; George A. | Reflective switch |
US6280327B1 (en) * | 1998-06-05 | 2001-08-28 | Arista Interactive Llc | Wireless game control units |
WO2000017848A1 (en) * | 1998-09-22 | 2000-03-30 | Vega Vista, Inc. | Intuitive control of portable data displays |
US6201554B1 (en) * | 1999-01-12 | 2001-03-13 | Ericsson Inc. | Device control apparatus for hand-held data processing device |
US6466198B1 (en) | 1999-11-05 | 2002-10-15 | Innoventions, Inc. | View navigation and magnification of a hand-held device with a display |
US6245014B1 (en) * | 1999-11-18 | 2001-06-12 | Atlantic Limited Partnership | Fitness for duty testing device and method |
-
2000
- 2000-04-05 US US09/543,660 patent/US6466198B1/en not_active Expired - Lifetime
-
2001
- 2001-04-04 DE DE60132201T patent/DE60132201T2/de not_active Expired - Lifetime
- 2001-04-04 AT AT01928361T patent/ATE382889T1/de not_active IP Right Cessation
- 2001-04-04 EP EP01928361A patent/EP1290672B1/de not_active Expired - Lifetime
- 2001-04-04 WO PCT/US2001/010962 patent/WO2001078055A1/en active Search and Examination
-
2002
- 2002-08-20 US US10/224,073 patent/US6933923B2/en not_active Expired - Lifetime
-
2003
- 2003-09-10 HK HK03106445.5A patent/HK1054610A1/zh unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0805389A2 (de) * | 1996-04-30 | 1997-11-05 | Sun Microsystems, Inc. | Verschiebung an dem Sunpad, gemäss seiner Neigung |
WO1998014863A2 (en) * | 1996-10-01 | 1998-04-09 | Philips Electronics N.V. | Hand-held image display device |
GB2336747A (en) * | 1998-04-22 | 1999-10-27 | Nec Corp | Hand held communication terminal and method of scrolling display screen of the same. |
Non-Patent Citations (1)
Title |
---|
See also references of WO0178055A1 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9866667B2 (en) | 2012-02-24 | 2018-01-09 | Blackberry Limited | Handheld device with notification message viewing |
US10375220B2 (en) | 2012-02-24 | 2019-08-06 | Blackberry Limited | Handheld device with notification message viewing |
US10346022B2 (en) | 2013-07-24 | 2019-07-09 | Innoventions, Inc. | Tilt-based view scrolling with baseline update for proportional and dynamic modes |
Also Published As
Publication number | Publication date |
---|---|
ATE382889T1 (de) | 2008-01-15 |
DE60132201D1 (de) | 2008-02-14 |
US6466198B1 (en) | 2002-10-15 |
WO2001078055A1 (en) | 2001-10-18 |
DE60132201T2 (de) | 2008-12-24 |
EP1290672A4 (de) | 2005-04-06 |
EP1290672B1 (de) | 2008-01-02 |
US6933923B2 (en) | 2005-08-23 |
US20020190947A1 (en) | 2002-12-19 |
HK1054610A1 (zh) | 2003-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1290672B1 (de) | Ansichtsnavigation und vergrösserung eines tragbaren geräts mit einer anzeige | |
RU2288512C2 (ru) | Способ и устройство для просмотра информации на дисплее | |
US6127990A (en) | Wearable display and methods for controlling same | |
US8139030B2 (en) | Magnetic sensor for use with hand-held devices | |
US6400376B1 (en) | Display control for hand-held data processing device | |
US6798429B2 (en) | Intuitive mobile device interface to virtual spaces | |
US6184847B1 (en) | Intuitive control of portable data displays | |
US6151208A (en) | Wearable computing device mounted on superior dorsal aspect of a hand | |
US20070176898A1 (en) | Air-writing and motion sensing input for portable devices | |
US20100171691A1 (en) | Viewing images with tilt control on a hand-held device | |
US20040196400A1 (en) | Digital camera user interface using hand gestures | |
US20100265269A1 (en) | Portable terminal and a display control method for portable terminal | |
EP2350782A1 (de) | Mobile vorrichtungen mit bewegungsgestenerkennung | |
JP2003271310A (ja) | 情報入出力装置、その制御方法および該制御方法を実現するためのプログラム | |
KR20060035148A (ko) | 모바일 기기의 동작 인식 장치 및 이를 이용한 사용자의동작 인식 방법 | |
JP2007232772A (ja) | データ表示装置、データ表示プログラム、および、コンピュータ読取可能な記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20021105 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20050221 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: 7G 06F 1/16 A |
|
17Q | First examination report despatched |
Effective date: 20050415 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REF | Corresponds to: |
Ref document number: 60132201 Country of ref document: DE Date of ref document: 20080214 Kind code of ref document: P |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 |
|
NLV1 | Nl: lapsed or annulled due to failure to fulfill the requirements of art. 29p and 29m of the patents act | ||
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 Ref country code: CH Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080413 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 |
|
ET | Fr: translation filed | ||
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080602 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080402 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20080430 |
|
26N | No opposition filed |
Effective date: 20081003 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20080404 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 |
|
REG | Reference to a national code |
Ref country code: HK Ref legal event code: WD Ref document number: 1054610 Country of ref document: HK |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20080404 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080102 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20080403 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 16 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 17 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20170323 Year of fee payment: 17 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20170427 Year of fee payment: 17 Ref country code: FR Payment date: 20170426 Year of fee payment: 17 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: IT Payment date: 20170421 Year of fee payment: 17 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 60132201 Country of ref document: DE |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20180404 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181101 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180404 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180430 Ref country code: IT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180404 |