US20150160722A1 - Integrated tracking for on screen navigation with small hand held devices - Google Patents
Integrated tracking for on screen navigation with small hand held devices Download PDFInfo
- Publication number
- US20150160722A1 US20150160722A1 US14/484,213 US201414484213A US2015160722A1 US 20150160722 A1 US20150160722 A1 US 20150160722A1 US 201414484213 A US201414484213 A US 201414484213A US 2015160722 A1 US2015160722 A1 US 2015160722A1
- Authority
- US
- United States
- Prior art keywords
- computer system
- movement
- display
- movement sensor
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
Definitions
- the present invention generally relates to the field of data processing. More particularly, an embodiment of the present invention relates to enabling documents to be viewed with a portable device.
- Mobile computer systems such as, for example, laptop or notebook computer systems, personal digital assistants (PDA), cellular phones, etc. are quickly gaining popularity because of their small size, lightweight, increase in performance and decrease in cost.
- the size of the display may vary.
- a laptop computer system may have a 15 inch display, whereas a PDA may have a smaller display.
- One drawback of having a small display is the ability to view information. Often, information is filtered such that limited amount can be displayed. When the information cannot be filtered, it may be possible to display the information. The smaller display also makes it difficult to navigate the information being displayed especially when there is limited navigation capability.
- FIG. 1 illustrates one example of a prior art computer system, in accordance with one embodiment.
- FIG. 2 is a diagram illustrating one example of a bottom view of a computer system equipped with a movement sensor, in accordance with one embodiment.
- FIGS. 3A and 3B illustrate an example of controlling location of a cursor in a computer system equipped with a movement sensor, in accordance with one embodiment.
- FIG. 4 illustrates an example of controlling information displayed on a display screen of a computer system equipped with a movement sensor, in accordance with one embodiment.
- FIGS. 5A-C are diagrams illustrating different examples of interactions with a computer system equipped with a movement sensor, in accordance with one embodiment.
- FIG. 6 is a flow diagram illustrating one example of a process of determining information to be displayed on a computer system equipped with a movement sensor, in accordance with one embodiment.
- FIG. 7 illustrates one example of a computer system, in accordance with one embodiment.
- a method and system for controlling information displayed in a computer system may be a handheld computer system equipped with a movement sensor.
- the information displayed on a display screen of the computer system may be associated with a portion of the data entity. Navigating the information to display other portions of the data entity may be performed by sensing movement of the computer system.
- the phrase computer system may refer to a laptop computer system, a handheld computer system, a micro personal computer system, a tablet computer system, a digital organizer, a cellular phone or any other portable computer systems that may include a display screen to display information.
- FIG. 1 illustrates one example of a prior art computer system, in accordance with one embodiment.
- Computer system 100 may be a digital organizer such as those manufactured by, for example, palmOne Inc. of Milpitas, Calif.
- the computer system 100 may include a writing area 115 and a display 105 .
- the display 105 may be a color display, a monochrome display, etc.
- the display 105 may be small (e.g., 160 ⁇ 160 pixel display) and may limit the amount of information that is viewable at a time.
- a vertical scroll bar 110 may be provided to enable scrolling.
- a horizontal scroll bar 112 may be provided to enable displaying information adjacent to the information currently displayed.
- a pointing device or a stylus may be used to interact with the scroll bars 110 , 115 .
- Another technique used to enable scrolling includes using a thumb wheel (not shown). To scroll diagonally, a combination of vertical and horizontal scrolling may be required, making the techniques described in this example slow and cumbersome.
- FIG. 2 is a diagram illustrating one example of a bottom view of a computer system equipped with a movement sensor, in accordance with one embodiment.
- a bottom side of the computer system 200 may include a movement sensor 205 .
- the movement sensor 205 may be any device that can sense directions of the movement of the computer system 200 on a surface.
- the surface may be generally flat.
- the bottom side of the computer system 200 may be designed with appropriate surface contacts (not shown) made with material that may enable the computer system 200 to be moved across the surface with relative ease while not interfering with functionalities of the movement sensor 205 .
- the movement sensor 205 may be an optical sensor. Optical sensors are known to one skilled in the art.
- the movement sensor 205 may also be located at a location other than the location illustrated in FIG. 2 as long as it is able to sense the directions of the movements of the computer system 200 .
- the bottom side of the computer system 200 may include a second movement sensor (not shown). The combination of the movement sensor 205 and the second movement sensor may enable detection of angular position or rotation of the computer system 200 .
- the computer system 200 may include logic that translates the information sensed by the movement sensor 205 .
- This logic may be referred to herein as translation logic.
- the translation logic may be implemented in software, hardware, or a combination of both.
- the translation logic may translate the information sensed by the movement sensor 205 into operations that can be performed by the computer system 200 .
- FIGS. 3A and 3B illustrate an example of controlling location of a cursor in a computer system equipped with a movement sensor, in accordance with one embodiment.
- Computer system 300 is illustrated with its top side and its display visible.
- the curve line drawn above the computer system 300 illustrates a pattern that the computer system 300 has moved through in moving from location 305 ( FIG. 3A ) at time t 1 to location 310 ( FIG. 3B ) at time t 2 .
- This pattern may be sensed by the movement sensor 205 and provided to the translation logic.
- the translation logic may translate information associated with the pattern into operations that may result in moving a cursor along a similar pattern.
- This example is illustrated with a cursor located at position 406 ( FIG. 3A ) before the movement of the computer system 300 and at position 411 ( FIG. 3B ) after the movement of the computer system 300 .
- the curve lines drawn on the display illustrate the similarity between the movement of the cursor and the movement of the computer system 300 .
- FIG. 4 illustrates an example of controlling information displayed on a display of a computer system equipped with a movement sensor, in accordance with one embodiment.
- Computer system 400 is illustrated with its display visible.
- the display may be viewed as a window to a large document (e.g., a map) that cannot be displayed in its entirety due to the small size of the display.
- the information being displayed is associated with a section or portion 410 of a map 405 .
- the map 405 may be stored in the computer system 400 , or it may be accessed via a network.
- the map 405 in this example is illustrated logically rather than physically.
- the translation logic may translate the information sensed by the movement sensor 205 into operations that may cause another section of the document to be displayed.
- the selection of this other section may be consistent with the pattern of movement of the computer system 400 .
- the display may include a section of the document that is on the right side of the section previously included on the display.
- Other arrows illustrated in FIG. 4 represent different possible directions (e.g., vertical, diagonal, etc.) that the computer system 400 may be moved to display different sections of the document.
- the movement sensor 205 may be a mechanical sensor such as, for example, one that is implemented using a trackball. This may enable the movement sensor 205 to be manipulated while the computer system 400 is not placed against a surface. For example, a user may place a finger over the trackball from the bottom side of the computer system 400 and navigate or control information to be included on the display by turning the trackball. The movement of the trackball may then be sensed by the movement sensor 205 . It may be noted that one advantage of using the movement sensor 205 is that scrolling of the document in the diagonal direction can be easily performed and thus can be very intuitive.
- FIGS. 5A-C are diagrams illustrating different examples of interactions with a computer system equipped with a movement sensor, in accordance with one embodiment.
- computer system 500 may include an upper section (not shown) and a lower section (not shown).
- the display may be part of the upper section.
- the bottom side of the computer system 500 may be part of the lower section.
- a pressure when a pressure is applied to the upper section, the upper section may move slightly toward the lower section. The upper section may then move back to its normal default position when the pressure is removed. This type of movement may be referred to as a clicking motion.
- the computer system 500 may perform different operations. For example, when a pressure is applied such that the upper section is almost evenly displaced toward the lower section, a first mouse click may be recognized by the computer system 500 and corresponding actions may be performed. The same mouse click may be recognized when the pressure 505 is applied toward the middle of the upper section ( FIG. 5A ). When the pressure 510 is applied toward the right side of the upper section ( FIG. 5B ), a second mouse click may be recognized by the computer system 500 . When the pressure 515 is applied toward the left side of the upper section ( FIG. 5C ), a third mouse click may be recognized by the computer system 500 .
- the translation logic may translate the different types of clicking motions of the upper section relative to the lower section into operations to be performed by the computer system 500 .
- the ability to cause operations to be performed by initiating different clicking motions as described with FIGS. 5A-C may enable a user to use the computer system 500 more efficiently.
- FIG. 6 is a flow diagram illustrating one example of a process of determining information to be displayed on a computer system equipped with a movement sensor, in accordance with one embodiment.
- the process may be performed by the computer system using the information provided by the movement sensor 205 .
- the computer system is displaying information associated with one section of a data entity (e.g., a document).
- a test is made to determine if the movement sensor 205 has sensed any movement of the computer system.
- the computer system may continue to display the same information.
- the movement sensor 205 senses the movement information, as shown in block 615 .
- This information may then be translated by the translation logic into operations to display another section of the data entity, as shown in block 620 .
- the computer system may display a section of the data entity that is above a section that was previously displayed.
- the computer system is equipped with a mechanical sensor such as a trackball, the same result may be accomplished by turning the trackball downward to scroll the document being displayed upward. It may be noted that the process described may be used to scroll a document, move a cursor, or perform any operations that normally requires using a mouse or similar controlling devices.
- FIG. 7 illustrates one example of a computer system, in accordance with one embodiment.
- Computer system 700 may be a handheld computer system and may include processor 705 .
- the processor 705 may be a processor in the family of Pentium processors manufactured by Intel Corporation of Santa Clara, California. Other processors may also be used.
- the computer system 700 may include a display controller 710 and memory 715 .
- the display controller 710 may be coupled to a display (not shown) which may be a liquid crystal display (LCD) or a display that uses other suitable display technology.
- the memory 715 may be a combination of one or more static random access memory (SRAM), dynamic random access memory (DRAM), read only memory (ROM), etc.
- SRAM static random access memory
- DRAM dynamic random access memory
- ROM read only memory
- the computer system 700 may also include a movement sensor 720 , translation logic 730 and a storage device 725 .
- the movement sensor 720 may be an optical sensor, a mechanical sensor, or any sensor that may be used to detect movements of the computer system 700 .
- the translation logic 730 may include logic to translate movement information sensed by the movement sensor. 720 .
- the translation logic 730 may translate that information into operations that can be processed by the processor 705 .
- the storage device 725 may be used to store the data entity that may be included on the display of the computer system 700 .
- the computer system 700 may also include other components to enable it to perform various functions.
- embodiments of the present invention may be implemented as one or more software programs, embodiments of the present invention may be implemented or realized upon or within a machine readable medium.
- the translation logic may be implemented in software, and the instructions associated with the translation logic may be stored in a machine readable medium.
- a machine readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer).
- a machine readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices, etc.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method for navigating information displayed on a display of a computer system includes determining movement of the computer system. The movement of the computer system is sensed by a movement sensor.
Description
- The present invention generally relates to the field of data processing. More particularly, an embodiment of the present invention relates to enabling documents to be viewed with a portable device.
- Mobile computer systems such as, for example, laptop or notebook computer systems, personal digital assistants (PDA), cellular phones, etc. are quickly gaining popularity because of their small size, lightweight, increase in performance and decrease in cost. Depending on the type of systems, the size of the display may vary. For example, a laptop computer system may have a 15 inch display, whereas a PDA may have a smaller display. One drawback of having a small display is the ability to view information. Often, information is filtered such that limited amount can be displayed. When the information cannot be filtered, it may be possible to display the information. The smaller display also makes it difficult to navigate the information being displayed especially when there is limited navigation capability.
- The invention may be best understood by referring to the following description and accompanying drawings that are used to illustrate embodiments of the invention. In the drawings:
-
FIG. 1 illustrates one example of a prior art computer system, in accordance with one embodiment. -
FIG. 2 is a diagram illustrating one example of a bottom view of a computer system equipped with a movement sensor, in accordance with one embodiment. -
FIGS. 3A and 3B illustrate an example of controlling location of a cursor in a computer system equipped with a movement sensor, in accordance with one embodiment. -
FIG. 4 illustrates an example of controlling information displayed on a display screen of a computer system equipped with a movement sensor, in accordance with one embodiment. -
FIGS. 5A-C are diagrams illustrating different examples of interactions with a computer system equipped with a movement sensor, in accordance with one embodiment. -
FIG. 6 is a flow diagram illustrating one example of a process of determining information to be displayed on a computer system equipped with a movement sensor, in accordance with one embodiment. -
FIG. 7 illustrates one example of a computer system, in accordance with one embodiment. - For one embodiment, a method and system for controlling information displayed in a computer system is disclosed. The computer system may be a handheld computer system equipped with a movement sensor. The information displayed on a display screen of the computer system may be associated with a portion of the data entity. Navigating the information to display other portions of the data entity may be performed by sensing movement of the computer system.
- In the following detailed description of embodiments of the present invention numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that embodiments of the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention.
- Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “for one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- In the following discussion, the phrase computer system may refer to a laptop computer system, a handheld computer system, a micro personal computer system, a tablet computer system, a digital organizer, a cellular phone or any other portable computer systems that may include a display screen to display information.
-
FIG. 1 illustrates one example of a prior art computer system, in accordance with one embodiment.Computer system 100 may be a digital organizer such as those manufactured by, for example, palmOne Inc. of Milpitas, Calif. Thecomputer system 100 may include awriting area 115 and adisplay 105. Thedisplay 105 may be a color display, a monochrome display, etc. Thedisplay 105 may be small (e.g., 160×160 pixel display) and may limit the amount of information that is viewable at a time. - To view information on the
display 105, avertical scroll bar 110 may be provided to enable scrolling. Depending on the type of information, there may also be ahorizontal scroll bar 112 to enable displaying information adjacent to the information currently displayed. A pointing device or a stylus (not shown) may be used to interact with thescroll bars -
FIG. 2 is a diagram illustrating one example of a bottom view of a computer system equipped with a movement sensor, in accordance with one embodiment. For one embodiment, a bottom side of thecomputer system 200 may include amovement sensor 205. Themovement sensor 205 may be any device that can sense directions of the movement of thecomputer system 200 on a surface. The surface may be generally flat. The bottom side of thecomputer system 200 may be designed with appropriate surface contacts (not shown) made with material that may enable thecomputer system 200 to be moved across the surface with relative ease while not interfering with functionalities of themovement sensor 205. For one embodiment, themovement sensor 205 may be an optical sensor. Optical sensors are known to one skilled in the art. Themovement sensor 205 may also be located at a location other than the location illustrated inFIG. 2 as long as it is able to sense the directions of the movements of thecomputer system 200. For one embodiment, the bottom side of thecomputer system 200 may include a second movement sensor (not shown). The combination of themovement sensor 205 and the second movement sensor may enable detection of angular position or rotation of thecomputer system 200. - For one embodiment, the
computer system 200 may include logic that translates the information sensed by themovement sensor 205. This logic may be referred to herein as translation logic. The translation logic may be implemented in software, hardware, or a combination of both. For example, the translation logic may translate the information sensed by themovement sensor 205 into operations that can be performed by thecomputer system 200. -
FIGS. 3A and 3B illustrate an example of controlling location of a cursor in a computer system equipped with a movement sensor, in accordance with one embodiment.Computer system 300 is illustrated with its top side and its display visible. The curve line drawn above thecomputer system 300 illustrates a pattern that thecomputer system 300 has moved through in moving from location 305 (FIG. 3A ) at time t1 to location 310 (FIG. 3B ) at time t2. This pattern may be sensed by themovement sensor 205 and provided to the translation logic. The translation logic may translate information associated with the pattern into operations that may result in moving a cursor along a similar pattern. This example is illustrated with a cursor located at position 406 (FIG. 3A ) before the movement of thecomputer system 300 and at position 411 (FIG. 3B ) after the movement of thecomputer system 300. The curve lines drawn on the display illustrate the similarity between the movement of the cursor and the movement of thecomputer system 300. -
FIG. 4 illustrates an example of controlling information displayed on a display of a computer system equipped with a movement sensor, in accordance with one embodiment.Computer system 400 is illustrated with its display visible. The display may be viewed as a window to a large document (e.g., a map) that cannot be displayed in its entirety due to the small size of the display. In this example, the information being displayed is associated with a section orportion 410 of amap 405. Themap 405 may be stored in thecomputer system 400, or it may be accessed via a network. Themap 405 in this example is illustrated logically rather than physically. - For one embodiment, the translation logic may translate the information sensed by the
movement sensor 205 into operations that may cause another section of the document to be displayed. The selection of this other section may be consistent with the pattern of movement of thecomputer system 400. Referring toFIG. 4 , when thecomputer system 400 is moved horizontally toward the right, the display may include a section of the document that is on the right side of the section previously included on the display. Other arrows illustrated inFIG. 4 represent different possible directions (e.g., vertical, diagonal, etc.) that thecomputer system 400 may be moved to display different sections of the document. - For one embodiment, the
movement sensor 205 may be a mechanical sensor such as, for example, one that is implemented using a trackball. This may enable themovement sensor 205 to be manipulated while thecomputer system 400 is not placed against a surface. For example, a user may place a finger over the trackball from the bottom side of thecomputer system 400 and navigate or control information to be included on the display by turning the trackball. The movement of the trackball may then be sensed by themovement sensor 205. It may be noted that one advantage of using themovement sensor 205 is that scrolling of the document in the diagonal direction can be easily performed and thus can be very intuitive. -
FIGS. 5A-C are diagrams illustrating different examples of interactions with a computer system equipped with a movement sensor, in accordance with one embodiment. For one embodiment, in addition to having amovement sensor 205,computer system 500 may include an upper section (not shown) and a lower section (not shown). The display may be part of the upper section. The bottom side of thecomputer system 500 may be part of the lower section. For one embodiment, when a pressure is applied to the upper section, the upper section may move slightly toward the lower section. The upper section may then move back to its normal default position when the pressure is removed. This type of movement may be referred to as a clicking motion. - For one embodiment, depending on where the pressure is applied to the upper section, the
computer system 500 may perform different operations. For example, when a pressure is applied such that the upper section is almost evenly displaced toward the lower section, a first mouse click may be recognized by thecomputer system 500 and corresponding actions may be performed. The same mouse click may be recognized when thepressure 505 is applied toward the middle of the upper section (FIG. 5A ). When thepressure 510 is applied toward the right side of the upper section (FIG. 5B ), a second mouse click may be recognized by thecomputer system 500. When thepressure 515 is applied toward the left side of the upper section (FIG. 5C ), a third mouse click may be recognized by thecomputer system 500. The translation logic may translate the different types of clicking motions of the upper section relative to the lower section into operations to be performed by thecomputer system 500. The ability to cause operations to be performed by initiating different clicking motions as described withFIGS. 5A-C may enable a user to use thecomputer system 500 more efficiently. -
FIG. 6 is a flow diagram illustrating one example of a process of determining information to be displayed on a computer system equipped with a movement sensor, in accordance with one embodiment. The process may be performed by the computer system using the information provided by themovement sensor 205. Atblock 605, the computer system is displaying information associated with one section of a data entity (e.g., a document). Atblock 610, a test is made to determine if themovement sensor 205 has sensed any movement of the computer system. - When no movement is sensed, the computer system may continue to display the same information. However, when the computer system is moved, the
movement sensor 205 senses the movement information, as shown in block 615. This information may then be translated by the translation logic into operations to display another section of the data entity, as shown inblock 620. For example, when the computer system is moved in a vertical direction on a surface, the computer system may display a section of the data entity that is above a section that was previously displayed. As another example, when the computer system is equipped with a mechanical sensor such as a trackball, the same result may be accomplished by turning the trackball downward to scroll the document being displayed upward. It may be noted that the process described may be used to scroll a document, move a cursor, or perform any operations that normally requires using a mouse or similar controlling devices. -
FIG. 7 illustrates one example of a computer system, in accordance with one embodiment.Computer system 700 may be a handheld computer system and may includeprocessor 705. Theprocessor 705 may be a processor in the family of Pentium processors manufactured by Intel Corporation of Santa Clara, California. Other processors may also be used. Thecomputer system 700 may include adisplay controller 710 andmemory 715. Thedisplay controller 710 may be coupled to a display (not shown) which may be a liquid crystal display (LCD) or a display that uses other suitable display technology. Thememory 715 may be a combination of one or more static random access memory (SRAM), dynamic random access memory (DRAM), read only memory (ROM), etc. - The
computer system 700 may also include amovement sensor 720,translation logic 730 and astorage device 725. Themovement sensor 720 may be an optical sensor, a mechanical sensor, or any sensor that may be used to detect movements of thecomputer system 700. Thetranslation logic 730 may include logic to translate movement information sensed by the movement sensor. 720. Thetranslation logic 730 may translate that information into operations that can be processed by theprocessor 705. Thestorage device 725 may be used to store the data entity that may be included on the display of thecomputer system 700. Although not shown, thecomputer system 700 may also include other components to enable it to perform various functions. - It is also to be understood that because embodiments of the present invention may be implemented as one or more software programs, embodiments of the present invention may be implemented or realized upon or within a machine readable medium. For example, the translation logic may be implemented in software, and the instructions associated with the translation logic may be stored in a machine readable medium. A machine readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices, etc.
- In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Claims (23)
1. A method, comprising:
determining information to be displayed on a display of a computer system by sensing movement of the computer system.
2. The method of claim 1 , wherein said sensing the movement of the computer system is performed using a movement sensor.
3. The method of claim 2 , wherein the movement sensor is an optical sensor.
4. The method of claim 3 , wherein said sensing the movement of the computer system is performed when the computer system is placed on a surface.
5. The method of claim 2 , wherein the movement sensor is a mechanical sensor.
6. The method of claim 5 , wherein said sensing the movement of the computer system is performed when the computer system is on a surface or handheld.
7. The method of claim 1 , wherein said sensing the movement of the computer system comprises sensing direction of the movement of the computer system.
8. The method of claim 7 , wherein said determining the information to be displayed on the display comprises determining the information consistent with said sensed direction of the movement of the computer system.
9. The method of claim 1 , further comprising:
determining an action to be performed by the computer system by sensing a clicking motion of the computer system.
10. The method of claim 9 , wherein the clicking motion of the computer system is initiated by applying pressure on an upper section of the computer system toward a lower section of the computer system.
11. The method of claim 1 , wherein said determining the information to be displayed on the display of the computer system comprises determining a location of a cursor.
12-16. (canceled)
17. A system, comprising:
a processor;
a display coupled to the processor;
a first movement sensor coupled to the processor, the first movement sensor is to sense direction of movement of the system; and
translation logic to translate the direction of movement of the system into a first set of operations to be performed by the processor, wherein the first set of operations includes displaying information on the display consistent with the sensed direction of movement of the system.
18. The system of claim 17 , wherein the translation logic is further to translate clicking motion of the system into a second set of operations to be performed by the processor, wherein the second set of operations corresponds to an action performed when a mouse click is initiated.
19. The system of claim 18 , wherein said clicking motion is initiated by applying pressure to an upper section of the system toward a lower section of the system, wherein said upper section includes the display, and wherein said second section includes the movement sensor.
20. (canceled)
21. The system of claim 17 , further comprising a second movement sensor coupled to the first movement sensor.
22. The system of claim 21 , wherein angular rotation is determined by using directional information sensed by the first movement sensor and the second movement sensor.
23. A method, comprising:
navigating information displayed on a display of a computer system by causing a first movement sensor to sense movement of the computer system.
24. The method of claim 23 , further comprising:
controlling position of a cursor displayed on the display of the computer system by causing the first movement sensor to sense movement of the computer system.
25. The method of claim 23 , further comprising:
determining angular rotation of the computer system by causing the first movement sensor and a second movement sensor to sense movement of the computer system.
26. The method of claim 23 , further comprising:
recognizing a mouse click action when an upper section of the computer system is displaced toward a lower section of the computer system.
27. (canceled)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/484,213 US20150160722A1 (en) | 2004-03-17 | 2014-09-11 | Integrated tracking for on screen navigation with small hand held devices |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/803,334 US8842070B2 (en) | 2004-03-17 | 2004-03-17 | Integrated tracking for on screen navigation with small hand held devices |
US14/484,213 US20150160722A1 (en) | 2004-03-17 | 2014-09-11 | Integrated tracking for on screen navigation with small hand held devices |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/803,334 Continuation US8842070B2 (en) | 2004-03-17 | 2004-03-17 | Integrated tracking for on screen navigation with small hand held devices |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150160722A1 true US20150160722A1 (en) | 2015-06-11 |
Family
ID=34985727
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/803,334 Expired - Fee Related US8842070B2 (en) | 2004-03-17 | 2004-03-17 | Integrated tracking for on screen navigation with small hand held devices |
US14/484,213 Abandoned US20150160722A1 (en) | 2004-03-17 | 2014-09-11 | Integrated tracking for on screen navigation with small hand held devices |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/803,334 Expired - Fee Related US8842070B2 (en) | 2004-03-17 | 2004-03-17 | Integrated tracking for on screen navigation with small hand held devices |
Country Status (1)
Country | Link |
---|---|
US (2) | US8842070B2 (en) |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6943678B2 (en) * | 2000-01-24 | 2005-09-13 | Nextreme, L.L.C. | Thermoformed apparatus having a communications device |
CN1316339C (en) | 2001-12-21 | 2007-05-16 | 捷讯研究有限公司 | Handheld electronic device with keyboard |
JP4855654B2 (en) * | 2004-05-31 | 2012-01-18 | ソニー株式会社 | On-vehicle device, on-vehicle device information providing method, on-vehicle device information providing method program, and on-vehicle device information providing method program |
US7986301B2 (en) | 2004-06-21 | 2011-07-26 | Research In Motion Limited | Handheld wireless communication device |
US20070254698A1 (en) * | 2004-06-21 | 2007-11-01 | Griffin Jason T | Handheld wireless communication device |
US7973765B2 (en) * | 2004-06-21 | 2011-07-05 | Research In Motion Limited | Handheld wireless communication device |
US8064946B2 (en) | 2004-06-21 | 2011-11-22 | Research In Motion Limited | Handheld wireless communication device |
US20070254703A1 (en) * | 2004-06-21 | 2007-11-01 | Griffin Jason T | Handheld wireless communication device |
US20070254689A1 (en) * | 2004-06-21 | 2007-11-01 | Griffin Jason T | Handheld wireless communication device |
US20070192711A1 (en) * | 2006-02-13 | 2007-08-16 | Research In Motion Limited | Method and arrangement for providing a primary actions menu on a handheld communication device |
US7982712B2 (en) * | 2004-06-21 | 2011-07-19 | Research In Motion Limited | Handheld wireless communication device |
US8271036B2 (en) * | 2004-06-21 | 2012-09-18 | Research In Motion Limited | Handheld wireless communication device |
US20070254705A1 (en) * | 2004-06-21 | 2007-11-01 | Griffin Jason T | Handheld wireless communication device |
US20070254708A1 (en) * | 2004-06-21 | 2007-11-01 | Griffin Jason T | Handheld wireless communication device |
US20070254700A1 (en) * | 2004-06-21 | 2007-11-01 | Griffin Jason T | Handheld wireless communication device |
US8463315B2 (en) | 2004-06-21 | 2013-06-11 | Research In Motion Limited | Handheld wireless communication device |
US8219158B2 (en) * | 2004-06-21 | 2012-07-10 | Research In Motion Limited | Handheld wireless communication device |
EP1677178A1 (en) * | 2004-12-29 | 2006-07-05 | STMicroelectronics S.r.l. | Pointing device for a computer system with automatic detection of lifting, and relative control method |
US8000741B2 (en) * | 2006-02-13 | 2011-08-16 | Research In Motion Limited | Handheld wireless communication device with chamfer keys |
US8537117B2 (en) | 2006-02-13 | 2013-09-17 | Blackberry Limited | Handheld wireless communication device that selectively generates a menu in response to received commands |
US7770118B2 (en) * | 2006-02-13 | 2010-08-03 | Research In Motion Limited | Navigation tool with audible feedback on a handheld communication device having a full alphabetic keyboard |
JP2015194848A (en) * | 2014-03-31 | 2015-11-05 | ブラザー工業株式会社 | Display program and display device |
EP3130994A4 (en) * | 2014-04-07 | 2018-01-03 | Sony Corporation | Display control device, display control method, and program |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5912660A (en) * | 1997-01-09 | 1999-06-15 | Virtouch Ltd. | Mouse-like input/output device with display screen and method for its use |
US20020082083A1 (en) * | 1998-11-02 | 2002-06-27 | Takeshi Ito | Information communication electronic device and information display method |
US20020093483A1 (en) * | 2000-11-30 | 2002-07-18 | Kaplan Alan Edward | Display control for hand-held devices |
US20040222980A1 (en) * | 2003-04-24 | 2004-11-11 | Byoung-Gon Lee | Apparatus for changing representation contents on sub-display of dual-folder type mobile communication terminal |
US20040227742A1 (en) * | 2002-08-06 | 2004-11-18 | Sina Fateh | Control of display content by movement on a fixed spherical space |
US7499040B2 (en) * | 2003-08-18 | 2009-03-03 | Apple Inc. | Movable touch pad with added functionality |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5825308A (en) * | 1996-11-26 | 1998-10-20 | Immersion Human Interface Corporation | Force feedback interface having isotonic and isometric functionality |
US6750877B2 (en) * | 1995-12-13 | 2004-06-15 | Immersion Corporation | Controlling haptic feedback for enhancing navigation in a graphical environment |
US6037937A (en) * | 1997-12-04 | 2000-03-14 | Nortel Networks Corporation | Navigation tool for graphical user interface |
US6429846B2 (en) * | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
AUPQ439299A0 (en) * | 1999-12-01 | 1999-12-23 | Silverbrook Research Pty Ltd | Interface system |
US6337678B1 (en) * | 1999-07-21 | 2002-01-08 | Tactiva Incorporated | Force feedback computer input and output device with coordinated haptic elements |
US6466198B1 (en) * | 1999-11-05 | 2002-10-15 | Innoventions, Inc. | View navigation and magnification of a hand-held device with a display |
US20020093492A1 (en) * | 2001-01-18 | 2002-07-18 | Baron John M. | System for a navigable display |
US6977645B2 (en) * | 2001-03-16 | 2005-12-20 | Agilent Technologies, Inc. | Portable electronic device with mouse-like capabilities |
US6677929B2 (en) * | 2001-03-21 | 2004-01-13 | Agilent Technologies, Inc. | Optical pseudo trackball controls the operation of an appliance or machine |
US7379053B2 (en) * | 2001-10-27 | 2008-05-27 | Vortant Technologies, Llc | Computer interface for navigating graphical user interface by touch |
WO2003050754A1 (en) * | 2001-12-12 | 2003-06-19 | Koninklijke Philips Electronics N.V. | Display system with tactile guidance |
US7456823B2 (en) * | 2002-06-14 | 2008-11-25 | Sony Corporation | User interface apparatus and portable information apparatus |
US6968508B2 (en) * | 2002-07-30 | 2005-11-22 | Motorola, Inc. | Rotating user interface |
US7305631B1 (en) * | 2002-09-30 | 2007-12-04 | Danger, Inc. | Integrated motion sensor for a data processing device |
FI20022282A0 (en) * | 2002-12-30 | 2002-12-30 | Nokia Corp | Method for enabling interaction in an electronic device and an electronic device |
US7102626B2 (en) * | 2003-04-25 | 2006-09-05 | Hewlett-Packard Development Company, L.P. | Multi-function pointing device |
US8046705B2 (en) * | 2003-05-08 | 2011-10-25 | Hillcrest Laboratories, Inc. | Systems and methods for resolution consistent semantic zooming |
US6986614B2 (en) * | 2003-07-31 | 2006-01-17 | Microsoft Corporation | Dual navigation control computer keyboard |
US7463238B2 (en) * | 2003-08-11 | 2008-12-09 | Virtualblue, Llc | Retractable flexible digital display apparatus |
WO2007136372A1 (en) * | 2006-05-22 | 2007-11-29 | Thomson Licensing | Video system having a touch screen |
-
2004
- 2004-03-17 US US10/803,334 patent/US8842070B2/en not_active Expired - Fee Related
-
2014
- 2014-09-11 US US14/484,213 patent/US20150160722A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5912660A (en) * | 1997-01-09 | 1999-06-15 | Virtouch Ltd. | Mouse-like input/output device with display screen and method for its use |
US20020082083A1 (en) * | 1998-11-02 | 2002-06-27 | Takeshi Ito | Information communication electronic device and information display method |
US20020093483A1 (en) * | 2000-11-30 | 2002-07-18 | Kaplan Alan Edward | Display control for hand-held devices |
US20040227742A1 (en) * | 2002-08-06 | 2004-11-18 | Sina Fateh | Control of display content by movement on a fixed spherical space |
US20040222980A1 (en) * | 2003-04-24 | 2004-11-11 | Byoung-Gon Lee | Apparatus for changing representation contents on sub-display of dual-folder type mobile communication terminal |
US7499040B2 (en) * | 2003-08-18 | 2009-03-03 | Apple Inc. | Movable touch pad with added functionality |
Also Published As
Publication number | Publication date |
---|---|
US20050206620A1 (en) | 2005-09-22 |
US8842070B2 (en) | 2014-09-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150160722A1 (en) | Integrated tracking for on screen navigation with small hand held devices | |
US10387030B2 (en) | Bendable display device and displaying method thereof | |
KR102358110B1 (en) | Display apparatus | |
US7966573B2 (en) | Method and system for improving interaction with a user interface | |
US9317197B2 (en) | Storage medium storing information processing program to be executed by computer of information processor to perform a process according to an input to touch surfaces | |
US10503399B2 (en) | Adjusting the display area of application icons at a device screen | |
US8120625B2 (en) | Method and apparatus using multiple sensors in a device with a display | |
US10705692B2 (en) | Continuous and dynamic scene decomposition for user interface | |
US20110316888A1 (en) | Mobile device user interface combining input from motion sensors and other controls | |
US9201585B1 (en) | User interface navigation gestures | |
JP2010020762A (en) | Touch input on touch sensitive display device | |
CA2658413A1 (en) | Touch screen device, method, and graphical user interface for determining commands by applying heuristics | |
CN103383598A (en) | Terminal and method for controlling the same based on spatial interaction | |
KR20140116434A (en) | Directional control using a touch sensitive device | |
US10558288B2 (en) | Multi-touch display panel and method of controlling the same | |
CN104471513A (en) | Flexible display apparatus and operating method thereof | |
US20110050730A1 (en) | Method of displaying data on a portable electronic device according to detected movement of the portable electronic device | |
TW201423564A (en) | Display device, method of driving a display device and computer | |
US10387017B2 (en) | Electronic device for displaying multiple screens and control method therefor | |
US10585510B2 (en) | Apparatus, systems, and methods for transferring objects between multiple display units | |
US20150033161A1 (en) | Detecting a first and a second touch to associate a data file with a graphical data object | |
US9256360B2 (en) | Single touch process to achieve dual touch user interface | |
US9836082B2 (en) | Wearable electronic apparatus | |
US20120162262A1 (en) | Information processor, information processing method, and computer program product | |
CN101727208B (en) | Mouse with rolling function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |