WO2004021683A1 - User-specified outputs in mobile wireless communication devices and methods therefor - Google Patents

User-specified outputs in mobile wireless communication devices and methods therefor Download PDF

Info

Publication number
WO2004021683A1
WO2004021683A1 PCT/US2003/024901 US0324901W WO2004021683A1 WO 2004021683 A1 WO2004021683 A1 WO 2004021683A1 US 0324901 W US0324901 W US 0324901W WO 2004021683 A1 WO2004021683 A1 WO 2004021683A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile wireless
wireless communication
communication device
sensory output
user
Prior art date
Application number
PCT/US2003/024901
Other languages
French (fr)
Inventor
Patrick Cauwels
Steven Herbst
David Roller
Peter Wyatt
Original Assignee
Motorola, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola, Inc. filed Critical Motorola, Inc.
Priority to AU2003259693A priority Critical patent/AU2003259693A1/en
Priority to EP03791668A priority patent/EP1535450B1/en
Priority to BR0313842-9A priority patent/BR0313842A/en
Publication of WO2004021683A1 publication Critical patent/WO2004021683A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/22Illumination; Arrangements for improving the visibility of characters on dials
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present inventions relate generally to mobile wireless communications devices, and more particularly to user enriching events in wireless communications devices, for example in cellular communications handsets, and methods therefor.
  • Cellular handsets are now viewed by many consumers as an apparel item integrated as a part of the individual being. Consumers also increasingly desire the ability to customize and personalize their handsets, for example as a form of self-expression, to reflect changes in mood or psychological disposition, differentiate from others, associate with peers, etc. It is known to generate audio sounds upon the occurrence of specified events on cellular telephone handsets.
  • the Motorola Timeport 280 for example, produces a sound when a charger cable is connected thereto. However, the user has no control over this signal.
  • the Motorola V60 cellular handset enables the association of different user specified audio alerts with different incoming communications including calls and e-mail.
  • FIG. 1 is an exemplary mobile cellular communications handset having a pivoting panel.
  • FIG. 2 is an exemplary cellular handset housing configuration detection switch.
  • FIG. 3 is a schematic electrical block diagram for an exemplary cellular communications handset.
  • FIG. 4 is a process flow diagram for one exemplary cellular handset mode of operation.
  • FIG. 5 is an exemplary process follow diagram for associating sensory output with event occurring on a wireless communications handset.
  • FIG. 6 is a process flow diagram for another exemplary cellular handset mode of operation.
  • FIG. 7 is a process flow diagram for yet another exemplary cellular handset mode of operation. DETAILED DESCRIPTION OF THE INVENTIONS
  • an exemplary cellular handset 100 includes a housing with a cover portion, or flip, 110 pivotally coupled to a housing 120.
  • a user interface is exposed upon opening the flip 110.
  • the exemplary user interface includes a display 112 and an audio output 114 on the cover portion, and an input keypad 122 including an alpha/ numeric keys and other controls on the housing portion 120.
  • the housing includes a switch for sensing whether the pivotal cover portion 110 is opened or closed relative to the housing.
  • FIG. 2 is an enlarged view of the housing portion 220 including a cover position-detecting switch 222 disposed near the cover hinge. The switch is actuated upon pivoting the cover 210, which includes a protruding member 212 for engaging an actuating the switch.
  • the switch and its location is only exemplary and is not intended to limit the invention, as many other switches and configuration are suitable for detecting the position of the pivoting cover.
  • the housing may have a portion that rotates, for example, a blade that rotates to cover and expose a user interface.
  • the blade position may be detected by a switch or by a rotary encoder, or by some other position detecting devices.
  • Other handset housings include sliding housing covers or portions, the position of which may also be detected by a sensor or switch.
  • an exemplary schematic block diagram of a mobile wireless communications device 300 includes a processor 310 coupled to memory 320, a display 330, and a radio frequency (RF) transceiver 340.
  • the transceiver is for communicating within service provider network infrastructures.
  • the wireless device also receives and transmits over small area networks, for example Bluetooth and IEEE 802.11b.
  • user inputs 350 for example a microphone, keypad, scrolling input device, joystick, data input jack, infrared signal input, accessory connectors, etc.
  • the processor is coupled to outputs 360, for example a speaker, audio output jack, etc.
  • the exemplary configuration is not intended to limit the invention, as the invention may be implemented in other architectures.
  • a housing actuation input 370 is also coupled to the processor for indicating the position of a mechanically actuatable portion of the mobile wireless communications device, for example a user interface cover or any other actuating portion.
  • the housing actuation input 370 of FIG. 3 corresponds, for example, to the position-detecting switch 222 of FIG. 2, or to any other mechanically actuatable housing portion.
  • the switch is not required in all embodiments of the invention, for example, some embodiments thereof do not include an actuatable user interface cover.
  • a mechanical portion of the wireless device is actuated. This actuation may be the translating or pivoting or rotating action of a housing cover portion, or some other mechanically actuatable portion thereof.
  • the actuation of the mechanical portion may also be the depression of one or more input keys, or the actuation of a switch,the extension of a retractable antenna, or the connection of an accessory, for example a plug-in charger, a camera, ear phones, etc.
  • a user-configurable sensory output of the mobile wireless communication device is produced upon actuating a mechanical portion of the mobile wireless communication device.
  • the user selects a sensory output from a plurality of sensory outputs, for example at a user configuration menu.
  • the selected sensory output is associated with a particular event on the wireless communication device.
  • the event selected at block 520 may be the mechanical actuation of a portion of the device, examples or which are discussed above, including the rotation or translation of a cover portion, or the depression of one or more input keys, the extension or retraction of a whip antenna, the opening or removal of a compartment, for example a battery compartment cover or a face place, or the actuation of some other mechanical portion of the device.
  • the user may select, or re-map, one or more sensory outputs associated with the depression of each input key.
  • the user-configurable sensory output is an audio output, for example a melodic sound, or an audio message, or some other sound clip.
  • the sound produced is related to the action performed, for example, a "Creeeeeak” sound may be produced as the cover pivots open, or a "Zzzzzzzip" sound may be produced as an antenna whip is withdrawn or retracted.
  • the user-configurable sensory output is a tactile sensation, which may be in the form of a buzz or it may be a more melodic or rhythmic tactile sensation.
  • the tactile output is produced in concert with some other sensory output, for example in synchronization with a melodic audio output.
  • the user-configurable sensory output may also be the production of some visual stimulation, for example an image on the display.
  • the visual image may be a still image or a dynamic video image, like a short video clip.
  • the wireless device 100 includes a vanity light 130 disposed along a side thereof, or on some other portions of the device, for emitting light upon the occurrence of a user specified event.
  • the visual sensory output is the illumination of one or more vanity lights upon the occurrence of the event specified at block 520 in FIG. 5.
  • the sensory output may also be the illumination of the display alone or in addition to the illumination of the vanity lights.
  • the lights may be configured to flash or provide steady brightness depending on the user's preferences.
  • the lighting may also be synchronized with other sensory outputs, for example with audio and tactile outputs.
  • the user-sensory output may be a thermal output, for example a change in temperature of the wireless device or a portion thereof, or an olfactory sensory output.
  • one or more of the user- configurable sensory outputs may be produced in combination, either serially or in parallel, in response to actuating the mechanical portion of the wireless device.
  • the user may also configure properties of the sensory output selected, for example the audio volume, or the fade-in and fade-out of the sensory output, among others.
  • the sensory output terminates after a specified time period.
  • the user may specify that the sensory output fade-out slowly, for example audio outputs may fade-out to an inaudible volume level.
  • the event specified at block 520 in FIG. 5 is the transitioning of the wireless device between a reduced power consumption mode and a relatively higher power consumption mode, for example between sleep and active modes.
  • Wireless handsets generally transition from active mode to sleep mode after some period of inactivity to conserve power.
  • the handset transitions to the active mode in response to some user input, for example upon depressing an input key or upon actuating some other mechanical portion thereof.
  • the user may specify whether the sensory output occurs when the device assumes the active or sleep mode, or both.
  • different events may be associated with the transition depending upon the direction the change in state.
  • the mobile wireless communication device transitions between a reduced power consumption mode and a relatively higher power consumption mode. Many events prompt the wireless device to transition between modes.
  • the wireless device may transition between a sleep mode and active mode upon actuating a mechanical portion of the mobile wireless communications device, for example by actuating a cover portion, depressing a in input key or other button or switch.
  • a user-configurable sensory output of the mobile wireless communication device is produced upon transitioning the mobile wireless communication device between modes.
  • the event selected at block 520 in FIG. 5 is the transitioning between power-on and power-off modes of operation of the mobile wireless communication device.
  • the user may specify whether the sensory output occurs when the device is turned on and/ or when it is turned off, and associate different events depending upon the direction the transition.
  • one or more user-specified sensory outputs are associated with the transitioning between off and on modes. Thereafter, upon applying or removing power, the associated sensory output is produced, according to the users selection. In some embodiments, the user-configurable sensory output terminates after a specified time period.
  • the mobile wireless communication device receives information from a communications service provider associated with an occurrence of an event that occurs on the mobile wireless communication device, whereby the occurrence of the event initiates the production of the sensory output on the wireless device.
  • the temporary sensory output thus communicates information received from the communications service provider upon the occurrence of the event.
  • the service provider selects the sensory output and associates it with an event, for example when the mobile wireless communication device transitions between power-off and power-on modes of operation, or some other event.
  • the sensory output that communicates information received from the communications network is the displaying of visual information, for example a still image or a short video clip.
  • corresponding audio and/ or tactile information also received from the service provider, is produced in concert with the visual information.
  • the sensory output is controlled by the network service provider upon the occurrence of the specified event, for example to communicate important service related information to the user from the service provider or from third parties.
  • the service provider may update the information by transmitting new information to the wireless device, for example in a broadcast message or in a point-to-point message.
  • the mobile wireless communication device undergoes a change in reception of a radio signal from a source other than the communications service provider, for example a Bluetooth signal, an IEEE 802.11b signal, an infrared signal, or some other signal.
  • a radio signal for example a Bluetooth signal, an IEEE 802.11b signal, an infrared signal, or some other signal.
  • a user-configurable sensory output of the mobile wireless communication device is produced upon undergoing a change in reception of the radio signal from the source other than the communications service provider.
  • the sensory output may be, for example, an audio signal alerting the user that the wireless device is receiving the signal or no longer receiving the signal.
  • a block 730 the user-configurable sensory output is terminated after a specified time period.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

A mobile wireless communication device, and methods therefore, including producing a user-configurable sensory output (620) of the mobile wireless communication device upon the occurrence of some event on the mobile wireless communication device, for example the transition between sleep and active modes, or the mechanical actuation of some portion of the device. In some embodiments, the user-configurable sensory output terminates (630) after a specified time period. In other embodiments, a service providers selects the sensory output and associates it with a particular event that occurs on the device, whereupon the sensory output is produced on the device upon the occurrence of the event, for example to communication information from the service provider.

Description

USER-SPECIFIED OUTPUTS IN MOBILE WIRELESS COMMUNICATION DEVICES AND METHODS THEREFOR
FIELD OF THE INVENTIONS
The present inventions relate generally to mobile wireless communications devices, and more particularly to user enriching events in wireless communications devices, for example in cellular communications handsets, and methods therefor.
BACKGROUND OF THE INVENTIONS
As consumers in the competitive wireless cellular communications handset market become more sophisticated, the successful marketing of cellular handsets depends upon the ability of manufacturers and network providers to offer more than basic features. Cellular handsets are now viewed by many consumers as an apparel item integrated as a part of the individual being. Consumers also increasingly desire the ability to customize and personalize their handsets, for example as a form of self-expression, to reflect changes in mood or psychological disposition, differentiate from others, associate with peers, etc. It is known to generate audio sounds upon the occurrence of specified events on cellular telephone handsets. The Motorola Timeport 280, for example, produces a sound when a charger cable is connected thereto. However, the user has no control over this signal. The Motorola V60 cellular handset enables the association of different user specified audio alerts with different incoming communications including calls and e-mail. The various aspects, features and advantages of the present invention will become more fully apparent to those having ordinary skill in the art upon careful consideration of the following Detailed Description of the Invention with the accompanying drawings described below.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is an exemplary mobile cellular communications handset having a pivoting panel.
FIG. 2 is an exemplary cellular handset housing configuration detection switch.
FIG. 3 is a schematic electrical block diagram for an exemplary cellular communications handset. FIG. 4 is a process flow diagram for one exemplary cellular handset mode of operation.
FIG. 5 is an exemplary process follow diagram for associating sensory output with event occurring on a wireless communications handset.
FIG. 6 is a process flow diagram for another exemplary cellular handset mode of operation.
FIG. 7 is a process flow diagram for yet another exemplary cellular handset mode of operation. DETAILED DESCRIPTION OF THE INVENTIONS
In FIG. 1, an exemplary cellular handset 100 includes a housing with a cover portion, or flip, 110 pivotally coupled to a housing 120. A user interface is exposed upon opening the flip 110. The exemplary user interface includes a display 112 and an audio output 114 on the cover portion, and an input keypad 122 including an alpha/ numeric keys and other controls on the housing portion 120.
In FIG. 1, the housing includes a switch for sensing whether the pivotal cover portion 110 is opened or closed relative to the housing. FIG. 2 is an enlarged view of the housing portion 220 including a cover position-detecting switch 222 disposed near the cover hinge. The switch is actuated upon pivoting the cover 210, which includes a protruding member 212 for engaging an actuating the switch. The switch and its location is only exemplary and is not intended to limit the invention, as many other switches and configuration are suitable for detecting the position of the pivoting cover.
In other embodiments, the housing may have a portion that rotates, for example, a blade that rotates to cover and expose a user interface. The blade position may be detected by a switch or by a rotary encoder, or by some other position detecting devices. Other handset housings include sliding housing covers or portions, the position of which may also be detected by a sensor or switch.
In FIG. 3, an exemplary schematic block diagram of a mobile wireless communications device 300 includes a processor 310 coupled to memory 320, a display 330, and a radio frequency (RF) transceiver 340. In one embodiment, the transceiver is for communicating within service provider network infrastructures.
In other embodiments the wireless device also receives and transmits over small area networks, for example Bluetooth and IEEE 802.11b. In FIG. 3, user inputs 350, for example a microphone, keypad, scrolling input device, joystick, data input jack, infrared signal input, accessory connectors, etc., are also coupled to the processor 310. The processor is coupled to outputs 360, for example a speaker, audio output jack, etc. The exemplary configuration is not intended to limit the invention, as the invention may be implemented in other architectures.
In FIG. 3, a housing actuation input 370 is also coupled to the processor for indicating the position of a mechanically actuatable portion of the mobile wireless communications device, for example a user interface cover or any other actuating portion. The housing actuation input 370 of FIG. 3 corresponds, for example, to the position-detecting switch 222 of FIG. 2, or to any other mechanically actuatable housing portion. The switch is not required in all embodiments of the invention, for example, some embodiments thereof do not include an actuatable user interface cover. In the process flow diagram of FIG. 4, at block 410, a mechanical portion of the wireless device is actuated. This actuation may be the translating or pivoting or rotating action of a housing cover portion, or some other mechanically actuatable portion thereof. The actuation of the mechanical portion may also be the depression of one or more input keys, or the actuation of a switch,the extension of a retractable antenna, or the connection of an accessory, for example a plug-in charger, a camera, ear phones, etc.
In FIG. 4, at block 420, a user-configurable sensory output of the mobile wireless communication device is produced upon actuating a mechanical portion of the mobile wireless communication device. In the process flow diagram 500 of FIG. 5, at block 510, the user selects a sensory output from a plurality of sensory outputs, for example at a user configuration menu. At block 520, the selected sensory output is associated with a particular event on the wireless communication device. The event selected at block 520 may be the mechanical actuation of a portion of the device, examples or which are discussed above, including the rotation or translation of a cover portion, or the depression of one or more input keys, the extension or retraction of a whip antenna, the opening or removal of a compartment, for example a battery compartment cover or a face place, or the actuation of some other mechanical portion of the device. In another embodiment, the user may select, or re-map, one or more sensory outputs associated with the depression of each input key.
In one embodiment, the user-configurable sensory output is an audio output, for example a melodic sound, or an audio message, or some other sound clip. In some embodiments, the sound produced is related to the action performed, for example, a "Creeeeeak" sound may be produced as the cover pivots open, or a "Zzzzzzzip" sound may be produced as an antenna whip is withdrawn or retracted. In other embodiments, the user-configurable sensory output is a tactile sensation, which may be in the form of a buzz or it may be a more melodic or rhythmic tactile sensation. In some embodiments, the tactile output is produced in concert with some other sensory output, for example in synchronization with a melodic audio output. The user-configurable sensory output may also be the production of some visual stimulation, for example an image on the display. The visual image may be a still image or a dynamic video image, like a short video clip.
In FIG. 1, the wireless device 100 includes a vanity light 130 disposed along a side thereof, or on some other portions of the device, for emitting light upon the occurrence of a user specified event. In one embodiment, the visual sensory output is the illumination of one or more vanity lights upon the occurrence of the event specified at block 520 in FIG. 5. The sensory output may also be the illumination of the display alone or in addition to the illumination of the vanity lights. The lights may be configured to flash or provide steady brightness depending on the user's preferences. The lighting may also be synchronized with other sensory outputs, for example with audio and tactile outputs. In other embodiments, the user-sensory output may be a thermal output, for example a change in temperature of the wireless device or a portion thereof, or an olfactory sensory output. Generally, one or more of the user- configurable sensory outputs may be produced in combination, either serially or in parallel, in response to actuating the mechanical portion of the wireless device. In some embodiment at block 510 of FIG. 5, the user may also configure properties of the sensory output selected, for example the audio volume, or the fade-in and fade-out of the sensory output, among others.
In FIG. 4, at block 430, in some embodiments, the sensory output terminates after a specified time period. In one embodiment, the user may specify that the sensory output fade-out slowly, for example audio outputs may fade-out to an inaudible volume level.
In another embodiment of the invention, the event specified at block 520 in FIG. 5 is the transitioning of the wireless device between a reduced power consumption mode and a relatively higher power consumption mode, for example between sleep and active modes. Wireless handsets generally transition from active mode to sleep mode after some period of inactivity to conserve power. The handset transitions to the active mode in response to some user input, for example upon depressing an input key or upon actuating some other mechanical portion thereof. The user may specify whether the sensory output occurs when the device assumes the active or sleep mode, or both. Also, different events may be associated with the transition depending upon the direction the change in state.
In the process flow diagram 600 of FIG. 6, at block 610, the mobile wireless communication device transitions between a reduced power consumption mode and a relatively higher power consumption mode. Many events prompt the wireless device to transition between modes. The wireless device may transition between a sleep mode and active mode upon actuating a mechanical portion of the mobile wireless communications device, for example by actuating a cover portion, depressing a in input key or other button or switch.
In FIG. 6, at block 620, a user-configurable sensory output of the mobile wireless communication device is produced upon transitioning the mobile wireless communication device between modes.
In another embodiment, the event selected at block 520 in FIG. 5 is the transitioning between power-on and power-off modes of operation of the mobile wireless communication device. The user may specify whether the sensory output occurs when the device is turned on and/ or when it is turned off, and associate different events depending upon the direction the transition. At block 520, one or more user-specified sensory outputs are associated with the transitioning between off and on modes. Thereafter, upon applying or removing power, the associated sensory output is produced, according to the users selection. In some embodiments, the user-configurable sensory output terminates after a specified time period.
In another embodiment, the mobile wireless communication device receives information from a communications service provider associated with an occurrence of an event that occurs on the mobile wireless communication device, whereby the occurrence of the event initiates the production of the sensory output on the wireless device. The temporary sensory output thus communicates information received from the communications service provider upon the occurrence of the event. In this embodiment, the service provider selects the sensory output and associates it with an event, for example when the mobile wireless communication device transitions between power-off and power-on modes of operation, or some other event. In one embodiment, the sensory output that communicates information received from the communications network is the displaying of visual information, for example a still image or a short video clip. In some embodiments corresponding audio and/ or tactile information, also received from the service provider, is produced in concert with the visual information. According to this embodiment, the sensory output is controlled by the network service provider upon the occurrence of the specified event, for example to communicate important service related information to the user from the service provider or from third parties. The service provider may update the information by transmitting new information to the wireless device, for example in a broadcast message or in a point-to-point message.
In another mode of operation, illustrated in the process flow diagram 700 of FIG. 7, at block 710, the mobile wireless communication device undergoes a change in reception of a radio signal from a source other than the communications service provider, for example a Bluetooth signal, an IEEE 802.11b signal, an infrared signal, or some other signal.
At block 720, a user-configurable sensory output of the mobile wireless communication device is produced upon undergoing a change in reception of the radio signal from the source other than the communications service provider. The sensory output may be, for example, an audio signal alerting the user that the wireless device is receiving the signal or no longer receiving the signal. A block 730, the user-configurable sensory output is terminated after a specified time period.
While the present inventions and what are considered presently to be the best modes thereof have been described in a manner that establishes possession thereof by the inventors and that enables those of ordinary skill in the art to make and use the inventions, it will be understood and appreciated that there are many equivalents to the exemplary embodiments disclosed herein and that myriad modifications and variations may be made thereto without departing from the scope and spirit of the inventions, which are to be limited not by the exemplary embodiments but by the appended claims. What is claimed is:

Claims

1. A method in a mobile wireless communication device, comprising: transitioning the mobile wireless communication device between a reduced power consumption mode and a relatively higher power consumption mode; producing a user-configurable sensory output of the mobile wireless communication device upon transitioning the mobile wireless communication device; terminating the user-configurable sensory output after a specified time period.
2. The method of Claim 1, producing the user-configurable sensory output by producing an audio output.
3. The method of Claim 1, terminating the user-configurable sensory output by fading-out the user-configurable sensory output.
4. The method of Claim 1, producing the user-configurable sensory output by producing a tactile sensation.
5. The method of Claim 1, producing the user-configurable sensory output by temporarily displaying a visual image on a display of the mobile wireless communication device.
6. The method of Claim 1, producing the user-configurable sensory output by emitting light from the mobile wireless communication device.
7. The method of Claim 1, selecting the user-configurable sensory output from a plurality of sensory outputs.
8. The method of Claim 1, transitioning the mobile wireless communication device between the reduced power consumption mode and the relatively higher power consumption mode by actuating a mechanical portion of the mobile wireless communications device.
9. A method in a mobile wireless communication device, comprising: actuating a mechanical portion of the mobile wireless communication device; producing a user-configurable sensory output of the mobile wireless communication device upon actuating the mechanical portion of the mobile wireless communication device; terminating the user-configurable sensory output after a specified time period.
10. The method of Claim 9, actuating the mechanical portion of the mobile wireless communication device by translating the mechanical portion of the mobile wireless communication device.
11. The method of Claim 9, actuating the mechanical portion of the mobile wireless communication device by rotating the mechanical portion of the mobile wireless communication device.
12. The method of Claim 9, actuating the mechanical portion of the mobile wireless communication device by connecting an accessory device with a connector of the wireless communication device
13. The method of Claim 9, producing the user-configurable sensory output by producing an audible sound.
14. The method of Claim 9, producing the user-configurable sensory output by emitting light from a vanity light on the mobile wireless communication device.
15. The method of Claim 9, producing the user-configurable sensory output by producing a tactile sensation.
16. The method of Claim 9, producing the user-configurable sensory output by temporarily displaying an image on a display of the mobile wireless communication device before displaying a user interface home menu.
17. The method of Claim 9, selecting the user-configurable sensory output from a plurality of sensory outputs of the mobile wireless communications device, associating the user-configurable output selected with actuation of a mechanical portion of the mobile wireless communication device.
18. A method in a mobile wireless communication device, comprising: selecting at least one sensory output from plurality of sensory outputs of the mobile wireless communication device; associating the selected sensory output with a mechanical operation on the mobile wireless communications device; producing the selected sensory output upon an occurrence of the associated mechanical operation.
19. The method of Claim 18, selecting a plurality of sensory outputs of the mobile wireless communication device, associating the plurality of selected sensory outputs with a plurality of input keys by associated at least one sensory output with each of the plurality of input keys, producing the at least one sensory output upon depressing the associated input key.
20. The method of Claim 18, selecting more than one sensory output and associating the more than one sensory output selected with a single mechanical operation, producing the more than one sensory output selected upon an occurrence of associated the mechanical operation.
21. A method in a mobile wireless communication device, comprising: selecting one of a plurality of sensory outputs of the mobile wireless communication device from a plurality of sensory outputs; transitioning between power-on and power-off modes of operation of the mobile wireless communication device; producing the sensory output selected upon transitioning between the power-on and power-off modes of operation.
22. The method of Claim 21, terminating the user sensory output after a specified time period.
23. The method of Claim 21, terminating the user sensory output by fading-out the user sensory output.
24. A method in a mobile wireless communication device, comprising: receiving information over an air interface from a communications service provider associated with an occurrence of an event on the mobile wireless communication device; producing a temporary sensory output that the communicates information received from the communications service provider upon the occurrence of the event.
25. The method of Claim 24, producing the temporary sensory output by displaying video information received from the communications service provider upon the occurrence of the event.
26. The method of Claim 25, the event is the transitioning from a power-off mode to a power-on mode, producing the temporary sensory output that communicates information received from the communications service provider when the mobile wireless communication device is transitioned to a power-on mode.
27. The method of Claim 24, producing the temporary sensory output upon by producing audio information received from the communications service provider with the visual information.
28. The method of Claim 24, producing the temporary sensory output by displaying visual information received from the communications service provider upon the occurrence of the event.
29. A method in a mobile wireless communication device, comprising: undergoing a change in reception of a radio signal from a source other than a communications service provider; producing a user-configurable sensory output of the mobile wireless communication device upon the change in reception of the radio signal from the source other than a communications service provider.
30. The method of Claim 29, terminating the sensory output after a specified time period.
PCT/US2003/024901 2002-08-30 2003-08-07 User-specified outputs in mobile wireless communication devices and methods therefor WO2004021683A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2003259693A AU2003259693A1 (en) 2002-08-30 2003-08-07 User-specified outputs in mobile wireless communication devices and methods therefor
EP03791668A EP1535450B1 (en) 2002-08-30 2003-08-07 User-specified outputs in mobile wireless communication devices and methods therefor
BR0313842-9A BR0313842A (en) 2002-08-30 2003-08-07 User-specified outputs and methods for mobile wireless communication devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/233,212 2002-08-30
US10/233,212 US7734316B2 (en) 2002-08-30 2002-08-30 User-specified outputs in mobile wireless communication devices and methods therefor

Publications (1)

Publication Number Publication Date
WO2004021683A1 true WO2004021683A1 (en) 2004-03-11

Family

ID=31977180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2003/024901 WO2004021683A1 (en) 2002-08-30 2003-08-07 User-specified outputs in mobile wireless communication devices and methods therefor

Country Status (8)

Country Link
US (2) US7734316B2 (en)
EP (2) EP2541879A1 (en)
KR (1) KR101099155B1 (en)
CN (2) CN1679305A (en)
AU (1) AU2003259693A1 (en)
BR (1) BR0313842A (en)
RU (1) RU2346405C2 (en)
WO (1) WO2004021683A1 (en)

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2768324B1 (en) * 1997-09-12 1999-12-10 Jacques Seguin SURGICAL INSTRUMENT FOR PERCUTANEOUSLY FIXING TWO AREAS OF SOFT TISSUE, NORMALLY MUTUALLY REMOTE, TO ONE ANOTHER
US8216256B2 (en) 1999-04-09 2012-07-10 Evalve, Inc. Detachment mechanism for implantable fixation devices
US6752813B2 (en) * 1999-04-09 2004-06-22 Evalve, Inc. Methods and devices for capturing and fixing leaflets in valve repair
US20040044350A1 (en) * 1999-04-09 2004-03-04 Evalve, Inc. Steerable access sheath and methods of use
US7811296B2 (en) * 1999-04-09 2010-10-12 Evalve, Inc. Fixation devices for variation in engagement of tissue
US10327743B2 (en) 1999-04-09 2019-06-25 Evalve, Inc. Device and methods for endoscopic annuloplasty
WO2000060995A2 (en) * 1999-04-09 2000-10-19 Evalve, Inc. Methods and apparatus for cardiac valve repair
US7226467B2 (en) 1999-04-09 2007-06-05 Evalve, Inc. Fixation device delivery catheter, systems and methods of use
US6575971B2 (en) * 2001-11-15 2003-06-10 Quantum Cor, Inc. Cardiac valve leaflet stapler device and methods thereof
US7048754B2 (en) * 2002-03-01 2006-05-23 Evalve, Inc. Suture fasteners and methods of use
US7734316B2 (en) * 2002-08-30 2010-06-08 Motorola, Inc. User-specified outputs in mobile wireless communication devices and methods therefor
KR100547792B1 (en) * 2002-10-24 2006-01-31 삼성전자주식회사 Wireless communication terminal that can change the image file of the background screen and display method of the image file of the background screen using the same
US7142893B2 (en) * 2003-05-01 2006-11-28 Nokia Corporation Apparatus, and associated method, for facilitating identification of a mobile telephone
US10646229B2 (en) 2003-05-19 2020-05-12 Evalve, Inc. Fixation devices, systems and methods for engaging tissue
JP4774048B2 (en) 2004-05-14 2011-09-14 エヴァルヴ インコーポレイテッド Locking mechanism of fixing device engaged with tissue and tissue engaging method
KR100597738B1 (en) * 2004-05-27 2006-07-07 삼성전자주식회사 Electronic device and control method thereof
WO2006037073A2 (en) 2004-09-27 2006-04-06 Evalve, Inc. Methods and devices for tissue grasping and assessment
US8052592B2 (en) 2005-09-27 2011-11-08 Evalve, Inc. Methods and devices for tissue grasping and assessment
EP3967269A3 (en) 2005-02-07 2022-07-13 Evalve, Inc. Systems and devices for cardiac valve repair
WO2011034628A1 (en) * 2005-02-07 2011-03-24 Evalve, Inc. Methods, systems and devices for cardiac valve repair
KR100832019B1 (en) * 2006-06-30 2008-05-26 주식회사 하이닉스반도체 Method for fabricating storage node contact in semiconductor device
US20100078343A1 (en) * 2008-09-30 2010-04-01 Hoellwarth Quin C Cover for Portable Electronic Device
US20100262436A1 (en) * 2009-04-11 2010-10-14 Chen Ying-Yu Medical information system for cost-effective management of health care
EP2633821B1 (en) 2009-09-15 2016-04-06 Evalve, Inc. Device for cardiac valve repair
US8945177B2 (en) 2011-09-13 2015-02-03 Abbott Cardiovascular Systems Inc. Gripper pusher mechanism for tissue apposition systems
US10390943B2 (en) 2014-03-17 2019-08-27 Evalve, Inc. Double orifice device for transcatheter mitral valve replacement
US9572666B2 (en) 2014-03-17 2017-02-21 Evalve, Inc. Mitral valve fixation device removal devices and methods
US10188392B2 (en) 2014-12-19 2019-01-29 Abbott Cardiovascular Systems, Inc. Grasping for tissue repair
US10524912B2 (en) 2015-04-02 2020-01-07 Abbott Cardiovascular Systems, Inc. Tissue fixation devices and methods
US10376673B2 (en) 2015-06-19 2019-08-13 Evalve, Inc. Catheter guiding system and methods
US10238494B2 (en) 2015-06-29 2019-03-26 Evalve, Inc. Self-aligning radiopaque ring
US10667815B2 (en) 2015-07-21 2020-06-02 Evalve, Inc. Tissue grasping devices and related methods
US10413408B2 (en) 2015-08-06 2019-09-17 Evalve, Inc. Delivery catheter systems, methods, and devices
US10238495B2 (en) 2015-10-09 2019-03-26 Evalve, Inc. Delivery catheter handle and methods of use
US10736632B2 (en) 2016-07-06 2020-08-11 Evalve, Inc. Methods and devices for valve clip excision
US11071564B2 (en) 2016-10-05 2021-07-27 Evalve, Inc. Cardiac valve cutting device
US10363138B2 (en) 2016-11-09 2019-07-30 Evalve, Inc. Devices for adjusting the curvature of cardiac valve structures
US10398553B2 (en) 2016-11-11 2019-09-03 Evalve, Inc. Opposing disk device for grasping cardiac valve tissue
US10426616B2 (en) 2016-11-17 2019-10-01 Evalve, Inc. Cardiac implant delivery system
US10779837B2 (en) 2016-12-08 2020-09-22 Evalve, Inc. Adjustable arm device for grasping tissues
US10314586B2 (en) 2016-12-13 2019-06-11 Evalve, Inc. Rotatable device and method for fixing tricuspid valve tissue
US11065119B2 (en) 2017-05-12 2021-07-20 Evalve, Inc. Long arm valve repair clip
US12102531B2 (en) 2018-10-22 2024-10-01 Evalve, Inc. Tissue cutting systems, devices and methods
JP7543391B2 (en) 2019-07-15 2024-09-02 エバルブ,インコーポレイティド Method of Actuating Individual Proximal Elements
US12048448B2 (en) 2020-05-06 2024-07-30 Evalve, Inc. Leaflet grasping and cutting device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6278887B1 (en) * 1999-02-05 2001-08-21 Neopoint, Inc. System and method for power conservation in a wireless communication handset
US6342738B1 (en) * 1998-06-03 2002-01-29 Telefonaktiebolaget Lm Ericsson (Publ) Mobile electronic device with integrated dual hardware/software power switch
US6385466B1 (en) * 1998-01-19 2002-05-07 Matsushita Electric Industrial Co., Ltd. Portable terminal device
US6426736B1 (en) * 1998-12-28 2002-07-30 Nec Corporation Portable telephone with liquid crystal display

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4860005A (en) * 1988-01-07 1989-08-22 Motorola, Inc. Communication receiver with automatic turn on/off
JPH096508A (en) * 1995-06-16 1997-01-10 Sony Corp Communication terminal equipment and display method
US5767778A (en) * 1996-03-06 1998-06-16 Aspire Corporation Event sensing circuit and alert generator
US5870683A (en) * 1996-09-18 1999-02-09 Nokia Mobile Phones Limited Mobile station having method and apparatus for displaying user-selectable animation sequence
US6018654A (en) * 1996-10-29 2000-01-25 Ericsson Inc Method and apparatus for downloading tones to mobile terminals
GB2334301A (en) 1998-02-17 1999-08-18 Ericsson Telefon Ab L M Slidable and rotatable hinge assembly for a mobile radiotelephone
KR100593996B1 (en) * 1998-12-17 2006-09-27 삼성전자주식회사 How to display wallpaper on a mobile phone
JP2000295170A (en) * 1999-04-09 2000-10-20 Sony Corp Communication system, communication terminal equipment and information distribution device
KR100344786B1 (en) * 1999-07-15 2002-07-19 엘지전자주식회사 Caller Information Providing System and Forwarding Method in Mobile Communication Network
US6671370B1 (en) * 1999-12-21 2003-12-30 Nokia Corporation Method and apparatus enabling a calling telephone handset to choose a ringing indication(s) to be played and/or shown at a receiving telephone handset
DE19962282A1 (en) * 1999-12-23 2001-06-28 Philips Corp Intellectual Pty Mobile telephone with display device; has control device, which switches off at least one part of display device according to each operation state of telephone
JP2001186225A (en) * 1999-12-24 2001-07-06 Nec Corp Portable telephone set
JP3448003B2 (en) * 2000-03-09 2003-09-16 株式会社東芝 Mobile communication terminal
KR100362560B1 (en) * 2000-08-19 2002-11-27 삼성전자 주식회사 Method for controlling driving of backlight part in mobile phone
US6973336B2 (en) * 2000-12-20 2005-12-06 Nokia Corp Method and apparatus for providing a notification of received message
GB0031477D0 (en) * 2000-12-22 2001-02-07 Symbian Ltd Mobile telephone device with idle screen
WO2002065303A1 (en) * 2001-02-13 2002-08-22 Fujitsu Limited Network terminal having power saving mode
EP1374604A2 (en) * 2001-03-20 2004-01-02 Koninklijke Philips Electronics N.V. Beacon infrastructure
KR200260160Y1 (en) * 2001-10-17 2002-01-10 가이아 텔레콤(주) Key tone upgrading/outputting system
US7065382B2 (en) * 2001-12-20 2006-06-20 Nokia Corporation Wireless terminal having a scanner for issuing an alert when within the range of a target wireless terminal
US7218918B1 (en) * 2002-07-15 2007-05-15 Bellsouth Intellectual Property Corporation Systems and methods for a wireless messaging information service
US7734316B2 (en) * 2002-08-30 2010-06-08 Motorola, Inc. User-specified outputs in mobile wireless communication devices and methods therefor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6385466B1 (en) * 1998-01-19 2002-05-07 Matsushita Electric Industrial Co., Ltd. Portable terminal device
US6342738B1 (en) * 1998-06-03 2002-01-29 Telefonaktiebolaget Lm Ericsson (Publ) Mobile electronic device with integrated dual hardware/software power switch
US6426736B1 (en) * 1998-12-28 2002-07-30 Nec Corporation Portable telephone with liquid crystal display
US6278887B1 (en) * 1999-02-05 2001-08-21 Neopoint, Inc. System and method for power conservation in a wireless communication handset

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1535450A4 *

Also Published As

Publication number Publication date
EP1535450A4 (en) 2010-07-21
RU2346405C2 (en) 2009-02-10
US20040203508A1 (en) 2004-10-14
CN102710822A (en) 2012-10-03
EP2541879A1 (en) 2013-01-02
EP1535450A1 (en) 2005-06-01
CN1679305A (en) 2005-10-05
RU2005108984A (en) 2005-08-27
US20050059351A1 (en) 2005-03-17
EP1535450B1 (en) 2013-01-09
CN102710822B (en) 2015-09-09
KR20050057005A (en) 2005-06-16
KR101099155B1 (en) 2011-12-27
AU2003259693A1 (en) 2004-03-19
BR0313842A (en) 2005-07-12
US7734316B2 (en) 2010-06-08

Similar Documents

Publication Publication Date Title
US7734316B2 (en) User-specified outputs in mobile wireless communication devices and methods therefor
US8958896B2 (en) Dynamic routing of audio among multiple audio devices
KR20100034229A (en) Potable device including earphone circuit and operation method using the same
JP2003125061A (en) Portable terminal
JP4147016B2 (en) Mobile device
JP2001186250A (en) Radio telephone device and its controlling method
EP2284718B1 (en) Apparatus and method for determining type of accessory in a portable terminal
EP1271903B1 (en) Mobile phone monitor and remote control system
JP4577018B2 (en) Mobile communication terminal
JP2003037656A (en) Mobile telephone set and back light control method therefor
CN102984389A (en) Method for depending on called terminal to determine back ringtone of calling terminal and terminal
US20040224722A1 (en) Wireless communication device having an integral laser pointer
JP3886409B2 (en) Mobile phone
JP4265749B2 (en) Electronic device with LCD remote control
JP2002320006A (en) Folded mobile phone
JP2007318363A (en) Mobile communication terminal, control method therefor, mobile communication terminal control program, and recording medium having the same program recorded thereon
KR20050079475A (en) Method for changing power saving mode using power saving mode key in mobile phone
KR100713467B1 (en) Method for changing mode in wireless terminal
JP4256907B2 (en) Mobile device
JP4256908B2 (en) Mobile device
CN109362012A (en) Wire and wireless signal transmits conversion method, loudspeaker and storage medium
KR20040059335A (en) Method for reducing power through set up the always sleep mode in mobile terminal
JP2006129112A (en) Portable telephone set
JP2003188956A (en) Foldable telephone set
KR20060067479A (en) Apparatus and method for outside speaker function and equalizer function in wireless telecommunication terminal

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1020057003373

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2003820455X

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 328/KOLNP/2005

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2003791668

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2005108984

Country of ref document: RU

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2003791668

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1020057003373

Country of ref document: KR

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Ref document number: JP