JP5385265B2 - Method and system for providing sensory information to devices and peripherals - Google Patents

Method and system for providing sensory information to devices and peripherals Download PDF

Info

Publication number
JP5385265B2
JP5385265B2 JP2010511156A JP2010511156A JP5385265B2 JP 5385265 B2 JP5385265 B2 JP 5385265B2 JP 2010511156 A JP2010511156 A JP 2010511156A JP 2010511156 A JP2010511156 A JP 2010511156A JP 5385265 B2 JP5385265 B2 JP 5385265B2
Authority
JP
Japan
Prior art keywords
device
display
processing system
orientation
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010511156A
Other languages
Japanese (ja)
Other versions
JP2010529552A (en
Inventor
ヘルツ,スコット
キーン,ダン
ウェスターマン,ヴァイン・カール
Original Assignee
アップル インコーポレイテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11/811,174 priority Critical
Priority to US11/811,174 priority patent/US8004493B2/en
Application filed by アップル インコーポレイテッド filed Critical アップル インコーポレイテッド
Priority to PCT/US2008/005819 priority patent/WO2008153639A1/en
Publication of JP2010529552A publication Critical patent/JP2010529552A/en
Application granted granted Critical
Publication of JP5385265B2 publication Critical patent/JP5385265B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1647Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/60Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges including speech amplifiers
    • H04M1/6033Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges including speech amplifiers for providing handsfree use or a loudspeaker mode in telephone sets
    • H04M1/6041Portable telephones adapted for handsfree use
    • H04M1/6058Portable telephones adapted for handsfree use involving the use of a headset accessory device connected to the portable telephone
    • H04M1/6066Portable telephones adapted for handsfree use involving the use of a headset accessory device connected to the portable telephone including a wireless connection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72563Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status with means for adapting by the user the functionality or the communication capability of the terminal under specific circumstances
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/60Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges including speech amplifiers
    • H04M1/6033Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges including speech amplifiers for providing handsfree use or a loudspeaker mode in telephone sets
    • H04M1/6041Portable telephones adapted for handsfree use
    • H04M1/6058Portable telephones adapted for handsfree use involving the use of a headset accessory device connected to the portable telephone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72527With means for supporting locally a plurality of applications to increase the functionality provided by interfacing with an external accessory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72527With means for supporting locally a plurality of applications to increase the functionality provided by interfacing with an external accessory
    • H04M1/7253With means for supporting locally a plurality of applications to increase the functionality provided by interfacing with an external accessory using a two-way short-range wireless interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72563Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status with means for adapting by the user the functionality or the communication capability of the terminal under specific circumstances
    • H04M1/72572Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status with means for adapting by the user the functionality or the communication capability of the terminal under specific circumstances according to a geographic location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S345/00Computer graphics processing and selective visual display systems
    • Y10S345/901Electronic book with display

Description

  Electronic devices such as computer systems or wireless cellular telephones or other data processing systems can often be used with peripheral devices. Peripherals such as a wired or wireless headset, or a wireless or wired keyboard, or a wired or wireless cursor control device are coupled to an electronic device, sometimes referred to as a host system. Peripherals typically provide input and / or output functions to the electronic device.

  Peripherals can also be configured to operate only with one specific electronic device or host. For example, a wireless headset peripheral is paired with a designated wireless cellular phone, thereby enabling communication with the designated wireless cellular phone other than other wireless cellular phones that are within the wireless area of the wireless headset. . This allows the user to operate the wireless headset with the designated wireless cellular phone even when it may be surrounded by other wireless cellular phones that are within the wireless area of the wireless headset. Thus, although the wireless headset in this case includes some intelligence or data that enables it to selectively operate with the designated host system, the wireless headset has no further processing or sensing capabilities. Bluetooth pairing or cooperation is an example of a relationship created between a peripheral device and a host. This is created by the user to exchange information in a secure manner. Creating a Bluetooth linkage between two devices includes entering the same personal identification number (PIN) or passkey on both devices, and creating such a linkage is a one-time process. Once the cooperation is created, the device can recognize the cooperation and exchange information without entering the PIN again.

US Patent Application Publication No. 2003/0095096 US Pat. No. 6,583,676 US Pat. No. 6,520,013

  Some of the electronic devices described above include sensors for various purposes. However, these sensors (eg, accelerometer sensors, proximity sensors, and ambient light sensors) are suitable to determine whether intentional or unintentional user actions have caused device movement and / or generation of orientation information. It cannot be detected and distinguished. For example, unintentional movement of the device can trigger improper configuration of the device. Unintentional movement can include device movement associated with a jogging user or device movement when the user quickly places the device on a surface that causes a short movement of the device. The sensor also cannot determine the interaction between the associated peripheral device and the device.

  At least some embodiments of the present disclosure relate to a peripheral device that includes at least one sensor that detects a state of the peripheral device. In these embodiments, the peripheral device and / or the host coupled to the peripheral device is responsive to data from at least one sensor to configure one or more configurations of the peripheral device and / or the host. Can be changed.

  In at least some embodiments, a method for detecting device motion and orientation information includes receiving a motion event from at least one sensor positioned within the device. The method further includes determining the orientation of the device. The method further includes determining whether the device is currently moving. The method further includes determining whether the device has moved within an angle relative to the ground reference during the first time period. The method further includes switching the orientation of the display of the device when the device moves beyond that angle. The method further includes switching the orientation when the device moves within that angle at least during the first time period. The method further includes determining whether the currently moving device has moved in the second time period, and if the device has not moved or if the device has moved at least in the second time period. , Determining whether the azimuth is vertical, and switching the azimuth when the azimuth is not vertical.

  In at least some embodiments, a method for detecting an orientation between a device and an associated peripheral device includes determining a device vector associated with the device. The device vector indicates the orientation of the device with respect to the ground reference. The method further includes determining a peripheral vector associated with the device peripheral. The peripheral device vector indicates the orientation of the peripheral device with respect to the ground reference. The method further includes generating an audio signal associated with the event from the device. The method further includes determining whether the peripheral vector points toward the device vector in response to the audio signal. The method further includes silencing the audio signal when the peripheral vector points to the device vector in response to the audio signal.

  In at least some embodiments, a peripheral device and an associated data processing system, which can be considered a host system, cooperate to based on sensor data from at least one sensor in the peripheral device and / or the host. The user's intention or action can be determined. For example, a set of sensors on a peripheral device (eg, accelerometer sensor, proximity sensor, ambient light sensor, etc.) provides data indicating that the peripheral device is not in close proximity to the user and at the same time another sensor on the host The set of data can provide data indicating that the host is near the user's ear. The peripheral device is coupled to a peripheral device interface for coupling the peripheral device to the data processing system, at least one peripheral device sensor for detecting the peripheral device user, and the peripheral device interface and the at least one peripheral device sensor. Peripheral processors can be included. The peripheral processor is configured to determine a peripheral device vector that indicates the orientation of the peripheral device relative to the ground reference. The device can include an interface for coupling the device to a peripheral device. The device can further include at least one sensor for sensing the user, and an interface and a processor coupled to the at least one sensor. The processor determines a device vector that indicates the orientation of the device relative to the ground reference, activates an audio signal associated with the event, and further determines whether the peripheral vector points towards the device vector in response to the audio signal. It is configured as follows.

Other systems and methods are also described, and machine-readable media containing executable instructions for operating the machine as described herein are also described.
The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings in which like reference numerals indicate similar elements.

1 is a diagram illustrating an embodiment of a system including an embodiment of a peripheral device and an embodiment of a data processing system used with the peripheral device.

4 is a flowchart of an embodiment of the disclosed method described herein.

FIG. 2 illustrates a data processing system (eg, wireless mobile cellular phone) for ground reference in the disclosed embodiments described herein.

FIG. 7 illustrates a data processing system (eg, a wireless mobile cellular phone) for ground reference in another embodiment of the disclosure described herein.

4 is a flowchart of an embodiment of the disclosed method described herein.

FIG. 5 is a device vector versus peripheral device vector in the disclosed embodiment described herein.

FIG. 4 is a device vector versus peripheral vector in another embodiment of the disclosure described herein.

1 is a perspective view of a portable data processing system according to one embodiment of the disclosure described herein. FIG.

1 is a perspective view of a portable data processing system according to one embodiment of the disclosure described herein. FIG.

1 is a perspective view of a portable data processing system in a first configuration (eg, in an open configuration) according to one embodiment of the disclosure described herein. FIG.

2 is a perspective view of a portable data processing system in a second configuration (eg, in a closed configuration) according to one embodiment of the disclosure described herein. FIG.

FIG. 2 is a block diagram of a data processing system in which the disclosed embodiments can be implemented.

1 is a schematic side view of a proximity sensor according to one embodiment of the disclosure described herein. FIG.

FIG. 6 is a schematic side view of another proximity sensor according to one embodiment of the disclosure described herein.

FIG. 2 is a block diagram of an example data processing system that can be used with one or more embodiments described herein.

1 is a block diagram illustrating a data processing system with two peripheral devices and a dock or other connector that couples the peripheral devices to the data processing system. FIG.

  Various embodiments and aspects of the disclosure are described with reference to details discussed below, and the accompanying drawings illustrate various embodiments. The following description and drawings are illustrative of the invention and are not to be construed as limiting the invention. Numerous specific details are set forth in order to provide a thorough understanding of various embodiments of the invention. However, in some instances, well-known or conventional details are not described in order to simplify the description of the embodiments of the present disclosure.

  Some portions of the detailed descriptions that follow are presented in terms of algorithms that include operations involving data stored in a computer memory. An algorithm is generally a self-consistent sequence of operations that leads to a desired result. Operations typically require or involve physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, expressions, numbers, or the like.

  It should be noted, however, that all of these and similar expressions will be associated with the appropriate physical quantities and are merely convenient labels attached to these quantities. As will be apparent from the following description, unless otherwise specified, “processing” or “computing” or “calculating” or “determining” is determined throughout the specification. ) "Or" displaying "or similar terms are used to describe data represented as physical (electrical) quantities in a system's registers and memory, in the system's memory or registers or other It is understood that it can refer to the operation and process of a data processing system or similar electronic device that manipulates and converts to other data that is also represented as physical quantities in such information storage, transmission devices or display devices. I want.

  The invention can relate to an apparatus for performing one or more of the operations described herein. The apparatus can be specially configured for the intended purpose or can comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program can include instructions for performing the operations described herein, and includes, but is not limited to, any type including floppy disks, optical disks, CD-ROMs, and magnetic optical disks. Disc or read-only memory (ROM), random access memory (RAM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), magnetic or optical card, or store electronic instructions Any type of medium suitable for doing so can be stored in a machine (eg, computer) readable storage medium, each of which is coupled to a bus.

  A machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (eg, a computer). For example, machine-readable media include read only memory (“ROM”), random access memory (“RAM”), magnetic disk storage media, optical storage media, flash memory devices, electrical, optical, acoustic or other forms of propagation. Signals (eg, carrier wave, infrared signal, digital signal, etc.), etc. are included.

  FIG. 1 shows an example of a system 200 that includes a peripheral device 201 that can be referred to as an accessory, and a data processing system 203 that is designed to exchange data with the peripheral device 201. In the example of FIG. 1, the peripheral device 201 can be a wireless headset that communicates with the data processing system 203 via a wireless personal area network (WPAN) interface, such as a Bluetooth interface. It can be a general purpose computer system such as a mobile cellular phone, or a personal digital assistant (PDA) that includes the wireless mobile cellular phone, or a handheld computer that includes a wireless mobile cellular phone. Although specific types of peripheral devices and specific types of data processing systems are illustrated in FIG. 1, it is understood that other types of peripheral devices and data processing systems may be used in other embodiments. It will be. For example, in another embodiment, the peripheral device can be a wired headset, or a wired or wireless keyboard, or a wired or wireless cursor control device, or other wired or wireless input or output device, in other cases. Peripherals can be thought of as data processing devices similar to PDAs or cellular phones or general purpose computer systems. In another embodiment, the data processing system can be a general purpose computer system, or a special purpose computer system, or an entertainment system, or a PDA, or an embedded device in another device, or a media player, etc. Peripheral 201 includes a peripheral processor 205, an audio transducer 213 (which can be a speaker), a microphone 209, and a wireless transceiver 207 coupled to one or more sensors 211. Peripheral processor 205 operates by wireless transceiver 207, which can be, for example, a Bluetooth or WiFi transceiver or other type of transceiver used to create a wireless local area network (WLAN) or WPAN, and wireless The operation of peripheral device 201 is controlled by operating microphone 209 and audio transducer 213 in response to signals from transceivers and / or sensors and / or processes executed on peripheral processor 205. Peripheral processor 205 may be coupled to an audio codec (not shown) or other device to drive or receive input from the audio transducer and microphone, respectively. When the peripheral device 201 is a wireless headset for a telephone, the wireless transceiver 207 establishes a wireless communication link with the telephone, functions as a host data processing system, and is played by a speaker (audio transducer 213). The audio data to be transmitted is transmitted, and the audio data is received from the microphone 209. Thus, the wireless headset functions in the same way as a wired headset on a telephone. The sensor 211 may be one or more sensors on the peripheral device 201 that are designed to detect or measure user activity or device context. The sensor 211 can include, for example, proximity sensors and / or ambient light sensors and / or accelerometers and / or other sensors described herein. The sensor 211 provides sensor data (eg, proximity data) to the peripheral processor 205, which processes this data or, as described below, to a data processing system to process the sensor data. Can be sent.

  Data processing system 203 includes a processing system 221, such as a set of one or more microprocessors, that is coupled to a wireless mobile phone transceiver 223, which is at least in part. It may be a wireless mobile cellular telephone transceiver controlled by the processing system 221. In one embodiment, the data processing system 203 may be a handheld PDA or handheld general purpose computer that includes a wireless cellular telephone. In this case, the RF circuit required for the wireless cellular phone can be provided by the wireless mobile phone transceiver 223. Data processing system 203 also includes one or more sensors 227, memory 229, I / O device 231, and at least one additional wireless transceiver 225, each of which is coupled to processing system 221. . The processing system 221 can include a set of one or more microprocessors, which are coupled to the other of the data processing system 203 via one or more buses. One or more sensors 227 can be located on the data processing system 203 and, as further described in US patent application Ser. No. 11 / 638,251, incorporated herein by reference, It can be designed to detect or measure device context. The one or more sensors 227 can include, for example, proximity sensors and / or ambient light sensors and / or accelerometers and / or other sensors described herein. Sensor data from these one or more sensors 227 is provided to a processing system 221 that can process this data as described herein, or the sensor data Can be transmitted to the peripheral device for processing, or both the peripheral device and the processing system 221 may process the sensor data. The I / O (input / output) device 231 includes (a) a keyboard, (b) a touch input panel, (c) a cursor control device (for example, a joystick or a trackpad), (d) a speaker, (e) a microphone, (F) includes one or more of buttons (eg, cellular phone “send” and “end” or other buttons), (g) a display device, and (h) other known input / output devices. be able to. In one embodiment, the touch input panel can be integrated with the display device to provide both input and output functions on the same surface of the display device, as further described below. These I / O devices allow a user to enter commands or commands or data into the processing system 221 to operate the system in the manner desired by the user. The memory 229 can be DRAM or flash memory, or any combination of other types of memory including, for example, a magnetic hard drive, and the memory 229 can be a processing system via one or more memory controllers. The memory 229 includes computer program instructions including a computer operating system (OS) and users such as web browser applications, email applications, calendar programs, address book applications, and other possible applications. Application programs can be stored. Memory 229 may also include, for example, address and / or contact information, calendar information (eg, events and tasks), bookmarks / favorites (eg, “URL”) and other user data (eg, word processing documents, spreadsheets, presentations). , Etc.) can be stored. Processing system 221 may retrieve and store computer program instructions and data from memory 229 to allow a user to operate data processing system 203. In addition, the memory 229 can store music and / or other media for playback on the data processing system 203 so that a user can use a speaker (eg, an earphone) of a peripheral device such as the peripheral device 201. Or, music and / or other media can be displayed and selectable for playback on a wireless headset. Wireless transceiver 225 provides wireless connectivity or wireless network (eg, WiFi network or other wireless local area network (WLAN) or wireless personal area network (WPAN), etc.) to other devices such as peripheral device 201. One or more wireless transceivers may be included. Wireless transceiver 225 is coupled to processing system 221 for providing data to data processing system 203. In one embodiment, the wireless transceiver 225 turns the data processing system 203 into the peripheral device 201 and optionally other peripheral devices (eg, wireless keyboard) and a WiFi compliant transceiver (eg, an IEEE 802.11a / g compliant transceiver). A Bluetooth compliant transceiver is included for wireless coupling to wirelessly couple the system 203 to the wireless network and / or other devices. Peripheral device 201 and data processing system 203 can be paired with each other using known techniques, such as the techniques described herein, to create Bluetooth collaboration. Alternatively, pairing can include other techniques for registering one device with another to provide a secure authenticated communication channel between the peripheral device 201 and the data processing system 203.

  In one embodiment, peripheral device 201 and data processing system 203 may provide user intent or behavior or system context based on sensor data from at least one sensor of peripheral device 201 and / or data processing system 203. Can be collaborative to make decisions. For example, a set of sensors, such as a proximity sensor and an ambient light sensor on a peripheral device, can provide data indicating that the peripheral device is not in proximity to the user, while another set of sensors on the host can be used by the host Can provide data indicating that the device is near the ear of the device, and in this situation, the peripheral device and the host can automatically change the configuration of the peripheral device and / or the host in response to the sensor data and the sensor data. Data can be exchanged. In this embodiment, if the peripheral device is a wireless headset and the host is a wireless cellular phone, the peripheral device sends its sensor data to the host, and the host sends this sensor data along with sensor data from the host. Processing can determine various configurations for the host and / or peripheral device. For example, the appropriate orientation (eg, horizontal or vertical) for a wireless cellular phone can be determined based on the detection of peripheral devices that the user is lying down while looking at the wireless cellular phone.

  In some embodiments, the peripheral device 201 includes a peripheral device interface 207 that couples the peripheral device 201 to a device, such as the data processing system 203 and at least one peripheral device sensor 211 for detecting a user of the peripheral device 201. Can be included. Peripheral device 201 may further include a peripheral processor 205 coupled to peripheral device interface 207 and at least one peripheral device sensor 211. Peripheral processor 205 is configured to determine a peripheral device vector that indicates the orientation of peripheral device 201 relative to the ground reference while being worn by the user. The device can include an interface 225 for coupling the device to the peripheral device 201. The device can further include at least one sensor 227 for sensing a user and a processor 221 coupled to the interface 225 and the at least one sensor 227. The processor 221 determines a device vector that indicates the orientation of the device relative to the ground reference and determines one or more events (eg, calendar events, phone calls, alarm events, To-Do events, email events, and / or reminder events). ), And in response to the audio signal, it is configured to determine whether the peripheral vector points towards the device vector. The peripheral vector refers to the device vector in response to the audio signal based on a peripheral vector that does not point to the device vector before the audio signal is generated.

  In at least some embodiments, the processor 221 is further configured to ignore the audio signal if the peripheral vector is not pointing towards the device vector in response to the audio signal. For example, a peripheral vector cannot point towards a device vector in response to an audio signal if the peripheral vector points towards the device vector before and after the audio signal is generated. In this embodiment, the change in direction of the peripheral device vector does not occur in response to the audio signal.

  In some embodiments, the peripheral interface 207 includes a wireless transceiver that wirelessly couples the device to the peripheral 201. Peripheral device 201 further includes a speaker or audio transducer 213 coupled to peripheral device interface 207 and a microphone 209 coupled to peripheral device interface 207. The wireless transceiver transmits the first audio data from the microphone 209 to the device. The wireless transceiver receives second audio data from the device and transfers the second audio data to the speaker. The device includes a wireless mobile phone transceiver 223.

  In one embodiment, at least one of the peripheral processor 205 and the processor 221 receives data from at least one of the at least one peripheral sensor 211 and the at least one sensor, and wireless mobile based on the data It is determined whether to use the speaker and microphone 209 for telephone calls transmitted via the telephone transceiver. At least one peripheral device sensor 211 includes (a) a proximity sensor, (b) an ambient light sensor, (c) a temperature sensor, (d) an accelerometer, (e) a position sensor, (f) an orientation sensor, and (g) Including at least one of audio sensors, the at least one sensor being (a) a proximity sensor, (b) an ambient light sensor, (c) a temperature sensor, (d) an accelerometer, (e) a position sensor, (f At least one of :) an orientation sensor; and (g) an audio sensor. Peripheral processor 211 can automatically configure speakers and processor 221 in response to outputs from at least one peripheral device 211 and at least one sensor 227.

  At least some embodiments of the present disclosure can include a digital media player, such as a portable music and / or video media player, which stores the media processing system and media for presenting the media And at least some embodiments of the present disclosure further include a radio frequency (RF) transceiver (eg, an RF transceiver for a cellular telephone) coupled to the antenna system and the media processing system. Can be included. In some embodiments, media stored on a remote storage device can be transmitted to a media player via an RF transceiver. The media can be, for example, one or more of music or other audio, still images, or movies.

  A portable media player is available from Apple Computer, Inc. of Cupperino, California. Including a media selection device, such as a click wheel input device, touch screen input device, push button device, movable pointing input device or other input device in a manufactured iPod® or iPod Nano® media player it can. A media selection device may be used to select media stored on the storage device and / or the remote storage device. A portable media player, in at least some embodiments, includes a display device coupled to the media processing system, selected via an input device, and further via either a speaker or earphone, or on a display device, or Media titles or other indicia presented both on the display device and on the speakers or earphones can be displayed. Examples of portable media players are described in US Patent Application Publication Nos. 2003/0095096 and 2004/0224638, both of which are incorporated herein by reference.

  Embodiments of the disclosure described herein include, for example, an entertainment system or a personal digital assistant (PDA), or a general purpose computer system, or a special purpose computer system, or an embedded device or media player in another device Data processing, such as no cellular phone, or a device that combines aspects or functions of these devices (eg, a PDA entertainment system and a media player such as iPod® in combination with a cellular phone in one portable device) It can be part of other types of systems.

  FIG. 2A is a flowchart illustrating one embodiment of the disclosed method described herein. In at least some embodiments, the method 250 senses device display movement and orientation information. For example, a user can browse the Internet using a device. Determining the appropriate orientation of the display, such as horizontal or vertical, ensures that the content being browsed is displayed according to the aspect ratio of the display. The method 250 includes, at block 252, receiving a motion event from at least one sensor positioned in the device. For example, the accelerometer sensor can detect movement in the X, Y, and / or Z axis directions. The method 250 further includes, at block 254, determining the current orientation of the device. The method 250 further includes determining at block 256 whether the device is currently moving. The accelerometer provides the previous X, Y, and Z information to the processor along with the current X, Y, and / or Z information, and the processor compares the X, Y, and Z information to a threshold to It can be determined whether or not is moving.

  In one embodiment, at block 256, if the device is not currently moving, at block 258, the previous orientation or default orientation (eg, portrait, landscape, upside down, up, down, or ambiguity). To be determined. For example, the vertical direction can be the default orientation. If the current orientation has changed compared to the previous or default orientation, at block 264, the software application that the device is using receives a message or call that switches the orientation to the current orientation of the device. .

  In another embodiment, if the device has a longitudinal orientation at block 258, the method 250 further includes at block 260 a shallow angle (e.g., 20 with respect to the ground reference) based on the event at block 252. Determining whether it has moved within a range of ˜30 degrees shallow angle. If not in the vertical direction, it is presumed that the user's intentional movement caused a motion event that forms an angle greater than the shallow angle with respect to the ground reference, so in block 264 the software application determines the orientation of the device. Receive a message or call to switch.

  In some embodiments, if it is determined at block 262 that the device forms a shallow angle with respect to the ground reference at the first time period, the user's intentional movement is at least at the first time period at the ground reference. The software application receives a message or call to switch orientation at block 264 because it is presumed that it has caused a motion event that forms a shallow angle with respect to. Alternatively, a device that spends less time than the first time period within a ground-based shallow angle is likely to have been accidentally placed at this location. In this example, method 250 returns to block 252 to wait for a new motion event.

  Returning to block 256, the method 250 further includes determining, at block 266, whether the currently moving device (block 256) has moved in a second time period. If the movement occurs at least in the second time period, the method 250 may block because it is presumed that a movement occurring in a longer time than the second time is likely to be an intentional user action. Proceed to 258. Otherwise, the method 250 returns to block 252 and waits for a new motion event.

  In at least some embodiments, the method 250 determines the appropriate orientation of the display of the device as described above. The method 250 allows proper orientation of the display to prevent accidental switching from one orientation to another. For example, an accidental or unintentional switch was detected by a sensor in the device because the user dropped the device, slid the device on the table, or ran with the device. This may occur based on erroneously determining a change in direction based on the motion information.

  FIG. 2B shows a diagram of a data processing system (eg, a device such as a wireless mobile cellular phone) with respect to ground reference in the disclosed embodiments described herein. A data processing system 245 having a virtual axis 249 forms an angle 247 with respect to a ground reference 240 (eg, ground, floor, table, shelf, horizontal plane). Method 250 has a shallow angle 242 formed between phantom line 249 and ground reference 240. For example, at block 260, the method 250 determines whether the device has moved based on an event at 252 within a shallow angle with respect to the ground reference (eg, a shallow angle of 20-30 degrees). In FIG. 2B, the device 250 forms an angle 247 that exceeds the shallow angle 242. In this embodiment, since the event that caused the angle 247 is likely to be an intentional action, the orientation is switched. However, FIG. 2C shows a potentially accidental operation.

  FIG. 2C shows a diagram of a data processing system (e.g., a wireless mobile cellular phone) relative to a ground reference in another embodiment of the disclosure described herein. Data processing system 280 having virtual axis 284 forms an angle 282 with respect to ground reference 270 (eg, ground, floor, table, shelf, horizontal plane). Method 250 has a shallow angle 272 formed between phantom line 274 and ground reference 240. Device 280 forms an angle 282 within a shallow angle 272 range. In this example, for the event that generates angle 282 to be a deliberate action, the orientation will only switch if the device has spent a sufficient amount of time (the first time period in block 262). .

  FIG. 3A is a flowchart of an embodiment of the disclosed method described herein. The method 300 includes, at block 302, determining a device vector associated with the device. The device vector indicates the orientation of the device with respect to the ground reference. The method 300 further includes, at block 304, determining a peripheral vector associated with the peripheral device of the device. The peripheral device vector indicates the orientation of the peripheral device with respect to the ground reference. The method 300 further includes, at block 306, generating an audio signal associated with the device event (eg, calendar event, phone call, alarm event, ToDo event, email event, and / or reminder event). The method 300 further includes, at block 308, determining whether the peripheral vector points towards the device vector in response to the audio signal. Method 300 further includes muting the audio signal at block 310 if the peripheral vector points toward the device vector in response to the audio signal. For example, the peripheral vector does not point towards the device vector before generating the audio signal, and the device vector is responsive to the audio signal based on the peripheral vector pointing toward the device vector during the generation of the audio signal. Pointing towards.

  The method 300 further includes ignoring the audio signal at block 312 if the peripheral vector does not point towards the device vector in response to the audio signal. For example, a peripheral vector does not refer to a device vector in response to an audio signal based on a peripheral vector that points toward the device vector before and during the generation of the audio signal. Alternatively, if the peripheral vector points away from the device vector before and during the generation of the audio signal, the audio signal is not muted.

  FIG. 3B shows a diagram of device vectors versus peripheral vectors in the disclosed embodiments described herein. The device vector 320 points away from the peripheral device vector 322.

  FIG. 3C shows a diagram of device vectors versus peripheral vectors in another embodiment of the disclosure described herein. Device vector 324 points towards peripheral device vector 326.

  In some embodiments, as described in block 308, the peripheral vector may point towards the device vector in response to the audio signal. For example, FIG. 3B can represent an initial time period of device vectors and peripheral vectors pointing away from each other. An audio signal associated with the event from the device is then generated, as described at block 306. In response to the audio signal, a user wearing a peripheral device faces the device indicated by the vector in FIG. 3C. Next, the audio signal generated by the device is muted as described in block 310.

  In one embodiment, the audio signal described in FIG. 3A represents a voice command. The user can acknowledge the voice command by swinging his head up and down when the peripheral vector moves in a direction perpendicular to the ground reference. Alternatively, the user can reject the voice command by swinging his / her head left and right when the peripheral device vector moves in the horizontal direction with respect to the ground reference.

  In another embodiment, the user uses a software application to browse the Internet. The combination of the device vector and the peripheral vector can enable the device to recognize that the user is currently browsing content such as a web page from the Internet. In this example, the device may be configured with a default time period before the device is locked and / or before the device display is dimmed. Based on the device's perception that the user is currently viewing a web page, the device does not require the user to change the default time period, and the user experience is more satisfying by changing the time period described above. Can bring.

  In another embodiment, the user is lying on a horizontal surface (eg, couch, floor, etc.) while interacting with the device and / or looking at the device. The device vector can indicate that the device axis 249 is parallel to the ground reference, and therefore the device display needs to be in landscape orientation. However, if the device recognizes that the user is lying on a horizontal plane based on the peripheral device vector, it is preferable to keep the device in the vertical direction.

  FIG. 4A illustrates a portable device 50 according to one embodiment of the present invention. The portable device 50 can include a housing 52, a display / input device 54, a speaker 56, a microphone 58, and an optional antenna 60 (which can be viewed outside the housing or hidden within the housing). The portable device 50 can also include a proximity sensor 62 and an accelerometer 64 and optionally other sensors (eg, ambient light sensors). The portable device 50 includes a cellular phone or device that is an integrated PDA, and a cellular phone or device that is an integrated media player, and a cellular phone or device that is both an entertainment system (eg, for gaming) and a cellular phone. Alternatively, the portable device 50 can be other types of devices described herein. In one particular embodiment, portable device 50 can include a cellular phone and media player and a general purpose computer, all of which are contained within housing 52. The portable device 50 can be implemented like the embodiment of the data processing system 203 shown in FIG. 1 and can operate with peripheral devices in the manner shown in FIG. 1 and described in this disclosure. The portable device 50 can have a form factor that is small enough to fit in the hands of a normal adult and lightweight enough that an adult can carry it with one hand. It will be understood that the term “portable” means a device that can be easily held in the hands (one or both hands) of an adult user. For example, laptop computers and iPods are portable devices.

  In one embodiment, as shown in FIG. 4A, the display / input device 54 occupies the majority of one surface (eg, the top surface) of the housing 52 of the portable device 50. In one embodiment, display / input device 54 uses substantially the entire front surface of portable device 50. In another embodiment, the display / input device 54 uses, for example, at least 75% of the front surface of the housing 52 of the portable device 50. In an alternative embodiment, the portable device 50 can include a display that does not have input capabilities, but the display still occupies most of one surface of the portable device 50. In this case, portable device 50 may include other types of input devices, such as a QWERTY keyboard or other type of keyboard that pulls or opens from a portion of portable device 50.

  FIG. 4B illustrates a data processing system according to one embodiment of the present invention, which can be implemented like the embodiment of the data processing system 203 shown in FIG. FIG. 4B shows a wireless device in a telephone configuration having a “candy bar” style. In FIG. 4B, the wireless device 30 may include a housing 32, a display device 34, an input device 36, which may be an alphanumeric keypad, a speaker 38, a microphone 40, and an antenna 42. The wireless device 30 can also include a proximity sensor 44 and an accelerometer 46. It will be appreciated that the embodiment of FIG. 4B can use more or fewer sensors and can have a different form factor than that shown in FIG. 4B.

  Display device 34 is shown positioned in the upper portion of housing 32, and input device 36 is shown positioned in the lower portion of housing 32. The antenna 42 is shown extending from the housing 32 at the upper portion of the housing 32. A speaker 38 is also shown in the upper portion of the housing 32 above the display device 34. Microphone 40 is shown in the lower portion of housing 32 below input device 36. It will be appreciated that the speaker 38 and microphone 40 may be positioned anywhere on the housing, but are typically positioned according to the user's ear and mouth, respectively. Proximity sensor 44 is shown at least partially at or near the location of speaker 38 and within housing 32. The accelerometer 46 is shown in the lower portion of the housing 32 and within the housing 32. It will be appreciated that the specific location of the features described above can be varied in alternative embodiments.

  The display device 34 may be, for example, a liquid crystal display (LCD) that does not include a function of accepting input, or a touch input screen that includes an LCD. Input devices 36 may include, for example, buttons, switches, dials, sliders, keys or keypads, navigation pads, touchpads, touch screens, and the like.

  Any known speaker, microphone and antenna can be used for speaker 38, microphone 40 and antenna 42, respectively.

  The proximity sensor 44 can detect the position of the object with respect to the wireless device 30 (for example, the distance from the wireless device 30), the direction, the speed, and the like. The position of the object relative to the wireless device can be expressed as a distance in at least some embodiments. The proximity sensor can generate position and / or movement data that can be used to determine the position of the object relative to the portable device 30 and / or the proximity sensor 44. An embodiment of a proximity sensor is shown in FIG.

  Further, a processing device (not shown) is coupled to the proximity sensor 44. The processing device can be used to determine the position of the object relative to the portable device 30 and / or the proximity sensor 44 based on the position and / or movement data provided by the proximity sensor 44. The proximity sensor can monitor the position of the object continuously or periodically. The proximity sensor can also determine the type of object being detected.

  Additional information about proximity sensors can be found in US Patent Application No. 11 / 241,839 entitled “Proximity Detector in Handheld Device” and US Patent Application No. 11/240, entitled “Proximity Detector in Handheld Device”. No. 788, US patent application Ser. No. 11 / 165,958 filed Jun. 23, 2005, entitled “Method and Apparatus for Remote Presence Detection,” and the title “issued June 24, 2003” Proximity / Touch Detector and Calibration Circuit ", US Pat. No. 6,583,676, all of which are incorporated herein by reference.

  According to one embodiment, the accelerometer 46 can detect movement including acceleration or deceleration of the wireless device. The accelerometer 46 can generate movement data for multiple dimensions that can be used to determine the direction of movement of the wireless device. For example, the accelerometer 46 can generate X, Y and Z axis acceleration information when the accelerometer 46 detects that the portable device has moved. In one embodiment, the accelerometer 46 can be implemented as described in US Pat. No. 6,520,013, which is hereby incorporated by reference in its entirety. Alternatively, the accelerometer 46 may be a KGF01 accelerometer provided by Kionix, or an ADXL311 accelerometer provided by Analog Devices, or other accelerometers known in the art.

  In addition, a processing device (not shown) is coupled to the accelerometer 46. The processing device can be used to calculate the movement direction, also called the movement vector of the wireless device 30. The movement vector can be determined according to one or more predetermined equations based on movement data provided by accelerometer 46 (eg, movement in X, Y, and Z). The processing device may be integrated with the accelerometer 46, or may be integrated with other components such as, for example, a portable device microprocessor chipset.

  The accelerometer 46 can monitor the movement of the portable device continuously or periodically. As a result, the orientation of the portable device before and after movement can be determined based on movement data provided by an accelerometer attached to the portable device.

  Additional information about accelerometers can be found in co-pending US patent application Ser. No. 10 / 986,730, filed Nov. 12, 2004, which is incorporated herein by reference in its entirety.

  Data obtained from proximity sensor 44 and accelerometer 46 can be combined together or used alone to gather information about user activity. Using data from proximity sensor 44, accelerometer 46 or both, for example, display backlight activation / deactivation, command activation, selection selection, scrolling or other movement control in display, input device settings Or other changes to one or more settings of the device can be made. For example, the orientation of the display 34 can change based on one or more sensors of the device 30. As shown in FIG. 1, the information from the peripheral device changes the settings of the device 30 such as muting the alarm sound generated by the device 30 when the peripheral device vector changes direction in response to an alarm. can do.

  4C and 4D illustrate a portable device 70 according to one embodiment of the present invention. The portable device 70 can be implemented like the embodiment of the data processing system 203 shown in FIG. 1, and can be implemented in the manner shown in FIGS. 3A-3C and further described with respect to FIGS. 3A-3C and this disclosure. Can work with equipment. Portable device 70 may be a cellular phone that includes a hinge 87 that couples display housing 89 to keypad housing 91. Hinge 87 allows the user to open and close the cellular phone to be at least one of the two different configurations shown in FIGS. 4C and 4D. In one particular embodiment, the hinge 87 can rotatably couple the display housing to the keypad housing. Specifically, the user can open the cellular phone to the open configuration shown in FIG. 4C and close the cellular phone to the closed configuration shown in FIG. 4D. The keypad housing 91 can include a keypad 95 that receives input from a user (eg, telephone number input or other alphanumeric input) and a microphone 97 that receives voice input from the user. Display housing 89 may include display 93 (eg, LCD), speaker 98 and proximity sensor 84 on its inner surface, and on its outer surface, display housing 89 includes speaker 96, temperature sensor 94, and display 88. (Eg, another LCD), ambient light sensor 92, and proximity sensor 84A. Thus, in this embodiment, the display housing 89 can include a first proximity sensor on its inner surface and a second proximity sensor on its outer surface. The first proximity sensor is used to detect a user's head or ear that is within a certain distance of the first proximity sensor and to automatically change the illumination settings of the displays 93 and 88 in response to this detection. (E.g., both display lights are turned off or otherwise set to a reduced power state). The data from the second proximity sensor, along with the data from the ambient light sensor 92 and the data from the temperature sensor 94, can be used to detect that the cellular phone is in the user's pocket.

  In at least some embodiments, the portable device 70 is one of the functions of a wireless communication device, such as a cellular phone, media player, entertainment system, PDA, or other type of device described herein. Components that provide more can be included. In one implementation of the embodiment, the portable device 70 may be a cellular phone integrated with a media player that plays MP3 files, such as MP3 music files.

  Each of the devices shown in FIGS. 4A, 4B, 4C, and 4D can be a wireless communication device, such as a wireless cellular phone, and can include multiple components that provide wireless communication functionality. FIG. 5 illustrates one embodiment of a wireless device 100 that includes wireless communication capabilities. The wireless device 100 can be included in any one of the devices shown in FIGS. 4A, 4B, 4C, and 4D, but in another embodiment of these devices in FIGS. 4A, 4B, 4C, and 4D, wireless More or fewer components than device 100 may be included. Moreover, all or part of the wireless device 100 can be implemented as part of the data processing system 203, and the wireless device 100 can operate with peripheral devices in the manner described in this disclosure.

  The wireless device 100 can include an antenna system 101. The wireless device 100 also includes a digital and / or analog radio frequency (RF) transceiver coupled to the antenna system 101 for transmitting and / or receiving voice, digital data and / or media signals via the antenna system 101. 102 can be included.

  The wireless device 100 may also include a digital processing system 103 to control the digital RF transceiver and manage voice, digital data and / or media signals. The digital processing system 103 may be a general purpose processing device such as a microprocessor or controller. The digital processing system 103 may also be a dedicated processing device such as an ASIC (Application Specific Integrated Circuit), FPGA (Field Programmable Gate Array) or DSP (Digital Signal Processor). The digital processing system 103 can also include other devices as known in the art for interfacing with other components of the wireless device 100. For example, the digital processing system 103 can include analog-to-digital and digital-to-analog converters for interfacing with other components of the wireless device 100. The digital processing system 103 can include a media processing system 109, which can include a general purpose or dedicated processing device for managing media such as files of audio data.

  The wireless device 100 can also include a storage device 104 coupled to the digital processing system to store data and / or operating programs for the wireless device 100. The storage device 104 can be, for example, any type of solid or magnetic memory device.

  The wireless device 100 may also include one or more input devices 105 coupled to the digital processing system 103 to accept user input (eg, phone numbers, names, addresses, media selections, etc.). . The input device 105 can be, for example, one or more of a keypad, touchpad, touch screen, pointing device combined with a display device, or similar input device.

  Wireless device 100 also includes at least one display device 106 coupled to digital processing system 103 for messages, telephone call information, contact information, images, movies, and / or media selected via input device 105. Information such as titles or other signs can be displayed. The display device 106 can be, for example, an LCD display device. Display device 106 may include a backlight 106a that illuminates display device 106 under certain circumstances. It will be appreciated that the wireless device 100 may include multiple displays.

  Wireless device 100 also includes battery 107 and includes digital RF transceiver 102, digital processing system 103, storage device 104, input device 105, microphone 105A, audio transducer 108, media processing system 109, sensor 110, and display device 106. Operating power can be supplied to the components of the system. The battery 107 can be, for example, a rechargeable or non-rechargeable lithium or nickel metal hydride battery.

  The wireless device 100 can also include an audio transducer 108 that can include one or more speakers and at least one microphone 105A.

  The wireless device 100 can also include one or more sensors 110 coupled to the digital processing system 103. The sensor 110 includes, for example, a proximity sensor, an accelerometer, a touch input panel, an ambient light sensor, an ambient noise sensor, a temperature sensor, a gyroscope, a hinge detector, a position determination device, an orientation determination device, a motion sensor, an audio sensor, and a radio frequency. One or more of electromagnetic sensors, and other types of sensors, and combinations thereof can be included. One or more of these sensors may also be included on a peripheral device that is configured to operate with (eg, exchange data with) the data processing system. Based on data acquired by the sensor 110 and sensors on the peripheral device, for example, changing the orientation of the display, muting the audio signal, activating or deactivating the backlight 106a, changing the settings of the input device 105 ( For example, as an intentional user input, switch between processing or not processing any input data from the input device), and other responses and combinations thereof, by the data processing system and / or peripheral device Various responses can be performed automatically.

  In one embodiment, digital RF transceiver 102, digital processing system 103, and / or storage device 104 may include one or more integrated circuits disposed on a printed circuit board (PCB).

  6 and 7 illustrate exemplary proximity sensors according to embodiments of the present invention. It will be appreciated that in alternative embodiments, other types of proximity sensors may be used, such as capacitive sensors other than the proximity sensors shown in FIGS. 6 and 7, or sensors such as sonar. In FIG. 6, the proximity sensor 120 includes an emitter 122, a detector 124, and a window 126. The emitter 122 generates light in the infrared (IR) region and can be, for example, a light emitting diode (LED). The detector 124 is configured to detect a change in light intensity and can be, for example, a phototransistor. The window 126 can be formed of a light transmissive or semi-light transmissive material. In one embodiment, the window 126 is an acoustic mesh, such as a mesh typically found in a portable device microphone or speaker, for example. In another embodiment, the window 126 can be MicroPerf, a meshed IR transparent strand, or a cold mirror.

  During operation, when the object 128 is above the window 126, light from the emitter 122 strikes the object and scatters. The light from the emitter is emitted in a square wave pulse having a known frequency, which allows the detector 124 to distinguish between ambient light and light from the emitter 122, and the light from the emitter is Reflected by an object, such as a substance in the user's hand or ear or the user's pocket, returns to the detector 124. At least a portion of the scattered light is reflected toward the detector 124. An increase in light intensity is detected by detector 124, which is interpreted by the processing system (not shown in FIG. 6) to mean that an object is within a short distance of detector 124. If the object is not present, or if the object is beyond a certain distance from the detector 124, an insufficient or smaller amount of emitted light is reflected back to the detector 124, which Is interpreted by the processing system (not shown in FIG. 6) to mean that does not exist or is at a relatively large distance. In each case, the proximity sensor measures the intensity of the reflected light that is related to the distance between the object that reflects the light and the detector 124.

  In one embodiment, emitter 122 and detector 124 are disposed within a portable device or peripheral housing as described in this disclosure.

  In FIG. 7, the proximity sensor emitter 122 and detector 124 are angled inward with respect to each other to improve detection of reflected light, but otherwise the proximity sensor of FIG. 6 operates in the same manner as the proximity sensor 6.

  It will be appreciated that at least some of the sensors used in embodiments of the present disclosure can measure or provide data representing analog values. In other words, the data is not a discrete value with a certain amount of discrete jumps from one value to the next, but any one of a set of values that can change continuously or nearly continuously. Represents a possible value. Furthermore, the value represented by the data cannot be predetermined. For example, in the case of a distance measured by a proximity sensor, the distance is not predetermined, unlike a key value on a keypad that represents a predetermined value. For example, a proximity sensor can determine or provide data representing a distance that can vary continuously or nearly continuously in an analog fashion, and for such a proximity sensor, the distance is the proximity sensor's This can correspond to the intensity of the reflected light generated from the emitter. The temperature sensor can determine or provide data representing temperature, which is an analog value. An optical sensor, such as an ambient light sensor, can determine or provide data representing light intensity that is an analog value. A motion sensor, such as an accelerometer, can determine or provide data representing a measurement of motion (eg, velocity and / or acceleration). The gyroscope can determine or provide data representing orientation measurements (eg, pitch or yaw or roll amount). The audio sensor can determine or provide data representing a measurement of acoustic intensity. For other types of sensors, the data determined or provided by the sensor can represent an analog value.

  FIG. 8 illustrates another example of a device according to one embodiment of the present disclosure. The device can include a processor, such as a microprocessor 402, and a memory 404 that are coupled to each other via a bus 406. Device 400 can optionally include a cache 408 coupled to microprocessor 402. The device may also optionally include a display controller and display device 410 coupled to other components via bus 406. One or more input / output controllers 412 are also coupled to the bus 406 to provide an interface to the input / output devices 414 and for one or more sensors 416 for sensing user activity. Provides an interface. Bus 406 may include one or more buses connected to each other via various bridges, controllers, and / or adapters, as is known in the art. Input / output device 414 may include a cursor control device, such as a keypad or keyboard, or a touch input panel. Further, the input / output device 414 may include at least one network interface for either a wired network or a wireless network (eg, an RF transceiver such as a WiFi or WPAN RF transceiver). The sensor 416 can be any one of the sensors described herein including, for example, a proximity sensor or an ambient light sensor. In at least some implementations of the device 400, the microprocessor 402 can receive data from one or more sensors 416 and can perform analysis of the data in the manner described herein. . For example, the data can be analyzed and then the microprocessor 402 can automatically adjust one or more settings of the device.

  In at least some embodiments, the data processing system 400 uses at least one sensor 416 to detect whether the data processing system 400 moves within an angle with respect to the ground reference in the first time period. Including. System 400 further includes a processor 402 coupled to at least one sensor 416. The processor 402 is configured to respond to data received from the at least one sensor 416 by switching the orientation of the data processing system when the data processing system 400 moves beyond that angle.

  The processor 402 may further be configured to switch orientation when the device moves within an angle range at least for the first time period in response to data from the at least one sensor 416. The processor 402 is responsive to data from the at least one sensor 416 to determine the orientation of the data processing system 400 and based on the current position changing as compared to the last position of the data processing system. It can be configured to determine whether the processing system 400 has moved. The processor 402 further determines whether the data processing system 400 has moved in a certain time period, and if the data processing system 400 has not moved or if the data processing system has moved at least in a second time period, Can be configured to switch the orientation when the orientation of the data processing system 400 is not the longitudinal direction. The orientation can include a longitudinal direction, a counterclockwise lateral direction, a clockwise lateral direction, upside down, upward, downward, and ambiguous orientation.

  FIG. 9 relates to another aspect of the disclosure described herein. In this aspect, the data processing system 203 can be considered as a peripheral to another data processing system, such as the data processing system 451, which, in at least some embodiments, is shown in FIG. It may be a general purpose computer system such as the system shown. The system 450 shown in FIG. 9 includes a data processing system 451 that includes a network interface and peripheral interface and storage. In at least some embodiments, the data processing system 451 can be a keyboard, cursor control device, and display, and the network, which can be the Internet, or other network such as a local area network or telephone network or cable TV system network. 459 may be a general purpose computer system having a network interface for coupling the data processing system. The network interface can be coupled to the network via either wired or wireless coupling, and there are multiple network interfaces for different networks, or different ways of coupling to the same network or multiple networks. be able to. Data processing systems typically include non-volatile mass storage devices that store user data, including user programs and operating systems, and URLs such as address or contact information, calendar information, and favorites or bookmarks for browsing the Internet. can do. The peripheral interface of the data processing system 451 is used to couple the data processing system 451 to a peripheral dock or other connector. A dock or other connector can be coupled to the data processing system 451 in a wired or wireless manner via a peripheral interface. The dock or connector 453 couples to one or more peripheral devices, such as a first peripheral device 457, which can be a wireless headset, and a second peripheral device 455, which can be a wireless cellular phone including PDA functionality. Designed to do. In one embodiment, the data processing system 203 can be the second peripheral device 455 and the peripheral device 201 can be the first peripheral device 457. The dock can mechanically hold both peripherals separately or simultaneously, and provides power to the peripherals, recharges the peripheral batteries, and the peripherals and data processing system 451. Can be electrically coupled to both peripherals to exchange data between them. The second peripheral device 455 can include storage for user information such as contacts, calendars, and URLs that can be synchronized with similar types of user data on the data processing system 451. A user can place one or both peripheral devices on the dock or connector 453 to perform some actions as described herein, or one or both peripheral devices. Can be removed to automatically perform some actions as described herein. The dock and / or peripheral device can include mechanical or electrical sensors to detect the placement of the peripheral device in the dock or connector and the removal of the peripheral device from the dock or connector.

  In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will be apparent that various modifications may be made to the invention without departing from the broad spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.

30 wireless device; 32 housing; 34 display device;
36 input devices; 38 speakers; 40 microphones;
42 antenna; 44 proximity sensor; 46 accelerometer;
50 portable device; 52 housing 52;
54 display / input devices; 56 speakers;
58 microphone; 60 antenna; 62 proximity sensor 62;
64 accelerometer;
70 portable devices; 84 proximity sensors; 87 hinges;
88 display; 89 display housing;
91 Keypad housing; 94 Temperature sensor; 95 Keypad;
96 speakers; 97 microphones; 98 speakers.

Claims (9)

  1. Receiving a motion event from at least one sensor positioned within the device;
    Determining the current orientation of the display of the device;
    Determining whether the device is currently moving based on a comparison of previous motion information and current motion information;
    If the device is not moving, determining whether the device has moved within an angle with respect to a ground reference over a first period;
    Switching the orientation of the display if the device has moved at least the first period within the angle; and
    A program for causing a data processing system to execute a method including:
  2.   The program according to claim 1, wherein the method further includes a step of switching an orientation of the display when the device moves beyond the angle.
  3. Determining whether the currently moving device has moved in a second period of time; and
    Determining whether the default orientation of the display matches the current orientation of the display if the device is not moving or if the device has moved at least for the second period of time;
    Switching the display orientation if the default orientation of the display does not match the current orientation of the display;
    Further including
    The step of determining whether the device is currently moving based on a comparison of previous motion information and current motion information comprises: determining the previous motion information and the current motion information for at least one threshold. The program according to claim 1, further comprising a comparing step.
  4. A data processing system,
    At least one sensor for detecting three-dimensional motion data;
    A processor coupled to the at least one sensor;
    With
    The processor is responsive to the motion data received from the at least one sensor to determine whether the data processing system is currently moving by comparing previous motion information with current motion information; If the data processing system is not currently moving, it is determined whether the data processing system has moved within an angle with respect to the ground reference, and the data processing system spends at least a first period of time to determine the ground reference The data processing system is configured to switch the orientation of the display of the data processing system when it is determined that the movement is within the angle.
  5.   The data processing system of claim 4, wherein the processor is configured to switch the orientation of the display when the data processing system moves beyond the angle.
  6.   The processor is responsive to the motion data from the at least one sensor to determine a current orientation of the display to determine whether the data processing system has moved, the data processing system having a second Determine if it has moved for a period of time and if the data processing system has moved for at least the second period of time, determine if the default orientation of the display matches the current orientation of the display 6. The data processing system of claim 4, wherein the display orientation is switched when the default orientation of the display does not match the current orientation of the display.
  7. Means for receiving a motion event from at least one sensor located within the device;
    Means for determining the current orientation of the display of the device;
    Means for determining whether the device is currently moving by comparing previous motion information with current motion information;
    Means for determining if the device has moved within an angle with respect to a ground reference if the device is not currently moving;
    Means for switching the orientation of the display if it is determined that the device has moved within the angle over at least the first period;
    A device comprising:
  8. Receiving a motion event from at least one sensor positioned within the device;
    Determining the current orientation of the display of the device;
    Determining whether the device is currently moving by comparing previous motion information with current motion information;
    Determining whether the current orientation of the display matches a default orientation of the display based on the device not currently moving;
    Determining whether the device has traveled within an angle with respect to a ground reference over a first time period if the current orientation matches the default orientation;
    Switching the orientation of the display when the device moves beyond the angle; and
    A program for causing a data processing system to execute a method including:
  9.   The program according to claim 8, further comprising the step of switching the direction when the current direction does not match the default direction.
JP2010511156A 2007-06-08 2008-05-06 Method and system for providing sensory information to devices and peripherals Active JP5385265B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/811,174 2007-06-08
US11/811,174 US8004493B2 (en) 2007-06-08 2007-06-08 Methods and systems for providing sensory information to devices and peripherals
PCT/US2008/005819 WO2008153639A1 (en) 2007-06-08 2008-05-06 Methods and systems for providing sensory information to devices and peripherals

Publications (2)

Publication Number Publication Date
JP2010529552A JP2010529552A (en) 2010-08-26
JP5385265B2 true JP5385265B2 (en) 2014-01-08

Family

ID=39539629

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010511156A Active JP5385265B2 (en) 2007-06-08 2008-05-06 Method and system for providing sensory information to devices and peripherals

Country Status (7)

Country Link
US (3) US8004493B2 (en)
EP (2) EP2156267A1 (en)
JP (1) JP5385265B2 (en)
KR (2) KR101204535B1 (en)
CN (1) CN101681185B (en)
DE (2) DE112008004269A5 (en)
WO (1) WO2008153639A1 (en)

Families Citing this family (227)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8645137B2 (en) 2000-03-16 2014-02-04 Apple Inc. Fast, language-independent method for user authentication by voice
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US9390229B1 (en) 2006-04-26 2016-07-12 Dp Technologies, Inc. Method and apparatus for a health phone
US8902154B1 (en) 2006-07-11 2014-12-02 Dp Technologies, Inc. Method and apparatus for utilizing motion user interface
US8620353B1 (en) 2007-01-26 2013-12-31 Dp Technologies, Inc. Automatic sharing and publication of multimedia from a mobile device
US8949070B1 (en) 2007-02-08 2015-02-03 Dp Technologies, Inc. Human activity monitoring device with activity identification
US7929964B2 (en) * 2007-06-08 2011-04-19 Alcatel-Lucent Usa Inc. Managing mobile station Wi-Fi communications
US8004493B2 (en) 2007-06-08 2011-08-23 Apple Inc. Methods and systems for providing sensory information to devices and peripherals
US8555282B1 (en) 2007-07-27 2013-10-08 Dp Technologies, Inc. Optimizing preemptive operating system with motion sensing
US7800044B1 (en) 2007-11-09 2010-09-21 Dp Technologies, Inc. High ambient motion environment detection eliminate accidental activation of a device
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
WO2009109191A1 (en) * 2008-03-06 2009-09-11 Gn Netcom A/S Headset as hub in remote control system
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
US8285344B2 (en) 2008-05-21 2012-10-09 DP Technlogies, Inc. Method and apparatus for adjusting audio for a user environment
US8996332B2 (en) 2008-06-24 2015-03-31 Dp Technologies, Inc. Program setting adjustments based on activity identification
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
US8204533B2 (en) * 2008-08-07 2012-06-19 Broadcom Corporation Method and system for bluetooth HID activity prediction for wireless coexistence throughput optimization
US8872646B2 (en) 2008-10-08 2014-10-28 Dp Technologies, Inc. Method and system for waking up a device due to motion
US9959870B2 (en) 2008-12-11 2018-05-01 Apple Inc. Speech recognition involving a mobile device
US20100156939A1 (en) * 2008-12-22 2010-06-24 Research In Motion Limited Portable electronic device and method of controlling same
EP2199885A1 (en) 2008-12-22 2010-06-23 Research In Motion Limited Portable electronic device and method of controlling same
US8030914B2 (en) * 2008-12-29 2011-10-04 Motorola Mobility, Inc. Portable electronic device having self-calibrating proximity sensors
US8275412B2 (en) * 2008-12-31 2012-09-25 Motorola Mobility Llc Portable electronic device having directional proximity sensors based on device orientation
KR101572847B1 (en) * 2009-01-09 2015-11-30 삼성전자주식회사 Method and apparatus for motion detecting in portable terminal
US8355031B2 (en) * 2009-03-17 2013-01-15 Harris Corporation Portable electronic devices with adjustable display orientation
US20100271331A1 (en) * 2009-04-22 2010-10-28 Rachid Alameh Touch-Screen and Method for an Electronic Device
JP5359536B2 (en) * 2009-05-07 2013-12-04 富士通モバイルコミュニケーションズ株式会社 Mobile phone and display direction control program for mobile phone
US8304733B2 (en) * 2009-05-22 2012-11-06 Motorola Mobility Llc Sensing assembly for mobile device
US8269175B2 (en) 2009-05-22 2012-09-18 Motorola Mobility Llc Electronic device with sensing assembly and method for detecting gestures of geometric shapes
US8788676B2 (en) 2009-05-22 2014-07-22 Motorola Mobility Llc Method and system for controlling data transmission to or from a mobile device
US8619029B2 (en) 2009-05-22 2013-12-31 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting consecutive gestures
US8391719B2 (en) * 2009-05-22 2013-03-05 Motorola Mobility Llc Method and system for conducting communication between mobile devices
US8294105B2 (en) * 2009-05-22 2012-10-23 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting offset gestures
US8542186B2 (en) 2009-05-22 2013-09-24 Motorola Mobility Llc Mobile device with user interaction capability and method of operating same
US8344325B2 (en) * 2009-05-22 2013-01-01 Motorola Mobility Llc Electronic device with sensing assembly and method for detecting basic gestures
US9529437B2 (en) 2009-05-26 2016-12-27 Dp Technologies, Inc. Method and apparatus for a motion state aware device
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US9431006B2 (en) 2009-07-02 2016-08-30 Apple Inc. Methods and apparatuses for automatic speech recognition
US8319170B2 (en) * 2009-07-10 2012-11-27 Motorola Mobility Llc Method for adapting a pulse power mode of a proximity sensor
US8817048B2 (en) * 2009-07-17 2014-08-26 Apple Inc. Selective rotation of a user interface
JP4823342B2 (en) * 2009-08-06 2011-11-24 株式会社スクウェア・エニックス Portable computer with touch panel display
TWI433525B (en) * 2009-08-12 2014-04-01 Sure Best Ltd Dect wireless hand free communication apparatus
US8854314B2 (en) * 2009-09-29 2014-10-07 Alcatel Lucent Universal interface device with housing sensor array adapted for detection of distributed touch input
EP2325421B1 (en) * 2009-10-22 2017-01-04 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Motor vehicle ignition key, motor vehicle navigation device, motor vehicle system and method
US8665227B2 (en) * 2009-11-19 2014-03-04 Motorola Mobility Llc Method and apparatus for replicating physical key function with soft keys in an electronic device
US8560309B2 (en) 2009-12-29 2013-10-15 Apple Inc. Remote conferencing center
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US8965441B1 (en) 2010-01-22 2015-02-24 Amazon Technologies, Inc. Reducing wireless interference with transmit power level management
US8989792B1 (en) 2010-01-22 2015-03-24 Amazon Technologies, Inc. Using inertial sensors to trigger transmit power management
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
TWM391242U (en) * 2010-04-30 2010-10-21 Chunghwa Picture Tubes Ltd Wireless human machine interface, cloud computing system and portable computer
US8963845B2 (en) 2010-05-05 2015-02-24 Google Technology Holdings LLC Mobile device with temperature sensing capability and method of operating same
US8452037B2 (en) 2010-05-05 2013-05-28 Apple Inc. Speaker clip
US8751056B2 (en) 2010-05-25 2014-06-10 Motorola Mobility Llc User computer device with temperature sensing capabilities and method of operating same
US9103732B2 (en) 2010-05-25 2015-08-11 Google Technology Holdings LLC User computer device with temperature sensing capabilities and method of operating same
US9107040B2 (en) * 2010-09-29 2015-08-11 Apple Inc. Systems, methods, and computer readable media for sharing awareness information
US8644519B2 (en) 2010-09-30 2014-02-04 Apple Inc. Electronic devices with improved audio
US9098437B2 (en) 2010-10-01 2015-08-04 Z124 Cross-environment communication framework
US9047102B2 (en) 2010-10-01 2015-06-02 Z124 Instant remote rendering
US8966379B2 (en) 2010-10-01 2015-02-24 Z124 Dynamic cross-environment application configuration/orientation in an active user environment
US8819705B2 (en) 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US8726294B2 (en) 2010-10-01 2014-05-13 Z124 Cross-environment communication using application space API
CN103262057B (en) 2010-10-01 2016-02-10 Z124 Cross-environment communication framework
US8933949B2 (en) 2010-10-01 2015-01-13 Z124 User interaction across cross-environment applications through an extended graphics context
US8842080B2 (en) 2010-10-01 2014-09-23 Z124 User interface with screen spanning icon morphing
US20120088452A1 (en) * 2010-10-11 2012-04-12 Gn Netcom A/S Method For Locating A Wirelessly Connected Device
US8761831B2 (en) * 2010-10-15 2014-06-24 Z124 Mirrored remote peripheral interface
US8320970B2 (en) 2011-02-16 2012-11-27 Google Inc. Mobile device display management
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US8811648B2 (en) 2011-03-31 2014-08-19 Apple Inc. Moving magnet audio transducer
US9007871B2 (en) 2011-04-18 2015-04-14 Apple Inc. Passive proximity detection
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US8594611B2 (en) 2011-06-22 2013-11-26 Harris Corporation Intrinsically safe portable radio architecture
US20130028443A1 (en) 2011-07-28 2013-01-31 Apple Inc. Devices with enhanced audio
US8994660B2 (en) 2011-08-29 2015-03-31 Apple Inc. Text correction processing
US8989428B2 (en) 2011-08-31 2015-03-24 Apple Inc. Acoustic systems in electronic devices
US20130057571A1 (en) * 2011-09-02 2013-03-07 Nokia Siemens Networks Oy Display Orientation Control
US20130080932A1 (en) 2011-09-27 2013-03-28 Sanjiv Sirpal Secondary single screen mode activation through user interface toggle
CN103091594B (en) * 2011-11-03 2015-11-25 宏达国际电子股份有限公司 The peripheral device of portable electronic equipment, portable electronic equipment connects method for sensing
US20130120106A1 (en) * 2011-11-16 2013-05-16 Motorola Mobility, Inc. Display device, corresponding systems, and methods therefor
US8879761B2 (en) 2011-11-22 2014-11-04 Apple Inc. Orientation-based audio
US8963885B2 (en) 2011-11-30 2015-02-24 Google Technology Holdings LLC Mobile device for interacting with an active stylus
US9063591B2 (en) 2011-11-30 2015-06-23 Google Technology Holdings LLC Active styluses for interacting with a mobile device
US9020163B2 (en) 2011-12-06 2015-04-28 Apple Inc. Near-field null and beamforming
US8903108B2 (en) 2011-12-06 2014-12-02 Apple Inc. Near-field null and beamforming
WO2013095409A1 (en) 2011-12-21 2013-06-27 Intel Corporation Near field communications-triggering for wireless display/docking
US9354748B2 (en) 2012-02-13 2016-05-31 Microsoft Technology Licensing, Llc Optical stylus interaction
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US9426905B2 (en) 2012-03-02 2016-08-23 Microsoft Technology Licensing, Llc Connection device for computing devices
US9298236B2 (en) 2012-03-02 2016-03-29 Microsoft Technology Licensing, Llc Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter
US9158383B2 (en) 2012-03-02 2015-10-13 Microsoft Technology Licensing, Llc Force concentrator
US9064654B2 (en) 2012-03-02 2015-06-23 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US8873227B2 (en) 2012-03-02 2014-10-28 Microsoft Corporation Flexible hinge support layer
US9360893B2 (en) 2012-03-02 2016-06-07 Microsoft Technology Licensing, Llc Input device writing surface
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US8838085B2 (en) 2012-04-26 2014-09-16 Qualcomm Incorporated Use of proximity sensors for interacting with mobile devices
JP2013232804A (en) * 2012-04-27 2013-11-14 Fujitsu Ltd Terminal device, backlight control method, and backlight control program
US20130300590A1 (en) 2012-05-14 2013-11-14 Paul Henry Dietz Audio Feedback
US9280610B2 (en) 2012-05-14 2016-03-08 Apple Inc. Crowd sourcing information to fulfill user requests
US9237601B2 (en) 2012-05-18 2016-01-12 Qualcomm Incorporated Mobile device function selection using position in relation to a user
US9721563B2 (en) 2012-06-08 2017-08-01 Apple Inc. Name recognition system
US10031556B2 (en) 2012-06-08 2018-07-24 Microsoft Technology Licensing, Llc User experience adaptation
US9019615B2 (en) 2012-06-12 2015-04-28 Microsoft Technology Licensing, Llc Wide field-of-view virtual image projector
US8947353B2 (en) 2012-06-12 2015-02-03 Microsoft Corporation Photosensor array gesture detection
US9256089B2 (en) 2012-06-15 2016-02-09 Microsoft Technology Licensing, Llc Object-detecting backlight unit
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9268424B2 (en) 2012-07-18 2016-02-23 Sony Corporation Mobile client device, operation method, recording medium, and operation system
US8964379B2 (en) 2012-08-20 2015-02-24 Microsoft Corporation Switchable magnetic lock
US9360497B2 (en) 2012-08-29 2016-06-07 Blackberry Limited Controlling sensor use on an electronic device
CN102857567B (en) * 2012-09-07 2015-10-21 中科方德软件有限公司 A kind of data transmission system based on internet of things sensors and method
US9576574B2 (en) 2012-09-10 2017-02-21 Apple Inc. Context-sensitive handling of interruptions by intelligent digital assistant
US9547647B2 (en) 2012-09-19 2017-01-17 Apple Inc. Voice-based media searching
US9820033B2 (en) 2012-09-28 2017-11-14 Apple Inc. Speaker assembly
CN103713735B (en) * 2012-09-29 2018-03-16 华为技术有限公司 A kind of method and apparatus that terminal device is controlled using non-contact gesture
US8858271B2 (en) 2012-10-18 2014-10-14 Apple Inc. Speaker interconnect
US9357299B2 (en) 2012-11-16 2016-05-31 Apple Inc. Active protection for acoustic device
WO2014097239A2 (en) * 2012-12-20 2014-06-26 Koninklijke Philips N.V. Automatic pairing of wireless peripherals in a system such as pressure support respiratory therapy system
US8942410B2 (en) 2012-12-31 2015-01-27 Apple Inc. Magnetically biased electromagnet for audio applications
BR112015018905A2 (en) 2013-02-07 2017-07-18 Apple Inc Operation method of voice activation feature, computer readable storage media and electronic device
US9524036B1 (en) * 2013-03-11 2016-12-20 Amazon Technologies, Inc. Motions for displaying additional content
US9282423B2 (en) * 2013-03-13 2016-03-08 Aliphcom Proximity and interface controls of media devices for media presentations
US9368114B2 (en) 2013-03-14 2016-06-14 Apple Inc. Context-sensitive handling of interruptions
US20140267148A1 (en) * 2013-03-14 2014-09-18 Aliphcom Proximity and interface controls of media devices for media presentations
WO2014144579A1 (en) 2013-03-15 2014-09-18 Apple Inc. System and method for updating an adaptive speech recognition model
CN105027197B (en) 2013-03-15 2018-12-14 苹果公司 Training at least partly voice command system
US9304549B2 (en) 2013-03-28 2016-04-05 Microsoft Technology Licensing, Llc Hinge mechanism for rotatable component attachment
US10078372B2 (en) 2013-05-28 2018-09-18 Blackberry Limited Performing an action associated with a motion based input
US9165533B2 (en) 2013-06-06 2015-10-20 Microsoft Technology Licensing, Llc Display rotation management
EP3005669B1 (en) 2013-06-06 2019-04-17 Dolby Laboratories Licensing Corporation Lighting for audio devices
WO2014197336A1 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
WO2014197334A2 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
WO2014197335A1 (en) 2013-06-08 2014-12-11 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
CN105264524B (en) 2013-06-09 2019-08-02 苹果公司 For realizing the equipment, method and graphic user interface of the session continuity of two or more examples across digital assistants
WO2014200731A1 (en) 2013-06-13 2014-12-18 Apple Inc. System and method for emergency calls initiated by voice command
DE102013214576A1 (en) * 2013-07-25 2015-01-29 Bayerische Motoren Werke Aktiengesellschaft Method and device for determining a position by means of a portable motion sensor
CN104423847A (en) * 2013-09-11 2015-03-18 联想(北京)有限公司 Information processing method, electronic device and external device
KR20150030455A (en) * 2013-09-12 2015-03-20 (주)스피치이노베이션컨설팅그룹 A Portable Device and A Method for Controlling the Same
US20150095667A1 (en) * 2013-09-27 2015-04-02 Gregory A. Nielsen Managing component performance
WO2015056038A1 (en) * 2013-10-16 2015-04-23 Sony Corporation Detecting intentional rotation of a mobile device
KR20150059517A (en) * 2013-11-22 2015-06-01 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
KR101839396B1 (en) * 2013-12-28 2018-03-16 인텔 코포레이션 System and method for device action and configuration based on user context detection from sensors in peripheral devices
US9690340B2 (en) 2014-09-25 2017-06-27 Intel Corporation System and method for adaptive thermal and performance management in electronic devices
US9664540B2 (en) * 2014-01-07 2017-05-30 Samsung Electronics Co., Ltd. Device including optical sensor
JP2014082798A (en) * 2014-02-10 2014-05-08 Kyocera Corp Portable terminal device, program and display control method
US9930251B2 (en) * 2014-02-27 2018-03-27 Huawei Device Co., Ltd. Method for quick photographing by mobile terminal, and mobile terminal
US9451354B2 (en) 2014-05-12 2016-09-20 Apple Inc. Liquid expulsion from an orifice
US9620105B2 (en) 2014-05-15 2017-04-11 Apple Inc. Analyzing audio input for efficient speech and music recognition
US9502031B2 (en) 2014-05-27 2016-11-22 Apple Inc. Method for supporting dynamic grammars in WFST-based ASR
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9734193B2 (en) 2014-05-30 2017-08-15 Apple Inc. Determining domain salience ranking from ambiguous words in natural speech
WO2015184186A1 (en) 2014-05-30 2015-12-03 Apple Inc. Multi-command single utterance input method
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US10289433B2 (en) 2014-05-30 2019-05-14 Apple Inc. Domain specific language for encoding assistant dialog
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
TWI584154B (en) * 2014-07-08 2017-05-21 拓連科技股份有限公司 Angle-based item determination methods and systems, and related computer program products
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US9424048B2 (en) 2014-09-15 2016-08-23 Microsoft Technology Licensing, Llc Inductive peripheral retention device
US9606986B2 (en) 2014-09-29 2017-03-28 Apple Inc. Integrated word N-gram and class M-gram language models
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9525943B2 (en) 2014-11-24 2016-12-20 Apple Inc. Mechanically actuated panel acoustic system
US9711141B2 (en) 2014-12-09 2017-07-18 Apple Inc. Disambiguating heteronyms in speech synthesis
EP3056825B1 (en) * 2015-02-11 2019-03-27 Danfoss A/S A thermostatic control device with an orientation sensor
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
FR3035718B1 (en) * 2015-04-28 2017-05-26 Centre Nat D'etudes Spatiales (Cnes) Method for controlling a calculation device via a mobile element and a control system using the same
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US9900698B2 (en) 2015-06-30 2018-02-20 Apple Inc. Graphene composite acoustic diaphragm
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US9858948B2 (en) 2015-09-29 2018-01-02 Apple Inc. Electronic equipment with ambient noise sensing input circuitry
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
CN107172115A (en) * 2016-03-08 2017-09-15 阿里巴巴集团控股有限公司 Data message method of sending and receiving, client, server and system
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
DK201670578A1 (en) 2016-06-09 2018-02-26 Apple Inc Intelligent automated assistant in a home environment
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK179343B1 (en) 2016-06-11 2018-05-14 Apple Inc Intelligent task discovery
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10332518B2 (en) 2017-05-09 2019-06-25 Apple Inc. User interface for correcting recognition errors
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
US10410637B2 (en) 2017-05-12 2019-09-10 Apple Inc. User-specific acoustic models
US10482874B2 (en) 2017-05-15 2019-11-19 Apple Inc. Hierarchical belief states for digital assistants
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
US10445429B2 (en) 2017-09-21 2019-10-15 Apple Inc. Natural language understanding using vocabularies with compressed serialized tries
US10403283B1 (en) 2018-06-01 2019-09-03 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US10504518B1 (en) 2018-06-03 2019-12-10 Apple Inc. Accelerated task performance
US10491734B1 (en) * 2019-01-04 2019-11-26 Faraday&Future Inc. User-friendly vehicle bluetooth pairing scheme

Family Cites Families (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4081623A (en) * 1976-11-15 1978-03-28 Bio-Systems Research, Inc. Sight operated telephone and machine controller
JPS6395661A (en) 1986-10-13 1988-04-26 Toshiba Corp Semiconductor element electrode
US6262769B1 (en) * 1997-07-31 2001-07-17 Flashpoint Technology, Inc. Method and system for auto rotating a graphical user interface for managing portrait and landscape images in an image capture unit
US6160540A (en) * 1998-01-12 2000-12-12 Xerox Company Zoomorphic computer user interface
US6078825A (en) * 1998-02-20 2000-06-20 Advanced Mobile Solutions, Inc. Modular wireless headset system for hands free talking
JP2000122635A (en) * 1998-10-09 2000-04-28 Victor Co Of Japan Ltd Screen control device
JP3739951B2 (en) 1998-11-25 2006-01-25 東芝電子エンジニアリング株式会社 Semiconductor light emitting device and manufacturing method thereof
US6424410B1 (en) * 1999-08-27 2002-07-23 Maui Innovative Peripherals, Inc. 3D navigation system using complementary head-mounted and stationary infrared beam detection units
US8120625B2 (en) 2000-07-17 2012-02-21 Microsoft Corporation Method and apparatus using multiple sensors in a device with a display
US7688306B2 (en) 2000-10-02 2010-03-30 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US6520013B1 (en) 2000-10-02 2003-02-18 Apple Computer, Inc. Method and apparatus for detecting free fall
IL138831A (en) 2000-10-03 2007-07-24 Rafael Advanced Defense Sys Gaze-actuated information system
JP2002132806A (en) 2000-10-18 2002-05-10 Fujitsu Ltd Server system, and information providing service system and method
US7305256B2 (en) * 2001-02-05 2007-12-04 Verizon Corporate Services Group Inc. Method, apparatus and program for providing user-selected alerting signals in telecommunications devices
US7532901B1 (en) 2001-03-16 2009-05-12 Radeum, Inc. Methods and apparatus to detect location and orientation in an inductive system
AU2002247355A1 (en) 2001-03-16 2002-10-03 Aura Communications, Inc. Techniques for inductive communication systems
US6798429B2 (en) 2001-03-29 2004-09-28 Intel Corporation Intuitive mobile device interface to virtual spaces
US7024228B2 (en) * 2001-04-12 2006-04-04 Nokia Corporation Movement and attitude controlled mobile station control
US6583676B2 (en) 2001-06-20 2003-06-24 Apple Computer, Inc. Proximity/touch detector and calibration circuit
DE10148010A1 (en) 2001-09-28 2003-04-24 Siemens Ag Method for controlling functions on a mobile telephone or on an audio replay device, uses acoustic signals like voice and a sensor on a user's head and on the mobile telephone
US7345671B2 (en) 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
KR20030048303A (en) * 2001-12-12 2003-06-19 주식회사 하빈 Digital audio player enabling auto-adaptation to the environment
DE10202110A1 (en) * 2002-01-21 2003-04-24 Siemens Ag Method for teaching the use of a data terminal device, e.g. a mobile phone, so that it can be controlled by head gestures, whereby in response to a user's head movement audible information is generated from an apparent direction
US20040192225A1 (en) * 2002-09-04 2004-09-30 Mahn Roger C. Method of branding the speaker of a wireless device
JP2004219791A (en) * 2003-01-16 2004-08-05 Matsushita Electric Ind Co Ltd Mobile display equipment
US7627343B2 (en) 2003-04-25 2009-12-01 Apple Inc. Media player system
US7027840B2 (en) * 2003-09-17 2006-04-11 Motorola, Inc. Method and apparatus of muting an alert
US7085590B2 (en) 2003-12-31 2006-08-01 Sony Ericsson Mobile Communications Ab Mobile terminal with ergonomic imaging functions
JP2005277452A (en) 2004-03-22 2005-10-06 Nec Corp Portable electronic apparatus and its display switching method
US7532197B2 (en) 2004-06-22 2009-05-12 Lenovo (Singapore) Pte Ltd. Method and system for automated monitoring of a display
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
FI20045300A (en) * 2004-08-17 2006-02-18 Nokia Corp Electronic device and procedure for controlling the functions of the electronic device and software product for implementing the procedure
US20060052109A1 (en) 2004-09-07 2006-03-09 Ashman William C Jr Motion-based user input for a wireless communication device
US7382353B2 (en) 2004-11-18 2008-06-03 International Business Machines Corporation Changing a function of a device based on tilt of the device for longer than a time period
US7787012B2 (en) * 2004-12-02 2010-08-31 Science Applications International Corporation System and method for video image registration in a heads up display
JP4679194B2 (en) * 2005-03-23 2011-04-27 Necカシオモバイルコミュニケーションズ株式会社 Image processing apparatus and program thereof
US8331603B2 (en) 2005-06-03 2012-12-11 Nokia Corporation Headset
US7599044B2 (en) 2005-06-23 2009-10-06 Apple Inc. Method and apparatus for remotely detecting presence
US20070004451A1 (en) * 2005-06-30 2007-01-04 C Anderson Eric Controlling functions of a handheld multifunction device
US8045727B2 (en) * 2005-09-30 2011-10-25 Atmel Corporation Headset power management
US8112125B2 (en) 2006-11-10 2012-02-07 At&T Intellectual Property I, Lp Voice activated dialing for wireless headsets
US8006002B2 (en) 2006-12-12 2011-08-23 Apple Inc. Methods and systems for automatic configuration of peripherals
WO2008154408A1 (en) * 2007-06-06 2008-12-18 Tobey Wayland E Modular hybrid snake arm
US8004493B2 (en) 2007-06-08 2011-08-23 Apple Inc. Methods and systems for providing sensory information to devices and peripherals

Also Published As

Publication number Publication date
US8830169B2 (en) 2014-09-09
US20120162066A1 (en) 2012-06-28
CN101681185B (en) 2015-07-08
US8004493B2 (en) 2011-08-23
DE112008004269A5 (en) 2013-08-08
DE112008001600T5 (en) 2010-05-12
KR101312899B1 (en) 2013-09-30
CN101681185A (en) 2010-03-24
JP2010529552A (en) 2010-08-26
US8619050B2 (en) 2013-12-31
KR101204535B1 (en) 2012-11-23
WO2008153639A1 (en) 2008-12-18
EP2237130A1 (en) 2010-10-06
US20080303681A1 (en) 2008-12-11
US20110273475A1 (en) 2011-11-10
KR20120056901A (en) 2012-06-04
KR20100036305A (en) 2010-04-07
EP2156267A1 (en) 2010-02-24

Similar Documents

Publication Publication Date Title
EP2244169B1 (en) Mobile terminal capable of providing multi-haptic effect and method of controlling the mobile terminal
US8935637B2 (en) Mobile terminal and method for operating the mobile terminal
EP2770400B1 (en) Multi-functional hand-held device
US10387020B2 (en) Display device, corresponding systems, and methods therefor
US9098069B2 (en) Display device, corresponding systems, and methods for orienting output on a display
US9225811B2 (en) Mobile terminal and method for controlling the same
KR101685363B1 (en) Mobile terminal and operation method thereof
CN1182693C (en) Portable telephone set
US8565829B2 (en) Mobile terminal with detachably coupled sub-device
KR101549558B1 (en) Mobile terminal and control method thereof
JP6407206B2 (en) Cover attachment with flexible display
CN102232211B (en) Handheld terminal device user interface automatic switching method and handheld terminal device
JP2014139811A (en) Sending parameter based on screen size or screen resolution of multi-panel electronic device to server
KR20110028834A (en) Method and apparatus for providing user interface using touch pressure on touch screen of mobile station
US8503932B2 (en) Portable communication device and remote motion input device
KR20110080348A (en) Mobile terminal, mobile terminal system and operation control method thereof
KR101668240B1 (en) Mobile terminal and operation control method thereof
AU2006218381B2 (en) Multi-functional hand-held device
US8279174B2 (en) Display device and method of controlling the display device
EP2283407B1 (en) Portable computer with multiple display configurations
CN104104769B (en) Dynamic routing of audio among multiple audio devices
US20110143769A1 (en) Dual display mobile communication device
US20080167071A1 (en) User Programmable Switch
US8130200B2 (en) Combination thumb keyboard and mouse
KR101629645B1 (en) Mobile Terminal and Operation method thereof

Legal Events

Date Code Title Description
RD03 Notification of appointment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7423

Effective date: 20100701

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20100709

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A821

Effective date: 20100709

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120131

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120518

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120817

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130329

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130627

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130906

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20131003

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250