US20130022220A1 - Wearable Computing Device with Indirect Bone-Conduction Speaker - Google Patents
Wearable Computing Device with Indirect Bone-Conduction Speaker Download PDFInfo
- Publication number
- US20130022220A1 US20130022220A1 US13/269,935 US201113269935A US2013022220A1 US 20130022220 A1 US20130022220 A1 US 20130022220A1 US 201113269935 A US201113269935 A US 201113269935A US 2013022220 A1 US2013022220 A1 US 2013022220A1
- Authority
- US
- United States
- Prior art keywords
- wearer
- side section
- vibration transducer
- support structure
- vibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/02—Casings; Cabinets ; Supports therefor; Mountings therein
- H04R1/028—Casings; Cabinets ; Supports therefor; Mountings therein associated with devices performing functions other than acoustics, e.g. electric candles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0176—Head mounted characterised by mechanical features
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1058—Manufacture or assembly
- H04R1/1066—Constructional aspects of the interconnection between earpiece and earpiece support
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R5/00—Stereophonic arrangements
- H04R5/02—Spatial or constructional arrangements of loudspeakers
- H04R5/023—Spatial or constructional arrangements of loudspeakers in a chair, pillow
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2460/00—Details of hearing devices, i.e. of ear- or headphones covered by H04R1/10 or H04R5/033 but not provided for in any of their subgroups, or of hearing aids covered by H04R25/00 but not provided for in any of its subgroups
- H04R2460/13—Hearing devices using bone conduction transducers
Definitions
- Computing devices such as personal computers, laptop computers, tablet computers, cellular phones, and countless types of Internet-capable devices are increasingly prevalent in numerous aspects of modern life. Over time, the manner in which these devices are providing information to users is becoming more intelligent, more efficient, more intuitive, and/or less obtrusive.
- wearable computing The trend toward miniaturization of computing hardware, peripherals, as well as of sensors, detectors, and image and audio processors, among other technologies, has helped open up a field sometimes referred to as “wearable computing.”
- wearable displays In the area of image and visual processing and production, in particular, it has become possible to consider wearable displays that place a very small image display element close enough to a wearer's (or user's) eye(s) such that the displayed image fills or nearly fills the field of view, and appears as a normal sized image, such as might be displayed on a traditional image display device.
- the relevant technology may be referred to as “near-eye displays.”
- Near-eye displays are fundamental components of wearable displays, also sometimes called “head-mounted displays” (HMDs).
- a head-mounted display places a graphic display or displays close to one or both eyes of a wearer.
- a computer processing system may be used to generate the images on a display.
- Such displays may occupy a wearer's entire field of view, or only occupy part of wearer's field of view.
- head-mounted displays may be as small as a pair of glasses or as large as a helmet.
- an exemplary wearable-computing system may include: (a) one or more optical elements; (b) a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements; (c) an audio interface configured to receive an audio signal; and (d) at least one vibration transducer located on the at least one side section, wherein the at least one vibration transducer is configured to vibrate at least a portion of the support structure based on the audio signal.
- the vibration transducer is configured such that when the support structure is worn, the vibration transducer vibrates the support structure without directly vibrating a wearer.
- the support structure is configured such that when worn, the support structure vibrationally couples to a bone structure of the wearer.
- an exemplary wearable-computing system may include: (a) a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements; (b) a means for receiving an audio signal; and (c) a means for vibrating at least a portion of the support structure based on the audio signal, wherein the means for vibrating is located on the at least one side section.
- the means for vibrating is configured such that when the support structure is worn, the means for vibrating vibrates the support structure without directly vibrating a wearer.
- the support structure is configured such that when worn, the support structure vibrationally couples to a bone structure of the wearer.
- FIG. 1 illustrates a wearable computing system according to an exemplary embodiment.
- FIG. 2 illustrates an alternate view of the wearable computing system of FIG. 1 .
- FIG. 3 illustrates an exemplary schematic drawing of a wearable computing system.
- FIG. 4 is a simplified illustration of a head-mounted display configured for indirect bone-conduction audio, according to an exemplary embodiment.
- FIG. 5 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- FIG. 6 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- the disclosure generally involves a wearable computing system with a head-mounted display (HMD), and in particular, an HMD having at least one vibration transducer that functions as a speaker.
- An exemplary HMD may employ vibration transducers that are commonly referred to as bone-conduction transducers.
- standard applications of bone-conduction transducers involve direct transfer of sound to the inner ear by attaching the transducer directly to the bone (or a pad that is adjacent to the bone).
- An exemplary HMD may include a bone-conduction transducer (or another type of vibration transducer) that transfers sound to the wearer's ear via “indirect bone conduction.”
- an exemplary HMD may include a vibration transducer that does not vibrationally couple to wearer's bone structure (e.g., a vibration transducer that is located so as to avoid substantial contact with the wearer when the HMD is worn). Instead, the vibration transducer is configured to vibrate the frame of the HMD. The HMD frame is in turn vibrationally coupled to the wearer's bone structure. As such, the HMD frame transfers vibration to the wearer's bone structure such that sound is perceived in the wearer's inner ear. In this arrangement, the vibration transducer does not directly vibrate the wearer, and thus may be said to function as an “indirect” bone conduction speaker.
- the vibration transducer may be placed at a location on the HMD that does not contact the wearer.
- a vibration transducer may be located on a side-arm of the HMD, near where the side-arm connects to the front of the HMD.
- the HMD may be configured such that when worn, there is space (e.g., air) between the portion of the HMD where the vibration transducer is located and the wearer. As such, the portion of the HMD that contacts and vibrationally couples to the wearer may be located away from the vibration transducer.
- the frame may transmit the audio signal through the air as well.
- the airborne audio signal may be heard by the wearer, and may actually enhance the sound perceived via indirect bone conduction. At the same time, this airborne audio signal may be much quieter than airborne audio signals emanating by traditional diaphragm speakers, and thus may provide more privacy to the wearer.
- one or more couplers may be attached to the HMD frame to enhance the fit of the HMD to the wearer and help transfer of vibrations from the frame to the wearer's bone structure.
- a fitting piece which may be moldable and/or made of rubber or silicone gel, for example, may be attached to the HMD frame.
- the fitting piece may be attached to the HMD frame in various ways. For instance, a fitting piece may be located behind the wearer's temple and directly above their ear, or in the pit behind the wearer's ear lobe, among other locations.
- an exemplary system may be implemented in or may take the form of a wearable computer (i.e., a wearable-computing device).
- a wearable computer takes the form of or includes an HMD.
- an exemplary system may also be implemented in or take the form of other devices, such as a mobile phone, among others.
- an exemplary system may take the form of non-transitory computer readable medium, which has program instructions stored thereon that are executable by at a processor to provide the functionality described herein.
- An exemplary, system may also take the form of a device such as a wearable computer or mobile phone, or a subsystem of such a device, which includes such a non-transitory computer readable medium having such program instructions stored thereon.
- an HMD may generally be any display device that is worn on the head and places a display in front of one or both eyes of the wearer.
- An HMD may take various forms such as a helmet or eyeglasses.
- references to “eyeglasses” herein should be understood to refer to an HMD that generally takes the form of eyeglasses. Further, features and functions described in reference to “eyeglasses” herein may apply equally to any other kind of HMD.
- FIG. 1 illustrates a wearable computing system according to an exemplary embodiment.
- the wearable computing system is shown in the form of eyeglass 102 .
- eyeglasses 102 include a support structure that is configured to support the one or more optical elements.
- the support structure of an exemplary HMD may include a front section and at least one side section.
- the support structure has a front section that includes lens-frames 104 and 106 and a center frame support 108 .
- side-arms 114 and 116 serve as a first and a second side section of the support structure for eyeglasses 102 .
- the front section and the at least one side section may vary in form, depending upon the implementation.
- the support structure of an exemplary HMD may also be referred to as the “frame” of the HMD.
- the support structure of eyeglasses 102 which includes lens-frames 104 and 106 , center frame support 108 , and side-arms 114 and 116 , may also be referred to as the “frame” of eyeglasses 102 .
- the frame of the eyeglasses 102 may function to secure eyeglasses 102 to a user's face via a user's nose and ears. More specifically, the side-arms 114 and 116 are each projections that extend away from the frame elements 104 and 106 , respectively, and are positioned behind a user's ears to secure the eyeglasses 102 to the user. The side-arms 114 and 116 may further secure the eyeglasses 102 to the user by extending around a rear portion of the user's head. Additionally or alternatively, for example, the system 100 may connect to or be affixed within a head-mounted helmet structure. Other possibilities exist as well.
- each of the frame elements 104 , 106 , and 108 and the side-arms 114 and 116 may be formed of a solid structure of plastic or metal, or may be formed of a hollow structure of similar material so as to allow wiring and component interconnects to be internally routed through the eyeglasses 102 .
- Other materials or combinations of materials are also possible.
- the size, shape, and structure of eyeglasses 102 , and the components thereof, may vary depending upon the implementation.
- each of the optical elements 110 and 112 may be formed of any material that can suitably display a projected image or graphic.
- Each of the optical elements 110 and 112 may also be sufficiently transparent to allow a user to see through the optical element. Combining these features of the optical elements can facilitate an augmented reality or heads-up display where the projected image or graphic is superimposed over a real-world view as perceived by the user through the optical elements.
- the system 100 may also include an on-board computing system 118 , a video camera 120 , a sensor 122 , and finger-operable touchpads 124 , 126 .
- the on-board computing system 118 is shown to be positioned on the side-arm 114 of the eyeglasses 102 ; however, the on-board computing system 118 may be provided on other parts of the eyeglasses 102 .
- the on-board computing system 118 may include a processor and memory, for example.
- the on-board computing system 118 may be configured to receive and analyze data from the video camera 120 and the finger-operable touchpads 124 , 126 (and possibly from other sensory devices, user interfaces, or both) and generate images for output from the optical elements 110 and 112 .
- the video camera 120 is shown to be positioned on the side-arm 114 of the eyeglasses 102 ; however, the video camera 120 may be provided on other parts of the eyeglasses 102 .
- the video camera 120 may be configured to capture images at various resolutions or at different frame rates. Many video cameras with a small form-factor, such as those used in cell phones or webcams, for example, may be incorporated into an example of the system 100 .
- FIG. 1 illustrates one video camera 120 , more video cameras may be used, and each may be configured to capture the same view, or to capture different views.
- the video camera 120 may be forward facing to capture at least a portion of the real-world view perceived by the user. This forward facing image captured by the video camera 120 may then be used to generate an augmented reality where computer generated images appear to interact with the real-world view perceived by the user.
- the sensor 122 is shown mounted on the side-arm 116 of the eyeglasses 102 ; however, the sensor 122 may be provided on other parts of the eyeglasses 102 .
- the sensor 122 may include one or more of a gyroscope or an accelerometer, for example. Other sensing devices may be included within the sensor 122 or other sensing functions may be performed by the sensor 122 .
- sensors such as sensor 122 may be configured to detect head movement by a wearer of eyeglasses 102 .
- a gyroscope and/or accelerometer may be arranged to detect head movements, and may be configured to output head-movement data. This head-movement data may then be used to carry out functions of an exemplary method, such as method 100 , for instance.
- the finger-operable touchpads 124 , 126 are shown mounted on the side-arms 114 , 116 of the eyeglasses 102 . Each of finger-operable touchpads 124 , 126 may be used by a user to input commands.
- the finger-operable touchpads 124 , 126 may sense at least one of a position and a movement of a finger via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities.
- the finger-operable touchpads 124 , 126 may be capable of sensing finger movement in a direction parallel or planar to the pad surface, in a direction normal to the pad surface, or both, and may also be capable of sensing a level of pressure applied.
- the finger-operable touchpads 124 , 126 may be formed of one or more transparent or transparent insulating layers and one or more transparent or transparent conducting layers. Edges of the finger-operable touchpads 124 , 126 may be formed to have a raised, indented, or roughened surface, so as to provide tactile feedback to a user when the user's finger reaches the edge of the finger-operable touchpads 124 , 126 . Each of the finger-operable touchpads 124 , 126 may be operated independently, and may provide a different function.
- FIG. 2 illustrates an alternate view of the wearable computing system of FIG. 1 .
- the optical elements 110 and 112 may act as display elements.
- the eyeglasses 102 may include a first projector 128 coupled to an inside surface of the side-arm 116 and configured to project a display 130 onto an inside surface of the optical element 112 .
- a second projector 132 may be coupled to an inside surface of the side-arm 114 and configured to project a display 134 onto an inside surface of the optical element 110 .
- the optical elements 110 and 112 may act as a combiner in a light projection system and may include a coating that reflects the light projected onto them from the projectors 128 and 132 . In some embodiments, a special coating may not be used (e.g., when the projectors 128 and 132 are scanning laser devices).
- the optical elements 110 , 112 themselves may include: a transparent or semi-transparent matrix display, such as an electroluminescent display or a liquid crystal display, one or more waveguides for delivering an image to the user's eyes, or other optical elements capable of delivering an in focus near-to-eye image to the user.
- a corresponding display driver may be disposed within the frame elements 104 and 106 for driving such a matrix display.
- a laser or LED source and scanning system could be used to draw a raster display directly onto the retina of one or more of the user's eyes. Other possibilities exist as well.
- FIGS. 1 and 2 show two touchpads and two display elements, it should be understood that many exemplary methods and systems may be implemented in wearable computing devices with only one touchpad and/or with only one optical element having a display element. It is also possible that exemplary methods and systems may be implemented in wearable computing devices with more than two touchpads.
- FIG. 3 illustrates an exemplary schematic drawing of a wearable computing system.
- a computing device 138 communicates using a communication link 140 (e.g., a wired or wireless connection) to a remote device 142 .
- the computing device 138 may be any type of device that can receive data and display information corresponding to or associated with the data.
- the computing device 138 may be a heads-up display system, such as the eyeglasses 102 described with reference to FIGS. 1 and 5 .
- the computing device 138 may include a display system 144 comprising a processor 146 and a display 148 .
- the display 148 may be, for example, an optical see-through display, an optical see-around display, or a video see-through display.
- the processor 146 may receive data from the remote device 142 , and configure the data for display on the display 148 .
- the processor 146 may be any type of processor, such as a micro-processor or a digital signal processor, for example.
- the computing device 138 may further include on-board data storage, such as memory 150 coupled to the processor 146 .
- the memory 150 may store software that can be accessed and executed by the processor 146 , for example.
- the remote device 142 may be any type of computing device or transmitter including a laptop computer, a mobile telephone, etc., that is configured to transmit data to the device 138 .
- the remote device 142 and the device 138 may contain hardware to enable the communication link 140 , such as processors, transmitters, receivers, antennas, etc.
- the communication link 140 is illustrated as a wireless connection; however, wired connections may also be used.
- the communication link 140 may be a wired link via a serial bus such as a universal serial bus or a parallel bus.
- a wired connection may be a proprietary connection as well.
- the communication link 140 may also be a wireless connection using, e.g., Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), Cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), or Zigbee® technology, among other possibilities.
- the remote device 142 may be accessible via the Internet and may comprise a computing cluster associated with a particular web service (e.g., social-networking, photo sharing, address book, etc.).
- FIG. 4 is a simplified illustration of an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- HMD 400 includes two optical elements 402 , 404 .
- the frame of HMD 400 includes two side arms 408 -L and 408 -R, two lens-frames 409 -L and 409 -R, and a nose bridge 407 .
- the nose bridge 407 and side arms 408 -L and 408 -R are arranged to fit behind a wearer's ears and hold the optical elements 402 and 404 in front of the wearer's eyes via attachments to the lens-frames 409 -L and 409 -R.
- HMD 400 may include various audio sources, from which audio signals may be acquired.
- HMD 400 includes a microphone 410 .
- HMD 400 may additionally or alternatively an internal audio playback device.
- an on-board computing system (not shown) may be configured to play digital audio files.
- HMD 400 may be configured to acquire an audio signal from an auxiliary audio playback device 412 , such as a portable digital audio player, smartphone, home stereo, car stereo, and/or personal computer.
- auxiliary audio playback device 412 such as a portable digital audio player, smartphone, home stereo, car stereo, and/or personal computer.
- Other audio sources are also possible.
- An exemplary HMD may also include one or more audio interfaces for receiving audio signals from various audio sources, such as those described above.
- HMD 400 may include an interface for receiving an audio signal from microphone 410 .
- HMD 400 may include an interface 411 for receiving an audio signal from auxiliary audio playback device 412 (e.g., an “aux in” input).
- the interface to the auxiliary audio playback device 412 may be a tip, ring, sleeve (TRS) connector, or may take another form.
- HMD 412 may additionally or alternatively include an interface to an internal audio playback device.
- an on-board computing system (not shown) may be configured to process digital audio files and output audio signals to a speaker or speakers. Other audio interfaces are also possible.
- HMD 400 also includes a vibration transducer 414 located on side arm 408 -R, which functions as an indirect bone-conduction speaker.
- a vibration transducer 414 located on side arm 408 -R, which functions as an indirect bone-conduction speaker.
- BCTs bone-conduction transducers
- Any component that is arranged to vibrate the HMD 400 may be incorporated as a vibration transducer, without departing from the scope of the invention.
- Vibration transducer 414 is configured to vibrate at least a portion of the eyeglass frame 406 based on an audio signal received via one of the audio interfaces.
- the side arm 408 -R is configured such that when a user wears HMD 400 , an inner wall of a first portion of side arm 408 -R contacts the wearer so as to vibrationally couple to a bone structure of the wearer.
- side arm 408 -R may contact the wearer at or near where the side-arm is placed between the wearer's ear and the side of the wearer's head, such as at the wearer's mastoid. Other points of contact are also possible.
- Eyeglass frame 406 may be arranged such that when a user wears HMD 400 , the eyeglass frame contacts the wearer. As such, when the vibration transducer 414 vibrates the eyeglass frame 406 , the eyeglass frame can transfer vibration to the bone structure of the wearer. In particular, vibration of the eyeglass frame 406 can be transferred at areas where the eyeglass frame contacts the wearer directly. For instance, the eyeglass frame 406 may transfer vibration, via contact points on the wearer's ear, the wearer's nose, the wearer's temple, the wearer's eyebrow, or any other point where the eyeglass frame 406 directly contacts the wearer.
- vibration transducer 414 is located on a second portion of the side-arm 408 -R, away from the first portion of the side-arm 408 -R that vibrationally couples to wearer. Further, in an exemplary embodiment, vibration transducer 414 vibrates the support structure without directly vibrating the wearer. To achieve this result, the second portion of the side-arm 408 -R, at which vibration transducer 414 is located, may have an inner wall that does not contact the wearer.
- This configuration may leave a space between the second portion of the side-arm 408 -R and the wearer, such that the vibration transducer indirectly vibrates the wearer by transferring vibration from the second portion to the first portion of the side-arm 408 -R, which in turn is vibrationally coupled to the wearer.
- the spacing of the vibration transducer may be accomplished by housing the transducer in or attaching the transducer to the side-arm in various ways.
- the vibration transducer 414 is attached to and protruding from the exterior wall of side arm 408 -R.
- a vibration transducer may be attached to an inner wall of the side arm (while still configured to leave space between the vibration transducer and the wearer).
- a vibration transducer may be enclosed within a side arm having the vibration transducer.
- a vibration transducer may be partially or wholly embedded in an exterior or interior wall of a side arm.
- vibration transducers may be arranged in other locations on or within side-arm 408 -L and/or 408 -R. Additionally, vibration transducers may be arranged on or within other parts of the frame, such as the nose bridge sensor 407 and/or lens frames 409 -L and 409 -R.
- the location of the vibration transducer may enhance the vibration of the side-arm 408 -R.
- side-arm 408 -R may contact and be held in place by the lens-frame 408 -R on one end, and may contact and be held in place by the wearer's ear on the other end (e.g., at the wearer's mastoid), allowing the portion of the side-arm between these points of contact to vibrate more freely. Therefore, placing vibration transducer 414 between these points of contact may help the transducer vibrate the side-arm 408 -R more efficiently. This in turn may result in more efficient transfer of vibration from the eyeglass frame to the wearer's bone structure.
- vibrations from the vibration transducer may also be transmitted through the air, and thus may be heard by the wearer over the air.
- the user may perceive the sound from vibration transducer 414 using both tympanic hearing and bone-conduction hearing.
- the sound that is transmitted through the air and perceived using tympanic hearing may complement the sound perceived via bone-conduction hearing.
- the sound transmitted through the air may enhance the sound perceived by the wearer, the sound transmitted through the air may be unintelligible to others nearby.
- the sound transmitted through the air by the vibration transducer may be inaudible (possibly depending upon the volume level).
- FIG. 5 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- FIG. 5 shows an HMD 500 , in which a vibration transducer 514 is arranged on an outer wall of side-arm 508 -R.
- a vibration transducer 514 is arranged on an outer wall of side-arm 508 -R.
- a portion of side-arm 508 -R on which vibration transducer 514 is located is proximate to the pit behind the ear lobe of the wearer.
- vibration transducer 514 When located as such, a gap may exist between the wearer and the portion of side-arm 508 -R to which vibration transducer 514 is attached. As a result, driving vibration transducer 514 with an audio signal vibrates side-arm 508 -R. While the portion of side-arm 508 —to which vibration transducer 514 is attached does not contact the wearer, side-arm 508 -R may contact the wearer elsewhere. In particular, side-arm 508 -R may contact the wearer in between the ear and the head, such as at location 516 , for example. Accordingly, vibrations of side-arm 508 -L may be transferred to a wearer's bone structure at location 516 . Therefore, by vibrating side-arm 508 -R, which in turn vibrates the wearer's bone structure, vibration transducer 514 may serve as an indirect bone conduction speaker.
- FIG. 6 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- FIG. 6 shows an HMD 600 , which includes two vibration transducers 614 and 615 .
- vibration transducers 614 and 615 are arranged on the side arms 608 -L and 608 -R, respectively.
- vibration transducers 614 and 615 are typically located on portions of side-arm 608 -L and 608 -R, respectively, which are proximate to a wearer's left and right temple, respectively.
- vibration transducer 614 may be located on the outer wall of side-arm 608 -L, in between the wearer's left eye and left ear.
- vibration transducer 615 may be located on the outer wall of side-arm 608 -R, in between the wearer's right eye and right ear.
- an exemplary HMD may include one or more couplers arranged on the HMD. These couplers may help to enhance the fit of the HMD frame to the wearer, such that the HMD fits in a more secure and/or a more comfortable manner. Further, in an exemplary embodiment, these couplers may help to more efficiently transfer vibration of the HMD frame to the bone structure of the wearer.
- HMD 600 includes two fitting pieces 616 -L and 616 -R.
- fitting piece 616 -L is located on an inner wall of side-arm 608 -L and extends from a portion of the inner wall that is proximate to the pit behind the left ear lobe of the wearer. Configured as such, fitting piece 616 -L may serve to fill any gap between the wearer's body and the side-arm 608 -L.
- fitting piece 616 -R may be configured in a similar manner as fitting piece 616 -L, but with respect to the right side of the wearer's body.
- the fitting pieces or any type of couplers may be attached to, embedded in, and/or enclosed in the HMD frame at various locations.
- fitting pieces may be located in various locations so as to fill space between an HMD frame and wearer's body, and thus help the HMD frame vibrate the wearer's bone structure.
- a fitting piece may be configured to contact a wearer's ear, nose, temple, eyebrow, nose (e.g., at the bridge of the nose), or neck (e.g., at the pit behind the ear lobe), among other locations.
- an exemplary embodiment may include only one coupler, or may include two or more couplers. Further, it should be understood that an exemplary embodiment need not include any couplers.
- an HMD may be configured with multiple vibration transducers, which may be individually customizable. For instance, as the fit of an HMD may vary from user-to-user, the volume of speakers may be adjusted individually to better suit a particular user.
- an HMD frame may contact different users in different locations, such that a behind-ear vibration transducer (e.g., vibration transducer 514 of FIG. 5 ) may provide more-efficient indirect bone conduction for a first user, while a vibration transducer located near the temple (e.g., vibration transducer 414 of FIG. 4 ) may provide more-efficient indirect bone conduction for a second user.
- a behind-ear vibration transducer e.g., vibration transducer 514 of FIG. 5
- a vibration transducer located near the temple e.g., vibration transducer 414 of FIG. 4
- an HMD may be configured with one or more behind-ear vibration transducers and one or more vibration transducers near the temple, which are individually adjustable.
- the first user may choose to lower the volume or turn off the vibration transducers near the temple, while the second user may choose to lower the volume or turn off the behind-ear vibration transducers.
- Other examples are also possible.
- different vibration transducers may be driven by different audio signals.
- a first vibration transducer may be configured to vibrate a left side-arm of an HMD based on a first audio signal
- a second vibration transducer may be configured to vibrate a second portion of the support structure based on a second audio signal.
- the above configuration may be used to deliver stereo sound.
- two individual vibration transducers (or possibly two groups of vibration transducers) may be driven by separate left and right audio signals.
- vibration transducer 614 -L may vibrate side-arm 608 -L based on a “left” audio signal
- vibration transducer 614 -R may vibrate side-arm 608 -R based on a “right” audio signal.
- Other examples of vibration transducers configured for stereo sound are also possible.
- an HMD may include more than two vibration transducers (or possibly more than two groups of vibration transducers), which each are driven by a different audio signal.
- multiple vibration transducers may be individually driven by different audio signals in order to provide a surround sound experience.
- vibrations transducers may be configured for different purposes, and thus driven by different audio signals.
- one or more vibrations transducers may be configured to deliver music, while another vibration transducer may be configured for voice (e.g., for phone calls, speech-based system messages, etc.).
- voice e.g., for phone calls, speech-based system messages, etc.
- Other examples are also possible.
- an exemplary HMD may include one or more vibration dampeners that are configured to substantially isolate vibration of a particular vibration transducer or transducers.
- a first vibration transducer may be configured to vibrate a left side-arm based on a “left” audio signal
- a second vibration transducer may be configured to vibrate a right side-arm based on a second audio signal.
- one or more vibration transducers may be configured to (a) substantially reduce vibration of the right arm by the first vibration transducer and (b) substantially reduce vibration of the left arm by the second vibration transducer. By doing so, the left audio signal may be substantially isolated on the left arm, while the right audio signal may be substantially isolated on the right arm.
- Vibration dampeners may vary in location on an HMD. For instance, continuing the above example, a first vibration dampener may be coupled to the left side-arm and a second vibration dampener may be coupled to the right side-arm, so as to substantially isolate the vibrational coupling of the first vibration transducer to the left side-arm and vibrational coupling of the second vibration transducer to the second right side-arm. To do so, the vibration dampener or dampeners on a given side-arm may be attached at various locations along the side-arm. For instance, referring to FIG. 4 , vibration dampeners may be attached at or near where side-arms 408 -L and 408 -R are attached to lens-frames 409 -L and 409 -R, respectively.
- vibration transducers may be located on the left and right lens-frames, as illustrated in FIG. 6 by vibration transducers 618 -L and 618 -R.
- HMD 600 may include vibration dampeners (not shown) that help to isolate vibration of the left side of HMD 600 from the right side of HMD 600 .
- vibration dampeners may be attached at or near to where lens-frames 609 -L and 609 -R couple to nose bridge 607 .
- a vibration dampener may be located or attached to nose bridge 607 , in order to help prevent: (a) vibration transducer 618 -L from vibrating the right side of HMD 600 (e.g., lens frame 609 -R and/or side-arm 608 -R) and (b) vibration transducer 618 -R from vibrating the left side of HMD 600 (e.g., lens frame 609 -L and/or side-arm 608 -R).
- vibration dampeners may vary in size and/or shape, depending upon the particular implementation. Further, vibration dampeners may be attached to, partially or wholly embedded in, and/or enclosed within the frame of an exemplary HMD. Yet further, vibration dampeners may be made of various different types of materials. For instance, vibration dampeners may be made of silicon, rubber, and/or foam, among other materials. More generally, a vibration dampener may be constructed from any material suitable for absorbing and/or dampening vibration. Furthermore, in some embodiments, a simple air gap between the parts of the HMD may function as a vibration dampener (e.g., an air gap where a side arm connects to a lens frame).
- a vibration dampener e.g., an air gap where a side arm connects to a lens frame.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Manufacturing & Machinery (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Details Of Audible-Bandwidth Transducers (AREA)
- Headphones And Earphones (AREA)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/269,935 US20130022220A1 (en) | 2011-07-20 | 2011-10-10 | Wearable Computing Device with Indirect Bone-Conduction Speaker |
PCT/US2012/047618 WO2013013158A2 (fr) | 2011-07-20 | 2012-07-20 | Dispositif de calcul portatif avec haut-parleur à conduction osseuse indirecte |
CN201280045795.7A CN103988113B (zh) | 2011-07-20 | 2012-07-20 | 具有间接骨传导扬声器的可佩戴计算设备 |
US15/066,253 US9900676B2 (en) | 2011-07-20 | 2016-03-10 | Wearable computing device with indirect bone-conduction speaker |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161509997P | 2011-07-20 | 2011-07-20 | |
US13/269,935 US20130022220A1 (en) | 2011-07-20 | 2011-10-10 | Wearable Computing Device with Indirect Bone-Conduction Speaker |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/066,253 Continuation US9900676B2 (en) | 2011-07-20 | 2016-03-10 | Wearable computing device with indirect bone-conduction speaker |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130022220A1 true US20130022220A1 (en) | 2013-01-24 |
Family
ID=47555766
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/269,935 Abandoned US20130022220A1 (en) | 2011-07-20 | 2011-10-10 | Wearable Computing Device with Indirect Bone-Conduction Speaker |
US15/066,253 Active 2031-12-29 US9900676B2 (en) | 2011-07-20 | 2016-03-10 | Wearable computing device with indirect bone-conduction speaker |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/066,253 Active 2031-12-29 US9900676B2 (en) | 2011-07-20 | 2016-03-10 | Wearable computing device with indirect bone-conduction speaker |
Country Status (3)
Country | Link |
---|---|
US (2) | US20130022220A1 (fr) |
CN (1) | CN103988113B (fr) |
WO (1) | WO2013013158A2 (fr) |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130201082A1 (en) * | 2008-06-11 | 2013-08-08 | Honeywell International Inc. | Method and system for operating a near-to-eye display |
US20140029762A1 (en) * | 2012-07-25 | 2014-01-30 | Nokia Corporation | Head-Mounted Sound Capture Device |
US20140184550A1 (en) * | 2011-09-07 | 2014-07-03 | Tandemlaunch Technologies Inc. | System and Method for Using Eye Gaze Information to Enhance Interactions |
US20140247951A1 (en) * | 2013-03-01 | 2014-09-04 | Lalkrushna Malaviya | Animal Headphone Apparatus |
WO2015009539A1 (fr) * | 2013-07-15 | 2015-01-22 | Google Inc. | Isolation de transducteur audio |
US20150149092A1 (en) * | 2013-11-25 | 2015-05-28 | National Oilwell Varco, L.P. | Wearable interface for drilling information system |
WO2015143018A1 (fr) * | 2014-03-18 | 2015-09-24 | Google Inc. | Barrette piézoélectrique adaptative pour récepteur à conduction osseuse dans des ordinateurs portatifs |
US20150334486A1 (en) * | 2012-12-13 | 2015-11-19 | Samsung Electronics Co., Ltd. | Glasses apparatus and method for controlling glasses apparatus, audio apparatus and method for providing audio signal and display apparatus |
US9323983B2 (en) | 2014-05-29 | 2016-04-26 | Comcast Cable Communications, Llc | Real-time image and audio replacement for visual acquisition devices |
US9430921B2 (en) * | 2014-09-24 | 2016-08-30 | Taction Technology Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US9471101B2 (en) | 2013-09-11 | 2016-10-18 | Lg Electronics Inc. | Wearable computing device and user interface method |
US9690326B2 (en) | 2015-02-02 | 2017-06-27 | Samsung Display Co., Ltd. | Wearable display device |
US9720083B2 (en) * | 2013-06-05 | 2017-08-01 | Google Inc. | Using sounds for determining a worn state of a wearable computing device |
US9806795B2 (en) | 2013-08-05 | 2017-10-31 | Microsoft Technology Licensing, Llc | Automated earpiece cache management |
CN107526432A (zh) * | 2016-06-15 | 2017-12-29 | 意美森公司 | 用于经由罩壳提供触觉反馈的系统和方法 |
US9895110B2 (en) | 2014-09-11 | 2018-02-20 | Industrial Technology Research Institute | Exercise physiological sensing system, motion artifact suppression processing method and device |
US9924265B2 (en) * | 2015-09-15 | 2018-03-20 | Intel Corporation | System for voice capture via nasal vibration sensing |
US9936273B2 (en) | 2015-01-20 | 2018-04-03 | Taction Technology, Inc. | Apparatus and methods for altering the appearance of wearable devices |
US9999396B2 (en) | 2014-09-11 | 2018-06-19 | Industrial Technology Research Institute | Exercise physiological sensing system, motion artifact suppression processing method and device |
US10241583B2 (en) | 2016-08-30 | 2019-03-26 | Intel Corporation | User command determination based on a vibration pattern |
US10298282B2 (en) | 2016-06-16 | 2019-05-21 | Intel Corporation | Multi-modal sensing wearable device for physiological context measurement |
US10324494B2 (en) | 2015-11-25 | 2019-06-18 | Intel Corporation | Apparatus for detecting electromagnetic field change in response to gesture |
WO2019143864A1 (fr) * | 2018-01-17 | 2019-07-25 | Magic Leap, Inc. | Systèmes et procédés d'affichage pour déterminer l'enregistrement entre un affichage et les yeux d'un utilisateur |
US20190238971A1 (en) * | 2018-01-31 | 2019-08-01 | Bose Corporation | Eyeglass Headphones |
US10390139B2 (en) | 2015-09-16 | 2019-08-20 | Taction Technology, Inc. | Apparatus and methods for audio-tactile spatialization of sound and perception of bass |
US10491739B2 (en) | 2017-03-16 | 2019-11-26 | Microsoft Technology Licensing, Llc | Opportunistic timing of device notifications |
US10573139B2 (en) | 2015-09-16 | 2020-02-25 | Taction Technology, Inc. | Tactile transducer with digital signal processing for improved fidelity |
US10721594B2 (en) | 2014-06-26 | 2020-07-21 | Microsoft Technology Licensing, Llc | Location-based audio messaging |
US10873800B1 (en) * | 2019-05-17 | 2020-12-22 | Facebook Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US11106034B2 (en) * | 2019-05-07 | 2021-08-31 | Apple Inc. | Adjustment mechanism for head-mounted display |
CN113691914A (zh) * | 2017-12-22 | 2021-11-23 | 谷歌有限责任公司 | 二维分布式模式致动器 |
US11290706B2 (en) | 2018-01-17 | 2022-03-29 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
US20220382382A1 (en) * | 2021-06-01 | 2022-12-01 | tooz technologies GmbH | Calling up a wake-up function and controlling a wearable device using tap gestures |
US11567336B2 (en) | 2018-07-24 | 2023-01-31 | Magic Leap, Inc. | Display systems and methods for determining registration between display and eyes of user |
US11656472B2 (en) * | 2017-10-22 | 2023-05-23 | Lumus Ltd. | Head-mounted augmented reality device employing an optical bench |
US11662311B2 (en) | 2018-04-08 | 2023-05-30 | Lumus Ltd. | Optical sample characterization |
US11729359B2 (en) | 2019-12-08 | 2023-08-15 | Lumus Ltd. | Optical systems with compact image projector |
US11747137B2 (en) | 2020-11-18 | 2023-09-05 | Lumus Ltd. | Optical-based validation of orientations of internal facets |
US11762169B2 (en) | 2017-12-03 | 2023-09-19 | Lumus Ltd. | Optical device alignment methods |
US11768538B1 (en) | 2019-04-26 | 2023-09-26 | Apple Inc. | Wearable electronic device with physical interface |
US11927734B2 (en) | 2016-11-08 | 2024-03-12 | Lumus Ltd. | Light-guide device with optical cutoff edge and corresponding production methods |
US12019249B2 (en) | 2019-12-25 | 2024-06-25 | Lumus Ltd. | Optical systems and methods for eye tracking based on redirecting light from eye using an optical arrangement associated with a light-guide optical element |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9596536B2 (en) * | 2015-07-22 | 2017-03-14 | Google Inc. | Microphone arranged in cavity for enhanced voice isolation |
DE202016105934U1 (de) | 2016-10-21 | 2017-08-22 | Krones Ag | Andockstation für ein Etikettieraggregat |
WO2018230790A1 (fr) * | 2017-06-13 | 2018-12-20 | 주식회사 비햅틱스 | Visiocasque |
CN107280956A (zh) * | 2017-07-28 | 2017-10-24 | 马国华 | 一种电子音频理疗养生仪 |
CN109270710A (zh) * | 2018-12-13 | 2019-01-25 | 张�浩 | 骨传导眼镜架 |
US10659869B1 (en) * | 2019-02-08 | 2020-05-19 | Facebook Technologies, Llc | Cartilage transducer |
Family Cites Families (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5282253A (en) | 1991-02-26 | 1994-01-25 | Pan Communications, Inc. | Bone conduction microphone mount |
US5457751A (en) | 1992-01-15 | 1995-10-10 | Such; Ronald W. | Ergonomic headset |
US6301367B1 (en) * | 1995-03-08 | 2001-10-09 | Interval Research Corporation | Wearable audio system with acoustic modules |
JPH1065996A (ja) | 1996-08-23 | 1998-03-06 | Olympus Optical Co Ltd | 頭部装着型表示装置 |
JP2001522063A (ja) * | 1997-10-30 | 2001-11-13 | ザ マイクロオプティカル コーポレイション | 眼鏡インターフェースシステム |
US6215655B1 (en) | 1997-10-31 | 2001-04-10 | Lacerta Enterprises, Inc. | Drive-in ordering apparatus |
US6463157B1 (en) | 1998-10-06 | 2002-10-08 | Analytical Engineering, Inc. | Bone conduction speaker and microphone |
US7150526B2 (en) | 2000-06-02 | 2006-12-19 | Oakley, Inc. | Wireless interactive headset |
US8482488B2 (en) | 2004-12-22 | 2013-07-09 | Oakley, Inc. | Data input management system for wearable electronically enabled interface |
US7461936B2 (en) | 2000-06-02 | 2008-12-09 | Oakley, Inc. | Eyeglasses with detachable adjustable electronics module |
US20020039427A1 (en) | 2000-10-04 | 2002-04-04 | Timothy Whitwell | Audio apparatus |
US20020124295A1 (en) | 2000-10-30 | 2002-09-12 | Loel Fenwick | Clothing apparatus, carrier for a biophysical sensor, and patient alarm system |
AU2003254210A1 (en) | 2002-07-26 | 2004-02-16 | Oakley, Inc. | Wireless interactive headset |
US7310427B2 (en) | 2002-08-01 | 2007-12-18 | Virginia Commonwealth University | Recreational bone conduction audio device, system |
US7233684B2 (en) | 2002-11-25 | 2007-06-19 | Eastman Kodak Company | Imaging method and system using affective information |
US7762665B2 (en) | 2003-03-21 | 2010-07-27 | Queen's University At Kingston | Method and apparatus for communication between humans and devices |
US7792552B2 (en) | 2003-04-15 | 2010-09-07 | Ipventure, Inc. | Eyeglasses for wireless communications |
JP2005352024A (ja) * | 2004-06-09 | 2005-12-22 | Murata Mfg Co Ltd | 眼鏡型インタフェース装置及びセキュリティシステム |
US7555136B2 (en) | 2004-06-25 | 2009-06-30 | Victorion Technology Co., Ltd. | Nasal bone conduction wireless communication transmitting device |
US20060034478A1 (en) | 2004-08-11 | 2006-02-16 | Davenport Kevin E | Audio eyeglasses |
US7580540B2 (en) | 2004-12-29 | 2009-08-25 | Motorola, Inc. | Apparatus and method for receiving inputs from a user |
US20070069976A1 (en) | 2005-09-26 | 2007-03-29 | Willins Bruce A | Method and system for interface between head mounted display and handheld device |
US8325964B2 (en) | 2006-03-22 | 2012-12-04 | Dsp Group Ltd. | Method and system for bone conduction sound propagation |
US7543934B2 (en) | 2006-09-20 | 2009-06-09 | Ipventures, Inc. | Eyeglasses with activity monitoring and acoustic dampening |
US7740353B2 (en) | 2006-12-14 | 2010-06-22 | Oakley, Inc. | Wearable high resolution audio visual interface |
JP2008165063A (ja) * | 2006-12-28 | 2008-07-17 | Scalar Corp | ヘッドマウントディスプレイ |
KR20080090720A (ko) | 2007-04-05 | 2008-10-09 | 최성식 | 진동유닛을 구비한 헤드폰 |
US8086288B2 (en) | 2007-06-15 | 2011-12-27 | Eric Klein | Miniature wireless earring headset |
EP2731358A1 (fr) * | 2008-02-11 | 2014-05-14 | Bone Tone Communications Ltd. | Système acoustique et procédé pour produire un son |
US20090259090A1 (en) * | 2008-03-31 | 2009-10-15 | Cochlear Limited | Bone conduction hearing device having acoustic feedback reduction system |
US20100149073A1 (en) * | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
WO2008145409A2 (fr) | 2008-08-29 | 2008-12-04 | Phonak Ag | Prothèse auditive et méthode permettant de fournir une aide auditive à un utilisateur |
WO2010062481A1 (fr) * | 2008-11-02 | 2010-06-03 | David Chaum | Système et appareil d'affichage proche de l'oeil |
CN101753221A (zh) * | 2008-11-28 | 2010-06-23 | 新兴盛科技股份有限公司 | 蝶颞骨传导通讯与/或助听装置 |
JP2010224472A (ja) * | 2009-03-25 | 2010-10-07 | Olympus Corp | 眼鏡装着型画像表示装置 |
US8094858B2 (en) | 2009-04-27 | 2012-01-10 | Joseph Adam Thiel | Eyewear retention device |
WO2011001498A1 (fr) | 2009-06-29 | 2011-01-06 | パイオニア株式会社 | Amortisseur de haut-parleur et dispositif haut-parleur |
US8964298B2 (en) * | 2010-02-28 | 2015-02-24 | Microsoft Corporation | Video display modification based on sensor input for a see-through near-to-eye display |
WO2012090944A1 (fr) * | 2010-12-27 | 2012-07-05 | ローム株式会社 | Téléphone mobile |
US8223088B1 (en) | 2011-06-09 | 2012-07-17 | Google Inc. | Multimode input field for a head-mounted display |
-
2011
- 2011-10-10 US US13/269,935 patent/US20130022220A1/en not_active Abandoned
-
2012
- 2012-07-20 CN CN201280045795.7A patent/CN103988113B/zh active Active
- 2012-07-20 WO PCT/US2012/047618 patent/WO2013013158A2/fr active Application Filing
-
2016
- 2016-03-10 US US15/066,253 patent/US9900676B2/en active Active
Cited By (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130201082A1 (en) * | 2008-06-11 | 2013-08-08 | Honeywell International Inc. | Method and system for operating a near-to-eye display |
US9594248B2 (en) * | 2008-06-11 | 2017-03-14 | Honeywell International Inc. | Method and system for operating a near-to-eye display |
US20140184550A1 (en) * | 2011-09-07 | 2014-07-03 | Tandemlaunch Technologies Inc. | System and Method for Using Eye Gaze Information to Enhance Interactions |
US20140029762A1 (en) * | 2012-07-25 | 2014-01-30 | Nokia Corporation | Head-Mounted Sound Capture Device |
US9094749B2 (en) * | 2012-07-25 | 2015-07-28 | Nokia Technologies Oy | Head-mounted sound capture device |
US20150334486A1 (en) * | 2012-12-13 | 2015-11-19 | Samsung Electronics Co., Ltd. | Glasses apparatus and method for controlling glasses apparatus, audio apparatus and method for providing audio signal and display apparatus |
US9712910B2 (en) * | 2012-12-13 | 2017-07-18 | Samsung Electronics Co., Ltd. | Glasses apparatus and method for controlling glasses apparatus, audio apparatus and method for providing audio signal and display apparatus |
US20140247951A1 (en) * | 2013-03-01 | 2014-09-04 | Lalkrushna Malaviya | Animal Headphone Apparatus |
US9628895B2 (en) * | 2013-03-01 | 2017-04-18 | Lalkrushna Malaviya | Animal headphone apparatus |
US9720083B2 (en) * | 2013-06-05 | 2017-08-01 | Google Inc. | Using sounds for determining a worn state of a wearable computing device |
WO2015009539A1 (fr) * | 2013-07-15 | 2015-01-22 | Google Inc. | Isolation de transducteur audio |
US9143848B2 (en) | 2013-07-15 | 2015-09-22 | Google Inc. | Isolation of audio transducer |
CN105518516A (zh) * | 2013-07-15 | 2016-04-20 | 谷歌公司 | 音频换能器的隔离 |
US9806795B2 (en) | 2013-08-05 | 2017-10-31 | Microsoft Technology Licensing, Llc | Automated earpiece cache management |
US9471101B2 (en) | 2013-09-11 | 2016-10-18 | Lg Electronics Inc. | Wearable computing device and user interface method |
US20150149092A1 (en) * | 2013-11-25 | 2015-05-28 | National Oilwell Varco, L.P. | Wearable interface for drilling information system |
US9547175B2 (en) | 2014-03-18 | 2017-01-17 | Google Inc. | Adaptive piezoelectric array for bone conduction receiver in wearable computers |
WO2015143018A1 (fr) * | 2014-03-18 | 2015-09-24 | Google Inc. | Barrette piézoélectrique adaptative pour récepteur à conduction osseuse dans des ordinateurs portatifs |
US9323983B2 (en) | 2014-05-29 | 2016-04-26 | Comcast Cable Communications, Llc | Real-time image and audio replacement for visual acquisition devices |
US10721594B2 (en) | 2014-06-26 | 2020-07-21 | Microsoft Technology Licensing, Llc | Location-based audio messaging |
US9999396B2 (en) | 2014-09-11 | 2018-06-19 | Industrial Technology Research Institute | Exercise physiological sensing system, motion artifact suppression processing method and device |
US9895110B2 (en) | 2014-09-11 | 2018-02-20 | Industrial Technology Research Institute | Exercise physiological sensing system, motion artifact suppression processing method and device |
US10820117B2 (en) | 2014-09-24 | 2020-10-27 | Taction Technology, Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US20170171666A1 (en) * | 2014-09-24 | 2017-06-15 | Taction Technology Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US10812913B2 (en) | 2014-09-24 | 2020-10-20 | Taction Technology, Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US9430921B2 (en) * | 2014-09-24 | 2016-08-30 | Taction Technology Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US10659885B2 (en) | 2014-09-24 | 2020-05-19 | Taction Technology, Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US9936273B2 (en) | 2015-01-20 | 2018-04-03 | Taction Technology, Inc. | Apparatus and methods for altering the appearance of wearable devices |
US9690326B2 (en) | 2015-02-02 | 2017-06-27 | Samsung Display Co., Ltd. | Wearable display device |
US9924265B2 (en) * | 2015-09-15 | 2018-03-20 | Intel Corporation | System for voice capture via nasal vibration sensing |
US10390139B2 (en) | 2015-09-16 | 2019-08-20 | Taction Technology, Inc. | Apparatus and methods for audio-tactile spatialization of sound and perception of bass |
US10573139B2 (en) | 2015-09-16 | 2020-02-25 | Taction Technology, Inc. | Tactile transducer with digital signal processing for improved fidelity |
US11263879B2 (en) | 2015-09-16 | 2022-03-01 | Taction Technology, Inc. | Tactile transducer with digital signal processing for improved fidelity |
US10324494B2 (en) | 2015-11-25 | 2019-06-18 | Intel Corporation | Apparatus for detecting electromagnetic field change in response to gesture |
US10444844B2 (en) | 2016-06-15 | 2019-10-15 | Immersion Corporation | Systems and methods for providing haptic feedback via a case |
US10095311B2 (en) * | 2016-06-15 | 2018-10-09 | Immersion Corporation | Systems and methods for providing haptic feedback via a case |
CN107526432A (zh) * | 2016-06-15 | 2017-12-29 | 意美森公司 | 用于经由罩壳提供触觉反馈的系统和方法 |
US10298282B2 (en) | 2016-06-16 | 2019-05-21 | Intel Corporation | Multi-modal sensing wearable device for physiological context measurement |
US10241583B2 (en) | 2016-08-30 | 2019-03-26 | Intel Corporation | User command determination based on a vibration pattern |
US11927734B2 (en) | 2016-11-08 | 2024-03-12 | Lumus Ltd. | Light-guide device with optical cutoff edge and corresponding production methods |
US10491739B2 (en) | 2017-03-16 | 2019-11-26 | Microsoft Technology Licensing, Llc | Opportunistic timing of device notifications |
US11966062B2 (en) * | 2017-10-22 | 2024-04-23 | Lumus Ltd. | Head-mounted augmented reality device employing an optical bench |
US11656472B2 (en) * | 2017-10-22 | 2023-05-23 | Lumus Ltd. | Head-mounted augmented reality device employing an optical bench |
US11762169B2 (en) | 2017-12-03 | 2023-09-19 | Lumus Ltd. | Optical device alignment methods |
CN113691914A (zh) * | 2017-12-22 | 2021-11-23 | 谷歌有限责任公司 | 二维分布式模式致动器 |
WO2019143864A1 (fr) * | 2018-01-17 | 2019-07-25 | Magic Leap, Inc. | Systèmes et procédés d'affichage pour déterminer l'enregistrement entre un affichage et les yeux d'un utilisateur |
US11290706B2 (en) | 2018-01-17 | 2022-03-29 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
US11883104B2 (en) | 2018-01-17 | 2024-01-30 | Magic Leap, Inc. | Eye center of rotation determination, depth plane selection, and render camera positioning in display systems |
US11880033B2 (en) | 2018-01-17 | 2024-01-23 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
US20190238971A1 (en) * | 2018-01-31 | 2019-08-01 | Bose Corporation | Eyeglass Headphones |
US11451901B2 (en) * | 2018-01-31 | 2022-09-20 | Bose Corporation | Eyeglass headphones |
US10555071B2 (en) * | 2018-01-31 | 2020-02-04 | Bose Corporation | Eyeglass headphones |
US11662311B2 (en) | 2018-04-08 | 2023-05-30 | Lumus Ltd. | Optical sample characterization |
US11567336B2 (en) | 2018-07-24 | 2023-01-31 | Magic Leap, Inc. | Display systems and methods for determining registration between display and eyes of user |
US11880043B2 (en) | 2018-07-24 | 2024-01-23 | Magic Leap, Inc. | Display systems and methods for determining registration between display and eyes of user |
US11768538B1 (en) | 2019-04-26 | 2023-09-26 | Apple Inc. | Wearable electronic device with physical interface |
US11106034B2 (en) * | 2019-05-07 | 2021-08-31 | Apple Inc. | Adjustment mechanism for head-mounted display |
US20220408177A1 (en) * | 2019-05-17 | 2022-12-22 | Meta Platforms Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US10873800B1 (en) * | 2019-05-17 | 2020-12-22 | Facebook Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US11902735B2 (en) * | 2019-05-17 | 2024-02-13 | Meta Platforms Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US11445288B2 (en) * | 2019-05-17 | 2022-09-13 | Meta Platforms Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US11729359B2 (en) | 2019-12-08 | 2023-08-15 | Lumus Ltd. | Optical systems with compact image projector |
US12019249B2 (en) | 2019-12-25 | 2024-06-25 | Lumus Ltd. | Optical systems and methods for eye tracking based on redirecting light from eye using an optical arrangement associated with a light-guide optical element |
US11747137B2 (en) | 2020-11-18 | 2023-09-05 | Lumus Ltd. | Optical-based validation of orientations of internal facets |
US20220382382A1 (en) * | 2021-06-01 | 2022-12-01 | tooz technologies GmbH | Calling up a wake-up function and controlling a wearable device using tap gestures |
Also Published As
Publication number | Publication date |
---|---|
WO2013013158A2 (fr) | 2013-01-24 |
US9900676B2 (en) | 2018-02-20 |
US20160192048A1 (en) | 2016-06-30 |
CN103988113B (zh) | 2017-05-03 |
CN103988113A (zh) | 2014-08-13 |
WO2013013158A3 (fr) | 2013-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9900676B2 (en) | Wearable computing device with indirect bone-conduction speaker | |
US9210494B1 (en) | External vibration reduction in bone-conduction speaker | |
US9609412B2 (en) | Bone-conduction anvil and diaphragm | |
US9031273B2 (en) | Wearable computing device with behind-ear bone-conduction speaker | |
US20140064536A1 (en) | Thin Film Bone-Conduction Transducer for a Wearable Computing System | |
US20160161748A1 (en) | Wearable computing device | |
US9547175B2 (en) | Adaptive piezoelectric array for bone conduction receiver in wearable computers | |
US9456284B2 (en) | Dual-element MEMS microphone for mechanical vibration noise cancellation | |
US9100732B1 (en) | Hertzian dipole headphone speaker | |
US9002020B1 (en) | Bone-conduction transducer array for spatial audio | |
US9143848B2 (en) | Isolation of audio transducer | |
US8965012B1 (en) | Smart sensing bone conduction transducer | |
US9998817B1 (en) | On head detection by capacitive sensing BCT | |
US10734706B1 (en) | Antenna assembly utilizing space between a battery and a housing | |
US9525936B1 (en) | Wireless earbud communications using magnetic induction | |
US11675200B1 (en) | Antenna methods and systems for wearable devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DONG, JIANCHUN;CHI, LIANG-YU TOM;HEINRICH, MITCHELL;AND OTHERS;SIGNING DATES FROM 20110907 TO 20110909;REEL/FRAME:027040/0777 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044142/0357 Effective date: 20170929 |