CN111512639A - Earphone with interactive display screen - Google Patents

Earphone with interactive display screen Download PDF

Info

Publication number
CN111512639A
CN111512639A CN201880070808.3A CN201880070808A CN111512639A CN 111512639 A CN111512639 A CN 111512639A CN 201880070808 A CN201880070808 A CN 201880070808A CN 111512639 A CN111512639 A CN 111512639A
Authority
CN
China
Prior art keywords
user
display screen
transparent display
headphone system
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880070808.3A
Other languages
Chinese (zh)
Other versions
CN111512639B (en
Inventor
梁荣斌
罗纳德.庞
K.H.刘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/695,600 external-priority patent/US10433044B2/en
Application filed by Individual filed Critical Individual
Publication of CN111512639A publication Critical patent/CN111512639A/en
Application granted granted Critical
Publication of CN111512639B publication Critical patent/CN111512639B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1041Mechanical or electronic switches, or control elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1058Manufacture or assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1091Details not provided for in groups H04R1/1008 - H04R1/1083
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2420/00Details of connection covered by H04R, not provided for in its groups
    • H04R2420/07Applications of wireless loudspeakers or wireless microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Acoustics & Sound (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Headphones And Earphones (AREA)
  • Telephone Function (AREA)

Abstract

A novel headphone system includes a first speaker assembly, a second speaker assembly, and an interactive visual display system. The interactive visual display system includes a display screen operable to output visual content in accordance with an interaction with a user. In one embodiment, one of the display screens may be a transparent display screen placed in front of the user's eyes. The real-time image through the transparent display screen may be merged with the digital image generated by the transparent display screen to provide an augmented reality experience for the user. In another embodiment, the headset system includes an external device interface that enables a user to interact with content displayed from the screen via an external device or the internet. In another embodiment, the interactive visual display system is removable from the rest of the headphone system and may optionally be installed in another compatible non-headphone device.

Description

Earphone with interactive display screen
Technical Field
The present invention relates to audio electronic devices, and more particularly to audio headphones.
Background
As consumer electronics become more prevalent in modern society, the demand for personal audio headsets will continue to increase. With increasing demand, there has recently been a proliferation in the design and development of headsets. In fact, headset developers are continually seeking new and improved designs and functions to attract consumers.
One approach to making the product more appealing to consumers has been to add some type of aesthetic feature to the design of the headphones, for example, some designs include variable color/design panels to allow the user to customize the appearance of the headphones.
While both of these design approaches add to the overall aesthetic appeal of the headset, they have their own drawbacks, for example, the limitation of headsets with variable color/design panels is that they can only display content (e.g., color, pattern, image, etc.) at variable color/design panels.
Disclosure of Invention
Aspects of the present invention provide a headphone system that is capable of interactively displaying content according to the user or by the definition of other parameters, including but not limited to: GPS location, user motion, sound, voice, nearby images, and/or commands received via wireless communication from an authorized internet site or a nearby authorized device. Finally, the user can control the interactive display of the headphones with his own brain waves.
Aspects of the present invention improve upon the prior art by transforming headphones into a complex interactive display platform in addition to reproducing music and sound. As a result, the headset of the present invention enables a user to publicly express his/her feelings to nearby persons by displaying a form of static, animation, or interactive image or video.
It is an object of the present invention to provide a portable, interchangeable display platform that can be worn on the head or body. The display content may be programmable and/or may interact with an external control device (e.g., a smart phone or tablet) in the vicinity. One advantage of the display screen is that it can provide a visual attraction to and/or communication with surrounding people.
An exemplary embodiment of the present invention adds a portable and/or interchangeable display platform to both sides of the headset and to the headband. Three display screens, one on the left earpiece, one on the right earpiece, and one on the headband, form a complete interactive display platform. Another exemplary embodiment includes only a headband display or displays on the left and right earphones. Such an embodiment would simplify implementation and reduce the cost of the interactive display headphones.
In one example, the headset and display platform have a link to an external stand-alone control device. This external control device can provide the necessary audio signals for the headphones to play and the appropriate video or still/moving image signals for the display platform to play. This link may be a physical link (e.g., a physical wire) or a wireless link (e.g., bluetooth, ZigBee, WIFI, NFC, 3G/4G, etc.) to connect to an external device. Such external devices may also be used to turn on/off and/or configure various headphone functions, such as noise cancellation, loudness control, spectral equalizers, or signal processing functions that other headphones may have.
The display platform, like a microcomputer, has its own Central Processing Unit (CPU), memory, storage and necessary interfaces to perform display functions based on information and instructions sent by external devices over the link connecting the control device and the headset/display platform. Content transmitted over the link is first stored in internal storage and then displayed in a manner determined by the transmitted instructions. If the camera is mounted on the display platform, the display screen may also display video or images captured by the camera. If the display screen has "touch screen" input functionality, it can be used to control various headphone functions, such as noise cancellation, adjusting frequency response, and other signal processing functions. The display platform may include a gyroscope so that the orientation of the display screen may be automatically adjusted according to the orientation of the user's head. Global Positioning System (GPS) information provided by sensors or external control devices built into the headset allows the display screen and/or headset to present voice/sound, images, video associated with the current location. A motion sensor built into the headset detects the user's motion to cause the display screen to interact with the user's motion.
In one example, a headphone system includes a frame having a first region, a second region, and a middle region. The system also includes a first speaker assembly coupled to the first region of the frame, a second speaker assembly coupled to the second region of the frame, a display screen coupled to the frame, a controller coupled to the frame, and a memory. The user interface is for receiving input from a user. The memory stores data and code, and the controller is responsive to user input and executes the code and displays an image on the display screen based at least in part on the user input.
In an exemplary embodiment, the user interface includes a data communication interface to facilitate data communication between the headset system and an external system. Further, the user interface includes an input sensor coupled to the frame. The controller may provide control signals to the external device via the data communication interface in response to inputs received by the input sensor.
Optionally, the controller is for controlling the headset system based at least in part on instructions received from an external system via the data communication interface. In an exemplary embodiment, the data communication interface is configured to receive audio control instructions, display control instructions, and/or camera control instructions from an external system (when the headset system includes a camera). The controller and communication interface facilitate real-time control of the headset system by an external system.
In one exemplary embodiment, the data communication interface is configured to communicate with an external system via a wired connection. Optionally, the data communication interface comprises a short-range or long-range wireless system configured to communicate with external systems.
The data communication interface is configured to receive an audio signal from an external system, and the first speaker assembly is configured to output an audible indication of the audio signal in real-time. Optionally, the controller is configured to receive digital audio data via the data communication interface and store the digital audio data in the memory. Alternatively, the data communication interface is configured to receive a display signal from an external system, and the controller is configured to display an image on the display screen in real time based on the display signal. Alternatively, the controller is configured to receive display data via the communication interface and store the display data in the memory.
In one exemplary embodiment, the controller, memory and display screen are embedded in the first detachable display unit, which is removably coupled to the frame by mounting the first detachable display unit on the first speaker assembly. In a particular embodiment, the first speaker assembly includes a first electrical contact and the first detachable display unit includes a mating second electrical contact. The first electrical contact and the mating second electrical contact are adapted to be electrically connected to each other. The first detachable display unit is adapted to rotate about an axis relative to the first speaker assembly when the first detachable unit is mounted to the first speaker assembly. One of the first electrical contact and the mating second electrical contact comprises a generally circular ring-shaped (or circular arc-shaped) electrical conductor formed at least partially about an axis, the ring-shaped conductor adapted to slidingly engage the other of the first electrical contact and the mating second electrical contact when the first detachable display unit is rotated relative to the first speaker assembly. The other of the first electrical contact and the mating second electrical contact is a conductive biasing member adapted to exert a force on the loop conductor sufficient to maintain electrical contact with the loop conductor when the first detachable display unit is rotated relative to the first speaker assembly. In certain embodiments, the first speaker assembly includes a first set of threads and the first display unit includes a complementary second set of threads. The first and second thread sets facilitate mounting the display unit to the first speaker assembly.
Optionally, the headset system further comprises a second detachable display unit having a second display screen, both the display screen and the second display screen being controlled by the controller. In certain embodiments, the first detachable display unit is adapted to engage a first speaker assembly (e.g., coupled to the frame via the first speaker assembly), and the second detachable display unit is adapted to engage a second speaker assembly (e.g., coupled to the frame via the second speaker assembly). In a more specific embodiment, the headphone system further includes a third display unit having a third display screen, the third display unit being mounted to the middle region of the frame. The display screen, the second display screen and the third display screen are all controlled by the controller.
The controller may display static content or video on the display screen.
Optionally, the headset system further comprises a camera coupled to the frame.
In one exemplary embodiment, the user interface includes an input sensor coupled to the frame, and the controller is operable to control the headphone system based at least in part on an input of the input sensor. For example, the controller may display an image on the display screen based at least in part on input from the input sensor. As another example, the controller may control operation of at least one of the first speaker assembly and the second speaker assembly based at least in part on input from the input sensor. In embodiments including a camera, the controller may control operation of the camera based at least in part on input from the input sensor.
The input sensors may include sound sensors (e.g., microphones), orientation sensors (e.g., gyroscopes, tilt sensors, etc.), and/or motion sensors (e.g., gyroscopes, accelerometers, inclinometers, etc.). The input sensors may also include user manual input devices (e.g., touch screen displays, buttons, etc.).
In a particular embodiment, the input sensor generates an output indicative of an orientation of the display screen, and the controller is configured to automatically adjust the orientation of the displayed image on the display screen based at least in part on the output of the input sensor. In addition, the controller may provide control instructions to the external device in response to signals input to the sensor, so that the user may control the external device via the headphone system. The control instructions may include, but are not limited to, audio instructions (e.g., turn up volume, turn down volume, next selection, etc.). The headset system may also include a location determining device (e.g., a GPS device), and the controller may perform location-based operations using signals from the GPS device.
The invention also discloses a method for manufacturing the earphone. An exemplary method comprises: a frame is provided having a first region, a second region, and an intermediate region. The method also includes coupling a first speaker assembly to a first region of the frame and coupling a second speaker assembly to a second region of the frame. The method assembles a user interface, a memory, a display screen, and a controller into a display unit, and couples the display unit to a frame.
A headset system according to another embodiment of the invention includes a frame, a first speaker assembly, a transparent display screen, a user interface operable to receive input from a user, a memory for storing data and code, and a controller. The frame has a first region, a second region, and a headband extending between the first region and the second region, wherein the first region and the second region are configured to be positioned proximate a first ear and a second ear, respectively, of a user. Additionally, a first speaker assembly is coupled to the first region of the frame, and the frame is configured to position the first speaker assembly near a first ear of the user. The transparent display is also coupled to the frame and configured to be positioned in an optical path of the user when the first and second regions of the frame are positioned near the first and second ears of the user. A controller is also coupled to the frame and responsive to user input for executing the code and displaying the image on the transparent display for viewing based at least in part on the input from the user. Optionally, the transparent display may include nose pads (nose rest) and/or may be sized to be simultaneously visible to both eyes of the user.
In certain embodiments, the transparent display screen is rotatably coupled to the frame and configured to rotate between at least a first position and a second position. When in the first position, the transparent display screen is disposed in the optical path of the user, and when in the second position, the transparent display screen is disposed over the headband. Optionally, the transparent display screen is further configured to rotate from a second position to a third position, wherein the transparent display screen is configured to be disposed around a rear region of the user's head.
Various exemplary specific embodiments are disclosed. For example, the user interface may also include at least one sensor. The sensors may include one or more of motion sensors, sound sensors, brain wave sensors, orientation sensors, Global Positioning System (GPS) sensors, and the like. As another example, the headphone system can further include a second speaker assembly coupled to a second region of the frame, wherein the frame is configured to position the second speaker assembly near a second ear of the user. As yet another example, the headset system may further comprise a physical connection interface, such that the transparent display screen is detachable from the headset system via the physical connection interface.
In some embodiments, the network interface comprises a Wide Area Network (WAN), and in other embodiments, the network interface comprises a local area network (L AN).
In yet another particular embodiment, the headset system includes a battery and an energy harvesting device adapted to charge the battery. The energy harvesting device may include, for example, a solar panel, a kinetic energy harvesting device operable to generate an electrical charge in response to movement of the headset system, or the like.
In yet another particular embodiment, the optical characteristics of the transparent display screen are adjustable. Where the headset system includes a camera coupled to the frame, the controller may be configured to adjust the optical characteristics of the transparent display screen in response to ambient light conditions detected by the camera.
Embodiments of the headphone system may also include at least one adapter having a first portion configured to selectively couple with the frame and a second portion configured to selectively couple with an adapter-enabled device. Various adapter-enabled devices may be used, however in one embodiment, the transparent display itself is an adapter-enabled device.
Drawings
The present invention will be further described with reference to the accompanying drawings, wherein like reference numerals refer to substantially identical elements:
fig. 1 is a perspective view of a headphone system connected to an external device via a data cable;
fig. 2 is an exploded perspective view of the earphone system of fig. 1;
fig. 3 is a perspective view of a display unit of the earphone system of fig. 1;
fig. 4 is a schematic diagram of the headset system of fig. 1 worn by a user;
FIG. 5 is a block logic diagram of the control circuit of FIG. 3;
FIG. 6 is a representative image of an alternate component suitable for receiving the display unit of FIG. 2;
FIG. 7 is a side view of a headphone system including a transparent display screen according to one embodiment of the invention;
FIG. 8 is a side view showing the transparent display screen of FIG. 7 connected to the headphone system of FIG. 7;
fig. 9 is a front view of a headphone system with a transparent display screen according to another embodiment of the present invention;
fig. 10 is a diagram showing a plurality of headphone systems of fig. 7 interconnected via an internet network;
fig. 11 is a diagram showing a plurality of headphone systems of fig. 7 networked to one another over a local area network;
FIG. 12 is an exploded perspective view of an adapter according to an embodiment of the present invention;
FIG. 13 is a circuit diagram of the adapter of FIG. 12;
fig. 14 is a block diagram of a controller of the headphone system of fig. 7; and
fig. 15 is a perspective view of a display unit of the headphone system of fig. 7.
Detailed Description
The present invention overcomes the deficiencies associated with the prior art by providing a headphone system having an interactive display system. In the following description, numerous specific details are set forth (e.g., types of display screens, display content, particular sensor types, etc.) in order to provide a thorough understanding of one of the present inventions. Those skilled in the art will appreciate that implementation details may be varied from those described in the present disclosure. In other instances, well known headset manufacturing and operation of electronic device components have been omitted so as not to unnecessarily obscure the present invention.
Fig. 1 is a perspective view of an earphone system 100 connected to an external device 102 via a wire 104. The headphone system 100 includes a frame 106 supporting a set of speaker assemblies 108 and an interactive visual display system, wherein, in the exemplary embodiment, the visual display system includes a first display unit 110, a second display unit 112, and a third display unit 114. In this example, the display units 110 and 112 are removable from the headphone system 100 so that they can be interchanged with other display units installed in the headphone system 100. Unlike the display units 110 and 112, the display unit 114 is a component that is not removable from the headphone system 100 in the present embodiment. However, any of the display units 110, 112 and/or 114 may be interchangeable or an integral part of the headphone system 100 without departing from the main scope of the present invention.
The external device 102 is for example a smart phone provided with an application 116 enabling a user to control the headset system 100 and interact with the headset system 100. For example, the visual content 118 displayed by the display units 110, 112, and 114 may be controlled and interacted with in real-time by user I/O devices (e.g., touch screen, trackball, orientation sensor, microphone, acceleration sensor, etc.) and/or other devices (e.g., GPS position determination system) of the external device 102 while the application 116 is running. The audio content output from speaker component 108 may also be controlled and interacted with in real-time via user I/O devices of external device 102 under the operation of application 116. In addition, using application 116, audio and display content may be pre-loaded from external device 102 to one or more of display units 110, 112, and/or 114.
For example, the line 104 is a Universal Serial Bus (USB) that provides a wired link for data communication between the headset system 100 and an external device (e.g., the external device 102). Alternatively, a conventional auxiliary audio type cable may be substituted for line 104.
The headset system 100 is adapted for short-range or long-range wireless communication with external devices having wireless communication capabilities. For example, the headset system 100 and the external device 102 are adapted to communicate via a short-range wireless link 120. As another example, the headset system 100 may be configured to communicate with the external device 102 via a 3G/4G wireless connection, which is not short-range communication. As yet another example, the headset system 100 is adapted for internet 122 communication via a wireless link 124. Likewise, the external device 102 is also adapted to communicate with the internet 122 via a wireless link 126.
The display unit 110 provides control signals to the display units 112 and 114 via a control cable/bus 128. Alternatively, the display units 110, 112, and 114 may enable wireless communication.
Fig. 2 shows a perspective view of the display units 110 and 112 with the headphone system 100 exploded along the axis 200. Each of the speaker assemblies 108 defines a recess 202 adapted to receive a respective one of the display units 110 and 112. In addition, each recess 202 defines a set of internal threads 204 and a set of electrical contacts 208. The internal threads 204 are adapted to mate with a complementary set of external threads 206 formed on the display units 110 and 112. For example, electrical contacts 208 include three (or more) conductive spring elements formed on each recess 202. Each set of electrical contacts 208 is adapted to slidingly engage another set of three concentric circular (or circular arc) electrical contacts 210 formed on the bottom side of the display units 110 and 112. As display units 110 and 112 are threaded into recess 202, each of electrical contacts 210 slidably engages a respective one of electrical contacts 208, thereby establishing an electrical connection therebetween. When compressed, the spring characteristics of electrical contacts 208 not only help establish an electrical connection with electrical contacts 210, but also provide a biasing force to secure display units 110 and 112 in recess 202. It should be appreciated that as long as the display units 110 and 112 are sufficiently threaded into the recess 202, the electrical contacts 208 and 210 remain in contact with each other regardless of the orientation of the display units 110 and 112 about the axis 200.
Fig. 3 shows a perspective view of the display unit 110 in an embodiment in accordance with the invention. In addition to the threads 206 and electrical contacts 210 (shown in fig. 2), the display unit 110 includes a housing 300 to support a display screen 302, a set of user input buttons 304, a microphone 306, a camera 308, an orientation sensor 310, a motion sensor 312, a Global Positioning System (GPS) module 314, a Universal Serial Bus (USB) port 316, an auxiliary cable port 318, a short-range wireless module 320, control circuitry 322, and a battery 324.
In the exemplary embodiment, display screen 302 is a touch screen operable to display visual content in the form of still images and/or video. The display screen 302 is also operable to receive user input through touch indications. The content displayed by the display screen 302 may be predetermined content (e.g., music videos, pictures, etc.) and/or content generated in real-time via touch indications. In one example, the content generated in real-time is a draw line generated on the display screen 302 by running a fingertip on the display screen 302. The display screen 302 is an optional means of providing user control instructions to the headphone system 100. For example, the volume of the audio signal output from the speaker assembly 108 may be adjusted by running a fingertip from a lower portion of the display screen 302 to an upper portion of the display screen 302. As another optional feature, the display screen 302 may serve as a user input device for an additional external device 102, e.g., a user may turn down the volume of the external device 102 via a touch-directed input of the display screen 302.
The user input buttons 304 are mechanical devices for directly inputting user control commands to the headphone system 100. Alternatively, the user input buttons 304 may serve as an additional user input device for indirectly inputting user control commands to the external device 102. For example, the button 304 may be used to pause audio signals flowing from the external device 102 to the headphone system 100.
The microphone 306 is another input device that directly inputs user control commands to the headphone system 100. In other words, the microphone 306 enables the user to control the display unit 110 via voice/sound commands. For example, a user may speak "show artist" by voice to instruct the display unit 110 to display an image of the artist of a song being played by the speaker component 108. As another example, a user may instruct camera 308 to capture video by speaking "record video" through speech. Also, the microphone 306 may serve as an additional user input device for indirectly inputting user control commands to the external device 102. For example, the user may change the audio track from the external device 102 by speaking "next track" by voice. Microphone 306 may also be used by display unit 110 to record sound, or as an additional microphone for external device 102 to record sound.
The camera 308 causes the display unit 110 to record digital video and/or still pictures. The camera 308 may be user controlled directly through the input devices (i.e., display screen 302, buttons 304, microphone 306, direction sensor 310, motion sensor 312) of the display unit 110. Alternatively, the operation of the camera 308 may be operated from the external device 102.
For example, the orientation sensor 310 is a micro-electromechanical system (MEMS) gyroscope. The orientation sensor 310 provides a number of useful functions to the headphone system 100. For example, the orientation sensor 310 allows the display unit 110 to detect its orientation so that it can adjust the orientation of the content displayed by the display screen 302. Another useful function is that the direction sensor 310 acts as a user input device for controlling the headphone system 100, e.g. a user can increase and decrease the audio volume of the headphone system 100 by tilting the head in a first direction and an opposite second direction, respectively. As another example, a user may change the content displayed in the display screen 302 by changing the orientation of their head. Another useful function is that the orientation sensor 310 acts as a user input device for controlling the external device 102. For example, the user may adjust the ring volume of the external device 102 by changing the direction of the headset system 100.
For example, the motion sensor 312 is an accelerometer that provides some useful functionality for the headset system 100. One function is that the motion sensor 312 acts as a user input device for controlling the headphone system 100. For example, the user may increase the volume of the headphone system 100 by quickly turning the head in a first direction and decrease the volume of the headphone system 100 by quickly turning the head in a second, opposite direction. The magnitude of the volume change may be proportional to the acceleration of the user's rotating head. The user may alter and/or change the content displayed by the display screen 302 by moving the head. For example, a ball displayed by the display screen 302 may bounce off of the peripheral edge of the display screen 302 when the user shakes the head. Another useful function is that the motion sensor 312 acts as a user input device for controlling the external device 102. For example, the user may choose to answer an incoming call from the external device 102 by clicking on it in some predetermined manner.
The GPS module 314 is a conventional GPS module that enables the headset system 100 to perform location-sensitive functions, e.g., GPS information provided by sensors in the GPS module facilitates the output of location-sensitive image, video, and audio content. As another example, the interactive headphone system 100 may play audio or video/image information related to a significant event that currently occurs at the current location. As another example, the interactive headset system 100 may play/display nearby discounts or promotional activity, traffic information, inclement weather information, etc., related information by interacting with authorized resources via the internet.
In this particular embodiment there are three alternative external device interfaces through which the headset system 100 can communicate with an external device (i.e., the external device 102), namely a USB port 316, an auxiliary cable port 318, and a short-range wireless module 320. The USB port 316 is adapted to access a data line (i.e., line 104) through which data may be preloaded into the display unit 110 or streamed in real-time. For example, media files (e.g., MP3 audio files, video files, image files, etc.) may be preloaded onto the display unit 110 from a computer (i.e., external device 102) via the USB port 316. Alternatively, the USB port 316 may be a data interface (e.g., an HDMI interface) through which media files may be streamed to the display unit 110 in real time. The USB port 316 may not only facilitate data exchange between the display unit 110 and an external device, but also be used to supply power to the display unit 110. The power supply of the display unit 110 may be used to recharge the battery 324 and/or provide operating power directly to the display unit 110. The auxiliary cable port 318 is adapted to receive an auxiliary audio cable through which audio data flows from an external audio signal source (e.g., an MP3 player) to the display unit 110. The short-range wireless module 320 provides a wireless link through which wireless data can be preloaded to the display unit 110 or streamed in real-time. For example, media files may be preloaded to the display unit 110 from a computer (i.e., the external device 102) via the short-range wireless module 320, and optionally, the media files may be transmitted in real-time from the computer to the display unit 110 via the short-range wireless module 320.
The control circuit 322 provides for overall coordination and control of the various functions of the display unit 110. Control circuitry 322 is electrically coupled to display 302, buttons 304, microphone 306, camera 308, orientation sensor 310, motion sensor 312, GPS module 314, USB port 316, auxiliary cable port 318, short-range wireless module 320, and battery 324.
Fig. 4 shows a diagram illustrating the directional correction function of the headphone system 100 by depicting the display content 118 when the user 400 is looking down and up. As shown, the orientation of the display content 118 is always maintained correct regardless of the orientation of the display unit 110. As previously mentioned, the directional correction is effected by the directional sensor 310 (shown in fig. 3). This particular functionality is useful not only for the change of orientation of the headphone system 100, but also for the change of orientation of the display unit 110 relative to the rest of the headphone system 100. For example, if a portion of display unit 110 is unscrewed (e.g., 90 degrees) from recess 202, orientation sensor 310 will detect the orientation offset and the orientation of content 118 will be corrected.
Fig. 5 is a block diagram of control circuitry 322 according to one embodiment of the invention. The control circuitry 322 includes a power connection 500, one or more processing units 502, non-volatile memory 504, a speaker interface 506, a camera interface 508, a motion sensor interface 510, a direction sensor interface 512, a wireless module interface 514, a GPS module interface 516, a microphone interface 518, an auxiliary cable interface 520, a USB interface 522, a touch screen interface 524, a button interface 526, and all working memory 528 interconnected via a system bus 530. Power connection 500 provides a means for electrically connecting control circuitry 322 to battery 324 or some other source of operating power. The processing unit 502 executes data and code stored in the working memory 528 in order to cause the headset system 100 to perform various functions. The non-volatile memory 504 (e.g., read-only memory) provides storage for data and code (e.g., boot code and programs, digital audio files, image/video files, etc.) that remains when the headset system 100 is powered down. The speaker interface 506 provides a connection between the display unit 110 and the speaker assembly 108. Camera interface 508 facilitates electrical connection of camera 308 to control circuitry 322. The motion sensor interface 510 facilitates electrical connection of the motion sensor 312 to the control circuitry 322. The orientation sensor interface 512 facilitates electrical connection of the orientation sensor 310 to the control circuitry 322. Wireless module interface 514 facilitates electrical connection of short-range wireless module 320 to control circuitry 322. The GPS module interface 516 facilitates electrical connection of the GPS module 314 to the control circuit 322. Microphone interface 518 facilitates electrical connection of microphone 306 to control circuitry 322. The auxiliary line interface 520 facilitates indirect electrical connection of an external device (e.g., external device 102) to the control circuitry 322 via the auxiliary cable port 318. The USB interface 522 facilitates indirect electrical connection of an external device (e.g., external device 102) to the control circuitry 322 via the USB port 316. The touch screen interface 524 facilitates electrical connection of the display screen 302 to the control circuitry 322. The button interface 526 facilitates electrical connection between the button 304 and the control circuit 322.
The working memory 528 (e.g., random access memory) provides temporary storage for data and executable code that is loaded into the working memory 528 during startup and execution. The working memory 528 includes an operating system algorithm module 532, a speaker algorithm module 534, a camera algorithm module 536, a motion sensor algorithm module 538, a direction sensor algorithm module 540, a wireless algorithm module 542, a GPS algorithm module 544, a microphone algorithm module 546, an auxiliary port algorithm module 548, a USB communication algorithm module 550, a touch screen algorithm module 552, a button algorithm module 554, an external device communication algorithm module 556, and an internet communication algorithm module 558.
The modules of the working memory 528 provide the following functions. The operating system algorithm module 532 provides coordination and control of the various operating programs and modules of the headphone system 100. Speaker algorithm module 534 facilitates the output of analog audio signals from speaker interface 506 to speakers of speaker assembly 108. The camera algorithm module 536 facilitates operation of the camera 308 (e.g., shutter operation, image processing/storage, etc.). The motion sensor algorithm module 538 performs various operations based on the motion measurement signals captured by the motion sensor 312. For example, when the motion sensor 312 detects a preset acceleration, the motion sensor algorithm module 538 may output an instruction to decrease the audio volume of the headphone system 100. The orientation sensor algorithm module 540 performs various operations according to the orientation detected by the orientation sensor 310. For example, when the orientation sensor 310 detects a change in the orientation of the display unit 110, the orientation sensor algorithm module 540 may include an algorithm that may output instructions to correct the orientation of the content displayed on the display screen 302. The wireless algorithm module 542 facilitates communication between the headset system 100 and wireless devices, such as the external device 102, wireless modem, and the like. The GPS algorithm module 544 provides for convenient operation and use of data from the GPS module 314. The microphone algorithm module 546 operates based on audio signals detected/captured by the microphone. For example, the microphone algorithm module 546 may include an algorithm to pause audio output from the headset system 100 in response to the word "pause" being input to the microphone 306 by speech. The auxiliary port algorithm module 548 facilitates communication of the headset system 100 with the external system 102 via wires plugged into the auxiliary cable port 318. The USB communication algorithm module 550 facilitates the headset system 100 communicating with the external system 102 via a cable (i.e., line 104) plugged into the USB port 316. The touch screen algorithm module 552 facilitates operation of the display screen 302. The button algorithm module 554 is operable to perform a function based on user command input of the button 304. The external device communication algorithm module 556 facilitates communication between the headset system 100 and an external device, such as the external device 102. Internet communications algorithm module 558 facilitates connecting headset system 100 to the internet.
Fig. 6 is a representative image of an alternative system 600 that includes a display unit 110 and an alternative component 602 adapted to receive the display unit 110. Example embodiments of alternative components 602 include, but are not limited to, belt buckles, hand bags, waist wear, hat/band hats, coats, other clothing, and other non-headphone systems. Similar to the headphone system 100, the replacement component 602 includes a set of display screens 610, a set of speakers 612, and a set of auxiliary devices 614.
Fig. 7 shows a side view of an earphone system 700 according to another embodiment of the present invention. The headphone system 700 is similar to the headphone system 100, except that the headphone system 700 includes a removable transparent display 702 that is configured to be coupled to a display unit 704 near the left ear of the user and an opposing display unit (not shown) near the right ear of the user. The display unit 704 is similar to the display unit 110, but is also adapted to receive and operate the transparent display 702 via a physical connection interface 706 (e.g., an electromechanical port). An opposing display unit is also coupled to the frame 106 and includes a connection interface 706, but is otherwise similar to the display unit 112 (FIG. 1). The opposing display unit may also include some or all of the components of display unit 704 as desired. The transparent display 702 includes a plurality of connection interfaces 708, each of which includes an electromechanical plug adapted to connect to an associated display unit 704 and an interface 706 of an opposing display unit. Note that the same features of the headphone system 700 and the display unit 704 as those of the headphone system 100 and the display unit 110, respectively, are denoted by the same reference numerals and are retained from the description to avoid redundancy.
Fig. 8 shows a side view of a headset system 700 worn by a user 400 including a transparent display screen 702. Thus, the interface 708 of the transparent display 702 is plugged into the display unit 704 and the complementary interface 706 of the opposing display unit, and thus, the transparent display 702 is directly positioned on the light path 800 of the user 400. In this embodiment, the transparent display 702 is sized to be viewable by both eyes of the user 400 at the same time. In other embodiments, the transparent display 702 may be smaller.
Transparent display 702 includes nose piece 802, frame 804, and transparent display 806. Nose pad 802 is affixed to the bottom of transparent display 702 and is adapted to engage (engage) the user's nose to support transparent display 702 and facilitate positioning transparent display 702 relative to the user's eyes. Frame 804 facilitates support and positioning of transparent display 702 relative to display unit 704. The frame 804 also includes internal circuitry (not shown) that carries power and display drive signals (e.g., data, control signals, etc.) between the display screen 806 and the control circuitry of the display unit 704 and/or the opposing display unit.
Although each side (arcuate) of the frame 804 includes an interface 708 (see fig. 7), in other embodiments, the transparent display 702 may have only one interface 708, for example, on the side of the frame 804 to which the display unit 704 is connected. In this case, the other side of the frame 804 may be eliminated or have connectors adapted only to engage the associated interface 706 for physical support purposes. In fact, many interface and transparent display designs are possible, and such modifications are within the scope of the present invention.
The display unit 704 and the opposing display unit may also communicate with each other (e.g., via the bus 128, short wave wireless, etc.) and cooperate to operate the transparent display 702. For example, the display unit 704 and/or an opposing display unit may provide one or more of power, data, and/or control signals to the respective interfaces 708 of the transparent display 702. In other embodiments, the display unit 704 may control the operation of the transparent display 702 individually. In other embodiments, the display unit 704 and the opposing display unit may operate in a master-slave configuration such that one display unit has primary control of the transparent display screen 702 while the other display unit has secondary control, such as increasing processing power on a failover basis in response to a request by a master server or user, or the like.
When the transparent display 702 is in the optical path of the user 400 in fig. 8, it should be noted that rotating the display unit 704 and the opposing display unit relative to the frame 106 will also rotate the transparent display 702 relative to the user's head. As described above, each set of electrical contacts 208 provides a resilient connection whereby the display unit may be rotated while still maintaining electrical contact with the contacts 208.
The display 806 is, for example, a transparent L CD or L ED display that allows the user 400 to view while superimposing useful images thereon.
The transparent display 806 provides some valuable features. For example, the transparent display 806 may operate to provide augmented reality by declaring images on the transparent display 806 that are superimposed on a real-time scene other than the transparent scene 702 viewable by the user 400. As another example, the display 806 may be made to correct vision defects for the user, such as myopia, hyperopia, and/or astigmatism. As yet another example, the display 806 may operate in a uniform, translucent mode to simulate sunglasses. More specifically, the user may control the color, pattern, and tint of the display 806 (e.g., by displaying controls, voice commands, body gestures, etc. on the screen 302 of the display unit 704). The light transmittance of the screen 806 may also be automatically adjusted if input from a light sensor (e.g., camera 308) is available. In addition, the sunglass mode may operate in the background in combination with other modes.
The display 806 may also operate in accordance with information provided by sensors of the system 700. For example, the display screen 806 may display images/video captured by the camera 308 from behind the user in real-time and/or through a 360 degree field of view, which may be displayed as a virtual reality view. Of course, the headphone system 700 and/or the display unit may have multiple cameras pointing in multiple directions to facilitate these and other functions. As yet another example, the system 700 can function as a translator where relevant software is accessible. For example, speech recognition software may be used to receive conversations in one language via microphone 306, translate them to another language, and output the translated version through speaker component 108 with minimal delay. Similarly, the display 806 may be operated to display images that have been translated into another language, for example, from the original language detected from the pictures taken by the camera 308.
With the audio and visual detection capabilities of the microphone 306 and camera 308, as well as feedback from other available sensors (e.g., GPS sensors), the system 700 may also use artificial intelligence software (e.g., built-in applications, downloaded and installed applications, services available in the internet cloud, etc.) as an autonomous travel guide. In particular, the system 700 may be operated to assist cyclists and drivers in navigating roads, avoiding possible hazards and anticipating traffic congestion ahead. Such information is output to the user through the transparent display 702 and/or the speaker 108. Indeed, with all on-board sensors (on-board sensors) and output devices, system 700 may implement various Virtual Reality (VR) and/or Augmented Reality (AR) functionality that may be helpful to user 400, and thus may be used as VR or AR goggles.
As another example, the headset system 700 may be particularly useful in electronic commerce. More specifically, in embodiments where the headset system 700 may communicate over the internet, the headset system 700 may be configured to perform payments for online purchases. The transparent display 702 may display purchase and/or payment information for the item, and the headset system 700 may be configured to facilitate purchase approval and payment, for example, by accepting voice commands from the user 400 via the microphone 306 and/or recognizing movements of the head (nodding, panning, etc.) captured by one or more other sensors (e.g., accelerometer, camera, etc.).
More generally, the headset system 700 enables the user 400 to perform various actions using voice commands and head movements. For example, the voice recognition software may convert the user's voice commands into actions (e.g., make a purchase, take a picture, etc.). Similarly, the user may use body gestures detectable by the sensors of the headset 700, such as moving his/her head sideways or up and down, respectively, as a decline or confirmation action.
In practice, the function of the headphone system 700 may become complicated. Thus, as described above, sometimes these complex programs (e.g., artificial intelligence, virtual reality, etc.) will require more processing power than the processor of the headset system 700 can provide. In this case, the headset system 700 may be configured to send the locally collected desired requests and parameters to a service provider on the internet, or in the case of a local area network, to a local server. The external server may then process the request and return the results to the headset system 700. Thereafter, the headset system 700 may operate according to the received results, such as by displaying the returned image to the user 400 via the transparent display 702. The shunting (streaming) process from the headphone system 700 thus enables the headphone system 700 to perform very complex functions while keeping processing delays to a minimum.
Fig. 9 shows a front view of a headphone system 900 according to an alternative embodiment of the invention. The headphone system 900 is similar to the headphone system 700 except that the headphone system 900 includes a transparent display screen 902, the transparent display screen 902 being coupled to a display unit 904 and an opposing display unit 906 via respective rotatable ring assemblies 908. The rotatable ring assembly 908 allows the transparent display screen 902 to rotate about the axis 900. More specifically, the rotatable ring assembly 908 allows the transparent display screen 902 to be rotated from a forward position in the optical path of the user 400 to an upright position at 90 degrees from the forward position and to a rearward position at 180 degrees from the forward position. Thus, the transparent display 902 may be rotated at least 180 degrees about the axis 900. When in the rear position, interactive display 702 may serve as a billboard for people walking behind user 900.
In this embodiment, transparent display screen 902 is secured to rotatable ring assembly 908. However, the transparent display 902 may be made removable by incorporating an interface similar to port 706 (or other connector) into the rotatable ring assembly 908 of the display units 904 and 906.
Fig. 9 also shows that the system 900 includes a set of brain wave sensors 910 mounted on the frame 106 so as to engage the head of the user 400. The sensor 910 is coupled to control circuitry of one or more of the display units 904 and 906 via the bus 128 (see fig. 1 and 3). This allows the system 900 to receive user input in the form of brain activity. In response to specific brain activity acquired by sensors 910, headset system 900 may perform various tasks based on the input. In other words, a user may simply instruct the system 900 to perform a function by thinking. Although the sensor 910 is intended to be in contact with the head of the user, a non-contact brain wave sensor may also be used to monitor brain activity. In addition to the brain wave sensors 910, the headset system 900 may be configured to include or interface with other biometric and/or behavioral sensors (e.g., heart rate sensors, blood pressure sensors, pedometers, etc.) to provide biometric data and analysis to the user 400 via the transparent display 902 (e.g., wirelessly, etc.). Brain wave sensors and biometric sensors may also be incorporated into other headset systems described herein. Indeed, the various sensors described herein may be used to collect various behavioral parameters that may be analyzed and used to provide accurate advertisements to the user 400 via the display screen 702, speaker component 108, and the like.
FIG. 10 shows a user 4001-400nThe global social network of (1), wherein the user 4001-400nCommunicate with each other via the internet 1000 or some other wide area network using the respective headset systems 700. In one embodiment, the headset system 700 may store applications that facilitate connecting and communicating directly with each other over the internet 1000.
In other embodiments, central server 1002 hosts user 400 via Internet 10001-400nThe global social network of (1). Thus, the central server 1002 is used to perform various content functions, including but not limited to from various users 4001-400nThe headset system 700 of (a) receives content, processes content, creates or augments content, delivers content to the user 4001-400nThe headset system 700, etc. The central server 1002 may also push applications to the headset system 700 so that the headset system 700 can interface with the central server 1002 or other headset systems 700 and communicate content according to the applications. Alternatively, an application for interfacing with the central server 1002 may be preloaded into the control circuitry of the headset system 700.
The global social network of fig. 10 may provide a wide variety of functions and advantages. For example, once a group is formed, the user 4001-400nMay be performed collaboratively or interactively. Images and/or videos captured by one or more users' cameras may be viewed by other users 400. Similarly, user 4001-400nContent captured by other users' microphones 306 may also be heard through their own speaker assemblies 108. Users 400 can also see each other and interact via an augmented reality or virtual reality display. Additionally, the user 400 may share and display the images they want to advertise via the side of the display unit and/or the display screen on the headband 114 of the headset system 700.
One particular social networking application is a virtual global concert, where some or all of the attendants are located in different parts of the world and wear the headset system 700. Thus, each user 400 can listen to the same music through a program host (DJ), display centrally provided images/videos, and view the virtual dance floor or each dancer/performer via an enhanced display on their transparent display 702. The global social network of system 700 may also be used to enable interactive virtual meetings where all participants are located in remote locations (e.g., distributed around the world), but may still collaborate.
Another advantage of the headset system 700 being connected to the internet 1000 is that complex computing operations can be performed remotely from the system 700, such as through the central server 1002 or other internet server (not shown), so as not to overload the local memory and processing power of the system 700. In other words, each system 700 may send local processing tasks, data, and requirements to a remote computer on the internet 1000 for analysis and processing, and then return the desired results to the system 700 for further use (e.g., display on the transparent display 702).
FIG. 11 is similar to FIG. 10, but shows a user 4001-400nLocal area network (L AN)1100, in communication with each other, in addition, a user 400 wearing a headset system 7001-400n L AN 1100 is connected to the Internet 1000 to facilitate data communication between the headset system 700 and AN entity on the Internet, such as a central server 1002 (e.g., for processing assistance, etc.), a local server 1104 may also be coupled to L AN 1100 to facilitate the user 4001-400nA local application hosting (e.g., a collaboration program, etc.).
Fig. 12 shows a perspective view of an adapter 1200 disposed between the speaker assembly 108 and the display unit 110. The adapter 1200 is operable to carry adapter-enabled devices such as a transparent display screen, a conventional display screen, speakers, a camera, a microphone, brain wave sensors, game controllers, batteries, and the like. The adapter 1200 includes an outer spigot (male end)1202, an inner spigot (male end)1204, a custom port 1206 and a standard port 1208. The male end 1202 is substantially identical to the male end of the display unit 110 and is therefore adapted to electrically and mechanically mate (mate) with the recess 202. That is, the male end 1202 includes threads 1210 and contacts 1212 (shown in FIG. 13) that mate with the threads 204 and contacts 208, respectively, of the recess 202. Thus, contact 1212 is electrically coupled to bus 128. The female end 1204 is substantially identical to the recess 202 and is therefore adapted to electrically and mechanically mate with the threads 206 and contacts 210 (fig. 2) of the display unit 11. More specifically, the internal seam 1204 includes threads 1214 and contacts 1216 that are adapted to mate with the threads 206 and contacts 210, respectively, of the display unit 110. Accordingly, the contacts 1216 are adapted to electrically couple the contacts 210 of the display unit 110 to the bus 128.
Custom port 1206 facilitates electrical coupling of adapter-enabled devices, such as transparent display 702, to bus 128. In this example, custom port 1206 is substantially complementary to interface 708 of transparent display 702. Thus, previous headphone systems, such as headphone system 100, may be retrofitted with transparent display 702 using adapter 1200. Standard port 1208 further facilitates electrically coupling devices coupled by a standard adapter (e.g., a camera, a display screen, a speaker, etc.) to bus 128. In an example embodiment, the standard port 1208 is a Universal Serial Bus (USB) port.
It should also be noted that multiple adapters 1200 may be nested (nest) to facilitate coupling multiple adapter-enabled devices to the headset system. For example, the male end 1202 of the first adapter 1200(1) may be mounted in the recess 202 to receive the transparent display 702. The male end 1202 of the second adapter 1200(2) may then be installed in the female end of the first adapter 1200(1) such that the second adapter 1200(2) may host a 360 degree camera system. Thereafter, the display unit 110 may be screwed (threaded) into the inner seam allowance 1204 of the second adapter 1200 (2). Indeed, nesting more than two adapters 1200 may add more functionality to the headset system.
It should also be noted that the adaptor 1200 may be selectively mounted on the speaker assemblies 108 on either side of the headphone system. For example, where the transparent display 702 includes two interfaces 708, each speaker assembly 108 of the headphone system may be provided with an adapter 1200. As another example, multiple adapters 1200 may be distributed on different sides of the headphone system to improve symmetry, maintain weight balance, and the like.
Fig. 13 is a block diagram showing a circuit of the adaptor 1200. As shown, each of the contacts 1212 of the outer bay 1202 is electrically coupled to a respective one of the contacts 1216 of the inner bay 1204, such that the contacts 210 of the display unit 704 may be indirectly coupled to the bus 128 through the adapter 1200. In addition, each respective one of the contacts 1212 and 1216 is electrically coupled to the control circuitry 1300 of the adapter 1200. Control circuit 1300 is further electrically coupled to custom port 1206 and standard port 1208 and facilitates coordination and control of various functions and associated control signals of adapter 1200.
Fig. 14 is a block diagram of the control circuit 1400 of the display unit 704. The control circuit 1400 is similar to the control circuit 322, but is shown with a bus interface 1402 and a transparent display interface 1404 coupled to the system bus 530. A bus interface 1402 facilitates electrical connection and communication between the control circuit 1400 and the bus 128. Transparent display interface 1404 facilitates electrically connecting a transparent display screen (such as transparent display screen 702) to control circuit 1400 via port 706.
The working memory 528 of FIG. 14 is shown as further comprising a transparent display algorithm 1406, an Adapter Coupled Device (ACD) application 1408, and an adapter communication protocol 1410. Transparent display algorithm 1406 facilitates control and operation of transparent screen 806 of transparent display 702 to perform the functions described herein. For example, the algorithm 1406 may facilitate display of a desired image on the screen 806, darken the screen 806 like a sunglass, and the like. ACD application 1408 represents a program related to adapter 1200 and the devices coupled thereto and facilitates operation of the adapter and the devices to which the adapter is coupled. Thus, ACD application 1408 may be loaded from adapter 1200 into working memory 528. The adapter communication protocol 1410 facilitates communication between the control circuitry 1400 of the display unit 704 and the control circuitry 1300 of the adapter 1200.
FIG. 15 shows a perspective view of a display unit 704 according to one embodiment of the invention. The display unit 704 is similar to the display unit 110, but also includes an energy harvesting (energyharnesss) device 1500 electrically coupled to the battery 324 to extend the life of the battery 324. Energy harvesting device 1500 is, for example, a kinetic energy generating device that converts motion of display unit 704 (e.g., caused by user 400) into electrical energy stored in battery 324. Alternatively, energy harvesting device 1500 may be a solar panel located outside unit 704 to convert incident light into electrical energy that is stored in battery 324. Fig. 15 also shows a port for an interface 706 formed in the display unit 704.
For example, alternate display types (e.g., L ED screen, L CD, etc.) may replace the display screen 302. As another example, alternate data/power cable types (e.g., HDMI interface, micro USB, mini USB, etc.) may replace the cord 104. As yet another example, the interactive display unit (e.g., display unit 206) may operate independently outside of the headset environment. for example, the display unit 206 may be removed from the speaker assembly 108 and mounted to compatible components of other types of devices including, but not limited to, belt buckles, carrying bags, waist wear, brimmed/brimmed hats, coats, or other clothing, etc.

Claims (22)

1. An earphone system, comprising:
a frame having a first region, a second region, and a headband extending between the first region and the second region, the first region and the second region configured to be positioned adjacent a first ear and a second ear, respectively, of a user;
a first speaker assembly coupled to the first region of the frame, the frame configured to position the first speaker assembly near a first ear of a user;
a transparent display coupled to the frame and configured to be positioned in an optical path of the user when the first and second regions of the frame are located near first and second ears of the user;
a user interface operable to receive input from the user;
a memory for storing data and code; and
a controller coupled to the frame, the controller responsive to the input from the user and operable to run the code and display an image on the transparent display screen for viewing based at least in part on the input from the user.
2. The headphone system of claim 1, wherein:
the transparent display screen is rotatably coupled to the frame;
the transparent display screen is configured to rotate between at least a first position and a second position;
the transparent display screen is disposed in the optical path of the user when the transparent display screen is in the first position; and
the transparent display screen is disposed above the headband when the transparent display screen is in the second position.
3. The headphone system of claim 2, wherein:
the transparent display screen is further configured to rotate from the second position to a third position; and
the transparent display screen is configured to be disposed around a rear area of the user's head when the transparent display screen is in the third position.
4. The headphone system of claim 1, further comprising a physical connection interface via which the transparent display screen is detachable from the headphone system.
5. The headphone system of claim 1, wherein the user interface comprises at least one sensor.
6. The headphone system of claim 5, wherein the sensor comprises a motion sensor.
7. The headphone system of claim 5, wherein the sensor comprises a sound sensor.
8. The headphone system of claim 5, wherein the sensors comprise brain wave sensors.
9. The headphone system of claim 5, wherein the sensor comprises an orientation sensor.
10. The headphone system of claim 5, wherein the sensor comprises a Global Positioning System (GPS) sensor.
11. The headphone system of claim 1, further comprising:
a second speaker assembly coupled to the second region of the frame; and wherein
The frame is configured to position the second speaker assembly near a second ear of the user.
12. The headphone system of claim 1, wherein the transparent display screen comprises a nose pad.
13. The headphone system of claim 1, further comprising a network interface, the controller operable to display an image on the transparent display screen based at least in part on data received via the network interface.
14. The headset system of claim 13 wherein the network interface comprises a Wide Area Network (WAN) interface.
15. The headset system of claim 13 wherein the network interface comprises a local area network L AN interface.
16. The headphone system of claim 1, further comprising:
a battery; and
an energy harvesting device adapted to charge the battery.
17. The headset system of claim 16 wherein the energy harvesting device comprises a solar panel.
18. The headphone system of claim 16, wherein the energy harvesting device comprises a kinetic energy harvesting mechanism operable to generate an electrical charge in response to movement of the headphone system.
19. The headphone system of claim 1, wherein the optical characteristics of the transparent display screen are adjustable.
20. The headphone system of claim 19, further comprising:
a camera connected to the frame; and wherein
The controller is configured to adjust the optical characteristic of the transparent display screen in response to ambient light conditions detected by the camera.
21. The headphone system of claim 1, further comprising:
an adapter having a first portion configured to selectively couple with the frame and a second portion configured to selectively couple with an adapter-enabled device; and
wherein the transparent display screen is an adapter-enabled device.
22. A headphone system as in claim 1, wherein the transparent display screen is sized to be viewable by both eyes of the user simultaneously.
CN201880070808.3A 2017-09-05 2018-09-04 Earphone with interactive display screen Active CN111512639B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US15/695,600 2017-09-05
US15/695,600 US10433044B2 (en) 2012-08-02 2017-09-05 Headphones with interactive display
PCT/US2018/049346 WO2019050836A1 (en) 2017-09-05 2018-09-04 Headphones with interactive display

Publications (2)

Publication Number Publication Date
CN111512639A true CN111512639A (en) 2020-08-07
CN111512639B CN111512639B (en) 2022-06-10

Family

ID=65634370

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880070808.3A Active CN111512639B (en) 2017-09-05 2018-09-04 Earphone with interactive display screen

Country Status (4)

Country Link
CN (1) CN111512639B (en)
AU (1) AU2018329647B2 (en)
GB (1) GB2579530B (en)
WO (1) WO2019050836A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112886483A (en) * 2021-01-15 2021-06-01 武汉森林马科技有限公司 Hang formula of AR interaction usefulness and prevent line winding device
CN113596659A (en) * 2021-07-27 2021-11-02 惠州市欧凡实业有限公司 Wireless motion earphone and step counting method based on wireless motion earphone detection
CN116820254A (en) * 2023-07-03 2023-09-29 深圳市高昂电子有限公司 Keyboard control device applied to earphone with interactive display screen

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111836088A (en) * 2020-07-22 2020-10-27 业成科技(成都)有限公司 Correction system and correction method
CN112820286A (en) * 2020-12-29 2021-05-18 北京搜狗科技发展有限公司 Interaction method and earphone equipment
KR102616113B1 (en) * 2021-06-15 2023-12-21 주식회사 프리텍 Earphone independent type VR headset apparatus

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815126A (en) * 1993-10-22 1998-09-29 Kopin Corporation Monocular portable communication and display system
US20090214060A1 (en) * 2008-02-13 2009-08-27 Neurosky, Inc. Audio headset with bio-signal sensors
US20090323975A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation Headphones with embeddable accessories including a personal media player
US20130237146A1 (en) * 2012-03-08 2013-09-12 Lee Serota Headset with Adjustable Display and Integrated Computing System
CN103535051A (en) * 2012-08-02 2014-01-22 庞博文 Earphone with interdynamic display screen
CN105026984A (en) * 2013-05-07 2015-11-04 北京易申科技有限公司 Head-mounted display
US20160085305A1 (en) * 2014-09-18 2016-03-24 Mary A. Spio Audio computer system for interacting within a virtual reality environment
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset
CN106358104A (en) * 2015-07-15 2017-01-25 视讯联合科技股份有限公司 Headset with concealed video screen and various sensors

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2059597A1 (en) * 1991-01-22 1992-07-23 Paul A. Vogt Radio eyewear
US7911410B2 (en) * 2007-03-06 2011-03-22 Sony Ericsson Mobile Communications Ab Peripheral with a display
US8655004B2 (en) * 2007-10-16 2014-02-18 Apple Inc. Sports monitoring system for headphones, earbuds and/or headsets
US20100309097A1 (en) * 2009-06-04 2010-12-09 Roni Raviv Head mounted 3d display
JP5499633B2 (en) * 2009-10-28 2014-05-21 ソニー株式会社 REPRODUCTION DEVICE, HEADPHONE, AND REPRODUCTION METHOD
US9438988B2 (en) * 2014-06-05 2016-09-06 Todd Campbell Adaptable bone conducting headsets
US10579135B2 (en) * 2017-01-27 2020-03-03 Otoy, Inc. Headphone based modular VR/AR platform with rotating display

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815126A (en) * 1993-10-22 1998-09-29 Kopin Corporation Monocular portable communication and display system
US20090214060A1 (en) * 2008-02-13 2009-08-27 Neurosky, Inc. Audio headset with bio-signal sensors
US20090323975A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation Headphones with embeddable accessories including a personal media player
US20130237146A1 (en) * 2012-03-08 2013-09-12 Lee Serota Headset with Adjustable Display and Integrated Computing System
CN103535051A (en) * 2012-08-02 2014-01-22 庞博文 Earphone with interdynamic display screen
CN105026984A (en) * 2013-05-07 2015-11-04 北京易申科技有限公司 Head-mounted display
US20160085305A1 (en) * 2014-09-18 2016-03-24 Mary A. Spio Audio computer system for interacting within a virtual reality environment
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset
CN106358104A (en) * 2015-07-15 2017-01-25 视讯联合科技股份有限公司 Headset with concealed video screen and various sensors

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112886483A (en) * 2021-01-15 2021-06-01 武汉森林马科技有限公司 Hang formula of AR interaction usefulness and prevent line winding device
CN113596659A (en) * 2021-07-27 2021-11-02 惠州市欧凡实业有限公司 Wireless motion earphone and step counting method based on wireless motion earphone detection
CN116820254A (en) * 2023-07-03 2023-09-29 深圳市高昂电子有限公司 Keyboard control device applied to earphone with interactive display screen
CN116820254B (en) * 2023-07-03 2024-03-08 深圳市高昂电子有限公司 Keyboard control device applied to earphone with interactive display screen

Also Published As

Publication number Publication date
WO2019050836A1 (en) 2019-03-14
AU2018329647A1 (en) 2020-04-02
CN111512639B (en) 2022-06-10
GB202003990D0 (en) 2020-05-06
AU2018329647B2 (en) 2020-04-30
GB2579530A (en) 2020-06-24
GB2579530B (en) 2021-02-17

Similar Documents

Publication Publication Date Title
US11044544B2 (en) Headphones with interactive display
CN111512639B (en) Earphone with interactive display screen
US10816807B2 (en) Interactive augmented or virtual reality devices
US10559110B2 (en) Virtual reality
JP2023113596A (en) Improved optical and sensory digital eyewear
CN104520787B (en) Wearing-on-head type computer is as the secondary monitor inputted with automatic speech recognition and head-tracking
JP6481057B1 (en) Character control method in virtual space
US20160381451A1 (en) Headphones with interactive display
US20180124497A1 (en) Augmented Reality Sharing for Wearable Devices
WO2019153650A1 (en) Display device
CN109224432A (en) Control method, device, storage medium and the wearable device of entertainment applications
WO2018075523A9 (en) Audio/video wearable computer system with integrated projector
US20240179448A1 (en) System and method for interactive headphones
CN109240498B (en) Interaction method and device, wearable device and storage medium
WO2022149497A1 (en) Information processing device, information processing method, and computer program
WO2022149496A1 (en) Entertainment system and robot
US20240036323A1 (en) Electronic device and method for adjusting volume using acoustic signal output from external object
KR20240097658A (en) Wearable device for displaying multimedia content provided by external electronic device and method thereof
KR20240084665A (en) Wearable device for displaying user interface associated with controlling of external electronic device and method thereof
KR20240097659A (en) Wearable device for displaying visual object and method thereof
KR20240036433A (en) Method and apparatus for determining persona of an avatar object disoposed in a virtual space
KR20240067751A (en) Wearable device for recording audio signal and method thereof
KR20240097656A (en) Wearable device for switching screen based on biometric data obtained from external electronic device and method thereof
JP2019133677A (en) Method of controlling character in virtual space
WO2023183340A1 (en) Devices, methods, and graphical user interfaces for three-dimensional user experience sessions in an extended reality environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant