WO2018075523A9 - Système informatique vestimentaire audio/vidéo à projecteur intégré - Google Patents

Système informatique vestimentaire audio/vidéo à projecteur intégré Download PDF

Info

Publication number
WO2018075523A9
WO2018075523A9 PCT/US2017/056986 US2017056986W WO2018075523A9 WO 2018075523 A9 WO2018075523 A9 WO 2018075523A9 US 2017056986 W US2017056986 W US 2017056986W WO 2018075523 A9 WO2018075523 A9 WO 2018075523A9
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
headphones
head mounted
audio
data
Prior art date
Application number
PCT/US2017/056986
Other languages
English (en)
Other versions
WO2018075523A1 (fr
Inventor
Jason HARDI
Original Assignee
Muzik, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Muzik, Llc filed Critical Muzik, Llc
Priority to CN201780077088.9A priority Critical patent/CN110178159A/zh
Priority to EP17863227.9A priority patent/EP3526775A4/fr
Publication of WO2018075523A1 publication Critical patent/WO2018075523A1/fr
Publication of WO2018075523A9 publication Critical patent/WO2018075523A9/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/237Communication with additional data server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • G09G2340/125Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels wherein one of the images is motion video
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • Provisional Patent Application Serial No. 61/660,662 filed June 15, 2012 and to U.S. Patent Application Serial No. 14/751,952; filed June 26, 2015; in the USPTO which is a continuation of U.S. Patent Application Serial No. 13/918,451; filed on June 14, 2013 which claims benefit of U.S. Provisional Patent Application Serial No. 61/660,662; filed June 15, 2012, and claims benefit of U.S. Provisional Patent Application Serial No. 61/749,710; filed January 7, 2013 and to claims benefit of U.S. Non-Provisional Patent Application Serial No. 61/762,605; filed February 8, 2013, the content of all of which are hereby incorporated herein by reference.
  • audio content that is stored on the mobile device is wirelessly streamed to the headphones for listening.
  • headphones can wirelessly transmit commands to the mobile device for controlled streaming.
  • the audio headphones may transmit commands such as pause, play, skip, etc. to the mobile device which may be utilized by an application executed on the mobile device.
  • such audio headphones support wirelessly receiving audio content for playback to the user as well as wireless transmission of commands to the mobile device for control of the audio playback to the user on the headphones.
  • FIG. 1 is a block diagram illustrating an environment for operation of headphones in some embodiments according to the inventive concept.
  • FIG. 2 is a flowchart illustrating a method for presenting views of a user environment associated with the headphones in some embodiments according to the inventive concept.
  • FIG. 3 is a block diagram of a processing system included in the headphones in some embodiments according to the inventive concept.
  • FIGs. 4 and 5 are flowcharts illustrating methods of establishing live streaming of audio and/or video from the headphones to an endpoint in some embodiments according to the inventive concept.
  • FIG. 6 is a schematic representation of a composite view including video streamed from an electronic device, such as a mobile phone, combined with video and/or audio streamed from the headphones on the electronic device in some embodiments according to the inventive concept.
  • FIG. 7 is a flowchart illustrating methods of providing the composite view including video streamed from an electronic device, such as a mobile phone, combined with video and/or audio streamed from the headphones on the electronic device in some embodiments according to the inventive concept.
  • FIG. 8 is an illustration of a camera on an earcup of the headphones in some embodiments according to the inventive concept.
  • FIG. 9 is an illustration of a rotatable camera apparatus in some embodiments according to the inventive concept.
  • FIG. 10 is a block diagram of the processing system included in the headphones in some embodiments according to the inventive concept.
  • FIG. 11 is an illustration of a touch sensitive control surface of the headphones in some embodiments according to the inventive concept.
  • FIG. 12 illustrates an environment for streaming audio and/or video from the headphones to an endpoint in some embodiments according to the inventive concept.
  • FIG. 12 is a flow diagram that illustrates a configuration for live streaming video/audio to a server through a local mobile device to a server that is remote from the headphones in some embodiments according to the invention.
  • FIG. 13 is a flow diagram that illustrates a configuration for streaming of live audio/video from the headphones over a local WiFi connection to a server that is remote from the headphones in some embodiments according to the invention.
  • FIG. 14 is a flow diagram that illustrates generation of a preview image provided by the headphones in some embodiments according to the invention.
  • FIG. 15 is a flow diagram that illustrates the configuration of an endpoint established for content sharing via a Webserver integrated into the headphones in some embodiments according to the invention.
  • FIG. 16 is a flow diagram that illustrates the downloading of images stored on the headphones to a mobile device in some embodiments according to the invention.
  • FIG. 17 is a flow diagram illustrating access to an image preview function supported by the Webserver hosted on the headphones in some embodiments according to the invention.
  • FIG. 18 is a flow diagram that illustrates streamed video/audio from the headphones using the locally hosted Webserver to an endpoint at a remote server via a mobile device in some embodiments according to the invention.
  • FIG. 19 is a schematic representation of the headphones including left and right earpieces, configured to couple to the ears of a user.
  • FIG. 20 is a block diagram showing an example architecture of an electronic device, such as a headphones, as described herein.
  • FIG. 21 illustrates an embodiment of a headphone according to the inventive concepts within an operating environment.
  • FIG. 22 is a schematic representation of the headphones including the plurality of cameras used to determine positional data in an environment that includes a feature with six DOF in some embodiments.
  • FIG. 23 is a schematic representation of operations between the headphones and a separate electronic device to determine positional data for the headphones as part of an immersive experience provided by the separate electronic device.
  • FIG. 24 illustrates an embodiment of the headphones according to the inventive concepts within an operating environment.
  • FIG. 25 illustrates an embodiment for a cross-platform application programming interface for connected audio devices, such as the headphones in some embodiments.
  • FIG. 26 illustrates another embodiment for a cross-platform application programming interface for connected audio devices, such as the headphones in some embodiments.
  • FIGs. 27-35 illustrate various embodiments of a remote used to control devices, such the headphones in some embodiments according to the invention.
  • FIG. 36 is a schematic representation of a series of screens presented on the mobile device running an application configured to connect the headphones to the application for syncing in some embodiments according to the invention.
  • Figure 37 is a schematic representation of the headphones included in a telemedicine system in some embodiments according to the invention.
  • Figure 38 is a schematic representation of a plurality of headphones included in a distributed system configured to detect symptoms among a population in to issue alerts based thereon in some embodiments according to the invention.
  • Figure 39 is a block diagram of a wearable computer system including at least one projector in some embodiments according to the invention.
  • Figure 40 is a perspective view of an earcup of a particular type of wearable computer system showing a projector integrated into the cup in some embodiments according to the invention.
  • Figure 41 is a block diagram illustrating various sources of augmentation data for use in the wearable computer system shown in Figure 39 in some embodiments according to the invention.
  • Figure 42A is a schematic representation of a head wearable computer system generating a projection image onto an arbitrary object or surface in some embodiments according to the invention.
  • Figure 42B is a schematic representation of a particular type of head wearable computer system embodied as audio/video enabled headphones with two integrated projectors and a camera in some embodiments according to the invention.
  • headphones may be used to stream a user's local environmental experience or the local environment over a network by capturing an image or video of a user view with a camera included in headphones worn by the user and paired or otherwise associated with an electronic device, such as mobile phone, and paired with a wireless network.
  • a user wearing headphones having an integrated camera can capture images and/or video content of the surroundings and stream such captured content over a network to an endpoint, such as a social media server.
  • audio content may also be streamed from a microphone included in the headphones.
  • the captured content is streamed over a wireless connection to a mobile device hosting an application.
  • the mobile application can render the captured content and provide a live stream to the endpoint.
  • the endpoint can be any resource that can be operatively coupled to a network and can ingest the streamed content such as social media servers, media storage sites, educational sites, commercial sales sites, or the like.
  • the headphones can include a first ear piece (sometimes referred to as an earcup) having a Bluetooth (BT) transceiver circuit (also including a BT low energy circuit (BTE), a second earpiece having a WiFi transceiver circuit, a control processor, at least one camera, at least one microphone, and a user touchpad for controlling functions on the headphones.
  • BT Bluetooth
  • BTE BT low energy circuit
  • the headphones are paired with a mobile device, wherein the user touchpad can be used to control features and operations of an application operating on the mobile device that is associated with the headphones.
  • the headphones are paired with communication using a wireless network.
  • the headphones can be operate using the BT circuit and the WiFi circuit concurrently, where some operations are carried out using the WiFi circuit whereas other operations are carried out using the BT circuit.
  • the headphones are sometimes described herein as having particular circuits located in particular portions of the headphones, any arrangement may be used in some embodiments according the present invention.
  • any type of wireless communications network may be used to carry out the operations of the headphones given that such a wireless communications network can provide the performance called for by the headphones and the applications that are operatively coupled to the headphones, such as maximum latency and minimum bandwidth requirements for such operations and applications.
  • the headphones may include a telecommunication network interface, such as an LTE interface, so that a mobile device or local WiFi connection may be unnecessary for communications between the headphones and an endpoint.
  • a telecommunication network interface such as an LTE interface
  • any telecommunication network interface that provides the performance called for by the headphones and the applications that are operatively coupled to the headphones may be used. Accordingly, when particular operations or applications are described as being carried out using a mobile device (such as a mobile phone) in conjunction with the headphones, it will be understood that equivalent operations and applications may be carried out without the mobile device by using a telecommunication network interface in some embodiments.
  • the term 7 includes either of the items or both.
  • the streaming of audio/video includes the streaming of audio alone, video alone, or audio and video.
  • FIG. 1 depicts an exemplary suitable environment 100, which includes headphones 110 associated with a mobile device 130 supporting one or more mobile applications 135, a wireless network 125, a telecommunications network 132, and an application server 140 that provides a user environment capture system 150.
  • the headphones 110 communicate with the mobile device 130 directly or over the network 125 (such as the internet), to provide the application server 140 with information or content captured by a camera(s) and/or microphone(s) on the headphones 110.
  • the content can include images, video, or other visual information from an environment surrounding a user of the headphones 110, although other content can also be provided.
  • the headphones 110 may also communicate with the mobile device 130 via Bluetooth® or other near-field communication interfaces, which provides the captured information to the application server 140 via a wireless network 135 and/or the telecommunications network 132.
  • the mobile device 130 via the mobile application 135, may capture information from the environment surrounding the headphones 110, and provide the captured information to the application server 140.
  • the user environment capture system 150 may, upon accessing or receiving audio and/or video captured by the headphones 110, may perform various actions using the accessed or received information. For example, the user environment capture system 150 may cause a display device 160 to present the captured information, such as images from the camera(s) on the headphones 110.
  • the display device 160 may be, for example, an associated display, a gaming system, a television or monitor, the mobile device 130, and/or other computing devices configured to present images, video, and/or other multimedia presentations, such as other mobile devices.
  • the user environment capture system 150 performs actions (e.g., presents a view of an environment) using images captured by a camera of the headphones 110.
  • FIG. 2 is a flowchart illustrating a method 200 for presenting views of an environment surrounding using captured content. The method 200 may be performed by the user environment capture system 150 and, accordingly, is described herein merely by way of reference thereto. It will be appreciated that the method 200 may be performed on any suitable system.
  • the user environment capture system 150 accesses audio information captured by the headphones 110.
  • the headphones 110 may use one or more microphones on the headphones 110 to capture ambient noise or to capture the user's own commentary.
  • a microphone may be used to reduce ambient noise according using noise reduction.
  • the user environment capture system 150 accesses images/video captured by one or more cameras on the headphones 110.
  • a camera integrated with an earcup of the headphones 110 may capture images and/or video clips of the
  • environment to provide a first person view of the environment (e.g., visual information seen using the approximate reference point of the user within the environment).
  • the user environment capture system 150 performs an action based on the captured information. For example, the user environment capture system 150 may cause the display device 160 to render or otherwise present a view of the environment associated with images captured by the headphones 110. The user environment capture system 150 may perform additional actions including causing a delay before otherwise causing the display device 160 to present the captured images or sound.
  • the user environment capture system 150 may add data to captured content, including location data, consumer or marketing data, information about data consumed by the user of the headphone 110, such as a song played on the headphone 110, or identification of a song played in the user environment.
  • the user environment capture system 150 may further stream user commentary or user voice data concurrently with the captured video.
  • the user environment capture system 150 may perform other actions using captured visual information.
  • the capture system 150 may cause a social network platform, or other website, to post information that includes some or all of the captured visual information along with audio information played to the user wearing the headphones 100 when the visual information was captured, and/or may share the visual and audio information with other users associated with the user.
  • the user environment capture system 150 may generate a tweet and automatically post the tweet on behalf of the user that includes a link to a song currently being played by the user, as well as an image of what the user is currently seeing while listening to the song via the audio headphone worn by the user.
  • the headphones 110 include various computing components, and can connect directly to a WiFi network.
  • the headphones 110 may include a Bluetooth connection to a mobile device executing an application that allows the user to configure the headphones to select a particular WiFi network and enter secure password information.
  • the user creates a WiFi hot spot with the mobile device, for example via a BT connection, to configure the headphones 110 to use a desired WiFi network with a secure password.
  • the headphones connect directly to a WiFi network in a home, office or other location wherein the mobile device can, via a BT connection, configure the headphones 110 to use the desired network with a secure password.
  • the headphones 110 when the headphones 110 are in a WiFi network with a user's mobile device, internet access the headphones 110 may appear on the network as an IP camera.
  • Applications such as Periscope and Skype may be used with such IP cameras.
  • a user may turn on the headphones IP camera and the WiFi using a programmable hot key located on the headphones or alternatively may activate the IP camera (and other functions) using voice recognition commands. When not in use the camera and WiFi can be shut down to preserve battery life.
  • the headphones 110 are activated so that pairing with this mobile device 110 can be established via a Bluetooth connection.
  • the paring may be established automatically upon power on.
  • the separate mechanism may be utilized to initiate the pairing.
  • the headphones may activate the local camera and a WiFi connection to an access point or a local mobile device in response to an input to the headphones 110.
  • the input can be a programmable "hotkey" or other input such a voice command or gesture to activate the camera.
  • Other inputs may also be used.
  • an application on the mobile device can provide a list of WiFi networks that are accessible and available for use by the headphones 110 for streaming audio/video.
  • the application running on the mobile device 130 can transmit the selected WiFi network to the headphones 110 using a Bluetooth low energy command. Other types of network protocols may also be used to transmit commands.
  • the user may enter authentication information such as a password which is also transmitted to the headphones 110 from the application on the mobile device 130 also over the Bluetooth low energy interface.
  • a companion application can be launched on the mobile device 130 in response to an input at the headphones 110 or via an input to the mobile device 130 itself.
  • the companion app can be started on the mobile device 130 in response to a hotkey pressed on the headphones 110 and transmitted to the mobile device 130.
  • the companion app may be an application such as Periscope.
  • the companion application operating on the mobile device 130 can access the WiFi connection utilized by the headphones 110 to transmit the streaming video. Some embodiments according to the invention, the user may then select that WiFi connection for use by the companion appl.
  • the companion app can connect to the selected WiFi connection that carries the video and/or audio from the headphones 110 which can then be used for streaming from the mobile device 130 in whatever form that the particular companion app supports. It will be further understood that the operations shown in FIG. 4 and described herein can be controlled by the companion app via the SDK described herein which allows control of functionality provided by the headphones 110 in the application on mobile device 130 or on the headphones itself.
  • a user may press a hot key on the headphones to perform various actions.
  • a user can press one of the hot keys on the headphones to activate a companion application on a smartphone that is compatible with an IP camera such as Periscope or Skype.
  • a user can press a hot key that automatically wakes up the WiFi and establishes a connection to a known, previously configured network.
  • a user can press a hot key that automatically turns on WiFi, establishes a connection, opens a companion app (e.g Periscope) on a smartphone, tablet or laptop and starts the live stream.
  • a user can press a hot key to capture still pictures.
  • a user can press a hot key to capture still pictures and automatically share to social networks such as Facebook and Twitter.
  • a microphone on the headphone can include user voice data along with video data. Music and/or audio playing on the headphones can be sent along with video data.
  • the headphones 110 may be paired with the mobile device 130 in response to an input at the headphones 110, such as a hotkey, audio input, a gesture, or the like to initiate the pairing of the headphones 110 to the mobile device 130 via, for example, a Bluetooth connection.
  • an input at the headphones 110 such as a hotkey, audio input, a gesture, or the like to initiate the pairing of the headphones 110 to the mobile device 130 via, for example, a Bluetooth connection.
  • the video camera associated with the headphones 110 can be activated in response to another input at the headphones 110 which may also activate a WiFi connection from the headphones 110. It will also be understood that in some embodiments according to the invention, the operations described above in reference to 505 and 510 can be integrated into a single operation or can be combined so that only a single input may be reused to take both steps described therein.
  • an application on the mobile device 130 can be activated or utilized to select the particular WiFi connection that is activated in operation 510.
  • the WiFi connection can be selected via a native application or capability embedded in the mobile device 130 such as a settings menu, etc.
  • authentication information can be provided to the headphones 110 via the application, such as a user name and password which may be transmitted to the headphone 110 over the Bluetooth connection or a low energy Bluetooth connection.
  • a native application can be launched on the headphones 110 to stream audio/video over the WiFi connection without passing through the mobile device 130.
  • a user can capture a composite view including a video stream from a front facing camera of a mobile device 130 (i.e., a selfie view) and first-person view generated by the camera(s) of the headphones 110.
  • a mobile device 130 i.e., a selfie view
  • first-person view generated by the camera(s) of the headphones 110.
  • a camera on the mobile device 130 can be used to generate what is sometimes referred to a "selfie view" which is generated as a preview and provided on the display of the mobile device 130. It will be understood that the recording by the mobile device 130 can be activated manually or automatically in response to an orientation or movement when the mobile device 130 is set into a particular mode such as a composite video mode.
  • At least one of the cameras associated with the headphones 110 is activated and generates a first-person view.
  • the first-person view is generated as a video feed which is forwarded to the mobile device 130.
  • the mobile device 130 includes an application that generates a composite view on the display of the mobile device 130.
  • the composite view can include a representation of the selfie view provided from the camera on the mobile device 130 as well as at least one first-person view provided by the video feed from the headphones 110.
  • the depiction of the composite view shown on the mobile device 130 in FIG. 6 is representative and is not to be construed as a limitation of the strict construction of the composite view.
  • composite view generated in FIG. 6 can be any view provided on the display of the mobile device 130 and includes both the selfie view as well as at least one first-person view provided by the headphones 110.
  • the operations shown in FIG. 6 can be carried out as shown in operation 705-730 in some embodiments according to the invention.
  • the headphones 110 can be activated whereupon a connection is established between the headphones 110 and the mobile device 130 via, for example, a Bluetooth connection.
  • the video camera located on the headphones 110 can be activated responsive to an input at the headphones 110.
  • the input to the headphones 110 used to activate the video camera can be any input, such as a hotkey, press, or other input such as a gesture or voice command.
  • a WiFi connection is established in response to the input at the headphones 110.
  • an application executing on the mobile device 130 is utilized to indicate the WiFi connections available for the streaming of video from the camera on the headphone 110.
  • the available WiFi connections can be provided on the display of mobile device 130 using an application executing thereon whereupon the user can select the WiFi connection that is to be used for the streaming of audio/video from the headphones 110. Still further, the user may be prompted to provide authentication information for access to the first-person video view from the headphones 110.
  • an input at the headphones 110 can be utilized to launch a companion application on the mobile device 130.
  • the companion app can be launched in response to an input at the headphones 110 such as a hotkey or audio or gesture input.
  • the companion app running on the mobile device 130 accesses the selected WiFi connection to receive the streamed video from the headphones 110 (as well as audio information provided by the headphones 110) which is then directed to the companion app running on the mobile device 130.
  • the companion app connects to the WiFi network provided from the headphones 110 to access the streamed video/audio and generates the composite image using the first-person view provided by the headphones 110 along with the selfie video feed provided from the camera located on the mobile device 130.
  • the composite view can be provided by combining the selfie video feed with the first-person view provided by the headphones 110.
  • any format can be used on the display of the mobile device 130.
  • the operations described herein can be provided via an SDK that allows control of the headphones 110 by the companion application that is executed on the mobile device 130.
  • the video feed can be sent to existing and future applications running on the smartphone, tablet or laptop that support dual streaming video feed such as Periscope, Skype, Facebook, etc.
  • voice data can be sent along with the video streams.
  • Music and/or audio data can be sent along with the video streams.
  • FIG. 8 is an illustration of a video camera 810 on an earcup 805 of the headphones 110 in some embodiments according to the inventive concept. Because different users may wear the headphones 110 in different orientations, or even the same user may change the orientation of the headphones 110, either by moving the position of the headphones 110 on the head, or by moving the head while wearing the headphones 110, the video camera 810 is adaptable to different orientations. In some embodiments, the video camera 810 rotates about a ring through an arc of between about 60 degrees and about 120 degrees. As illustrated, the earcup 805 comprises an earpiece 807, a camera ring 809, a touch sensitive control surface 811, an operating indication light 812.
  • FIG. 9 is an illustration of a rotatable video camera apparatus in some embodiments according to the inventive concept shown overlaid on an orientation axis.
  • an accelerometer 905 is mounted on the rotating video camera ring 809.
  • the accelerometer 905 provides orientation to a processor circuit 920 with respect to a gravity vector.
  • a servo motor 910 can be controlled by the processor 910 to rotate the video camera 810 around the ring 809 to keep the camera oriented in the direction of the horizon vector. In this manner the field of view in the camera can be maintained generally to be the same as the line of sight of the user.
  • image stabilization technology may be incorporated into the processing of the video data.
  • the user can activate privacy mode which can rotate the video camera 810 away from the horizon vector so that the video camera in not maintained in the same line of sight of the user.
  • the user can activate gesture mode for the headphones 110 to rotate the video camera 810 to a custom orientation for the particular user.
  • the video camera rotates to the custom orientation (such as about 45 degrees between the horizon and gravity vectors) and begins gesture processing once the rotation is complete. In this way, the user can choose the custom orientation that fits their preference or is appropriate for a particular situation such as when a user is lying down.
  • FIG. 10 illustrates an example embodiment of a particular configuration of headphones 110 suitable for streaming content such as audio and video.
  • the headphones 110 can be coupled to a mobile device 130 (such as a mobile phone) via a Bluetooth connection as well as a low energy Bluetooth connection (i.e. BLE).
  • the Bluetooth connection can be utilized to stream music from the mobile device 130 to the headphones 110 for listening.
  • the application on the mobile device 130 can be controlled over the low energy Bluetooth interfaced which is configured to transmit commands to/from the headphones 110.
  • the headphones 110 can include "hotkeys" that can be programed to be associated with predefined commands that can be transmitted to the application on the mobile device 130 in response to the button push over these low energy Bluetooth interface.
  • the application can transmit music to the headphones 110 over the Bluetooth connection.
  • the Bluetooth as well as the low energy Bluetooth interfaces can be provided in a particular portion of the headphones 110, such as in a right side earcup. It will be understood, however, that the interfaces described herein can be provided at any portion of the headphones 110 which is convenient.
  • the headphones 110 can also include a WiFi interface that is configured for carrying out higher powered functions provided by the headphones 110
  • a WiFi connection can be established so that video streaming can be provided from a video camera on the headphones 110 to a remote server or an application on the mobile device 130
  • the WiFi interface can be utilized to sync media to/from the headphones 110 as well as store audio files for playback.
  • photos and other media can be provided over the WiFi connection to a remote server or mobile device.
  • the WiFi interface can be operatively associated with a relatively high powered processor (i.e., relative to the circuitry configured to provide the Bluetooth and Bluetooth low energy interfaces described above).
  • the relatively high powered processor can provide, for example, the functionality associated image processing audio/video streaming as well as functions typically associated with what is commonly referred to as a "Smartphone".
  • the Bluetooth/Bluetooth low energy processing can be provided as a default mode of operation for the headphones 110 until a command is received to being operations that are more suitably carried out by the processor associated with the WiFi interface.
  • the Bluetooth and Bluetooth low energy circuits can provide a persistent voice control application that listens for a particular phrase (such as "okay, Muzik") where upon the headphones 110 transmits the command over the low energy Bluetooth interface to an application on the mobile device 130 (or to a native application in the headphones 110 or a remote application on a server).
  • the application executes a predefined operation associated with the command sent by the headphones 110, such as an application that translates in voice data to text.
  • the processor associated with the WiFi interface remains in a standby mode while the Bluetooth/Bluetooth low energy circuitry remains active.
  • the Bluetooth/Bluetooth low energy circuitry can enable the processor associated with the WiFi when a particular operation associated with the processor is called for.
  • a command can be received by the Bluetooth/Bluetooth low energy circuitry that is
  • the Bluetooth/Bluetooth low energy circuitry causes the processor or exit standby mode and become active, such as when video streaming is enabled.
  • the high powered processor portion of the headphones 110 can support embedded mobile applications that are maintained in standby mode while the
  • Bluetooth/Bluetooth low energy circuitry calls upon the higher powered processor for particular functions.
  • the higher powered processor may load the mobile applications that are maintained in standby mode on the headphones 110 so that operations requiring the higher powered processor may begin, such as when live streaming is activated.
  • the headphones 110 includes a first or left earcup that may be thought of as comprising the WiFi processing.
  • the headphones 110 further include a second or right earcup which includes the Bluetooth processing.
  • the left earcup 1030 comprises WiFi processor 1012, such as a Qualcomm Snapdragon 410 processor, having a WiFi stack 1013 connected to a WiFi chipset 1014 and WiFi transceiver 1015.
  • WiFi processor 1012 is also connected to additional memory such as flash memory 1016 and DRAM 1017, video camera 1020 is housed on the left earcup 1010 and connected to WiFi chipset 1014.
  • LED indicators such as a flash LED 1021 and a camera on LED 1022 may be used in conjunction with video camera 1020.
  • One or more sensors may further be housed in the left earcup 1010 including an accelerometer 1018. Other sensors may be incorporated as well, including a gyroscope, magnetometer, thermal or IR sensor, heart rate monitor, decibel monitor, etc.
  • Microphone 1019 is provided for audio associated with video captured by video camera 1020. Microphone 1019s connected through PMIC card 1022 to the WiFi processor 1012.
  • USB adaptor 1024 further connects through the PMIC card 1022.
  • Positive and negative audio cables 1025+ and 1025- run from the PMIC card 1022 to a multiplexer (Audio Mux) 1040 housed in the right earcup 1030.
  • Audio Mux multiplexer
  • Right earcup 1030 includes a Bluetooth processor 1032, such as a CSR8670 processor, connected to a Bluetooth transceiver 1033.
  • Battery 1031 is connected to the
  • Multiple microphones may be connected to the Bluetooth processor 1032, for example voice microphone 1035 and wind cancellation microphone 1036 are connected to and provide audio input to the Bluetooth processor 1032.
  • Audio signals are output from the Bluetooth processor 1032 to a differential amplifier 1037 and further output as positive and negative audio signals 1038 and 1039 respectively to the left speaker 1011 in the left earcup 1010 and the right speaker 1031 in the right earcup 1030.
  • microcontroller 1050 The operation of the headphone 110 and coordination between the WiFi and the Bluetooth is accomplished using microcontroller 1050, with is connected to the WiFi processor 1012 and the Bluetooth processor 1032 via an I 2 C bus 1051.
  • WiFi processor 1032 and WiFi processor 1012 may be in direct communication via UART protocol.
  • a user may control the various functions of the headphone 110 via a touch pad, control wheel, hot keys or a combination thereof, input through capacitive touch sensor 1052, which may be housed on the external surface of the right earcup 1030, and is connected to Ule microcontroller 1050. Additional control features may be included with the right earcup 1030, such as LED's 1055 to indicate various modes of operation, one or more hot keys 1056, a power on/or off button 1057, and a proximity sensor 1058.
  • FIG. 11 illustrates an example of the capacitive touch panel (sensor) 1110 used in conjunction with the capacitive touch sensor 1052 described above.
  • earcup 1105 includes capacitive touch panel 1110 having capacitive touch ring 1112 and a first button 1113 and a second button 1114.
  • Various user controls can be programmed into the headphone.
  • Table 2 below provides example user controls associated with the first and second buttons.
  • the headphone may accept control instruction by voice operation using a voice recognition protocol integrated with the control system of the headphone.
  • Table 3 below provides examples of various voice commands for control of the headphone and associated paired mobile device.
  • the user headphone is paired via a wireless connection either Bluetooth or Wifi or both, to a mobile device running an application for sharing the images and audio captured by the headphone with third party applications running on the internet.
  • FIGs. 12 and 13 illustrate examples for sharing audio and video captured by the camera and microphones on the headphones 110.
  • the left side of the headphones uses FFMPEG alongside of Android MediaCodec to create a suitable RTMP stream for use on Live Streaming Platforms.
  • the RTMP Server JNI bindings and helper code to Android are derived from Kickflip.io's SDK.
  • the RTMP Server may be used in two ways: first connected through a user environment capture WIFI AP using a relay app on the mobile device as illustrated in FIG. 12. In this example the headphone records video/audio and converts it to RTMP format.
  • the converted audio/video content is transmitted via a WiFi connection to the mobile device that is running a program to share the converted content to the Internet.
  • the mobile device then shares the converted content via a cellular connection, such as an LTE connection to RTMP endpoints on the Internet or Cloud such as Youtube, Facebook, Periscope, etc.
  • the headphones 110 can provide low latency video feed as described hereinabove in some embodiments according to the invention.
  • the headphones 110 can include a real time message protocol (RTMP) server that is configured to accept the video/audio stream generated by the camera associated with the headphones 110 and produce data for the audio/video stream in the packet format associated with the RTMP protocol.
  • RTMP real time message protocol
  • the message protocol can be supported by a wide range of services that ingest video for posting or streaming.
  • the streaming audio/video provided in the RTMP packetized format is provided to the mobile device 130 over an access point WiFi connection 1210 generated by the headphones 110
  • the mobile device 130 includes an application that is configured to relay the packetized RTMP data for the audio/video stream to a
  • the mobile device 130 can include an additional application that provides for authentication of the user's account that is associated with an endpoint for the video streaming.
  • an additional application can be included on the mobile device 130 so that the user's account can be authenticated so that when the video stream is forwarded to the endpoint (i.e., the user's Facebook page) the server can ingest the RTMP formatted audio/video stream associated with the user's account.
  • the endpoint i.e., the user's Facebook page
  • the RTMP packetized format of the audio/video feed is forwarded to the identified endpoint 1225 for the livestream via the LTE network connection.
  • the RTMP packetized data format is forwarded directly to the telecommunications network 1220 (i.e., such as a LTE network connection) without passing through the mobile device 130
  • the headphones 110 can stream the packetized audio/video directly to the LTE network connection shown in FIG. 12 which is then forwarded to the identified endpoint 1225 without use of the mobile device 130
  • the headphone is connected directly to a local WIFI network, as illustrated in FIG. 13.
  • a local WIFI network Utilizing the direct WiFi connections directly connects the user environment capture feature on the headphone to the Internet to allow usage of the cloud- based endpoint, the user sets the desired WiFi network connections between the Headphone and local WiFi network. In some embodiments this is done with an app hosted on the mobile device to enter the SSID and keys. In other embodiments this connecting the Headphone to the local WiFi network may be automated after initial setup. After connecting to the local WiFi network, the mobile device sets the desired RTMP destination and sends the
  • the RTMP packetized audio/video stream is generated by the headphones 110 connected to a WiFi network without channeling through the mobile device 130 in some embodiments according to the invention.
  • the application running on the mobile device 130 can establish the desired WiFi network 1305 for streaming of the RTMP packetized data using, for example, a Bluetooth connection and identifying the particular WiFi network 1305 to be used. Still further, the application on the mobile device 130 can also set the destination endpoint for the RTMP packetized data generated by the headphones 110.
  • the application can provide user authentication and identification to the headphones 110 for inclusion with the RTMP packetized data over the WiFi network.
  • the RTMP packetized audio/video data is provided directly to the RTMP endpoint via the WiFi without channeling through the mobile device 130 in some embodiments according to the invention.
  • the user may desire to preview the video feed being sent over the internet to the RTMP endpoints.
  • a preview method is provided for delivering a live feed from the camera to the mobile device to function as a viewfinder for the camera.
  • the preview function encodes video with MotionJPEG.
  • MotionJPEG is a standard that allows a web server to serve moving images in a low latency manner.
  • the Motion JPEG utilizes methods from the open source SKIA image Library.
  • FIG. 14 illustrates a process to preview the image recorded on the headphone camera 1405.
  • the preview frame from the camera is captured/encoded and a processor on the headphone/l 10 converts the preview frame 1410 in memory to MotionJPEG using SKIA 1415.
  • a socket 1420 is then created and configured to deliver the MotionJPEG over an internet HTTP connection.
  • the socket is connectable over the WiFi network 1425 using a purpose built app on the mobile device 130 or using an off the shelf app such as the Shared Home Ap.
  • the preview stream is then viewable on the mobile device as a standard webview.
  • the headphone of the present disclosure hosts an HTTP Server.
  • the server is configured to be used as a method for controlling and configuring the user environment capture and sharing features of the camera enabled headphone, via a HTTP POST with JSON.
  • the light web server on the headphone essentially creates a web server that is embedded in the headphones 110 there are many applications for this technology, including but not limited to: Personalized Live Streaming to be consumed by one or more friends via social media; Electronic News Gathering for Television Networks; Virtualized spectators at concerts, sports, or other activities. This basically allows one to see the event through the eyes of the user of Live; Personalized decentralized websites for users of the product; Personalized decentralized social media profiles for users of the product; and Personalized decentralized blogging platform for users of the product.
  • the user is able to capture images for products and access web based services for product identification and/or purchase.
  • the user may use many different web or cloud based applications such as CQR Code scanning applications, group chatting functions, and more.
  • the user may fully operate with cloud based applications and web based features without a graphic interface.
  • the headphone web server also facilitates configuration of the RTMP destination in the content sharing application of the present invention.
  • a Webserver 1505 is hosted on the headphones 110 and can be accessed by an application on the mobile device 130.
  • the Webserver 1505 on the headphones 110 can establish a WiFi access point mode network 1510 over which the application on the mobile device 130 can be contacted.
  • the application on the mobile device 130 can forward information that is to be used in a live video feed (such as an endpoint 1225 at which live video is to be ingested).
  • the communication can also include an address of the mobile device 130 on which the application is executing.
  • the information is transmitted to the Webserver 1505 over the WiFi access point mode network 1510 which is then forwarded to the RTMP server 1515 located on the headphones 110.
  • the RTMP server 1515 generates the live video stream which is forwarded to the mobile device 130 using the information forwarded to the Webserver 1505.
  • the RTMP packetized data is relayed to the application on the mobile device 130 using the address of the mobile device 130 and also including the endpoint 1225 information associated with the live video feed.
  • the application on the mobile device 130 can reformat the live video feed which can then be forwarded to the endpoint 1225 over a communications network 1220, such as an LTE network connection in some embodiments according to the invention.
  • FIG. 15 illustrates an example of this process.
  • the user's mobile device is connected via the WiFi network to the headphone server.
  • the mobile device then sends a post containing the URL and the phone's IP address to the server on the headphone.
  • the server the sends the received mobile device configuration to the RTMP server.
  • the RTMP server send converted RTMP data via the app hosted on the headphone to the mobile device.
  • the mobile device can then send the RTMP data to RTMP endpoints via a cellular connection such as an LTE connection.
  • the headphone server also facilitates downloading images from the headphone to the mobile device.
  • FIG. 16 illustrates an example of such a process.
  • the mobile device sends via the WiFi connection a request to the headphone server for an image from an image list stored on storage media.
  • the server responds with the image list in JSON Array.
  • the mobile device requests a specific image with path from JSON, the response uses Via getMedia
  • the server responds with the image for viewing on the mobile device or for downloading.
  • the headphone server 1505 may also provide for enabling and/or disabling the image preview function of the user environment capture system.
  • the mobile device may request an on/off preview function command from the mobile device to the headphone server the server enables or disables the preview function with the associated content configuration described herein.
  • the server then starts/stops delivery of frames via the preview function.
  • the connection to the live preview can be established without a preliminary request as described above.
  • the application on the mobile device 130 sends a signal to the server 1505 to access the live preview which is generated by the camera 1405 on the headphones 110.
  • the preview is then forwarded to application on the mobile device 130 by the camera 1405.
  • Mobile device 130 which in turn can reformat, receive the media and is forwarded to the identified endpoint via an LTE network connection. It will be understood, however, that other types of
  • telecommunication networks can be used.
  • FIG. 18 illustrates an example use case for providing a delay in the streaming content.
  • the user requests to enable the streaming preview function.
  • Tile request is sent from tile mobile device to the headphone serve.
  • the headphone server enables the preview function.
  • the headphone server starts delivery of preview frames in tile proper formatting in real-time.
  • the mobile device then sends the RTMP endpoint destinations and delay settings to the headphone server.
  • the headphone sever configures the MotionJPEG sever and RTMP Server to relay the RTMP data to the mobile device at a specified delay while the preview is consumed in real-time.
  • the RTMP stream can be stopped within the delayed time and drop the stream before it is consumed by tile RTMP endpoint.
  • the mobile device streams the delayed RTMP content to the RTMP endpoints via a cellular connection such as an LTE connection. In some embodiments the blocked RTMP stream can be resumed once the disturbing content is out of the picture.
  • the current configuration including the headphone having a light web server allows headphones to identify each other as an RTMP endpoint.
  • headphones can stream audio data to each other. For example, if two or more headphones are connected via a local WiFi network, each headphone can be identified as an RTMP endpoint. This enables voice communication between the connected headphones.
  • Example scenarios include networked headphones in a call center, a coach in communication with team members, managers in connect in employees, or any situation where voice communication is desirable between connected headphones.
  • a headphone may be provided without a camera but with all the same functionality above. This may be advantageous for in ear applications, or for sport applications. Audio content, and other collected data from the user (e.g., accelerometer data, heart rate, activity level, etc.) can be streamed to an RTMP endpoint such as a coach or social media members.
  • RTMP endpoint such as a coach or social media members.
  • a raw stream can be provided from the camera 1405 as the RTMP data without a specified delay.
  • the raw stream is received by the application on the mobile device 130 and is processed to generate a delayed version of the raw stream which is analogous to the relayed RTMP data provided at the specified delay as described above. Therefore, the same functionality can be provided in the delayed stream produced by the application such that the stream can be stopped within the delayed time before it is consumed by the endpoint.
  • the application can produce an alternative raw video stream which is unedited for content.
  • consumers may choose between raw or delayed streamed content.
  • the headphones 110 may provide more electronics“real-estate” than is typically utilized by converting the headphones, which goes unused. Moreover, the capability of the headphones to communicate with, as well as the typical proximity of the headphones to, the user’s other electronic devices can offer the opportunity to augment operations of those other electronics using hardware/software associated with the headphones 110 thereby offering ways to complete or enhance operations of the other electronic devices.
  • the headphones 110 can be configured to assist a separate portable electronic device by offloading the determination of positional data associated with the headphones, which may in-tum, be used to determine positional data for the user, which may improve the user’s experience in immersive type applications supported by the separate mobile electronic device. Other types of offloading and/or augmentation can also be provided. It will be understood that the electronic device can be the mobile device 130 described herein and that the headphones 110 may operate as described herein without the electronic device.
  • FIG. 19 is a schematic representation of the headphones 110 including left and right earpieces 10A and 10B, respectively, configured to couple to the ears of a user.
  • the headphones 110 further include a plurality of sensors 5A-5D including the video camera and microphones described herein.
  • the sensors 5A-5D may be configured to assist in the determination of positional data.
  • the positional data can be used to determine a position of the headphones 110 in an environment, with six degrees of freedom (DOF).
  • DOF degrees of freedom
  • the sensors 5 A-5D can be any type of sensor used to determine location in what is sometimes referred to as an inside-out tracking system where, for example, the sensors 5 A-5D receive electromagnetic and/or other physical energy (such as radio, optical, and/or ultrasound signals etc.) from the surrounding environment to provide signals that may be used to determine a location of the headphones with six DOF.
  • the plurality of sensors may be used to determine a head position of the user based on a determined position of the headphones 110.
  • the plurality of sensors 5A-5D can be located on any portion of the headphones or proximate to the headphones.
  • the sensors 5A are on the left earpiece 10A
  • the sensors 5B are on the headband
  • the sensors 5C are on the right earpiece 10B
  • the sensors 5D are separated from the headphones 110 but located proximate enough to be in wireless or wired communication with augmentation functions in the headphones 110.
  • the sensors 5D can be located with separate electronic devices that may be worn by the user and may be utilized as part of an immersive experience provided by the separate electronic device, such as a bracelet, necklace, wand, of the like.
  • the location of the sensors 5 A-5D on the headphones can be selected so that the sensors can sufficiently receive electromagnetic and/or other physical energy as part of the inside-out tracking system to determine positional data for the headphones with six DOF.
  • FIG. 19A illustrates a particular configuration and location of the sensors 5A-5D, it will be understood by one of skill in the art that other configurations of the sensors are possible without deviating from the inventive concept.
  • FIG. 19B is a schematic representation of an augmentation function located, for example, in a first earpiece 10A of headphones 110, including a sensor interface 660 as further illustrated in FIG. 20. It will be understood that the sensor interface can be provided as part of the processor shown in the figures herein. According to FIG. 19B, the sensors 5A-5D are coupled to the sensor interface 660 which can operate the sensors 5 A-5D to determine positional data for the headphones 110 with six DOF. As illustrated in FIG. 19B, the sensors 5A-5D may be co-located with the sensor interface 660 in an earpiece of the headphones 110, located on some other portion of the headphones 110 (e.g.
  • the sensor interface 660 controls the sensors 5A-5D to detect electromagnetic and/or physical signals that can be used to determine the positional data for the headphones.
  • the sensor interface 660 can control the cameras to capture images of the environment with can be used to determine the position of the headphones based on the location of environmental features detected within the images.
  • the sensors 5 A-5D are RFID sensors
  • the sensor interface 660 can control the RFID sensors to determine the position of the headphones based on triangulation of radio signals.
  • the sensors 5A-5D are accelerometers
  • the sensor interface 660 can control the accelerometer sensors to determine the orientation and/or movement of the headphones based on detected movement of the accelerometers.
  • other sensors are possible, including combinations of multiple types of sensors to achieve determination of the position and/or other characteristics of the headphones 110 and
  • the first earpiece 10A of the headphones 110 may contain an augmentation function.
  • the augmentation function may perform operations configured to augment the operations of the headphones 110.
  • the augmentation function may perform operations responsive to a request and/or data provided to the headphones 110 and return a result of the request/data to the requestor.
  • the augmentation function may be provided a request/data from a separate electronic device.
  • the headphones 110 may perform calculations and/or other operations related to the request/data and provide a response to the separate electronic device.
  • the separate electronic device can use the augmentation function of the headphones 110 to perform calculations and/or operations on the behalf of the separate electronic device 30.
  • the second earpiece 10B of the headphones 110 may contain other electronics used in the operations of the headphones 110. As illustrated in FIG. 19C, the second earpiece 10B of the headphones 110 may also contain an augmentation function similar to the augmentation function in the first earpiece 10A. That is to say that the headphones 110 may contain an augmentation function in either or both of the earpieces 10A 10B. When a plurality of augmentation functions are provided in the headphones 110, they may operate on a request/data provided by a separate electronic device separately or in coordination with one another. In some embodiments, the one or more augmentation functions may be used to process both requests provided by a separate electronic device as well as operations required by the headphones 110.
  • FIG. 19C also illustrates that the second earpiece 10B of the headphones 110 may also contain one or more sensors 5C. These sensors 5C can be coupled to the sensor interface 660 in the first earpiece 10A of the headphones 110. As will be understood by one of skill in the art, this coupling can be done via several mechanisms, including but not limited to an electronic connection through the headband of the headphones 110.
  • FIGS. 19 A, 19B and 19C are merely representative and that other configurations of the various circuits can be made without deviating from the inventive concept.
  • FIG. 20 illustrates a high-level block diagram showing an example architecture of an electronic device, such as a headphones 110, as described herein, and which may implement the operations described above.
  • the headphones 110 includes one or more processors 610 and memory 620 coupled to an interconnect 630.
  • the interconnect 630 may be an abstraction that represents any one or more separate physical buses, point to point connections, or both connected by appropriate bridges, adapters, or controllers.
  • the interconnect 630 may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus or PCI- Express bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (12C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called "Firewire”.
  • PCI Peripheral Component Interconnect
  • ISA industry standard architecture
  • SCSI small computer system interface
  • USB universal serial bus
  • IIC (12C) bus or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called "Firewire”.
  • IEEE Institute of Electrical and Electronics Engineers
  • the processor(s) 610 is/are the central processing unit (CPU) of the headphones 110 and, thus, control the overall operation of the headphones 110. As discussed herein, the one or more processors 610 may be configured to perform an augmentation function, such as those illustrated in FIGS. 19B and 19C. In certain embodiments, the processor(s) 610 accomplish this by executing software or firmware stored in memory 620.
  • the processor(s) 610 may be, or may include, one or more programmable general purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), trusted platform modules (TPMs), or a combination of such or similar devices.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • PLDs programmable logic devices
  • FPGAs field-programmable gate arrays
  • TPMs trusted platform modules
  • the memory 620 is or includes the main memory of the headphones 110.
  • the memory 620 represents any form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices. In use, the memory 620 may contain code 670 containing instructions according to the techniques disclosed herein.
  • a network adapter 640 Also connected to the processor(s) 610 through the interconnect 630 are a network adapter 640 and a mass storage device 650.
  • the network adapter 640 provides the headphones 110 with the ability to communicate with remote devices over a network and may be, for example, an Ethernet adapter, a Bluetooth adapter, etc..
  • the network adapter 640 may also provide the headphones 1 lOwith the ability to communicate with other computers.
  • the code 670 stored in memory 620 may be implemented as software and/or firmware to program the processor(s) 610 to carry out actions described above.
  • such software or firmware may be initially provided to the headphones 1 lOby downloading it from a remote system through the headphones 110 (e.g., via network adapter 640).
  • the sensor interface 660 may receive input from one or sensors, such as sensors 5A-5D of FIG. 1. Though illustrated as a single element, the headphones 110 may include multiple sensor interfaces 660. In some embodiments, the sensor interfaces 660 may process sensors of different types. The sensor interface 660 may communicate via the interconnect 630 with the memory 620, the processors 610, the network adapter 540 and/or the mass storage device 650 to store, analyze, and/or communicate the input received by the sensor interface 660 to the headphones 110 or a separate electronic device. As shown, the camera and microphone can be accessed via the interface 660.
  • FIG. 21 illustrates an embodiment of a headphones 110 according to the inventive concepts within an operating environment.
  • the headphones 110 may be communicatively coupled to an electronic device 30 by one or more communication paths 20A-n.
  • the communication paths 20A-n may include, for example, WiFi, USB, IEEE 1394, radio, though the present inventive concepts are not limited thereto.
  • the communication paths 20A-n may be used simultaneously and, in some embodiments, in coordination with one another.
  • the headphones 110 may exchange data and/or requests with the separate electronic device 30.
  • the headphones 110 may be communicatively coupled to one or more sensors 5A-5D.
  • the sensors 5A-5D may be integral to the headphones 110, attached to the headphones 110, or separate from the headphones 110.
  • the separate electronic device 30 may be communicatively coupled to the separate electronic device 30, such as sensors 30A-B illustrated in FIG. 20.
  • the sensors 30A-30B may be integral to the electronic device 30, attached to the electronic device 30, or separate from the electronic device 30.
  • the electronic device 30 and the headphones 110 may share input received from the sensors 5A-5D and 30A-30B to determine a position of a user of the electronic device 30 and the headphones 110.
  • the electronic device 30 may be in further communication with an external server 40 through a network 125.
  • the network 125 may be a large network such as the global network more commonly known as the Internet.
  • the electronic device 30 may be connected to the network 125 through intermediate gateways such as the network gateway 35.
  • the electronic device may be connected to the network gateway through various means.
  • the network gateway 35 may be a radio-based telecommunication gateway, such as a base station, and the electronic device 30 may communicate with the network gateway 35 via radio communication such as that commonly used in cellular telephone networks.
  • the network gateway 35 may be network access point, and the electronic device 30 may communicate with the network gateway 35 via wireless network (“WiFi").
  • WiFi wireless network
  • the network gateway 35 may further communicate with the network 125 via a communication method that is similar or different than the one used between the electronic device 30 and the network gateway 35.
  • the headphones 110 can access the network gateway 35 directly.
  • the electronic device 30 may communicate with the server to exchange information, data, and or requests. In some embodiments, the electronic device 30 may share data provided by the headphones 110 with the server 40. In some embodiments, as discussed further herein, the electronic device 30 may retrieve instructions and/or data from the server 40 which may be sent to the headphones 110 for offloading and/or augmentation. In some embodiments, the electronic device 30 may provide requests/data to the headphones 110 for operation thereon, and resulting data provided by the headphones 110 responsive to the requests/data may be further sent from the electronic device 30 to the server 40. In some embodiments, the data provided by the headphones 110 to the electronic device 30 may be combined with data determined by the electronic device 30, such as sensor input from sensors 30A-30B, before being provided to the server 40.
  • FIG. 22 is a schematic representation of the headphones 110 including the plurality of cameras 5 A-5B used to determine positional data in an environment that includes a feature 80, with six DOF in some embodiments.
  • the feature 80 can be at a fixed and/or known location in the environment that is visible to some or each of the sensors 5A-5B.
  • the sensor interface 660 can control the sensors 5 A-5B to capture data, for example images (or video) from a sensor that is a camera that depicts the different perspectives 87A-87B of the feature 80 from the respective sensors 5A-5B, respectively.
  • the different perspectives can be used by the sensor interface 660 to determine positional data of the headphones 110.
  • three sensors may triangulate a position of the feature 80 by analyzing data from three views 87A of the feature 80.
  • the feature 80 can further include a marker 85 which may further assist the sensor interface 660 in locating the feature 80 as well as in determining the positional data.
  • FIG. 23 is a schematic representation of operations between the headphones 110 and a separate electronic device 30 to determine positional data for the headphones as part of an immersive experience provided by the separate electronic device 30.
  • the headphones 110 may be connected to the separate electronic device 30 by one or more communication channels 20A-n.
  • the headphones 110 may be connected to the separate electronic device 30 by Bluetooth, WiFi, NFC, and/or USB, but the present inventive concept is not limited thereto.
  • a plurality of the communication channels 20A-n may be used simultaneously. According to FIG.
  • the separate electronic device 30 may transmit requests, via, over the communication channels 20A-n by, for example, an application programming interface (API) for the headphones 110, to the headphones 110 for positional data within the environment with six DOF.
  • the request may include additional data to assist with performing the request.
  • the requests can be received by the augmentation function, which can operate the sensors to generate the requested positional data or other requested service.
  • the generated positional data can then be transmitted to the separate electronic device 30 for use, for example, in generating a display on the separate electronic device 30 as part of the immersive application provided to the user.
  • the separate electronic device 30 may utilize the augmentation function and sensors 5A-5D in the headphones 110 to determine a position of the user head, for example, so that the display may be more satisfying to user. Moreover, this may be provided while also relieving the separate electronic device 30 from determining the positional data.
  • the separate electronic device 30 may have its own sensors and provide a portion of the positional data (such as GPS data and orientation data for the device via an associated accelerometer) and therefore request supplemental positional data from the headphones 110.
  • the separate electronic device 30 may transmit the requests for supplemental positional data which, when returned by the headphones 110, can be combined with the portion of the positional data provided by the additional sensors of the separate electronic device 30.
  • the separate electronic device 30 may therefore provide an improved immersive experience, (such as a VR or AR immersive experience).
  • the separate electronic device 30 may provide the portion of the positional data (such as GPS data and orientation data for the device 30 via an associated accelerometer) from the sensors of the separate electronic device 30 to the headphones 110. In such embodiments, the separate electronic device 30 may transmit the requests for the headphones 110 to determine a position based on the portion of the positional data provided by the separate electronic device 30 and the positional data determined by the headphones 110.
  • the portion of the positional data such as GPS data and orientation data for the device 30 via an associated accelerometer
  • the headphones 110 may then provide the absolute and/or relative position back to the separate electronic device 30.
  • the separate electronic device 30 may therefore provide an experience with improved performance, as certain calculations are offloaded to the headphones 110.
  • This approach can allow for distribution of computational tasks between the electronic device 30 and the headphones 110. This could range from a simple offloading of selected tasks to the headphones 110, to hosting of an application on the headphones 110 that is accessed via a user interface in the electronic device 30.
  • the separate electronic device 30 may use the augmentation function of the headphones 110 to perform text-to-audio translation (i.e. generate spoken audio corresponding to provided text).
  • the separate electronic device 30 may transmit text data in addition to the request to the augmentation function as part of an electronic book reader application.
  • the text data can be received by the augmentation function for conversion to audio for listening by the user through the earpieces of the headphones 110.
  • the user may select an option in the electronic book reader application to play audio output that corresponds to the written text of an electronic book.
  • the text data is transmitted to the augmentation function for conversion to audio, which therefore relieves the electronic book reader application from converting the text to audio.
  • the data transmitted to the headphones 110 may designate a characteristic of the audio play back, such as an accent, gender, or identity of the audio (such as voice characteristic associated with a celebrity).
  • the characteristics may be stored with the headphones 110, such that the user of the headphones 110 can customize their experience in a way that is persistent regardless of the device providing the text.
  • the headphones 110 can be controlled using applications provided on the mobile device 130 or embedded in the headphones 110 itself via an SDK.
  • FIG. 24 illustrates an embodiment of the headphones 110 according to the inventive concepts within an operating environment.
  • the headphones 110 may be communicatively coupled to an electronic device 30 (sometimes referred to as a mobile device 130) by one or more communication paths 20A-n.
  • the communication paths 20A-n may include, for example, WiFi, USB, IEEE 1394, radio, though the present inventive concepts are not limited thereto.
  • the communication paths 20A-n may be used simultaneously and, in some embodiments, in coordination with one another.
  • the headphones 110 may exchange data and/or requests with the separate electronic device 30.
  • the headphones 110 may be communicatively coupled to one or more sensors 5A-5D.
  • the sensors 5A-5D may be integral to the headphones 110, attached to the headphones 110, or separate from the headphones 110.
  • the separate electronic device 30 may be communicatively coupled to the separate electronic device 30, such as sensors 30A-B illustrated in FIG. 24.
  • the sensors 30A-30B may be integral to the electronic device 30, attached to the electronic device 30, or separate from the electronic device 30.
  • the electronic device 30 may be in further communication with an external server 40 through a network 125.
  • the network 125 may be a large network such as the global network more commonly known as the Internet.
  • the electronic device 30 may be connected to the network 123 through intermediate gateways such as the network gateway 35.
  • the electronic device may be connected to the network gateway through various means.
  • the network gateway 35 may be a radio-based telecommunication gateway, such as a base station, and the electronic device 30 may communicate with the network gateway 35 via radio communication such as that commonly used in cellular telephone networks.
  • the network gateway 35 may be network access point, and the electronic device 30 may communicate with the network gateway 35 via wireless network (“WiFi").
  • WiFi wireless network
  • the network gateway 35 may further communicate with the network 125 via a communication method that is similar or different than the one used between the electronic device 30 and the network gateway 35.
  • the electronic device 30 may communicate with the server to exchange information, data, and or requests. In some embodiments, the electronic device 30 may share data provided by the headphones 110 with the server 40. In some embodiments, as discussed further herein, the electronic device 30 may retrieve instructions and/or data from the server 40 which may be sent to the headphones 110 for offloading and/or augmentation. In some embodiments, the electronic device 30 may provide requests/data to the headphones 110 for operation thereon, and resulting data provided by the headphones 110 responsive to the requests/data may be further sent from the electronic device 30 to the server 40. In some embodiments, the data provided by the headphones 110 to the electronic device 30 may be combined with data determined by the electronic device 30, such as sensor input from sensors 30A-30B, before being provided to the server 40.
  • the sensors 5A-5D and 30A-30B may be still cameras, video cameras, microphones, and/or position detectors.
  • the headphones 110 may also have operational controls 7 which can be transmitted to the electronic device 30.
  • the operational controls 7 may interact with applications running on the electronic device 30 so as to control operations of the headphones 110.
  • the electronic device 30 may be communicatively coupled to a connected device 34.
  • the connected device can be any connected device that supports an associated app running in an operating environment of the electronic device 30.
  • one or more of the sensors 5A-5D and/or 30A-30B may be associated with the connected device 34.
  • FIG. 25 illustrates an embodiment for a cross-platform application programming interface for connected audio devices. As illustrated in FIG. 25, the electronic device 30 may run a device operating system. In some embodiments, the device operating system may be a portable device operating system such as iOS or Android.
  • a headphone application may execute.
  • the headphone application may be communicatively coupled to the headphones 110 via the electronic device 30. Though illustrated as headphones 110 and headphone application within the figures, it will be understood that the present inventive concepts may apply to any connected wearable device.
  • the sensor data processor may communicate with sensors on the headphones 110 and/or the connected device 34.
  • the sensor data processor may operate to provide data from the sensors to third party applications.
  • the sensor data processor may provide a video stream from a camera coupled to the headphones 110 to a third party application for further processing by the third party application (e.g. Facebook Live).
  • the integration with the third party applications may be accomplished via an API framework coupled to the sensor data processor.
  • the third party applications may provide respective third party applets which are configured to execute within the headphone application.
  • the third party applets may be statically or dynamically linked to the headphone application.
  • the third party applets may be configured to send and/or receive data from the sensor data processor via the API framework.
  • the API framework may be a complete implementation of all the functions by which data may be exchanged between the third party applets and the sensor data processor. Individual ones of the third party applets may implement some or all of the functions defined within the API framework.
  • Portions of the API framework may support specific classes of devices and/or device implementations.
  • the API framework may define classes such as an AUDIO device and/or a VIDEO device.
  • Third party applets may implement commands to the generic devices and/or may implement customized commands specific to their implementation.
  • the third party applets may, in turn, communicate directly to their respective third party applications.
  • the third party applications may also be executing within the device operating system.
  • the third party applications may communicate with additional externally connected devices.
  • the headphone application can provide connective functionality between the headphones 110 and other external devices and/or functions.
  • the visually impaired can use video cameras on the headphones 110 to receive assistance seeing while crossing the road.
  • Video from the video cameras on the headphones 110 may be provided to a third party application on the electronic device 30 to analyze the video stream.
  • the video cameras may act as eyes and then audibly give commands to the wearer of the headphones 110 that it is safe.
  • users can look at products in a store and a video camera the headphones 110 will capture video of what the user is seeing and provide the video to a third party application.
  • the third party application may provide targeted sales info based on user preferences, share product info, best price, reviews, and provide the ability to buy now.
  • teams can share and collaborate quickly on what they are working on via cameras on the headphones 110 as they look at their computer screens, job sites, fashion shows, medical demonstrations, concerts, etc.
  • the headphones 110 may have built in technology augmented with third party applications to help teams be more efficient
  • the headphones 110 may include a cross platform SDK that allows users to interact with third party applications that include artificial intelligence platforms, such as, for example, Siri, Cortana, Google Voice, Watson, etc.
  • the headphones 110 may be remote updatable and may learn user behavior and continue to enhance user experiences with machine learning and hot integration.
  • headphones 110 include still and/or video cameras
  • users can take pictures or videos of everything they see, not just what they see on a screen of the electronic device 30.
  • the headphones 110 may send the content directly to the electronic device 30, cloud, or through streaming audio and video to external platforms and/or application such as Facebook Live, Youtube Live, Periscope, Snapchat, etc.
  • FIG. 26 illustrates another embodiment for a cross-platform application programming interface for connected audio devices.
  • FIG. 26 The embodiments of FIG. 26 are similar to those illustrated in FIG. 25 in that they include a Sensor Data Processor and API framework within a headphone application executing in a device operating system on the electronic device 30.
  • the third party applications may communicate directly with the API framework without requiring the presence of third-party applets within the headphone application.
  • the third party applications can dynamically access functionality of the API framework without a pre-existing third party applet.
  • the API framework may be provided as a client-server framework handling requests sent from the third party applications.
  • the headphone application may recognize the existence of third party applications within the device operating system which do not have a current connection to the headphone application.
  • the unconnected third party application may represent a newly-added connected device. Responsive to this detection, the headphone application may initiate communication with the third party application and/or prompt the user to perform actions to integrate the third party application. The communication with the third party application may take place over the API framework.
  • communication between the headphone application and respective ones of the third party applications may be uni-direction or bi-directional, and may be initiated by the headphone application or the third party application.
  • FIGS. 25 and 26 may be combined into an embodiment which utilizes the client-server framework described with respect to FIG. 26 as well as the statically/dynamically linked third party applets of FIG.
  • FIG. 27 illustrates an embodiment of a smart remote control 100 according to the present inventive concepts within an operating environment that may be utilized with the headphones 110 as described herein. It will be understood that the inputs provided by the headphones 110 as described herein can also provide the functions of the smart remote so that the systems and operations described herein can be carried out without the smart remote 110 but rather only through use of the headphones 110.
  • the smart remote control 100 may be communicatively coupled to an electronic device 30 by one or more communication paths 200 A-n. In some embodiments, the smart remote control 100 may be physically separate from the electronic device 30.
  • the communication paths 200A-n may include, for example, WiFi, USB, IEEE 1394, Bluetooth, Bluetooth Low-Energy, electrical wiring, and/or various forms of radio, though the present inventive concepts are not limited thereto.
  • the communication paths 200A-n may be used simultaneously and, in some embodiments, in coordination with one another.
  • the smart remote control 100 may exchange data and/or requests with the electronic device 30.
  • the electronic device 30 may additionally be connected to headphones 10 via communication paths 20 A-n.
  • the communication paths 20 A-n may include, for example, WiFi, USB, IEEE 1394, Bluetooth, Bluetooth Low-Energy, electrical wiring, and/or various forms of radio, though the present inventive concepts are not limited thereto.
  • the communication paths 20A-n may be used simultaneously and, in some embodiments, in coordination with one another.
  • the headphones 10 may exchange data and/or requests with the electronic device 30.
  • the electronic device 30 may be in further communication with an external server 40 through a network 125.
  • the network 125 may be a large network such as the global network more commonly known as the Internet.
  • the electronic device 30 may be connected to the network 125 through intermediate gateways such as the network gateway 35.
  • the electronic device 30 may be connected to the network gateway 35 through various means.
  • the network gateway 35 may be a radio-based telecommunication gateway, such as a base station, and the electronic device 30 may communicate with the network gateway 35 via radio communication such as that commonly used in mobile telephone networks.
  • the network gateway 35 may be network access point, and the electronic device 30 may communicate with the network gateway 35 via wireless network (“WiFi").
  • the network gateway 35 may further communicate with the network 125 via a communication method that is similar or different than the one used between the electronic device 30 and the network gateway 35.
  • the communication paths described herein are not intended to be limiting. One of skill in the art will recognize that there are multiple communication method that is similar or different than the one used between the electronic device 30
  • the electronic device 30 may communicate with the server to exchange information, data, and or requests.
  • the electronic device 30 may share data provided by the smart remote control 100 and/or the headphones 10 with the server 40.
  • the electronic device 30 may retrieve instructions and/or data from the server 40 responsive to input received from the smart remote control 100.
  • the electronic device 30 may be communicatively coupled to a connected device 34.
  • the connected device 34 can be any connected device that supports an associated application running in an operating environment of the electronic device 30.
  • the electronic device 30 may exchange data and/or control the connected device 34 responsive to input received from the smart remote control 100. Though illustrated as being connected to the connected device 34 through the network gateway 35, this illustration is not intended to be limiting.
  • the electronic device 30 may directly connect to the connected device 34 via similar communication paths as described with respect to communications paths 200 A-n and 20A-n.
  • a path between the electronic device 30 and the connected device 34 may include, for example, WiFi, USB, IEEE 1394, Bluetooth, Bluetooth Low-Energy, electrical wiring, and/or various forms of radio, though the present inventive concepts are not limited thereto.
  • the communications paths 20A-n may be different communications paths than the communications paths 200A-n. That is to say that, in some embodiments, the electronic device 30 may communicate with the smart remote control 100 via different communication paths than with the headphones 10, the connected device 34, and/or the server 40. In some embodiments, the electronic device 30 may communicate with the smart remote control 100 via substantially similar communication paths as the headphones 10, the connected device 34, and/or the server 40.
  • the input received from the smart remote control 100 may be transmitted to the electronic device 30.
  • the input provided by smart remote control 100 may be used to interact with applications running on the electronic device 30 so as to control operations of the headphones 10, the server 40 and/or the connected device 34.
  • FIG. 28A illustrates a high-level block diagram showing an example architecture of a control device, such as smart remote control 100 as described herein, and which may implement the operations described herein. It will be understood that the headphones 110 can provide the functions of the smart remote control 100 in some embodiments.
  • the smart remote control 100 may include one or more processors 610 and memory 620 coupled to an
  • the interconnect 630 may be an abstraction that represents any one or more separate physical buses, point to point connections, or both connected by appropriate bridges, adapters, or controllers.
  • the interconnect 630 may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus or PCI-Express bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (12C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called "Firewire.”
  • PCI Peripheral Component Interconnect
  • ISA industry standard architecture
  • SCSI small computer system interface
  • USB universal serial bus
  • IIC (12C) IIC
  • IEEE Institute of Electrical and Electronics Engineers
  • the processor(s) 610 may control the overall operation of the smart remote control 100. As described herein, the one or more processors 610 may be configured to respond to input provided to the smart remote control 100 and transfer that input to the electronic device 30. In certain embodiments, the processor(s) 610 accomplish this by executing software or firmware stored in memory 620.
  • the processor(s) 610 may be, or may include, one or more programmable general purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs),
  • PLDs programmable logic devices
  • FPGAs field-programmable gate arrays
  • TPMs trusted platform modules
  • the memory 620 is or includes the main memory of the smart remote control 100.
  • the memory 620 represents any form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices.
  • the memory 620 may contain code 670 containing instructions according to the techniques disclosed herein.
  • a network adapter 640 may be connected to the processor(s) 610 through the interconnect 630.
  • the network adapter 640 may provide the smart remote control 100 with the ability to communicate with remote devices, including the electronic device 30, over a network and may be, for example, an Ethernet adapter, a Bluetooth adapter, etc.
  • the network adapter 640 may also provide the smart remote control 100 with the ability to communicate with other computers.
  • the code 670 stored in memory 620 may be implemented as software and/or firmware to program the processor(s) 610 to carry out actions described above. In certain embodiments, such software or firmware may be initially provided to the smart remote control 100 by downloading it from a remote system through the smart remote control 100 (e.g., via network adapter 640). Though referenced as a single network adapter 640, it will be understood that the smart remote control 100 may contain multiple network adapters 640 that may be used to communicate over multiple types of networks.
  • One or more input device(s) 660 may also be connected to the processor(s) 610 through the interconnect 630.
  • the input device(s) 660 may receive input from one or sensors coupled to the smart remote control 100.
  • the input device(s) 660 may include touch-sensitive sensors and/or buttons. Though illustrated as a single element, the smart remote control 100 may include multiple input devices 660.
  • the input devices(s) 660 may
  • the interconnect 630 communicates via the interconnect 630 with the memory 620, the processors 610, and/or the network adapter(s) 640 to store, analyze, and/or communicate the input received by the input device(s) 660 to the smart remote control 100, the electronic device 30, and/or another device.
  • FIG. 28B illustrates a high-level block diagram showing an example architecture of an electronic device, such as electronic device 30, as described herein, and which may implement the operations described herein.
  • the electronic device 30 may include one or more processors 710 and memory 720 coupled to an interconnect 730.
  • the interconnect 730 may be an abstraction that represents any one or more separate physical buses, point to point connections, or both connected by appropriate bridges, adapters, or controllers.
  • interconnect 730 may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus or PCI-Express bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (12C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called “Firewire.”
  • PCI Peripheral Component Interconnect
  • ISA industry standard architecture
  • SCSI small computer system interface
  • USB universal serial bus
  • IIC (12C) bus or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called "Firewire.”
  • the processor(s) 710 may control the overall operation of the electronic device 30. As described herein, the one or more processors 710 may be configured to receive input provided from the smart remote control 100 and execute operations of a common application programming interface (API) framework responsive to that input. In certain embodiments, the processor(s) 710 accomplish this by executing software or firmware stored in memory 720.
  • the processor(s) 710 may be, or may include, one or more programmable general purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), field- programmable gate arrays (FPGAs), trusted platform modules (TPMs), or a combination of such or similar devices.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • PLDs programmable logic devices
  • FPGAs field- programmable gate arrays
  • TPMs trusted platform modules
  • the memory 720 is or includes the main memory of the electronic device 30.
  • the memory 720 represents any form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices.
  • the memory 720 may contain code 770 containing instructions according to the techniques disclosed herein.
  • the network adapter(s) 740 may provide the electronic device 30 with the ability to communicate with remote devices, including the smart remote control 100, the connected device 34 (see FIG. 1) and/or the server 40 (see FIG. 1), over a network and may include, for example, an Ethernet adapter, a Bluetooth adapter, etc.
  • the network adapter(s) 740 may also provide the electronic device 30 with the ability to communicate with other computers.
  • the code 770 stored in memory 720 may be implemented as software and/or firmware to program the processor(s) 710 to carry out actions described above.
  • such software or firmware may be initially provided to the electronic device 30 by downloading it from a remote system (e.g., via network adapter 740).
  • the mass storage device 750 may contain the code 770 for loading into the memory 720.
  • the mass storage device 750 may also contain a data repository for storing configuration information related to the operation of the electronic device 30 and/or the smart remote control 100. That is to say that the mass storage device 750 may maintain data used to configure and/or operate the smart remote control 100. This data may be stored in the mass storage device 750 of the electronic device 30 and communicated to the smart remote control 100 via, for example, the network adapter 740.
  • the headphones 110 can receive input from the smart remote control 100 for interaction with connected devices using the cross-platform SDK described above.
  • the remote control application may include a cross platform SDK that allows users to interact with third party applications that include artificial intelligence platforms, such as, for example, Siri, Cortana, Google Voice, Watson, etc.
  • the remote control application may include a software development kit (SDK) to facilitate development and/or interaction with the API of the remote control application.
  • SDK software development kit
  • FIG. 29 illustrates another embodiment for a cross-platform API capable of receiving input at the electronic device 30 from the smart remote control 100 for interaction with connected devices.
  • third party applications may communicate directly with the API framework without requiring the presence of third-party applets within the remote control application.
  • the third party applications can dynamically access functionality of the API framework without a pre-existing third party applet.
  • the API framework may be provided as a client-server framework handling requests sent from the third party applications.
  • the remote control application may recognize the existence of third party applications within the device operating system which do not have a current connection to the remote control application.
  • the unconnected third party application may represent a newly-added connected device. Responsive to this detection, the remote control application may initiate communication with the third party application and/or prompt the user to perform actions to integrate the third party application. The communication with the third party application may take place over the API framework.
  • communication between the remote control application and respective ones of the third party applications may be unidirectional or bidirectional, and may be initiated by the remote control application or the third party application.
  • FIG. 29 illustrates an embodiment in which input provided at the smart remote control 100 is provided to the electronic device 30 for operation of further devices in communication with electronic device 30, such as headphones 10, connected device 34, and/or server 40.
  • the smart remote control 100 may have an input sensor 107.
  • the input sensor 107 may be a touch sensitive control, such as a capacitive and/or resistive sensor.
  • the input sensor 107 may detect a touch of the user on the input sensor 107.
  • the input sensor 107 may be a proximity sensor capable of sensing input provided proximate to, but not necessarily touching, the input sensor 107.
  • the input sensor 107 may be one or more buttons.
  • the input sensor 107 may be a video camera or microphone when the headphones 110 function as the remote.
  • the input sensor 107 may be configured to detect a single touch of a user on or near the input sensor 107. In some embodiments, the input sensor 107 may be configured to detect a“swipe” comprising a sequential series of contacts across or near the input sensor 107. In some embodiments, the input sensor 107 may be configured to detect a series of touches and/or movements that comprise a gesture. Systems and methods for detecting user input comprising touches and gestures are described in U.S. Patent Application
  • the input received from the input sensor 107 may be provided to the electronic device 30.
  • the electronic device 30 may determine that the input is to be used to control an additional device.
  • the additional device may be a connected device 34, an external server 40, and/or headphones 10, though the present inventive concepts are not limited thereto.
  • the electronic device may be capable of controlling a plurality of connected devices 34 simultaneously in response to input data.
  • the electronic device 30 may control the further devices, such as connected device 34, external server 40, and/or headphones 10 in multiple ways.
  • the electronic device 30 may process the input data from the input sensor 107 and responsively operate portions of a third party application.
  • the electronic device 30 may pass on the input data from the input sensor 107 to the third party application, for the third party application to process.
  • the electronic device 30 may pass on the input data directly to the further device, such as connected device 34, external server 40, and/or headphones 10.
  • the electronic device 30 may determine which further device and/or third party application to provide the input based on the contents of a data repository.
  • the data repository may contain configuration data and preferences data.
  • the electronic device 30 may analyze the input first and then, based on the configuration data and/or preferences data, provide the input to the third party application and/or further device, such as the connected device 34, an external server 40, and/or headphones 10.
  • the third party application may communicate with a further device, such as the connected device 34, an external server 40, and/or headphones 10, it will be understood that not all input data must be communicated to an additional device.
  • the input data provided from the input sensor 107 may be communicated to a third party application that controls operations of the electronic device 30.
  • the third party application may control a volume of the electronic device 30.
  • the configuration data may indicate that certain input should be provided to a particular third party application and/or further device based on the type of input provided. For example, the configuration data may indicate that if a particular input is received, it is to be provided to a particular third party application. For example, the configuration data may indicate that a vertical swipe of the input sensor 107 is to advance a track of music currently playing. Upon receipt of such an input from the input sensor 107, the electronic device 30 may indicate to a third party application for playing music that a track-advance command has been received. The third party application for playing music may advance to a different music track and transmit the new music track to the headphones 10.
  • the configuration data may indicate that a complex s-shaped gesture received at the input sensor 107 is to share a particular piece of data with an external server 40.
  • the electronic device 30 may indicate to a third party application for sharing data that a message is to be sent to the external server 40.
  • the third party application for sharing data may transmit the message to the external server 40 and the external server 40 may process the message.
  • the gesture may also be recognized by the video camera on the headphones 110.
  • the configuration data may indicate that a gesture shaped as an up-arrow received at the input sensor 107 is to increase a temperature of a connected device 34 comprising a networked thermostat.
  • the electronic device 30 may indicate to a third party application controlling the connected device 34 that a temperature change is needed.
  • the third party controlling the connected device 34 may transmit an appropriate communication, which may be proprietary to the connected device 34, to increase the current temperature.
  • the configuration data may also indicate additional ways in which the electronic device 30 may determine which third party application and/or further device is to receive communication in response to the input data from the input sensor 107.
  • the third party application and/or device that will receive the communication in response to the input data from the input sensor 107 depends on which external devices are in communication with the electronic device 30.
  • a particular up-arrow gesture may be associated with the initiation of noise cancelling if headphones 10 are detected as being connected to the electronic device 30. If headphones 10 are not detected, the up-arrow gesture may be associated with an increase in temperature for a connected device 34, such as a networked thermostat, if connected device 34 is in
  • the up-arrow gesture may be associated with increasing a volume of the electronic device 30.
  • the electronic device 30 may dynamically change what operations are performed responsive to the input data from the input sensor 107 responsive to changing conditions on the electronic device 30.
  • the third party application and/or device which receives the communication in response to the input data from the input sensor 107 may depend on which third party applications are currently operating on the electronic device 30 independently of any connected devices. For example, a forward swipe gesture received as input from the input sensor may be provided to a music application to advance a music track if a third party music application is running, and may be provided to a phone application to drop a current call if a call is currently active on the electronic device 30.
  • the third party application and/or device which receives the communication in response to the input data from the input sensor 107 may depend on location of the electronic device 30.
  • the electronic device 30 may include functionality configured to determine the location of the electronic device 30.
  • the electronic device 30 may have a GPS sensor or other circuit capable of determining a current location. The electronic device 30 may use this current location to further differentiate which third party application may receive data corresponding to the input provided from the input sensor 107.
  • the electronic device 30 may determine that a particular gesture received from the input sensor 107 is to be provided to a third party application associated with a connected device 34 including a thermostat. If the electronic device 30 determines that the electronic device 30 is currently located remote from the home of the user of the electronic device 30, the electronic device 30 may determine that the particular gesture received from the input sensor 107 is to be discarded, or, in some
  • the external server 40 may be configured to remotely connect to the thermostat at the house of the user of the electronic device 30.
  • the third party application and/or device which receives the communication in response to the input data from the input sensor 107 may depend on a determined speed of the electronic device 30.
  • the electronic device 30 may include functionality configured to determine motion and/or speed of the electronic device 30.
  • the electronic device 30 may have an accelerometer sensor or other circuit capable of determining motion of the electronic device 30. The electronic device 30 may use this determined speed to further differentiate which third party application may receive data corresponding to the input provided from the input sensor 107.
  • the electronic device 30 may determine that a particular gesture received from the input sensor 107 is to be provided preferentially to a third party application associated with the operation of a vehicle. For example, if moving quickly, a gesture interpreted as an up-arrow may preferentially be provided to a third party application associated with increasing the volume of an automobile sound system. If the electronic device 30 determines that the electronic device 30 is currently moving at a speed less than a particular threshold, the electronic device 30 may determine that the particular gesture received from the input sensor 107 is to be preferentially provided to a third party application associated with operation of the electronic device 30 and/or other connected device. For example, if not moving or moving slowly, the gesture interpreted as an up-arrow may preferentially be provided to a third party application associated with increasing the volume of the electronic device 30 and/or headphones 10 connected to the electronic device 30.
  • the preference data on the electronic device 30 may indicate that certain input should be provided to a particular third party application and/or further device based on a user and/or system preference.
  • the preference data may indicate that that a certain destination has priority if the electronic device 30 has multiple further devices and/or third party applications to which data associated with the input data from the input sensor 107 may be sent.
  • the preference data may also indicate a particular mapping for a gesture to a particular operation by the electronic device 30.
  • the preference data may, in some embodiments, override the configuration data.
  • the preference data may be provided as part of the input data.
  • the input data provided by the user at the smart remote control 100 may include two portions: a first portion that identifies a particular device and/or third party application, and a second portion that identifies additional input to be forwarded to that application.
  • a first motion on an input sensor 107 of the smart remote control 100 may indicate that the next input is to be provided to a texting third party application
  • a second motion on the input sensor 107 of the smart remote control 100 may input the particular command, such as the sending of a preformatted text message, to be sent to the texting third party application.
  • the preference data may be kept for a particular user.
  • the preference data may be accessed by the electronic device 30 in response to a particular smart remote control 100 and/or an identification of a particular user using the smart remote control 100.
  • the electronic device 30 may be capable of managing multiple smart remote controls 100, and preference data may be maintained for each of the smart remote controls 100.
  • the preference data may be based on a particular unique value that is associated with the respective smart remote controls 100 that is passed to the electronic device 30 during communication with the smart remote control 100.
  • this unique value may include a serial number of the smart remote control 100, and/or an address of the smart remote control 100 on one of the communications paths 200A-n (see FIG. 1).
  • the electronic device 30 may be able to access an RFID associated with the smart remote control 100 to determine a unique identity for the smart remote control 100.
  • the smart remote control 100 may have other inputs which allow a specific user to be identified.
  • the smart remote control 100 may have a fingerprint sensor.
  • the fingerprint sensor may allow a user of the smart remote control 100 to identify themselves to the electronic device 100 and access features of the smart remote control 100.
  • the electronic device 30 may use a fingerprint retrieved via smart remote control 100 to identify the user of the smart remote control 100 so as to load a particular set of preference data for the user.
  • the fingerprint sensor of the smart remote control 100 may be used as an additional identification and/or security device for the electronic device 30.
  • FIGs. 30-34 illustrate example embodiments of a smart remote control 100 according to the present inventive concepts.
  • the smart remote control 100 may be embodied as a separate stand-alone device.
  • the input sensor 107 may be located on one or both sides of the smart remote control 100.
  • the configurations of the input sensor 107 may be different depending on which side of the smart remote control 100 they are received. For example, a particular gesture on a first side of the smart remote control 100 may be interpreted separately and/or differently from the same gesture on a second side of the smart remote control 100.
  • the smart remote control 100 as illustrated in FIG. 30 may include a battery.
  • the battery may be charged via a wired connection to the smart remote control 100 and/or wirelessly.
  • FIG. 31 illustrates an example embodiment of the smart remote control 100 in which the smart remote control 100 is incorporated as part of a phone case.
  • the electronic device 30 may be the phone contained within the phone case, but the present inventive concepts are not limited thereto.
  • the smart remote control 100 may be coupled to the phone so as to receive power from the phone and/or may have a separate battery. In some embodiments, the battery used to power the smart remote control 100 may provide additional charging for the phone.
  • FIG. 32 illustrates an example embodiment of the smart remote control 100 in which the smart remote control 100 is incorporated as a set of earbuds.
  • the smart remote control 100 may be inline, or otherwise connected with, a wire of the earbuds.
  • the smart remote control 100 may be integrated into the earbud itself.
  • the smart remote control 100 may have a separate battery and/or may receive power over the wire of the earbuds.
  • the smart remote control 100 may automatically communicate with an electronic device 30 to which the earbuds are connected, but the present inventive concepts are not limited thereto.
  • the earbuds may also have all of the functions associated with the headphones 110 including hot keys, biosensors, and all other sensors described herein.
  • FIG. 33 illustrates an example embodiment of the smart remote control 100 in which the smart remote control 100 is incorporated with an audio jack.
  • the smart remote control 100 may be configured to be inserted into a standard audio jack, such as a 3.5mm headphone jack commonly used on some phones, though the present inventive concepts are not limited thereto.
  • the smart remote control 100 may have a separate battery and/or may receive power over the audio jack.
  • the smart remote control 100 may automatically communicate with an electronic device 30 to which the earbuds are connected to through the audio jack, but the present inventive concepts are not limited thereto.
  • FIG. 34 illustrates an example embodiment of the smart remote control 100 in which the smart remote control 100 is incorporated with a DC power connector.
  • the DC power connector may be configured to insert into a cigarette lighter receptacle in an automobile.
  • the smart remote control 100 may have a separate battery and/or may receive power from the DC power connector.
  • the smart remote control 100 when used in an automobile, may automatically communicate with a nearby electronic device 30, such as a personal phone of a driver of the automobile, to control a sound system of the automobile, but the present inventive concepts are not limited thereto.
  • the smart remote control 100 may include a pivot point 910 to allow a face of the smart remote control 100 to be tilted for convenient access.
  • FIG. 35 illustrates an embodiment in which the electronic device 30 may provide input to an external device based on input from a smart remote control 100.
  • the electronic device 30 may receive input from an input sensor 107 of a smart remote control 100. As described herein, this input may be communicated over communications paths 200 A-n between the smart remote control 100 and the electronic device 30.
  • the operations may continue at operation 1020, in which the electronic device 30 accesses a data repository to identify a user input pattern associated with the input received from the input sensor 107.
  • the user input pattern may be a gesture performed by a user at the smart remote control 100.
  • the operations may continue at operation 1030, in which the electronic device 30 identifies a third party application, an external device and/or a third party application associated with an external device that corresponds with the user input pattern.
  • the external device may be, for example, a connected device 34, external server 40, and/or headphone 10, as described herein.
  • the operations may continue at operation 1040, in which the electronic device 30 provides data associated with the input received from the smart remote control 100 to the third party application, the external device and/or the third party application associated with the external device
  • the headphones, methods, and systems described herein can be utilized to provide applications configured, for example, to provide particular solutions. Accordingly, the systems devices and methods shown in the figures herein can provide an underlying framework for those solutions. For example in some embodiments according to the inventive concept, the method illustrated for example by FIG. 4 using a companion app to stream live audio/video can provide the basic framework for a particular application some of which is described herein below in greater detail.
  • the operations described herein are carried out by a native application that is resident on the headphones 110 running, for example, on a Snap Dragon microprocessor, as shown for example in FIGs. 3 and 10.
  • the operations can be carried out by an application that is resident on a mobile device, such as a smartphone.
  • a mobile device such as a smartphone.
  • the operations can be carried out by a combination of applications which operate on multiple platforms across the network.
  • the inputs provided to the headphones 110 can be provided by audio commands via the microphones included on the headphones 110. Accordingly, native applications within the headphones 110 or applications resident elsewhere can be utilized to translate audio commands which can then be executed as part of the embodiments described herein.
  • the headphones 110 can provide a base platform for implementation of the personal assistant for the user.
  • the personal assistant can respond to queries regarding the user's calendar, weather, events, etc.
  • the personal assistant implemented by the headphones 110 can determine that the user is scheduled for an upcoming trip including a long air segment.
  • the personal assistant can download a suggested playlist of audio selections for listening during that air segment.
  • the personal assistant can receive feedback from the user regarding the seed ability or users reaction to the playlist.
  • the personal assistant can be utilized to scheduled requested events, such as doctor's appointment, auto repair appointments, etc.
  • the headphones 110 can operate with remote servers that provide both the users schedule, personal information, or other information utilized to anticipate needs or desires as well as remote servers that are utilized to fetch information associated with events to be supported such as airline schedules, hotel reservations, etc.
  • the headphones 110 can support an application (such as a preloaded native application configured for VOIP call or message setup) that enables call set up or message set up for a particular applications.
  • an application such as a preloaded native application configured for VOIP call or message setup
  • the user may speak a command that is a phone call is to be initiated among a group of recipients.
  • the application operating within the headphones 110 or remote can set up the call with the group by accessing users contacts list to determine contact numbers for the individuals including or, in some embodiments according to the inventive concept, those individual identified by a particular group (i.e., such as the engineering group).
  • the headphones 110 can utilize the application operating thereon to set up a call with those members of the engineering team which are identified in the users contact list as well as using the numbers associated with those members.
  • the same basic functionality can be provided through messaging rather than voice.
  • those calls may be logged, recorded, and indexed for content.
  • the calls can be translated to other languages preferred by particular group members.
  • the headphones 110 utilizing the native application or remotely supported application sensors can be included in the headphones 110 to monitor the users biometric functions (such as heart rate, blood pressure, oxygen levels, movements, etc.). Still further, the same basic operations can be provided via in- ear headphones rather than over the ear or on the ear headphones. In such embodiments, the in- ear headphones can support the same basic functions (such as hot keys, capacitive touch surfaces, biometric sensors described above, etc.). Other sensors may also be utilized.
  • the earbuds/the headphones 110 can include a native application that provides meditation coaching to the user or analytics that record movements or activities on the part of the user and can then be fed back to the user for use later.
  • the headphones 110 may support an education environment wherein users/students may access remote applications or imbedded applications such as Rosetta Stone wherein the user can learn a foreign language through voice interaction through the headphones 110 and a remote server. Accordingly, when the user is learning a foreign language, the foreign language prompts or lessons can be provided to the user via the headphones 110 from the remote server whereupon the user may provide audio responses during the lesson which are then forwarded either to the native application imbedded in the headphones 110 or the remote server that supports the application.
  • the camera can be used to live stream a user undergoing reading instruction where a remote teacher uses the streamed video to monitor the student's progress and correct student where needed.
  • these same arrangements may be utilized to support a group of students which are learning collaboratively.
  • individual users may be able to interact with selected other individual users to collaborate on particular points of interest in a lesson.
  • a teacher or instructor may be able to selectively interact with only a group of students that need particular assistance whereas the remainder of the students may proceed with the lesson.
  • such implementations may be provided across a plurality of headphones in communication with the server and each conducting communications to/from the headphones 110 to provide the audio instruction as part of the educational environment as well as the audio responses from the students. Further, inputs may also be provided via the touch sensitive surface of the
  • the educational environment may also include the provisioning of live streaming video from students (such as during a lab or experiment) so that the instructor can monitor their progress or correct for misunderstandings during the lesson.
  • the live streaming can be stored for future reference by the instructor or by the students who wish to review the lessons after the fact.
  • the headphones 110 can be utilized to provide a remote presence by which users can act as local observers for remote actors who can provide guidance (via audio) to the local user to wearing the headphones 110.
  • live video streaming can be provided to the remote actor whereupon audio instructions can be provided to the local user who could then act on instructions given by the remote actor.
  • the local user may act under the instructions of a remote physician to examine certain aspects of a patient's physiology or symptoms.
  • a native application can be used to process an image (including a symptomatic area) and relevant data bases or libraries are accessed to match the image to a known condition.
  • the video streamed may be zoomed using voice or touch input using the capacitive touch surface.
  • the headphones 110 can be linked to an artificial intelligence that is configured to associate particular visual symptoms with particular conditions which may be suggested to the wearer remotely.
  • the user may be directed to aim the cameras at a different portion of the body to gather additional information or an audio signal is played to the user indicating the likely condition (e.g. chicken pox) which may in turn generate a message from the headphones 110 to a telemedicine registered doctor having a specialization in the particular condition.
  • the likely condition e.g. chicken pox
  • remote experts can guide local users who are tasked with a procedure or assembly that would otherwise be error prone or too lengthy without the guidance of the remote actor.
  • a remote technician may assist a local user in the setup of a computer system or the resolution of a software issue.
  • Figure 37 is a schematic representation of a telemedicine system 3700 including the Headphones 110 as described herein.
  • Figure 37 also illustrates that the Headphones 110 are wirelessly coupled to a system 3715 which can provide an artificial intelligence service configured to process images and/or audio provided by the Headphones 110 to determine possible diagnosis of a subject 3750 based on the image data and/audio data in some
  • the Headphones 110 can include a plurality of video cameras each of which can sample and generate a live video stream that can be provided to the system 3715 via a wireless connection 3720. It will be also understood that the Headphones 110 can include a plurality of microphones that are configured to receive audio signals 3705 which then can be streamed to the system 3715 via the wireless connection 3720.
  • the wireless connection 3720 can be any type of wireless interface described herein.
  • the Headphones 110 can also include internal speakers that generate audio 3725 for the wearer.
  • the Headphones 110 can be worn by a local user to support operation in the telemedicine system 3700 in some embodiments according to the invention.
  • the local user can be a third party that is assisting with an examination of the subject 3750 and acting under the direction of a remote user 3735, such as a doctor or other medical professional.
  • the local user can be a doctor that is examining the subject 3750 or performing surgery.
  • the doctor may utilize the Headphones 110 to sample live video (or static images) as well as audio 3705 for storage on a remote system 3740, such as a system that would store medical records or insurance data.
  • a remote system 3740 such as a system that would store medical records or insurance data.
  • the doctor may utilize the headphones 110 to record a diagnosis derived by the doctor which in turn is transmitted to the system 3740 for storage thereon.
  • the live video (or static images) as well as audio 3705 can be generated during a surgical procedure, which can be stored.
  • the local user can be a third party that employs the Headphones 110 under the instruction of the remote user 3735 by listening to the audio signals 3725 that are provided by the remote user 3735.
  • the remote user 3735 may instruct the local user to pan in a certain direction so that a particular part of the anatomy is recorded by the video 3710.
  • the remote user of 3735 can relay questions to the local user that can be repeated to the subject 3750. The responses from the subject 3750 can be relayed to the remote user 3735 via the audio signals 3705 or provided directly via the microphones.
  • the local user can provide additional commentary on the subject 3750 while operating under the control of the remote user 3735.
  • all of the data provided via the Headphones 110 can be recoded on the system 3740.
  • the data may also be provided to a system 3730 accessed by the remote user 3735.
  • the remote user 3735 may utilize the system 3730 to assist in a diagnosis based on the data provided by the Headphones 110.
  • each of the systems shown in Figure 37 can be interfaced to the Headphones 110 via an SDK or API as described herein.
  • the system 3740 can include a portion thereof or a front end that provides translation of audio data to text for storage by the system 3740.
  • the local user can be the subject 3750 who can perform a self-exam using the Headphones 110.
  • the subject 3750 may act as the third party described above to provide information to the remote user 3735 and may operate under the instructions thereof via the audio 3725 to, for example, direct the video 3710 to the area of interest and to provide audio feedback 3705 to the remote user 3735 or system 3715.
  • the system 3715 can provide a diagnosis of the subject 3750 based on the audio and/or video provided from the Headphones 110.
  • the system 3715 may access a plurality of medial databases and/or medical experts systems storing repositories of images and symptoms associated with particular conditions.
  • the system 3715 can utilize those remote systems to determine a likely diagnosis for the condition observed by the Headphones 110.
  • the system 3715 can operate in an autonomous mode to provide feedback to the local user such as a likely diagnosis associated with the symptoms presented by the video and/or audio.
  • the system 3715 may receive audio and/or video from the Headphones 110 depicting the condition of the subject 3750 whereupon the system 3715 accesses the remote systems to determine the most likely diagnosis for the symptoms presented.
  • the audio feedback can be provided to the Headphones 110 so that the local user can determine the best course of action based on the feedback provided by the system 3715. For example, if the feedback from the system 3715 is a particular condition, the system 3715 may present several options to the local user on how to proceed, such as route a call to a doctor having a specialization in the area most closely associated with the probable diagnosis, take further steps to investigate the condition, call local emergency services, or a request for further information regarding the subject 3750.
  • the system 3715 can include a component which provides translation of audio to/from the Headphones 110 such that the existing 3715 can support a local user regardless of the native language spoken by the local user. Accordingly, when the local user speaks to the system 3715, the system recognizes the native language of the local user and translates audio information to the Headphones 110 to the native language of the local user.
  • the video 3710 can be used to recognize particular prescription medication 3755 that may be associated with the subject 3750.
  • a video image (or a static image) can be provided to the system 3715 whereupon on the remote systems can be accessed to determine possible side effects of the prescription medication 3755 which may be associated with the condition of the subject 3750.
  • the system 3715 can determine whether a potential interaction has occurred between the prescription medications 3755 (based on, for example, the live video). The determination can be provided to the local user by the audio 3725.
  • the system 3715 may provide the local user with addition instructions to gather information on the prescription medications 3755 or to ask the subject 3750 for additional information regarding the usage of the prescription medications 3755.
  • the remote user 3735 may include a plurality of remote users 3735 among which are specialists having a particular background associated with particular conditions which may be exhibited by the subject 3750. Accordingly, when a particular remote user 3735 determines that the condition of the subject 3750 may be associated with a particular condition, the remote user 3735 may refer the treatment of the subject 3750 to one of the other remote users 3735 having a specialization in the area most likely associated with the condition of the subject 3750. Still further, the local user 3735 may ask for a second opinion from another of the remote users 3735.
  • the Headphones 110 may be utilized by visually impaired to provide assistance in providing self-examination/diagnosis in combination with the system 3715 providing artificial intelligence services.
  • a visually impaired user may wear the
  • the audio signals 3725 can be provided by the system 3715 to prompt the local user (i.e. the visually impaired local user) to pan the video 3710 in the direction of the affected area that the system 3715 wishes to sample.
  • the audio signals 3725 can therefore be tightly coupled to provide feedback to the local user 110 so that the video 3710 adequately samples the affected area.
  • Headphones 110 can include local sensors that are configured to determine the status of the local user wearing the Headphones 110 (such as heart rate, SP02, etc.).
  • the Headphones 110 can produce the audio 3725 either locally or under the control of the remote system 3715 to provide a customized hearing test for the local user 110 under the supervision of the remote user 3735 or the system 3715 in an autonomous mode.
  • the local user can provide audio feedback to the system 3715 or the remote user 3735 to determine the results of the hearing test.
  • the doctor acting as the local user can record a surgical procedure using the video 3710 and/or the audio 3705 which is then stored in the remote system 3740.
  • video, image data, and/or audio data can be regularly sampled and stored on the remote system 3740 for comparison to one another over a longer period of time.
  • the local user 110 may periodically do a self-examination to record the same areas of the body which are then stored on the remote system 3740 for later access. After a particular period of time when enough data has been sampled, the system 3715 may provide a diagnosis based on progressive changes exhibited by the stored data.
  • the system 3740 can be accessed by remote operators to transcribe audio data recorded by doctors acting as the local user.
  • the doctor may dictate the impressions derived from the examination which are stored on the system 3740 and later transcribed by the remote operators.
  • Figure 38 is a schematic representation of a plurality of headphones 110 operatively coupled to a symptom aggregation system 3805 in some embodiments according to the invention.
  • the system 3805 can receive and send information to each of a plurality of headphones 110 which may be distributed among a wide geographic area.
  • the headphones 110 are operatively coupled to the system 3805 by the internet and each may reside in a different geographic region including different countries or portions of the world.
  • each of the headphones 110 can be configured to provide live video and/or audio streaming to the system 3805.
  • the system 3805 may enable live streaming of the headphones remotely. In other words, the system 3805 may determine to activate live streaming of selective ones of the headphones based on data received from the headphones.
  • the system 3805 can monitor video/audio stream from the headphones 110 the occurrence of symptoms in the general population over a wide geographic area.
  • remote users may wear the headphones 110 in day to day activity where the system 3805 receives live video and/or audio from the headphones and analyzes that video and/or audio to detect symptoms which may be associated with a particular condition, and especially conditions which are communicative.
  • the system 3800 may be utilized to monitor the occurrence and spread of contagious diseases over a wide geographical region.
  • the live streaming from the headphones can be used for early detection of the outbreak of certain conditions which may be geographically limited.
  • the system 3805 may analyze their respective live streams from headphones 110A and 11 OB to detect whether members of the population in that region are exhibiting symptoms of a particular condition. Once a condition is recognized, the system 3805 can notify operators or supervisory system 3735 to take remedial action. For example, in some embodiments according to the invention, the supervisory system 3735 may activate the headphones 110A and 110B to provide more constant live streaming from the headphones in that region (i.e., and not limited to simply headphones 110A and 110B). Still further, the supervisory system 3735 may control the system 3805 to enable the live streaming from the headphones in that region more frequently.
  • warning indicators can be provided to the headphones 110 in their respective geographic region. For example, once the system 3805 determines that an outbreak may have occurred in the region in which headphones 110A and 110 B are being used, the system 3805 can dispatch audio warnings to the headphones 110A and 110B as well as any other headphones in the geographic region to take particular steps to avoid exposure or to receive treatment.
  • the headphones 110A can include sensors such as heart rate sensors, SP02 sensors, temperature sensors, etc. that monitor physical parameters of the wearer which can then be forwarded to the system 3805 and supervisory system 3735 for further processing in response to a suspected outbreak.
  • the system 3805 can be coupled to the systems 3715, 3730, and 3740 shown in Figure 37 so that the video and/or audio collected from the headphones 100 in Figure 38 can be archived and subject to processing by the artificial intelligence system 3715.
  • the functionality of the artificial intelligence system 3715 and the system 3805 can be combined into a single system.
  • system 3805 can have access to the remote systems described above and referenced to Figure 37 to provide access to medical databases for assistance in diagnosing a particular condition captured by the live streaming of the headphones 110. It will be further understood that the supervisory system 3735 can be monitored by doctors or other medical professionals which can intervene to control the system 3805 in issuing particular instructions or controls to the headphones 110.
  • the headphones 110 using the local or remote application can support augmented shopping where the user wears the headphones 110 into a commercial outlet while shopping for a particular product or while simply browsing all products.
  • the video cameras located on the headphones 110 can be used to stream live video to a remote server which can be used to identify particular products as seen by the user.
  • the remote applications can identify the products provide information related to competitive products including price, performance, physical dimensions, as well as views of those products so that the user may make a more informed decision regarding which product may suit their needs better.
  • the commercial outlet or retailer may utilize the video stream to determine which products the users are more interested in.
  • the headphones 110 along with a native or remote application may support services for the visually or hearing impaired.
  • the headphones 110 may utilize the cameras located thereon as a "set of eyes" for the user and the video from which can be streamed to a remote server for image processing wherein particular objects can be identified in the user warned of their presence.
  • the camera may stream video to locate a crosswalk on a street and further maybe utilized to determine if traffic is stopped before prompting the wearer to proceed through the crosswalk.
  • the headphones 110 can be utilized to provide haptic feedback to the user using some of the same techniques described above in reference to the visually impaired environment. For example, the headphones 110 may let the user provide streamed audio using the microphones thereon to identify the presence of objects which otherwise would not be readily apparent to the users. Still further, the headphones 110 may provide haptic feedback to the user as to the presence of those objects and moreover, may provide haptic feedback in the directional format so that the user is made aware of not only the presence but also the location of the object relative to the user. [000248] In some embodiments according to the inventive concept, the headphones 110 along with the native or remote application can provide a wireless payment system. For example, in some embodiments according to the inventive concept, the headphones 110 may include an NFC and Bluetooth interface which may be utilized to pay wireless in response, for example, voice commands or touch commands on the capacitive touch surface.
  • the headphones 110 along with the native application and/or remote application can be utilized to provide a motion controlled gaming environment where for example the headphone cameras are used to track devices located in the gaming environment, such as drum sticks or other motion controllers manipulated by the wearer of the headphones.
  • the video cameras can provide additional accuracy in determining the location, movement, orientation of those objects in the gaming environment which may provide a more realistic experience.
  • the video can also be used for motion tracking of the user which can be used to increase the accuracy of other devices used during gaming, such as the motion controller.
  • the video can also be used to provide additional information regard the actions taken be the player where, for example, the player uses drum sticks with accelerometers to accurately track movement of the drum sticks whereas the cameras in the headphones 110 can be used to track the movement of the players head.
  • data can be transmitted between the drum sticks and the headphones 110.
  • the streamed video can also be rendered on a display of the gaming action for a more realistic experience.
  • the video of the gaming action can also be streamed to a video server, such as Twitch.
  • feedback from the object manipulated by the user can be provided to the headphones 110 which may in turn provide an audio feedback signal to the user.
  • the video cameras may be utilized to determine further information regarding movement of the objects manipulated by the user such as the location of the object relative to other items in the environment.
  • the headphones 110 along with the native or remote application can be utilized to provide voice activated searching whereupon the user may speak a particular command such as "Okay Muzik search" were upon the application converts the audio to a text based search which is then submitted to the remote server.
  • the audio information is transmitted from the headphones to a mobile device or server which translates the audio information to the text which is then forwarded for searching.
  • the headphones 110 operating with the native or remote applications can be utilized to operate connected devices such as lights, door locks, etc.
  • the user may speak a particular command (such as okay music) followed by a voice command configured to carry out a particular function associated with a particular device.
  • the audio information can be translated by the native application to text data or alternatively the audio information can be transmitted to the remote application or server for translation to text.
  • the translated text is then forwarded to servers which are configured to determine nature of the command that is intended (such as turn on my lights). That particular command string or instruction is returned to the location associated with the headphones 110 or user whereupon the command is directed to the particular device identified by the remote server.
  • the headphones 110 can provide an application that implements what is sometimes referred to as a "chatbot”.
  • the chatbot may be implemented in support of a calling or messaging environment wherein the user interacts with a remote calling or messaging system using the local chatbot which is intended to simulate conversation with an intelligent entity and can operate in real time in response to queries by the user.
  • the chatbot can be supported by an automated on-line assistant such as one utilized both for customer engagement, customer support, call direction, or the like.
  • an automated on-line assistant such as one utilized both for customer engagement, customer support, call direction, or the like.
  • the headphones 110 and the native application as well as the remote application can be provided in support of a visually impaired user by using, for example, the video cameras to identify products while shopping and provide audio feedback to the wearer such as cost, product characteristics, costs relative to other products, warranty information, location of other related items, etc.
  • the video cameras can be utilized to identify coupons for products that are examined by the user.
  • the camera can be used to read braille or used to interpret sign language by the user. For example, the user may sign using the camera where a native or remote application translates the signs to text, email or audio.
  • the headphones 110 including the native and/or remote application can support a customer service environment wherein the user may request information about a particular product that has been purchased or is being considered for purchase.
  • the user may contact the customer service environment as an initial step in exploring the applicability of a particular product which may be then follow up by direct contact by a remote actor using the audio communication to the headphone 110.
  • the video cameras can be utilized in a spatial relation environment (such as interior design, construction, etc.) where the user is visualizing items or relationships which may be virtual.
  • a native application or remote application may respond by over laying virtualized objects into the scene that is streamed from the headphones 110.
  • the headphones 110 can be utilized to calendar a meeting with a particular person or group of persons.
  • the user may indicate that a meeting is to be calendared for a group of people at a particular time and day whereupon the application resident on the headphones 110 or remote from the headphones 110 may respond by forwarding an invitation to each of the members of the group which can be followed up by a reminder forwarded to each of those members closer to the actual scheduled time/date.
  • the headphones 110 and native and remote applications can be utilized to provide enhanced sensory awareness (such as enhanced vision or hearing) using the video cameras and microphones included with the Headphones 110.
  • the video streamed by the headphones 100 can be processed to identify particular objects where the movement of objects therein may be of particular interest to the user.
  • the user may be somehow impaired and therefore the video stream is processed to identify moving objects nearby the user which may otherwise raise safety concerns.
  • the video stream is processed to identify moving objects nearby the user which may otherwise raise safety concerns.
  • the user may be visually impaired and therefor enhanced hearing is provided by the microphones to similarly warn the user about objects in the environment.
  • both the cameras and the microphones can be used to identify objects in the environment which may be of particular interest to the user. It will be further understood that the processing used to recognize the objects can be done natively in the headphones 110 or on a remote server whereupon the processed information is returned to the headphones 110 upon completion.
  • the headphones 110 along with a native application or remote application can be streamed to groups associated with a particular end point server, such as Facebook, so that a group of viewers may observe streamed video.
  • the end point server may not otherwise incorporate a filter upon content which may be provided.
  • native voice over IP calling applications can be preloaded on the headphones 110 which may enable the user to make low cost or free call, as well as send low cost or free messages to individuals or groups in response to voice commands.
  • a native application can provide foreign language translations such that the foreign language can be translated in real time to the user's native language.
  • the user may wear the headphones 110 around the neck wherein the earcups are rotated to point upward in the direction of the foreign language speakers.
  • the foreign language audio is received by the microphones on the headphones 110 which is then converted to the native language of the user.
  • the headphones 110 can be connected to a cloud backend that is preloaded with cognitive services used for speech text, text to speech, image recognition, facial recognition, language translation, searching, hots as well as other types of artificial intelligence services.
  • a user may operate as a "DJ" that generates a playlist to which other users may subscribe or listen in on.
  • the DJ user could generate playlists and issue an invitation to other user or followers so that those users may hear the music included in the playlist.
  • data may be transmitted to the user's headphones so that the audio content can be indexed directly to where the DJ user is listening so that both the DJ user as well as the users can listen to the music at essentially the same point.
  • the earcups of the headphones 110 are removable and include unique identifiers so that the type of cushion can be determined by the headphones 110 Accordingly, when on ear cushions are placed on the headphones 110 the music equalization can be set to a predetermined configuration whereas when over ear cushions are coupled to the headphones 110, the equalization can be changed to a more optimized setting.
  • the headphones 110 may be used in analog mode such that an audio cable can be used to connect the headphones 110 to the Mobile Device 130 while also streaming live video from the headphones 110 Accordingly, the video and analog can essentially provide from one another but essentially concurrently.
  • the headphones 110 can automatically download features from a remote server upon request by the user or upon request for a particular function that is not supported in the present configuration. Accordingly, when a user requests a particular function which is not supported, the headphones 110 may prompt the user for authorization to download a version of an application which supports the requested feature.
  • the headphones 110 can monitor and learn the behavior of the user which then can be utilized by an artificial intelligence to provide suggestions to the user relevant used based on interest, used to call transportation services by reference to a location system associated with the headphones 110, monitor biometric readings of the user, or by monitoring activities of the user which can be associated with levels of stress such as frequency of phone calls, the frequency of calendar appointments, non-movement of the user, etc.
  • the headphones 110 can be incorporated as part of a system where users subscribe to the paid or ad supported model where the headphones 110 can be provided, along with all software, for a monthly payment.
  • the user may provide a down payment which may entitle the user to a monthly fee for all services and hardware.
  • the user may opt for an ad supported model wherein the video camera on the headphones 110 is used to capture local information which can, in turn, be used to provide advertising which is tailored to the user based on data collected by the headphones 110.
  • the user operating under the ad supported model would review products every day in a commercial outlet or hear live ads from an advertiser to offset cost of the subscription.
  • the user may look at a particular product using the headphones 110 whereupon the object is scanned and uploaded to the cloud for processing by cognitive software whereupon the remote server indicates using, for example, audio feedback to the user which identifies the product, whereupon the user acknowledges whether the provided feedback correctly identifies the product and a live advertising is played to the user.
  • FIG. 36 is a schematic representation of a series of screens presented on the mobile device 130 running an application configured to connect the headphones 110 to the application for syncing in some embodiments according to the invention.
  • a user may choose to sync their mobile device running the application shown to the headphones 110.
  • the headphones 110 can be synced to any device that is associated with a screen such as a TV, tablet, AR/VR system, smart watch, etc.
  • the user can select an app from among the services that they wish to link to the headphones 110.
  • the users may enter passwords or choose other settings where upon the user can interact with the selected app using voice commands. For example, the user may speak "Facebook live, start” to start the Facebook live application, or speak “Spotify play Drake” to begin playing music from Spotify to the headphones 110, or "messenger, Fred ⁇ will be home in 30 minutes'” to send a message to Fred using messenger or speak "Instagram, take picture” to take a picture using the Instagram application which is linked to the application.
  • applications running on the headphones 110 in the background can be enabled in response to voice commands can perform features and actions described herein in reference to FIGs. 1-36 as well as monitor behavior of the user based on the sensor input coupled to the headphones.
  • a particular application running in the background may be configured to periodically ask questions of the user whereupon the responses can be forwarded to a remote server (or processed by a native application) to monitor the users behavior and habits to determine the likelihood that particular products may be of interest to the user.
  • the application may ask questions associated with polling or make recommendations regarding health or wellness based on biometric sensor input or monitored sensors associated with the headphones 110.
  • the headphones 110 may communicate to remote server that the user participates in meditation in a particular time such as before work and make further note that the user's performance at work should be monitored to determine if the meditation provides any objective benefits, such as more alert behavior, more collaboration, etc. compared to users who do not meditate or practice some other behaviors such as listening to music.
  • the learned behavior accumulated by the headphones 110 can identify certain idiosyncrasies associated with the user and suggest particular applications for the users benefit or alternatively, new
  • the systems methods and devices described herein can take the form factor of a Head-worn Computer complete with an operating system as described herein and as depicted, for example, in Figure 10.
  • all of the functionality of a conventional mobile device, such as a smart phone, and its accompanying applications can be provided by the Head-worn Computer system.
  • the Head-worn Computer can operate as part of a subscription based service where the user pays the monthly fee in exchange for the functionality described herein such as calling, live video streaming, music streaming, telemedicine capabilities, access to education classes, accessibility support for the hearing and visually impaired, motion controlled gaming, etc.
  • Computer can provide the platform for a mobile communications system that provides unlimited calling and messaging along with other enhanced services such as group calling for teens, group messaging for teens, group listening to streaming services, etc. It will be further understood that the support for the mobile plan can be provided through an SDK configured to support specific applications such as Facebook Messenger and Watsapp.
  • live streaming of video can be configured for ingestion by social media services such as Facebook, Twitter, Snapchat, YouTube, Instagram, and Twitch. Other services may also be used.
  • live streaming of audio can be provided from the Headphones 110 or the Head-worn Computer system in conjunction with services such as Spotify, YouTube Music, Title, iHeartRadio, Pandora, Sound Cloud, Apple Music, and Shazam. Other audio services may also be used.
  • the calling applications described herein and provided by the Headphones 110 or the Head-worn Computer can be configured to operate with applications such as Skype, Slack, Facebook workplace, Twilio, WatsApp, G Talk, Twitch, Line, and WeChat. Other calling application may also be supported.
  • Headphones 110 or the Head-worn Computer can be configured to operate with applications such as Facebook Messenger, WatsApp, Skype, Wechat, Line, Google, and Facebook Messenger. Other messaging applications may also be supported.
  • the Headphones 110 or the Head-worn Computer can be configured to support health and wellness applications such as the brand Jordan or Puma, motion tracking, sleep tracking, meditation, stress management, telemedicine, WebMD (utilized for identifying potential illnesses),
  • the Headphones 110 or the Head-worn Computer can be configured to support education applications such that class lessons can be recorded and made available online, live streaming or offline streaming can be provided on demand for remote locations, language translation can also be provided, camera identification of historical or art object, general image recognition, voice control, reading of braille, and text to speech. Other education type applications may also be supported.
  • the Headphones 110 or the Head-worn Computer can be configured to support accessibility type applications such as sign language control wherein the video camera can be used to identify particular signs as part of sign language (which can then provide the basis for control of the headphones or the Head-worn Computer), can provide functionality to replace what is commonly referred to as a seeing eye dog to assist the visually impaired in safely traveling through the environment, custom hearing tests with tools to diagnose hearing issues, predictive noise cancelation, access to emergency services, detection of abduction which can automatically activate the camera and GPS associated with the Headphones 110 and the Head-worn Computer system.
  • accessibility type applications such as sign language control wherein the video camera can be used to identify particular signs as part of sign language (which can then provide the basis for control of the headphones or the Head-worn Computer)
  • the video camera can be used to identify particular signs as part of sign language (which can then provide the basis for control of the headphones or the Head-worn Computer)
  • the Headphones 110 or the Head-worn Computer can be configured to provide business to business type applications which can for example connect teams using live video, group calls (including recording calls, taking notes, linking to calendars, contacts, sharing call notes or voice recordings), group messaging, customer service immigration (where it may access customer service for a particular product that is seen by the video cameras or for the Headphones or the Head-worn Computer itself), construction, interior design, mapping applications, access to news, personal calendars, a personal assistant, where for example a best price can be obtained by viewing the product using the video cameras.
  • group calls including recording calls, taking notes, linking to calendars, contacts, sharing call notes or voice recordings
  • group messaging customer service immigration (where it may access customer service for a particular product that is seen by the video cameras or for the Headphones or the Head-worn Computer itself), construction, interior design, mapping applications, access to news, personal calendars, a personal assistant, where for example a best price can be obtained by viewing the product using the video cameras.
  • FIG 39 is a block diagram of a wearable computer system 3900 including at least one integrated projector 3901 in some embodiments according to the invention.
  • the wearable computer system 3900 may be, in some embodiments according to the invention, audio/video enabled headphones capable of live streaming video to a remote server with at least one integrated projector 3901 for providing an immersive augmented reality experience for the wearer of the computer system 3900.
  • the computer system 3900 includes at least one projector 3901 operatively coupled to the microprocessor which can be used to provide projected video output onto an arbitrary surface.
  • the computer system 3900 can be utilized to provide an immersive augmented reality experience for the user as described, for example, in reference to Figures 20 - 23.
  • the computer system 3900 can be equipped with sensors 5 that can be utilized to provide positional data for the computer system 3900 as it moves through an environment.
  • the movement of the wearable computer 3900 may be tracked using the sensors 5 so that the user may be provided with a more realistic experience by determining, for example, head movement or movement of the user's body within the environment, which can be used to alter the perspective of the video shown via the projector 3901.
  • the feature 80 can be used as a reference by the computer system 3900 to determine positional data within the environment as described above in reference to, for example, Figure 22. Still further, the computer system 3900 may be operatively coupled to a GPS system as shown in Figure 39 to provide geographic positional information to the computer system 3900 as it moves beyond the local environment which may be out of range of the feature 80. It will be further understood that although one feature 80 is shown in Figure 39, additional features may also be used for reference by the wearable computer system 3900. It will be further understood that the sensors 5 can also include emitting devices such as sonar or lidar, radar, or other sensors that can be utilized by the computer system 3900 to determine (at least partially or incrementally) the positional data for the wearable computer system 3900.
  • emitting devices such as sonar or lidar, radar, or other sensors that can be utilized by the computer system 3900 to determine (at least partially or incrementally) the positional data for the wearable computer system 3900.
  • the computer system 3900 can receive augmentation data from a plurality of sources for combination with other information provided by, or to, the computer system 3900 and projected via the projector 3901 for viewing by the wearer of the computer system 3900.
  • the augmentation data may be provided by a gaming application and combined with the positional data determined by the computer system 3900 which can be rendered by the computer system 3900 and projected by the projector 3901 for viewing by the user during game-play.
  • the rendering of the combined data can be modified so that the projector 3901 provides a more realistic view of the perspective provided to the user.
  • the computer system 3900 can also receive data from a mobile device (or an application executing on a mobile device) for display by the projector 3901.
  • the mobile device may provide a representation of a video output which would normally be provided on a display the mobile device.
  • the computer system 3900 can relay the display information provided received from the mobile device to the projector 3901 for display on an arbitrary surface.
  • the wearable computer system 3900 can be used to generate a large format virtual display from a relatively small format display integrated with a mobile device.
  • the limitations associated with a relatively small screen provided by the mobile device can be improved by projecting the display of the mobile device to a larger format so that the user of the computer system 3900 may view the display more clearly without the need for a large format electronic device (such as a monitor).
  • the computer system 3900 may be used to provide a convenient large format display regardless of the format provided by the mobile device.
  • the mobile device can be any device that provides a video output for reproduction via the projector 3901.
  • multiple mobile devices may be in communication with the computer system 3900 which may be then combined onto a single composite display that is provided by the projector 3901 onto the arbitrary surface.
  • the arbitrary surface can be any surface that is suitable for a display of an image thereon and can be any size that is desired for the display.
  • the surface can be the back of an airplane seat or a piece of paper or the user's hand. It will be further understood that the surface can have an arbitrary orientation relative to the user.
  • the projector 3901 may be adjustable to compensate for the orientation of the surface relative to the user so that the image projected onto the surface may be substantially rectangular.
  • the mobile device can be an electronic watch or other accessory includes a small format display. In some embodiments, the mobile device can be an electronic device that does not include a display.
  • the computer system 3900 can include at least one camera (which can provide still images and/or video images) which may be combined with data that is to be projected onto the surface.
  • the camera may be used to sample the surrounding environment and a projected image can be generated based on the capture image augmented with an overlay of the augmentation data shown at Figure 39.
  • the camera may be independently adjustable to sample the appropriate scenes despite the orientation of the computer system 3900 relative to the surface on which the image is to be projected.
  • Figure 39 shows that various accessories can be wirelessly coupled to the computer system 3900.
  • the accessory can be the electronic devices associated with a gaming system such as a wand, drumsticks, or generic device which can be used to participate in an electronic game.
  • the accessory can be a set or drumsticks which are configured to provide the functionality described in US Patent Application Number 15/090,175 ( ⁇ 75) entitled Interactive Instruments and Other Striking Objects filed April 4, 2016, the entire disclosure of which is incorporated herein by reference.
  • an image of a virtual drum set may be generated and projected onto a surface via the projector 3901. The user may then utilize the drum sticks to play the virtual drum set as described in the 'XXX application.
  • Other accessories may also be used.
  • an API can be provided to access the computer system 3900.
  • Figure 40 is a schematic representation of an earcup of the computer system 3900 as a set of headphones equipped with cameras and projector 3901 in some embodiments according to the invention.
  • the projector lens 409 can be located on a movable bezel 809 which rotates so that the projector 3901 can be oriented up or down relative to the user's placement of the wearable computer system on the head. Accordingly, the surface on which the image is to be projected can be more conveniently located by rotating the camera lens 409 to compensate for the orientation of the earcup relative to the surface.
  • Figure 41 is a schematic diagram illustrating various sources of augmentation data which can be overlaid or combined with images to be projected.
  • the augmentation data can be data provided by a gaming system such as scenes rendered as part of a first person shooter application which may include remote participants in the game that are competing with the user of the computer system 3900.
  • the augmentation data can be provided by a remote server which can provide various types of data to be overlaid with images that can be generated by the camera included with the wearable computer system 3900.
  • the remote server may provide anatomical data that can be projected onto a body of a patient so that the user may view the relative positions of internal organs when viewing the patient.
  • the camera may sample the image of the patient whereupon the remote server provides the anatomical data for augmentation so that the processer in the wearable computer system 3900 registers the image data relative to the augmentation data so that the internal anatomical images are overlaid correctly onto the image of the patient so the organs appear in the proper position.
  • the user may stand in front of a mirror and sample an image of themselves using the camera.
  • the remote server may provide augmentation data that represents clothing which can be overlaid and rendered with a sample image from the mirror so that the projected image combines the clothing data with the sampled image so that the user may view themselves as if the clothes were being worn.
  • registration of the user's image can be provided by the wearable computer so that the overlaid clothing can be properly rendered onto the image of the user.
  • the color, size, style, tailoring, and the like can be changed by the user whereupon the augmentation data representing the clothing may be modified to provide the changes selected.
  • the clothing can be associated with an electronic catalogue that the user can refer to when selecting clothing for viewing.
  • the clothing can be associated with a hardcopy catalogue that the user can refer to when selecting clothing for viewing wherein the camera can be used to sample the image or product code which can be used to request the corresponding augmentation data from the remote server.
  • the augmentation data can include construction information such that an inspection of a building could be provided by sampling a video of a building and overlaying the image with the construction blueprints so that an inspector can view internal components without opening the walls. Again, proper registration would occur between the augmentation data than comprises the blueprints and the sampled image of the interor of the building so that the components included in the blueprints are shown in the proper position relative to the sampled image.
  • Figure 42A is a schematic representation of the computer system 3900 as a pair of headphones generating a projection 4205 onto an arbitrary surface 4201 at an arbitrary orientation relative to the system 3900.
  • the projector lens is movable relative to the earcup on the headphones so that the projection can be viewed with an appropriate aspect ratio despite the arbitrary orientation of the surface relative to the headphones.
  • Figure 42B is an alternative view of the headphones shown in Figure 42A including multiple projectors: one on one of the earcups and another on the center of the headband. Still further, Figure 42B shows that the camera can be located on the opposite earcup relative to the first projector. As further shown in Figure 42B, projection field 1 can be oriented onto the surface for viewing along with image provided by projection field 2 so that the two projection fields completely align with one another on the surface. Accordingly, the first and second projectors can be used to provide different components of the same image so that, for example, a three dimensional image may be generated by the system 3900. Still further, the camera field sampled by the camera shown on the opposite earcup can sample the image generated by the over laid first and second projection fields for transmission to a remote server.
  • embodiments described herein may be embodied as a method, data processing system, and/or computer program product. Furthermore, embodiments may take the form of a computer program product on a tangible computer readable storage medium having computer program code embodied in the medium that can be executed by a computer.
  • the computer readable media may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wired, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB.NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computer environment or offered as a service such as a Software as a Service (SaaS).
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service
  • These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computer to implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Telephone Function (AREA)

Abstract

Un système monté sur la tête peut comprendre une caméra vidéo configurée pour fournir des données d'image. Un circuit d'interface sans fil peut être configuré pour recevoir des données d'augmentation en provenance d'un serveur distant. Un circuit de traitement peut être couplé à la caméra vidéo, le circuit de traitement pouvant être configuré pour enregistrer les données d'image conjointement aux données d'augmentation et pour combiner les données d'image aux données d'augmentation afin de fournir des données d'image augmentées. Un circuit de projecteur, couplé au circuit de traitement, peut être configuré pour projeter sur une surface les données d'image augmentées en provenance du système monté sur la tête.
PCT/US2017/056986 2016-10-17 2017-10-17 Système informatique vestimentaire audio/vidéo à projecteur intégré WO2018075523A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780077088.9A CN110178159A (zh) 2016-10-17 2017-10-17 具有集成式投影仪的音频/视频可穿戴式计算机系统
EP17863227.9A EP3526775A4 (fr) 2016-10-17 2017-10-17 Système informatique vestimentaire audio/vidéo à projecteur intégré

Applications Claiming Priority (16)

Application Number Priority Date Filing Date Title
US201662409177P 2016-10-17 2016-10-17
US62/409,177 2016-10-17
US201662412447P 2016-10-25 2016-10-25
US62/412,447 2016-10-25
US201662415455P 2016-10-31 2016-10-31
US62/415,455 2016-10-31
US201662424134P 2016-11-18 2016-11-18
US62/424,134 2016-11-18
US201662429398P 2016-12-02 2016-12-02
US62/429,398 2016-12-02
US201662431288P 2016-12-07 2016-12-07
US62/431,288 2016-12-07
US201762462827P 2017-02-23 2017-02-23
US62/462,827 2017-02-23
US201762516392P 2017-06-07 2017-06-07
US62/516,392 2017-06-07

Publications (2)

Publication Number Publication Date
WO2018075523A1 WO2018075523A1 (fr) 2018-04-26
WO2018075523A9 true WO2018075523A9 (fr) 2019-06-13

Family

ID=62019390

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/056986 WO2018075523A1 (fr) 2016-10-17 2017-10-17 Système informatique vestimentaire audio/vidéo à projecteur intégré

Country Status (3)

Country Link
EP (1) EP3526775A4 (fr)
CN (1) CN110178159A (fr)
WO (1) WO2018075523A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11083344B2 (en) 2012-10-11 2021-08-10 Roman Tsibulevskiy Partition technologies
EP3803796A4 (fr) 2018-05-29 2021-06-23 Curiouser Products Inc. Appareil d'affichage vidéo réfléchissant pour formation et démonstration interactives, et procédés d'utilisation associés
US11465030B2 (en) 2020-04-30 2022-10-11 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11167172B1 (en) 2020-09-04 2021-11-09 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8965460B1 (en) * 2004-01-30 2015-02-24 Ip Holdings, Inc. Image and augmented reality based networks using mobile devices and intelligent electronic glasses
US9361729B2 (en) * 2010-06-17 2016-06-07 Microsoft Technology Licensing, Llc Techniques to present location information for social networks using augmented reality
US9977496B2 (en) * 2010-07-23 2018-05-22 Telepatheye Inc. Eye-wearable device user interface and augmented reality method
US9348141B2 (en) * 2010-10-27 2016-05-24 Microsoft Technology Licensing, Llc Low-latency fusing of virtual and real content
KR20160084502A (ko) * 2011-03-29 2016-07-13 퀄컴 인코포레이티드 로컬 멀티-사용자 협업을 위한 모듈식 모바일 접속된 피코 프로젝터들
JP2012253483A (ja) * 2011-06-01 2012-12-20 Sony Corp 画像処理装置、画像処理方法、およびプログラム
US8952869B1 (en) * 2012-01-06 2015-02-10 Google Inc. Determining correlated movements associated with movements caused by driving a vehicle
US20150170418A1 (en) * 2012-01-18 2015-06-18 Google Inc. Method to Provide Entry Into a Virtual Map Space Using a Mobile Device's Camera
US20130339859A1 (en) * 2012-06-15 2013-12-19 Muzik LLC Interactive networked headphones
KR102065687B1 (ko) * 2012-11-01 2020-02-11 아이캠, 엘엘씨 무선 손목 컴퓨팅과 3d 영상화, 매핑, 네트워킹 및 인터페이스를 위한 제어 장치 및 방법
GB201314984D0 (en) * 2013-08-21 2013-10-02 Sony Comp Entertainment Europe Head-mountable apparatus and systems
JP2016532510A (ja) * 2013-08-30 2016-10-20 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Dixon磁気共鳴イメージング
WO2015027286A1 (fr) * 2013-09-02 2015-03-05 University Of South Australia Système et procédé de simulation de formation médicale
US9747007B2 (en) * 2013-11-19 2017-08-29 Microsoft Technology Licensing, Llc Resizing technique for display content
WO2015179877A2 (fr) * 2014-05-19 2015-11-26 Osterhout Group, Inc. Interface utilisateur externe pour ordinateur porté sur la tête
US20180276896A1 (en) * 2014-11-07 2018-09-27 Pcms Holdings, Inc. System and method for augmented reality annotations
WO2016113693A1 (fr) * 2015-01-14 2016-07-21 Neptune Computer Inc. Appareils, procédés, et systèmes de plate-forme de traitement de données et de commande portable
US9779554B2 (en) * 2015-04-10 2017-10-03 Sony Interactive Entertainment Inc. Filtering and parental control methods for restricting visual activity on a head mounted display

Also Published As

Publication number Publication date
EP3526775A1 (fr) 2019-08-21
EP3526775A4 (fr) 2021-01-06
WO2018075523A1 (fr) 2018-04-26
CN110178159A (zh) 2019-08-27

Similar Documents

Publication Publication Date Title
US20220337693A1 (en) Audio/Video Wearable Computer System with Integrated Projector
CN109600678B (zh) 信息展示方法、装置及系统、服务器、终端、存储介质
US9038127B2 (en) Physical interaction with virtual objects for DRM
KR102184272B1 (ko) 글래스 타입 단말기 및 이의 제어방법
US8963956B2 (en) Location based skins for mixed reality displays
US20160188585A1 (en) Technologies for shared augmented reality presentations
US20100060713A1 (en) System and Method for Enhancing Noverbal Aspects of Communication
US20230290259A1 (en) Virtual, augmented and extended reality system
WO2018075523A9 (fr) Système informatique vestimentaire audio/vidéo à projecteur intégré
KR20190004088A (ko) 생체신호연동 가상현실 교육 시스템 및 방법
JPWO2014156388A1 (ja) 情報処理装置、通知状態制御方法及びプログラム
CN106804000A (zh) 直播回放方法及装置
WO2021043121A1 (fr) Procédé, appareil, système et dispositif de changement de visage d'image et support de stockage
CN104509089A (zh) 信息处理设备、信息处理方法以及程序
KR20170012979A (ko) 영상 공유 서비스를 위한 전자 장치 및 방법
US10778826B1 (en) System to facilitate communication
CN107409131A (zh) 用于无缝数据流送体验的技术
US20220392175A1 (en) Virtual, augmented and extended reality system
US12010155B2 (en) Operating system level management of group communication sessions
JP7385733B2 (ja) 仮想カメラと物理的カメラとの位置同期
CN111512639A (zh) 带互动显示屏的耳机
CN108140045A (zh) 在增强和替代通信系统中支持感知和对话处理量
KR102512855B1 (ko) 정보 처리 장치 및 정보 처리 방법
US20230351711A1 (en) Augmented Reality Platform Systems, Methods, and Apparatus
EP3965369A1 (fr) Appareil de traitement d'informations, procédé de traitement d'informations et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17863227

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017863227

Country of ref document: EP

Effective date: 20190517