CN114270296A - Hardware architecture for modular eyewear systems, devices, and methods - Google Patents

Hardware architecture for modular eyewear systems, devices, and methods Download PDF

Info

Publication number
CN114270296A
CN114270296A CN202080049359.1A CN202080049359A CN114270296A CN 114270296 A CN114270296 A CN 114270296A CN 202080049359 A CN202080049359 A CN 202080049359A CN 114270296 A CN114270296 A CN 114270296A
Authority
CN
China
Prior art keywords
user
temple
sensors
processor
eyewear
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080049359.1A
Other languages
Chinese (zh)
Inventor
埃内斯托·卡洛斯·马丁内兹·维拉潘多
张惠权
罗国华
苏超明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Solos Technology Ltd
Original Assignee
Solos Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/926,722 external-priority patent/US11709376B2/en
Application filed by Solos Technology Ltd filed Critical Solos Technology Ltd
Publication of CN114270296A publication Critical patent/CN114270296A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C2200/00Generic mechanical aspects applicable to one or more of the groups G02C1/00 - G02C5/00 and G02C9/00 - G02C13/00 and their subgroups
    • G02C2200/02Magnetic means

Abstract

Systems, apparatuses, and methods are taught for providing reconfigurable components for eyewear devices. A reconfigurable assembly for use in an eyewear device includes an embedded electronic system. The embedded electronic system is configured for wireless communication and processing of sensor signals. The embedded electronic system is embedded in a component of the eyewear apparatus. A plurality of sensors are embedded in one or more of the reconfigurable component and the eyewear apparatus. The plurality of sensors are in electrical communication with the embedded electronic system. The embedded electronic system further includes a processor. The processor is configured to receive the outputs from the plurality of sensors and determine a system control parameter based on the output of at least one of the plurality of sensors.

Description

Hardware architecture for modular eyewear systems, devices, and methods
Cross Reference to Related Applications
This patent application is a continuation of U.S. patent application serial No. 16/711,340 entitled "modular eyewear system, apparatus, and method" filed 12, 11, 2019. This patent application claims the benefit of priority from U.S. provisional patent application No. 62/778,709 entitled "modular eyewear system with interchangeable frames and temples with embedded electronics for mobile audiovisual augmentation and augmented reality" filed on 12/2018. U.S. provisional patent application No. 62/778,709, entitled "modular eyewear system with interchangeable frame and temple with embedded electronics for mobile audio-visual augmentation and augmented reality" is incorporated herein by reference in its entirety. This patent application claims priority to U.S. provisional patent application No. 62/873,889 entitled "wearable device apparatus, system, and method" filed on 7/13/2019. U.S. provisional patent application No. 62/873,889, entitled "wearable device apparatus, systems, and methods," is hereby incorporated by reference in its entirety. U.S. patent application No. 16/711,340, entitled "modular eyewear system, apparatus and method," is hereby incorporated by reference in its entirety.
Technical Field
The present invention relates generally to eyewear devices, and more particularly to apparatus, methods, and systems for providing information to a user through a modular eyewear device.
Background
The modern life rhythm is fast. An individual is often time limited and often in a scenario where both hands are occupied and no information is available. This may cause some problems. Currently available glasses, such as prescription glasses worn during reading or prescription sunglasses, are expensive and are not easily reconfigurable to meet the needs of different users. This may cause some problems. Personalized sound transmission is usually accomplished by means of a closed in-ear device called a headset or an ear plug. Such devices can block the ear canal and can prevent the user from hearing far-field sounds. This may cause some problems. Therefore, there are some problems that require technical solutions that use technical means that can produce technical effects.
Drawings
The invention may best be understood by referring to the following description and accompanying drawings that are used to illustrate various embodiments of the invention. The present invention is illustrated by way of example in the embodiments and is not limited by the figures of the accompanying drawings, in which like references indicate similar elements.
Fig. 1 is a schematic diagram of a modular reconfigurable eyewear system according to an embodiment of the present invention.
Figure 2 is a schematic diagram of a reconfigurable assembly for an eyeglass apparatus according to an embodiment of the invention.
Figure 3 is a schematic diagram of a plurality of reconfigurable components for an eyeglass apparatus according to an embodiment of the present invention.
Figure 4 is a schematic diagram of another reconfigurable modular eyewear system in accordance with embodiments of the present invention.
Fig. 5 is a perspective view and a top view of the modular eyewear system from fig. 4 in accordance with an embodiment of the present invention.
Fig. 6A is a schematic diagram of a system architecture for a modular eyeglass apparatus according to an embodiment of the present invention.
Fig. 6B is a schematic diagram of a wireless network corresponding to the system structure for the modular eyeglass device of fig. 6A according to an embodiment of the present invention.
Fig. 7 is a schematic diagram of another system configuration for the modular eyeglass apparatus of fig. 4, according to an embodiment of the present invention.
Fig. 8 is a block diagram of a temple insert module according to an embodiment of the present invention.
Fig. 9 is a schematic view of a modular eyeglass apparatus incorporating a rear neck module assembly in accordance with an embodiment of the present invention.
Fig. 10 is a perspective view of a nape module assembly configured with a wearable device according to an embodiment of the present invention.
FIG. 11 is a schematic view of interlocking coupling of a temple arm to a temple arm in accordance with an embodiment of the present invention.
Fig. 12 is a schematic view of a nape module assembly coupled to electronics contained in a temple, according to an embodiment of the present invention.
Fig. 13 is a schematic view of a back neck module assembly combined with temple electronics according to an embodiment of the present invention.
FIG. 14 is a schematic view of a user interface on a nape module assembly according to an embodiment of the present invention.
FIG. 15 is a block diagram of a nape electronics unit according to an embodiment of the present invention.
Fig. 16 is a schematic block diagram of an embodiment according to the present invention.
Fig. 17 is a schematic block diagram of a wake-up control according to an embodiment of the present invention.
Fig. 18 is a schematic diagram of a state of button operation according to an embodiment of the present invention.
Fig. 19 is a state diagram illustrating operation of a touch sensor according to an embodiment of the present invention.
Fig. 20 is a state diagram illustrating the operation of a proximity sensor according to an embodiment of the present invention.
21A-21D are schematic diagrams of the positions of buttons according to embodiments of the present invention.
Detailed Description
In the following detailed description of the various embodiments of the invention, reference is made to the accompanying drawings in which like references indicate similar elements, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. In other instances, well-known circuits, structures and techniques have not been shown in detail in order not to obscure the understanding of this description. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
In one or more embodiments, methods, devices, and systems are described that provide modularity of eyewear systems for users. As described in the following embodiments, various combinations and configurations of electronic devices are described as being incorporated into eyewear apparatus. Certain electronic devices are configured to be removably coupled to the eyewear apparatus. In some embodiments, the configuration of the electronics is built into the eyewear apparatus. In other embodiments, the rear neck module assembly is removably coupled with the eyeglass apparatus. In various embodiments, the modular reconfigurable eyewear device provides information to a user through the eyewear device. As described in the embodiments, the information includes streaming audio in the form of music, and the information also includes biometric parameters of the user (e.g., physiological biometrics, biomechanics, etc.), such as, but not limited to: heart rate, breathing rate, posture, number of steps, rhythm, etc. The information also includes information of interest to the user, such as, but not limited to: the vehicle information that the user is using, for example: bicycle Revolutions Per Minute (RPM), engine parameters such as RPM, oil pressure, cooling water temperature, wind speed, water depth, air speed, etc. In various embodiments, information is presented to the user through, for example, an audio broadcast heard by the user and a video broadcast played on a display viewed by the user on an eyewear device, or an image viewed projected onto the user's pupils. Thus, the information will have a broad meaning within the scope of the embodiments taught herein.
Fig. 1 illustrates a modular reconfigurable eyewear system according to an embodiment of the present invention. Referring to fig. 1, a modular eyeglass apparatus shown in perspective is indicated generally at 100. The modular eyeglass apparatus 100 has a frame 102. In various embodiments, the frame 102 is ophthalmically configured to provide rim portions that support the lenses 120 and 122. The lenses 120 and 122 may provide any functionality provided by the eyewear device, such as, but not limited to, safety glass lenses, prescription lenses, sunglass lenses, solder glass lenses, and the like. The eyeglass apparatus can also comprise a single lens rather than the dual lenses shown in the figures. In some embodiments, a nose pad is provided to provide cushioning to the contact area with the user's nose. In some embodiments, the nose pad is made of a flexible material, such as silicone rubber.
The temple 104 (left temple) and the temple 114 (right temple) are coupled to the frame 102. The temples 104 and 114 may be flexibly coupled to the frame 102 by hinges as shown in the figures, or the temples 104 and 114 may have a fixed orientation relative to the frame 102.
In various embodiments, one or more of the temples (104 and 114) and the frame 102 may be mounted with electronics as described below. In the view of 100, left temple 104 is disposed on left Temple Insert Module (TIM)106, and right temple 114 is disposed on right Temple Insert Module (TIM) 116. The Temple Insert Modules (TIMs) are described more fully below in connection with the figures.
With continued reference to fig. 1, a modular eyeglass apparatus is shown at 130 in perspective view. Frame 132 is ophthalmically configured to surround lens 140 and lens 142 with rims to secure lens 140 and lens 142 thereto. The brow web 146 is secured to the frame 132 by assembly fasteners, adhesives, etc. in a variety of ways. The temple 144 includes a left temple connector 152 that rotatably couples with the left frame connector 150. Together, left frame connector 150 and left temple connector 152 form a rotatable mechanical and electrical connection between frame 132 and left temple 144, thereby providing one or more electrical pathways to connect frame 132 to left temple 144. Similarly, right temple 134 is rotatably coupled to frame 132 by a right hinge assembly 148.
It is noted that in some embodiments, the modular eyewear apparatus is configured such that each temple arm is removable from its hinge by an electrical/mechanical connector having one or more electrical contacts, which are not shown for clarity of illustration. These electrical contacts may be made using, for example, pins, points, pads, slots, contact means, and the like. For example, a line denoted by 154 demarcates the boundary where the right temple connector fits with the right temple 134. Similarly, a line indicated by 156 borders the assembly of left temple connector 152 with left temple 144.
The temples are interchangeable with the eyeglass apparatus by providing electrical/mechanical connectors between the temples, e.g. 134, 144, and the frame 132. This feature allows the user to interchange one temple with another. Different temples may be configured with different electronics to provide different functions as described herein. Either temple may be configured to accommodate a variety of electronics configurations. For example, in one or more embodiments, the right interchangeable temple houses electronic components that may include one or more of a biometric sensor, a biomechanical sensor, a vehicle sensor, an environmental sensor, a temperature sensor, an acoustic sensor, a motion sensor, a light sensor, a touch sensor, a proximity sensor, a speed sensor, an acceleration sensor, a rotation sensor, a magnetic field sensor, a Global Positioning System (GPS) receiver, a cable, a microphone, a micro-speaker, a power supply (battery), a camera, a micro-display, a head-up display (HUD) module, a multi-axis inertial measurement unit, a wireless communication system. It should be noted that TIMs may also include the electronic components and sensors described above. In various embodiments, one or more wireless communication systems are provided that use, for example: near Field Communication (NFC) using industry-science-medical (ISM)13.56MHz frequency, Adaptive Network Topology (ANT) ANT + wireless standard, wireless communication using bluetooth standard, bluetooth low energy standard (BLE), wireless communication using Wi-Fi standard, and wireless communication using mobile phone standard, such as 3G, 4G, LTE, 5G, etc., or other wireless standards. In some embodiments, electrical pathways from the electronics exit the temple bar via the sheath cavity, then enter the temple bar sheath and continue into the brow bar sheath cavity. The right interchangeable temple includes a hinge connector 148, the hinge connector 148 secured to the brow bar 146 and the frame 132.
In one or more embodiments, the right interchangeable temple arm is secured to the front of the frame by a hinge connector that allows power and data to be transmitted to the left interchangeable temple arm through the modular brow web. The hinge connector mechanically interlocks with the frame and allows for power/data connection with the electrical pin conductors. In one or more embodiments, the hinged connector senses the open state of the device when in the open orientation of wear, allowing for power or data transfer. When in the closed position (temple folded inward), the hinge connector, in combination with signals received from the one or more proximity sensors and motion sensor, will allow the system to sense the user-device interaction status and turn off power or data transmission. This function can reduce power consumption when folded and stowed, and can automatically power on when the user wears the device on his or her head. The hinged connector may provide flexible circuits and wired microconnectors in addition to switchable data or power transmission to provide stable uninterrupted power and/or data transmission.
In some embodiments, it is convenient to route electrical pathways within the volume of the brow web 146. In some embodiments, the brow web 146 is configured to provide a channel along its length within which to route electrical pathways. Thus, the brow web 146 provides one or more sheaths, channels, etc., along its length in which electrical pathways and sensors may be incorporated. Examples of electrical paths are, but not limited to: wires, printed circuit boards, flexible printed circuit boards, and the like. In various embodiments, one or more sensors are preferably mounted to the brow web 146 to form an electrical subassembly for the frame 132. In some embodiments, additional electrical pathways from the frame 132 are connected with electrical pathways contained in the brow web 146. In some embodiments, the flexible electronic circuit is attached to the underside top surface of the brow web and exits the brow web via the left and right sheath cavities. Alternatively, or in combination, a fully embedded flexible electronic device may be cast into the brow web and integrated contact points may be brought out of the brow web near the hinges. These integrated contact points on both sides of the brow web allow for the transmission of data or power when in contact with the integrated contact points of the left and right temples. In addition to facilitating connection with the electronics, the brow web may also hide the optional pupil module via the mounting flange and allow the user to view the microdisplay through the brow web pupil aperture.
In a similar manner, the left temple is configured as a left interchangeable temple, connected to the front frame of the eyeglasses by a left hinge connector. In various embodiments, the left interchangeable temple arm may contain the same electronics configuration/function as the right interchangeable temple arm, or the interchangeable temple arm may contain a different electronics configuration and a different function.
With continued reference to fig. 1, left temple 164 and right temple 174 are configured as shown at 160. Each of the temples 164 and 174 contain electronics configured to provide information and functionality to a user of the eyewear system. As shown at 160, the left temple 164 has a temple door (not shown) that is removed to expose an electronic component, indicated at 186. The temple door protects the electronics from environmental exposure and hazards. The temple door is fastened to the temple assembly using suitable mechanical fasteners for the particular application. In some embodiments, the temple door provides a rating for water intrusion via an IP code from international protection mark IEC standard 60529. The power source (battery) is indicated at 184 and the audio speaker and port are indicated at 182. The audio speaker and port 182 is typically located at the end of the temple, and in some embodiments is an integrated directional projection speaker that may privately direct sound to the user's ear. The projecting stereo speakers may communicate various audio signals to the user, such as, but not limited to, voice prompts, streaming music, intelligent audio aids, data, and the like. Notably, the projection speaker is not designed to obstruct the user's ear. Thus, the user can hear far-field sounds, and the sound quality of the far-field sounds is not degraded as with currently available earbud headphones that can block the user's ear.
The right temple 174 also has electronic components (not shown) disposed thereon, which are contained within the right temple 174. The right temple has an audio speaker disposed thereon, which may be an integrated directional projection speaker, with an audio speaker port 184. In one or more embodiments, right temple 174 is configured to receive an external component 190 that includes a microdisplay component 192. Similarly, the left temple may also be configured for the external component 190 and the microdisplay component 192.
In various embodiments, the microdisplay component, e.g., 192, isHead-up display (HUD) pupilTMAn optical module in which an optical device, an electronic device, and a micro display constituting an optical system are accommodated. The pupil mechanism may also house cables, flexible circuit boards or wires that exit the housing and enter the contact paths of the electronics. In one or more embodiments, these electrical pathways are connected to one side of left temple 174 to provide the user with a transparent heads-up display external attachment to augment the visual components of the mobile reality experience.
In one or more embodiments, wiring from the brow web is hidden in the left and right sheaths of the temple arms and enters the left and right temple arms via the sheath cavities, thereby protecting the wiring from the environment. The vicinity of the contact point path may also house a motion mechanism for customizing the interpupillary distance of the head-up microdisplay module.
In various embodiments, the front frame section, such as 102 or 132 (fig. 1) or any similar structure in the following figures, and the right and left temple sections, such as 104, 114, 134, 144, 164, 174 (fig. 1) or any similar structure in the following figures, may be part of a set of interchangeable front frame and temple sections, each having the same or different devices, accessories, capabilities and/or combinations of functions. At least one of an electronic board, a microphone, a speaker, a battery, a camera, a head-up display module, a wireless Wi-Fi radio, a GPS chipset, an LTE cellular radio, a multi-axis inertial measurement unit, a motion sensor, a touch sensor, light and proximity sensors, etc. may be included in the desired combination. Further, the electronic device may further comprise an electronic device allowing a user to perform at least one of: wireless connectivity to cellular services, smart phones, smart watches, smart bands, mobile computers, and sensor peripherals. The electronic device may further include the following capabilities: see-through augmented reality images through a modular head-up display (HUD), providing stereo audio content, voice and audio notifications including music through one or more integrated projecting micro-speakers. The front frame electrical contact means and the temple arm electrical contact means may comprise electrical contact points located within or near the respective hinge connectors for removable electrical contact with each other for electrically transmitting at least one of: a power source, electrical signals and data between the temple portion and the front frame portion when the contacts are in the electrically closed position. In some embodiments, the front frame hinge connector, the front frame electrical contact means, the temple hinge connector and the temple electrical contact means may form an electromechanical hinge, hinge connector, assembly or device when assembled together. In some embodiments, the system power may be turned off by folding the temple portion to the storage position, thereby disconnecting the contact points.
In some embodiments, at least one of the front frame electronics and temple arm electronics may include a battery, a camera, a heads-up display module, a controller, digital storage electronics, a CPU, a projection micro-speaker, a microphone, a wireless Wi-Fi radio, a GPS chipset, an LTE cellular radio, a multi-axis inertial measurement system or unit, and at least one of a sensation, motion, touch, light, proximity, temperature and pressure sensor, and the like.
In some embodiments, at least one temple may include a temple module insert (TIM) containing selected temple electronics mounted thereon. In other embodiments, the neck smart cord is electrically connected to the rear neck electronic module. The neck smart cord has right and left connectors or connector ends for mechanically and/or electrically interconnecting the rear neck electronic module with the right and left temples of the eyeglass apparatus.
Figure 2 illustrates a reconfigurable assembly for an eyeglass apparatus, according to an embodiment of the present invention. Referring to fig. 2 at 200, a temple 202 has an engagement portion 204 disposed thereon. In the description of the present embodiment, the temple insert module, indicated by 210, is referred to as a "TIM" and is configured to removably couple with the engagement portion 204 of the temple 202. TIM 210 is mounted in temple 202 as indicated by arrows 212a and 212 b. In various embodiments, the joint 204 is achieved by a mechanical connection, such as, but not limited to: press fit, clips, mechanical interlocks, hook and loop, magnetic surfaces, external clips of the temple, flanges and mechanical connections to the temple, and the like. In other embodiments, the joint 204 and the TIM 210 utilize magnetic surfaces to magnetically secure the TIM 210. The profile of the joint shown at 204, as well as the profile of any joint shown elsewhere in the figures presented herein, is for illustration only and does not constitute a limitation on embodiments of the invention. In the view shown at 200, the TIM 210 is only mechanically coupled to the temple 202, and there is no electrical connection between the TIM 210 and the temple 202.
In some embodiments, a speaker and speaker port 214, which may be a miniature projection speaker, is disposed on the TIM 210. The speaker provides information to the user through audio broadcasting. It is noted that the speaker provided herein is a speaker located outside the user's ear and therefore not inserted into the user's ear as in an insert-type ear plug. The TIM 210 is configured with electronic components including a processor, memory, a power source, and one or more wireless communication protocols that enable the TIM 210 to wirelessly communicate 222 with one or more devices 220. The device 220 may be an external sensor such as, but not limited to: a biosensor or vehicle sensor, a local user device, a network node, such as a wireless router, a remote network or a remote user device, such as a mobile phone accessed through a network. The various sensors, networks, and remote devices are described more fully below in connection with the accompanying figures.
Following the structure shown by 200 in fig. 2, in some embodiments, a second temple and a second TIM are provided. Two TIMs in such a system may participate in wireless communication between the devices 220 and themselves as needed to provide a degree of design functionality to the user. For example, in one embodiment, the left TIM includes wireless network capabilities sufficient to communicate with the remote device 220 using a first network protocol. In addition, the left TIM and the right TIM have wireless network capabilities that support communication using a second network protocol. To conserve power, the first network protocol has a greater range than the second network protocol because the distance between the left TIM and the remote device is greater than the separation distance (nominally the width of the user's head) between the left TIM and the right TIM. The structure indicated by 200 is referred to as truly wireless because there is no wired connection between the left TIM and the right TIM. In one or more embodiments, an audio stream is provided from a user device to a first TIM using a first wireless network. A second wireless audio stream is then provided from one TIM to another TIM using a second wireless network to provide the audio stream to each of the left and right projection speakers of the eyewear device.
As described in connection with the figures herein, the temples and TIMs described at 200 provide reconfigurable components for eyewear apparatus. The front end 206 of the temple 202 is engaged with the frame of an eyewear apparatus as described above, with or without a connector between the temple and the frame. Thus, depending on the intended design of the eyeglasses, the temple 202 may attain a fixed position relative to the frame, or the temple may be rotatably coupled to the frame.
Referring to 250 in fig. 2, temple 252 has an engagement 254 disposed thereon. Temple insert module TIM at 260 is configured to removably couple with engagement 254 of temple 252. TIM 260 is mounted in temple 252 as indicated by arrows 262a and 262 b. In various embodiments, the joint 204 is implemented by a combination of electrical and mechanical connections. The mechanical connection may be as described in connection 210/204, for example but not limited to: press fit, clips, mechanical interlocks, hook and loop, and the like. In other embodiments, the engagement portion 254 and the TIM 260 utilize magnetic surfaces to magnetically secure the TIM 260. For purposes of illustration, a plurality of electrical contacts 280 are provided, but this is not meant to be limiting. Electrical contacts 280 mate with corresponding electrical contacts in temple 252 to provide an electrical connection with one or more electrical pathways (not shown) in temple 252. Electrical pathways within temple 252 facilitate electrical connection between TIM 260 and one or more sensors 272 and 274, which sensors 272 and 274 may also represent a source of signals provided to the display. The sensors 272 and 274 may be acoustic sensors, such as microphones or any of the sensors described herein, for use in conjunction with electronic components configured with eyewear apparatus. In one or more embodiments, one or more of 272 and 274 provide a signal to a display, such as a HUD.
In some embodiments, a speaker and speaker port 264, which may be a miniature projection speaker, is provided on the TIM 260. The speaker provides information to the user through an open-ear audio broadcast.
Following the structure shown by 250 in fig. 2, in some embodiments, a second temple and a second TIM are provided as shown in fig. 3 below. Two TIMs in such a system participate in wireless communication between the devices 220 and themselves as needed to provide a degree of design functionality to the user. As described in connection with the figures herein, the temples and TIMs described at 250 provide reconfigurable components for eyewear apparatus. For example, the front end 256 of the temple 252 engages the frame of an eyeglass apparatus as described above, with or without a connector between the temple and the frame. Thus, depending on the intended design of the eyeglasses, the temple arm 252 may attain a fixed position relative to the frame or the temple arm may be rotatably attached to the frame.
Figure 3 shows a plurality of reconfigurable components for an eyewear apparatus generally at 300, in accordance with an embodiment of the present invention. Referring to 300 in fig. 3, the left reconfigurable assembly 250 from fig. 2 is shown with an accompanying right reconfigurable assembly for an eyeglass apparatus. Right temple 352 has an engagement that is not shown in fig. 3, but is similar to engagement 254 of left temple 252. Temple insert module TIM at 360 is configured to removably couple with an engagement portion of temple 352. TIM360 is coupled to temple 352 as indicated by arrows 362a and 362 b. In various embodiments, the engagement of the temple 352 is achieved through a combination of electrical and mechanical connections. This mechanical connection may be provided in connection with 210/204 as described above, such as but not limited to: press fit, clips, mechanical interlocks, hook and loop, and the like. In other embodiments, the engagement of temple 352 and TIM360 utilize magnetic surfaces to magnetically secure TIM 360. For purposes of illustration, a plurality of electrical contacts 380 are provided, but this is not meant to be limiting. The electrical contacts 380 mate with corresponding electrical contacts in the temple arm 352 to provide an electrical connection with one or more electrical pathways (not shown) in the temple arm 352. Electrical pathways within temple 352 facilitate electrical connection between TIM360 and one or more sensors 372 and 374. The sensors 372 and 374 may be acoustic sensors, such as microphones or any of the sensors or displays described herein, for use in conjunction with electronic components configured with the eyewear apparatus. In various embodiments, TIM360 is configured with electronic components including a processor, memory, a power source, and one or more wireless communication systems using protocols that enable TIM360 to wirelessly communicate with one or more devices 220 as illustrated by wireless transmission at 222. Further, TIM360 and TIM 260 may be configured with wireless communication capabilities that allow wireless communication between the TIMs, such as wireless transmission shown at 382. In some embodiments, a speaker and speaker port 364, which may be a miniature projection speaker, is disposed on the TIM 360. The speaker provides information to the user through audio broadcasting.
Referring to view 390 in FIG. 3, an electrical schematic diagram of each TIM 260 and TIM360 is shown. The TIM 260 is electrically coupled to the sensor 272 by electrical path 394. Similarly, the TIM 260 is electrically coupled to the sensor 274 via electrical path 392. The connectivity shown between the TIM 260 and the sensors constitutes a left temple electrical schematic 384. It is noted that the left temple electrical schematic 384 may be more or less complex than illustrated. Accordingly, the left temple electrical schematic is provided for illustration only and is not meant to be limiting thereby.
Similarly, the TIM360 is electrically coupled to the sensor 372 via electrical path 398. The TIM 260 is electrically coupled to the sensor 374 through electrical paths 396. The connectivity shown between the TIM360 and the various sensors constitutes a right temple electrical schematic 386. It is noted that the right temple electrical schematic 386 may be more or less complex than illustrated. Accordingly, the right temple electrical schematic is provided for illustration only and is not meant to be limiting thereby.
Two TIMs in such a system participate in wireless communication between the devices 220 and themselves as needed to provide a degree of design functionality to the user. For example, in one embodiment, the left TIM includes wireless network capabilities sufficient to communicate with the remote device 220 using a first network protocol. In addition, the left TIM and the right TIM have wireless network capabilities to support wireless communications as shown at 382. Wireless communication 382 may be performed using a second network protocol that is different from the protocol used at 222. To conserve power, the first network protocol (222) has a greater range than the second network protocol (382) because the separation distance between the left TIM 260 and the remote device 220 is greater than the separation distance between the left TIM 260 and the right TIM360, the latter being nominally the width of the user's head, while the former may be as far as the mobile phone cell tower.
FIG. 4 illustrates another reconfigurable modular eyewear system in accordance with embodiments of the present invention. Referring to fig. 4, one or more of the sensors, power supply components, and computing units are distributed throughout the eyewear apparatus, including throughout the frame, such as 402. Frame 402 is ophthalmically configured to surround lens 440 with a rim to secure lens 440 thereto. The left lens 404 and the right lens 414 are coupled to the frame 402 to form an eyeglass apparatus. The left lens 404 is configured with an engagement portion, indicated at 408. Left Temple Insert Module (TIM)406 is configured to engage with engagement 408 as described above, thereby providing a mechanical and electrical connection between TIM 406 and temple 404. Similarly, as shown, right temple 426 engages the junction of right temple 414. The TIM 406 includes audio speakers and audio ports, represented by 410, and the TIM 426 includes audio speakers and audio ports, represented by 430. In various embodiments, audio speakers 410 and 430 are projection speakers. The eyewear apparatus includes a plurality of sensors or displays 462, 464, 466, 468 and 470 that are integrated into an electrical pathway that passes from the left temple 404 and then through the frame 402 to the right temple 414. In various embodiments, there may be more sensors or fewer sensors than shown in FIG. 4. The sensors and the locations of the sensors shown in FIG. 4 are provided as examples only and do not constitute a limitation on embodiments of the invention. As described above in connection with the previous figures, at least one of the temple insert modules 406 and/or 426 has a set of electronics provided thereon necessary to provide the wireless connection 222 to the device 220.
In the eyeglass apparatus 400, a high-level view of an electrical path schematic is shown at 480. Referring to 480, left TIM 406 and right TIM 426 are electrically coupled to sensors 462, 464, 466, 468 and 470 by electrical path elements 482, 484, 486, 488, 490 and 492. An electrical path element, such as 484, is electrically connected to the sensor 464. The components shown at 480 collectively provide a modular, reconfigurable set of components for the eyeglass apparatus. In one or more embodiments, one or more acoustic sensors are located in at least one of the frame 402, the left temple 404, and the right temple 414. Thus, the acoustic sensor may be located anywhere on the temple (left or right) or frame of the eyewear apparatus.
Fig. 5 shows perspective and top views of the modular eyewear system from fig. 4, generally at 500, in accordance with an embodiment of the present invention. Referring to fig. 5, a modular eyeglass apparatus is shown at 502 in perspective view. The modular nose pads 504 are removably coupled with the modular eyeglass apparatus 502 as shown at 506. The modularity of the nose pads allows the user to replace the nose pads to improve the fit between the eyeglasses and the nose and face structures of the user. Greater comfort can be achieved by modularizing the nose pads of the eyeglass apparatus. In addition, other sensors, such as biosensors, may be provided in the nasal cushion.
Fig. 6A illustrates a system architecture for a modular eyeglass apparatus, in accordance with an embodiment of the present invention, generally at 600. Referring to fig. 6A, in various embodiments, the modular reconfigurable eyewear apparatus may include more than one wireless communication system. In various embodiments, eyewear device 602 has a high-level block diagram structure as shown at 604. In various embodiments, the eyewear device 602 is configured to communicate with the wireless sensor 640 and the mobile device 670. The wireless sensor 640 may comprise a single sensor or a plurality of sensors. The wireless sensor 640 may include, without limitation, any one or more of the sensors listed herein. For example, wireless sensors 640 may include biometric or biomechanical sensors configured for use with a user, or sensors configured for use with a vehicle or building. Some examples of biosensors are, but are not limited to: heart rate monitors, perspiration sensors, temperature sensors, etc. Some examples of vehicle sensors are, but not limited to: speed sensors, acceleration sensors, global positioning system signals, vehicle engine parameters, wind speed indicators, and the like. Some examples of sensors used with buildings are, but not limited to: thermostat temperature readings, water pressure values, etc. Some non-limiting examples of vehicles are, but not limited to: a scooter, bicycle, car, boat, yacht, boat, airplane, military vehicle, wing suit, and the like. In some embodiments, data is received from the special purpose network at 640 and/or 616. An example of a special purpose network, for illustration and not for imitation purposes, is the National Marine Electronics Association (NMEA) NMEA 2000 network, which is designed for vessels such as yachts (power or sails). NMEA 2000, also known in the art as "NMEA 2 k" or "N2K", is standardized as International Electrotechnical Commission (IEC) 61162-1. The NMEA 200 is a plug and play communications standard for interfacing with marine sensors and display units in ships, boats, yachts, etc. The mobile device 670 may be any one or more of the mobile devices listed herein without limitation. For example, the mobile device may be a mobile phone, a watch, a bracelet, a tablet, a laptop, a desktop, a car computer, etc.
The eyewear device 602 has a high-level structure, represented at 604, that includes a speaker 606, a central processing unit 608, a power supply 610, an acoustic sensor 608, a storage device 614, and a wireless communication system 616. The wireless communication system 616 may include one or more of a near field communication system 618, a wireless communication system utilizing a bluetooth communication protocol 620, a wireless communication system utilizing a Wi-Fi communication protocol at 624, a mobile phone communication protocol 622, for example. The wireless communication protocol, denoted at 622 by LTE, is given only as an example of a wireless device and does not constitute a limitation on embodiments of the present invention. Those skilled in the art will recognize that one or more antennas are included in the wireless communication system block 616, but are not shown for clarity.
The wireless sensor 640 has a high-level architecture, indicated at 642, that includes one or more sensors 644 and a wireless communication system 646. The wireless communication system 646 may be a low data rate communication system such as a near field communication system, BLE, ANT +, or the like. Alternatively, the wireless communication system 646 may be provided as a higher data rate system as required by the sensor 644.
The mobile device 670 has a high-level architecture, represented by 672, that includes a central processing unit 674, a power supply 676, a memory 678, and one or more wireless communication systems, shown in block 680. The mobile device 670 may optionally be configured to reach a remote network as shown by the cloud 689. Wireless communications block 680 may include one or more of a wireless communications system, such as a near field communications system 682, a wireless communications system utilizing a bluetooth communications protocol 684, a wireless communications system utilizing a Wi-Fi communications protocol at 686, and a mobile phone communications protocol at 688. The wireless communication protocol, denoted by LTE at 688, is given only as an example of a communication system for mobile devices and does not constitute a limitation of embodiments of the present invention. Those skilled in the art will recognize that one or more antennas are included in wireless communication system blocks 680 and 642, but are not shown for clarity.
In some embodiments, the wireless sensor system 642 and eyewear device 602 are initially configured by the mobile device 670 and a user of the mobile device user interface, as shown by pathways 652a and 652 b. In operation, the eyewear apparatus 602 wirelessly receives data from a suitable wireless communication system, such as the near field communication system 618, as shown at 650. The wireless data obtained from the wireless sensor system 642 may be communicated to the user device 670/672 via another wireless communication system as shown at 654. The wireless communication shown at 654 may be accomplished using a higher data rate channel using, for example, the bluetooth protocol at 620/684, or the Wi-Fi network protocol at 624/686, or the mobile phone communication protocol shown at 622/688. In various embodiments, the data transmitted from the eyewear device 602 may be stored and analyzed on the user device 670 and have different applications.
Fig. 6B illustrates a wireless network, generally at 690, corresponding to the system architecture for the modular eyewear system of fig. 6A, in accordance with an embodiment of the present invention. Referring to fig. 6B, the wireless communication block 616 may be connected to a plurality of devices as shown. For example, one or more wireless sensors 640 may be connected to the wireless communication module 616 using a low data rate near field communication network as shown at 618. One or more user devices 670 may communicate wirelessly with the wireless communication block using a bluetooth communication protocol as shown at 620. One or more wireless nodes, such as Wi-Fi node shown at 692, may communicate wirelessly as shown at 624 with wireless communication block 616. One or more remote networks 694 can wirelessly communicate with the wireless communication block 616 using a cellular communication protocol, as shown at 622. Accordingly, the reconfigurable eyewear apparatus may incorporate one or more wireless communication systems shown at 690. The eyewear device may be reconfigured for different wireless communications by, for example, replacing one TIM module with another. Alternatively, one or more temples may be interchangeable with the frames described above to provide customized functionality to the eyeglass apparatus.
Fig. 7 illustrates another system configuration for the modular eyeglass apparatus of fig. 4, generally at 700, in accordance with an embodiment of the present invention. Referring to fig. 7, the wireless communication module 616 of the eyewear device 602 may be configured to communicate cellular directly via a mobile phone network without requiring the user device to act as an intermediary. For example, in 700, the eyewear device 602 is configured to communicate with the remote device 702 over the wireless communication system 622, where the remote device 702 may be a mobile phone, connecting directly with the remote device 702 over an external network as shown by the cloud 704. No intermediate user mobile devices are required to support such communication lines. This configuration of the eyeglass device allows a user of the eyeglass device to place a call from the eyeglass device with the aid of an interface, such as a voice interface, one or more tactile interfaces like buttons, or the like. The voice interface provides for command and control of a telephone call by converting a user's voice signals into commands that the apparatus uses to cause operation of the wireless network for the telephone call. Examples of such commands are, but not limited to: selecting a caller, making a call, turning up the volume, turning down the volume, ending the call, etc.
Fig. 8 shows a block diagram of a Temple Insert Module (TIM) generally at 800, according to an embodiment of the present invention. Referring to fig. 8, a TIM, as used in the description of the present embodiment, may be based on a device, such as a computer, in which embodiments of the present invention may be used. The block diagram is a high-level conceptual representation that may be implemented in various ways and with various architectures. Bus system 802 interconnects Central Processing Unit (CPU)804 (also referred to herein as a processor), Read Only Memory (ROM)806, Random Access Memory (RAM)808, memory 810, audio 822, user interface 824, and communications 830. The RAM808 also may represent Dynamic Random Access Memory (DRAM) or other forms of memory. In various embodiments, user interface 824 may be a voice interface, a touch interface, a physical button, or a combination thereof. It should be understood that a memory (not shown) may be included in the central processor block 804. The bus system 802 may be, for example, one or more buses such as a system bus, Peripheral Component Interconnect (PCI), Advanced Graphics Port (AGP), Small Computer System Interface (SCSI), Institute of Electrical and Electronics Engineers (IEEE) Standard No. 994(FireWire), Universal Serial Bus (USB), Universal asynchronous receiver/transmitter (UART), Serial Peripheral Interface (SPI), Integrated Circuit (I2C), and so forth. The central processor 804 may be a single, multiple, or even a distributed computing resource. The memory 810 may be a flash memory or the like. It should be noted that a TIM may include some, all, more, or a rearrangement of components in a block diagram, depending on the actual implementation of the TIM. Thus, many variations of the system of FIG. 8 are possible.
A connection to one or more wireless networks 832 is obtained through a Communication (COMM)830, which enables the TIM 800 to wirelessly communicate with local sensors, local devices, and remote devices on remote networks. In some embodiments 832/830 provides access to a remote speech to text conversion system, which may be located, for example, at a remote cloud-based location. 832 and 830 are flexible in various implementations to represent wireless communication systems and may represent various forms of telemetry, General Packet Radio Service (GPRS), ethernet, Wide Area Network (WAN), Local Area Network (LAN), internet connection, Wi-Fi, WiMAX, ZigBee, infrared, bluetooth, near field communication, mobile phone communication systems such as 3G, 4G, LTE, 5G, etc., and combinations thereof. In various embodiments, a touch interface is optionally provided at 824. Signals from one or more sensors are input to the system via 829 and 828. Global Positioning System (GPS) information is received and input to the system at 826. The audio may represent a speaker, such as a projection speaker or a projection micro-speaker as described herein.
In various embodiments, different wireless protocols are used in the network to provide the system described in the above figures, depending on the hardware configuration. One non-limiting example of a technique for wireless signal transmission is the bluetooth wireless technology standard, which is also commonly referred to as the IEEE 802.15.1 standard. In other embodiments, a wireless signaling protocol known as Wi-Fi is used, which uses the IEEE 802.11 standard. In other embodiments, a zigbee communication protocol based on the IEEE 802.15.4 standard is used. These examples are given for illustration only and do not constitute a limitation on the different embodiments. Transmission Control Protocol (TCP) and Internet Protocol (IP) are also used in different embodiments. Embodiments are not limited to the data communication protocols listed herein and are readily usable with other data communication protocols not specifically listed herein.
In various embodiments, the components in the system, as well as the systems described in the previous figures (e.g., Temple Insert Module (TIM)), are implemented in an integrated circuit device, which may include an integrated circuit package containing an integrated circuit. In some embodiments, the components in the system and the system are implemented in a single integrated circuit die. In other embodiments, components in the system and the system are implemented in more than one integrated circuit die of an integrated circuit device, which may include a multi-chip package containing the integrated circuit.
Fig. 9 illustrates a modular eyeglass apparatus equipped with a nape module assembly, in accordance with an embodiment of the present invention. Referring to fig. 9, at 900, the nape module assembly is mounted to a pair of passive eyeglasses. Passive glasses indicate that there are no electronics in the glasses. Alternatively, the eyewear may be active or active eyewear, configured with electronic components encapsulated into one or more temples or Temple Insert Modules (TIMs), as described herein. The eyewear has a frame 902 containing lenses 906. The left temple 904 and the right temple 914 are attached to the frame 902. The nape module assembly includes a nape electronics pod (ePOD)924, a left temple interlock 920, a right temple interlock 922, a left smart cord 926, and a right smart cord 928. A left smart cord 926 electrically and mechanically couples the ePOD 924 to the left temple interlock 920 and a right smart cord electrically and mechanically couples the ePOD 924 to the right temple interlock 922.
Left temple interlock 920 contains an acoustic cavity, an audio speaker, and an acoustic port. The acoustic port of the left audio speaker is denoted by 930. The left smart cord 926 contains electrical conductors that provide audio signals to audio speakers contained within the left temple interlock 920. In one or more embodiments, the audio speaker included in the left temple interlock is a miniature projection speaker. Similarly, the acoustic port of the right audio speaker is identified with 932. The right smart cord 928 contains an electrical conductor that provides an audio signal to an audio speaker contained within the right temple interlock 922. In one or more embodiments, the audio speaker included in the right temple interlock is a miniature projection speaker.
In various embodiments, the ePOD 924 contains an electronic unit. The electronic unit contains the electronic components and functionality described herein for the Temple Insert Module (TIM). In other words, the electronic unit is a TIM for the mechanical and electrical packaging of the nape module assembly.
Electronic units with different electronic configurations and functions can be exchanged in and out of the ePOD in a manner similar to the way different TIMs are exchanged in and out of temples of the eyeglass apparatus.
At 950, length adjustment is provided to shorten or lengthen the right and left smart cords. A rear neck electronic pod (epo) 954 is configured with a left smart cord 956 and a right smart cord 958 leading from the same end of the epo 954. This configuration of smart cords 956 and 958 allows slider 960 to move away from or toward the ePOD. Moving the slider 960 away from the ePOD 954 shortens the available free length of the smart cord 965/958. Moving the slider 960 toward the ePOD 954 increases the available free length of the smart cord 956/958.
In one or more embodiments, in operation, when in the "on" state, audio data is streamed to the electronic unit in the ePOD 924 and directed to the left and right speakers for broadcasting to the user when the rear neck module assembly is mounted on the eyewear device and the user is wearing the eyewear device.
Fig. 10 illustrates, generally at 1000 in perspective view, a nape module assembly configured with a wearable device, in accordance with an embodiment of the present invention. Referring to fig. 10, a first sensor 1050 is shown on the ePOD 924. A second sensor 1052 is shown incorporated into the right temple interlock 922. A third sensor 1054 is shown incorporated into left temple interlock 920. The sensors 1050, 1052, and 1054 may be any of the sensors previously described herein for TIMs or directly for electronics built into the temple.
In the embodiment shown in fig. 10, each temple interlock module, i.e., 920 and 922, includes a through hole into which the temple of the eyewear is inserted. In this embodiment, temple interlock modules 920 and 922 are made of a flexible material, such as an elastomer or rubber that allows sufficient elongation to allow insertion of the temple arm therein. For example, left temple interlock 920 comprises a through-hole 1040 into which left temple 904 is inserted. The right temple interlock 922 includes a through hole 1042 into which the right temple 914 is inserted. The temple interlocks 920 and 922 are positioned on a pair of compatible eyeglasses such that the speaker ports 930 and 932 are positioned in front of and proximate to the user's ears. Compatible eyeglasses are eyeglasses that are compatible with mechanical attachments provided by the interlocking temples.
Figure 11 illustrates coupling a temple arm interlock to a temple arm according to an embodiment of the present invention generally at 1100 and 1150. Referring to fig. 11, a magnetic temple interlock is shown at 1100. The magnetic temple interlock includes a magnetic region 1108 on the temple 1102 of the eyeglass apparatus. Temple interlock 1104 has a corresponding magnetic region 1106. In operation, magnetic regions 1106 and 1108 are brought together, thereby causing magnetic regions 1106 and 1108 to attract one another, thereby providing a clamping force between temple interlock 1104 and temple 1102. The port containing the acoustic cavity of the speaker is denoted by 1110.
Another clamping method is shown at 1150. Temple interlock 1152 includes a slot 1158 between a first side 1156a and a second side 1156b of a flexible material. 1158. The geometry of 1156a and 1156b forms a U-shape that is insertable into a temple of an eyewear apparatus. The elasticity of the material 1152 provides a removable coupling between the temple interlock 1152 and the temple (not shown) of the eyeglasses. The acoustic port of the acoustic cavity housing the speaker is denoted by 1154.
Fig. 12 illustrates coupling a nape module assembly to electronics in a temple in accordance with an embodiment of the present invention, generally at 1200. Referring to fig. 12, the nape module assembly is coupled to electronics contained in the temple. A portion of the nape module assembly is shown with a nape electronics pod (ePOD)1220, a left smart cord 1222, and a left temple interlock 1210. As previously mentioned, any electronics contained in the temple arm may be directly contained in the temple arm without the need for a temple arm insert module (TIM). Alternatively, the electronics contained in the temple may be electronics that are part of a TIM, optionally as shown at 1204. In either case, a plurality of electrical contacts are provided on temple 1202, as shown at 1206. A corresponding number of electrical contacts 1208 are provided in the left temple interlock 1210. A mechanical interlock is provided between the temple 1202 and the left temple interlock 1210 to enable the connection between 1210 and 1202 to be removably coupled. In one or more embodiments, a magnetic coupling is provided near or at 1206/1208 to provide a detachable coupling.
Fig. 13 illustrates a schematic diagram, generally at 1300, of combining a nape module assembly with temple electronics, according to an embodiment of the present invention. Referring to FIG. 13, an outline of the eyeglass apparatus is shown at 1302. Eyewear apparatus 1302 includes electronics and/or electronic pathways in the left temple, right temple, and frame. The profile 1302 includes a frame, a left temple and a right temple. In the system shown in the figures, electronic pathway 1308 extends between the left and right temples of eyewear device 1302.
The eyewear apparatus includes a left Temple Insert Module (TIM)1304 located at the left temple and a right temple insert module 1306 located at the right temple. A rear neck module assembly configured with an electronic unit (ePOD) is indicated at 1310. The left smart cord 1312 provides an electrical path between the ePOD 1310 and the left TIM 1304. The right smart cord 1314 provides an electrical path between the ePOD 1310 and the right TIM 1306. In various embodiments, both the left TIM1304 and the right TIM1306 are configured with one or more wireless communication network systems that allow wireless communication between the left TIM1304 and the right TIM1306, as depicted at 1316. Remote device 1320 represents one or more wireless sensors or wireless user devices, as described above in connection with the preceding figures. Wireless communication 1322 is accomplished between a remote device 1320 and at least one of the left TIM1304, right TIM1306, and epo 1310. The above description of all electronic system functions for a TIM applies to an ePOD, such as the ePOD 1310.
In some embodiments, the left temple arm is not electrically connected to the right temple arm, such as where electrical path 1308 is removed from the electrical schematic shown in 1300.
FIG. 14 illustrates a user interface on a nape module assembly, generally at 1400, in accordance with an embodiment of the present invention. Referring to fig. 14, a rear neck electronic pod (ePOD) is indicated by 1402. The epo 1402 has a display interface 1404. In different embodiments, the display interface 1404 can be implemented in various ways. In some embodiments, the user interface is a tactile surface button. In some embodiments, the user interface is implemented using a touch screen, such as a capacitive touch screen that presents one or more controls to the user. In some embodiments, the user interface communicates information to a user. In other embodiments, the user interface communicates information to a person viewing the user interface 1404 from behind a user wearing the epo 1402. An example of this information is, but is not limited to, emoticons, emotional states, icons, etc., as shown at 1406.
FIG. 15 illustrates a block diagram of a nape electronics unit, generally at 1500, in accordance with an embodiment of the present invention. Referring to fig. 15, as used in the description of the present embodiment, the nape electronics unit may be based on a device, such as a computer, in which embodiments of the present invention may be used. The block diagram is a high-level conceptual representation that may be implemented in various ways and with various architectures. The bus system 1502 interconnects a Central Processing Unit (CPU)1504 (also referred to herein as a processor), a Read Only Memory (ROM)1506, a Random Access Memory (RAM)1508, a memory 1510, audio 1522, a user interface 1524, and communications 1530. The RAM 1508 may also represent Dynamic Random Access Memory (DRAM) or other form of memory. In various embodiments, the user interface 1524 may be a voice interface, a touch interface, a physical button, or a combination thereof. It should be understood that a memory (not shown) may be included in the central processor block 1504. The bus system 1502 may be, for example, one or more of a system bus, Peripheral Component Interconnect (PCI), Advanced Graphics Port (AGP), Small Computer System Interface (SCSI), Institute of Electrical and Electronics Engineers (IEEE) Standard No. 994(FireWire), Universal Serial Bus (USB), Universal asynchronous receiver/transmitter (UART), Serial Peripheral Interface (SPI), inter-Integrated Circuit (I2C), and the like. The central processor 1504 may be a single, multiple, or even a distributed computing resource. The memory 1510 may be a flash memory or the like. It should be noted that a TIM may include some, all, more, or a rearrangement of components in a block diagram, depending on the actual implementation of the TIM. Thus, many variations of the system of FIG. 15 are possible.
A connection is obtained to one or more wireless networks 1532 through Communications (COMM)1530, which enables the TIM 1500 to communicate wirelessly with local sensors, local devices, and remote devices on remote networks. In some embodiments 1532/1530 provides access to a remote speech to text conversion system, which may be located, for example, at a remote cloud-based location. 1532 and 1530 are flexible in various implementations to represent wireless communication systems and may represent various forms of telemetry, General Packet Radio Service (GPRS), ethernet, Wide Area Network (WAN), Local Area Network (LAN), internet connection, Wi-Fi, WiMAX, ZigBee, infrared, bluetooth, near field communication, mobile phone communication systems such as 3G, 4G, LTE, 5G, etc., and combinations thereof. In various embodiments, a touch interface is optionally provided at 1524. An optional display is provided at 1520. Signals from one or more sensors are input to the system via 1529 and 1528. At 1526, Global Positioning System (GPS) information is received and input to the system. The audio may represent a speaker, such as a projection speaker or a projection micro-speaker as described herein.
In various embodiments, different wireless protocols are used in the network to provide the system described in the above figures, depending on the hardware configuration. One non-limiting example of a technique for wireless signal transmission is the bluetooth wireless technology standard, which is also commonly referred to as the IEEE 802.15.1 standard. In other embodiments, a wireless signaling protocol known as Wi-Fi is used, which uses the IEEE 802.11 standard. In other embodiments, a zigbee communication protocol based on the IEEE 802.15.4 standard is used. These examples are given for illustration only and do not constitute a limitation on the different embodiments. Transmission Control Protocol (TCP) and Internet Protocol (IP) are also used in different embodiments. Embodiments are not limited to the data communication protocols listed herein and are readily usable with other data communication protocols not specifically listed herein.
In various embodiments, the components in the system and the systems (e.g., the back neck electronics unit) described in the previous figures are implemented in an integrated circuit device, which may include an integrated circuit package containing an integrated circuit. In some embodiments, the components in the system and the system are implemented in a single integrated circuit die. In other embodiments, components in the system and the system are implemented in more than one integrated circuit die of an integrated circuit device, which may include a multi-chip package containing the integrated circuit.
In various embodiments, the description of the embodiments provided herein provides a reconfigurable component for a head-wearable device. Reconfigurable components for head wearable devices include, but are not limited to including: detachable temple bars, detachable temple bar insertion modules (TIMs), a nape module assembly, an electronic pod ePOD for the nape module assembly, and a detachable electronic unit for the ePODs.
FIG. 16 shows a schematic block diagram of a system architecture, generally at 1600, according to an embodiment of the invention. Referring to fig. 16, embodiments of the present invention are applied in a system architecture customized for a head-wearable device, such as, but not limited to, smart glasses or other eyewear or head-mounted devices. The system architecture described in connection with the following figures is used in conjunction with the reconfigurable components of the head-wearable device described above, including, but not limited to including: removable temples, removable Temple Insert Modules (TIMs), a nape module assembly, an electronic pod ePOD for the nape module assembly, and a removable electronic unit for the ePODs, among others.
Referring to fig. 16, a Mobile Communication Unit (MCU), an MCU with a Digital Signal Processor (DSP) shown in various embodiments is shown by 1602 in fig. 16. The system 1600, in various embodiments, includes one or more of the following sub-modules:
central Processing Unit (CPU) + DSP chip 1634.
Voice wake-up chip 1608.
Universal Serial Bus (USB) for Battery charging and Signal paths to computer or Mobile devices
A type magnetic pogo pin connector 1610.
A multi-function button 1614.
A bi-color light source, such as Light Emitting Diodes (LEDs) (e.g., red and blue) 1620.
Stereo power amplifier 1630 to drive 2 speakers 1628.
Two microphones 1604/1606.
The sensor:
omicron touch sensor 1616.
Proximity sensor 1618.
Omicron 6-axis sensor (3-axis accelerometer + 3-axis gyroscope) 1622.
Omicron 3 axis magnetometer 1620.
In some embodiments, the core of the hardware architecture is a single chip with a CPU, digital signal processor and bluetooth RF module inside the device. The CPU and DSP (1634) may be separate chips or integrated into a single chip. In some embodiments, the DSP is integrated into the CPU to reduce cost. Note that the list of sub-modules given above is for illustration only, and embodiments of the invention may contain more sub-modules or fewer sub-modules than listed above. The interfaces shown herein, such as General Purpose Input Output (GPIO), I-party c (I2C), Universal Asynchronous Receiver Transmitter (UART), etc., are shown by way of example and are not meant to limit embodiments of the present invention. Embodiments of the present invention are readily implemented with different interface and bus standards.
The tasks of the CPU comprise task scheduling, GPIO control, sensor control data acquisition and processing, Bluetooth Radio Frequency (RF) management and power management. In various embodiments, the DSP processes signal processing tasks such as noise cancellation, echo cancellation, crosstalk correction, and all time consuming signal processing algorithms. A Bluetooth (BT) radio module 1632 handles the BT wireless communication protocol. In some embodiments, a cellular communication module is added in addition to bluetooth module 1632 to provide direct cellular telephone communication from head-wearable embedded system 1600. In other embodiments, bluetooth module 1632 is replaced with a cellular communication module. Thus, system 1600 may have many different wireless communication configurations. The system 1600 may be configured with various sensors, such as those listed below. One of ordinary skill in the art will recognize that the system 1600 may be configured with more or fewer sensors than those listed below:
LED 1620 — via GPIO output pin.
Button 1614 — via GPIO input pin.
Touch sensor 166 — via GPIOs input and output pins.
Voice wake-up chip 1608-via GPIO and UART.
Proximity sensor 1618-via the I2C bus.
Microphone 1604/1606& speaker 1628 via analog-to-digital converter (ADC)1640 and digital-to-analog converter (DAC) 1636.
6+3 axis sensors-via the I2C bus.
FIG. 17 shows a schematic block diagram for wake-up control according to an embodiment of the present invention. Referring to FIG. 17, the system 1702 enters a sleep mode 1704 to conserve battery power. A voice wake-up chip, e.g., 1608, is used to detect a wake-up word 1710 (e.g., "SOLOS" spoken by a user and received on a microphone, e.g., 1604). If a wake word 1710 is detected at 1608, the CPU/system is powered on and transitioned to a "run" state as shown at 1706.
The embedded speech recognition system 1608 is utilized in conjunction with the systems 1700, 1702, or 1600 shown in fig. 16 and 17. Examples of embedded speech recognition systems used in embodiments of the present invention include, but are not limited to, embedded speech recognition systems from NUANCE, such as VoCon Hybrid, or the like, or embedded speech recognition systems from other manufacturers. The data table for NUANCE VoCon Hybrid contained herein is seen in appendix 1.
Referring also to fig. 16-17, in operation, when idle, the system 1702 is in a "sleep" mode, whereas the DSP VoCon 1608 is always in an "on" state, so it can detect the wake-up word 1710 via input from the connected microphone 1604. When the DSP VoCon system 1608 detects the wake word 1710, a wake control signal 1714 is sent to change the system 1702 from the "sleep" mode 1704 to the "run" mode 1706.
The embedded speech recognition system 1608 is used to process the wake word 1710 and also to command and control the head wearable device according to commands extracted from the user's speech.
In various embodiments, the components in the systems shown in fig. 16 and/or 17 are implemented in an integrated circuit device, which may include: an integrated circuit package containing an integrated circuit. In some embodiments, the components in the system are implemented in a single integrated circuit die. In other embodiments, the components in the system are implemented in more than one integrated circuit die of an integrated circuit device, which may include: a multi-chip package containing the integrated circuit.
In various embodiments, a six (6) axis sensor, shown at 1624, and a three (3) axis sensor, shown at 1622, are used to perform navigation, tracking, and electronic compass functions. In one embodiment, for illustration only and not meant to be limiting as such, the sensors collect data (at a rate of 10 samples/second) for the following nine (9) axes, which can be used by the Mobile Communication Unit (MCU)1602 to calculate the motion of the user's head. Data for the nine (9) axes includes, but is not limited to, acceleration data from accelerometers measured along the X, Y and Z axes; gyroscope data from X, Y and the Z-axis; and magnetometer data from X, Y and the Z axis. It will be understood by those skilled in the art that X, Y and the Z-axis represent orthogonal coordinate systems. In some embodiments, these sensor data are used to track the motion trajectory of a user wearing the head wearable device.
FIG. 18 illustrates a state diagram generally at 1800 for button operations according to an embodiment of the invention. The buttons, such as 1614 in FIG. 16, are designed to support multi-use functionality. In one or more embodiments, the functionality supported by a single button, touch slider, and proximity sensor is illustrated by way of example. Buttons, touch sensors, and proximity sensors may provide other functions. A different number of buttons, sliders, and sensors may also be provided in various embodiments to provide the desired functionality for a given head wearable device. The functional and state diagrams depicted herein are provided as examples only and are not meant to limit embodiments of the present invention. The power on/off function 1802/1804 is as follows:
a) the power "on" state is entered 1802 by pressing the "power button" for a short predetermined time (two (2) seconds in one or more embodiments) until the "on" indicator light is activated. In one or more embodiments, the "on" indicator light is a blue light. In the "on" state 1802, the user may ask the system, via voice command, what the battery power level is. For example, the system may be configured to return battery power levels quantized to different granularities. One example is to quantify three (3) battery charge levels, i.e., low, medium, high. Other quantifications are possible and this example of using three battery charge levels is given for illustration only and is not meant to be limiting. Alternatively, the system may be configured to notify the user of the current battery power level via a machine-generated audible voice message.
b) The power "off state is entered at 1804 by long pressing the" power button "for a predetermined time (three (3) seconds in one or more embodiments) until the" off "indicator light is activated. In one or more embodiments, the "off" indicator light is a red light. The system may be configured to notify the user that the system is powering down to an "off" state via a machine-generated audible voice message.
The pairing/unpairing function 1806/1807 with a device such as a mobile phone is as follows:
i. pressing the "power button" longer than the time required to enter the off state may result in the system "pairing" with the mobile device. Given here for purposes of example, 5 seconds is a suitable time to configure the system for "pairing", noting that 5 seconds is longer than the time (2 seconds) required to bring the system to an off state. Thus, in operation, the user presses the "power button" until the blue and red lights flash alternately, which will pair the system with the mobile device, as shown at 1806. After pairing, the system plays a voice prompt generated by the machine to tell the user that the pairing is successful. At 1807, the system may be changed to an "unpaired" state by pressing the power supply longer than required for pairing. Thus, at 1807, eight (8) seconds may complete the system 'unpaired' state. Different times may be selected and those predetermined times given herein are given by way of example only and do not constitute a limitation on embodiments of the invention.
After pairing is successful, the user can play music at 1808 and make a phone call using a wireless connection between the head wearable device and a mobile phone or MCU (generally referred to as a device) at 1810. If the connection fails, the head wearable device, such as smart glasses, will automatically turn off after a preset time. In one embodiment, the predetermined time is, illustratively, three (3) minutes.
When music is played through smart glasses at 1808, the short press of the button will advance to the next song at 1812. In one embodiment, for illustration only, and not meant to be limiting as such, a short button press to the next song may be required for less than two (2) seconds.
When there is an incoming call during music playing or when in an idle state, a short press of the button for less than a predetermined time will answer the call at 1814. For purposes of illustration, and not meant to be limiting as such, the predetermined time for a short button press is less than 2 seconds. If the user presses the button for more than the predetermined time, the call is rejected at 1816.
FIG. 19 illustrates a state diagram generally at 1900 illustrating touch sensor operation in accordance with an embodiment of the invention. In one or more embodiments, the touch sensor is configured to have concurrent "slider" and "click" functionality. One example of a touch sensor is 1616 in fig. 16.
In one or more embodiments, by way of example, and not by way of limitation, a touch sensor supports two functions: touch slider, single and double click operations. In other embodiments, the touch sensor may be configured to support more or less functionality than is supported by the above-described sensors. Referring to fig. 19, a state diagram corresponding to touch sensor operation is shown. Volume control by the slider 1930 is provided on the eyeglass device 1940. The slider 1930 is configured to receive a "swipe" input that is recorded by a user sliding a finger along the sensor area. In this example, the volume control has 8 levels (level 1 … … level 8). The "quick slide" of the touch slider increases or decreases the sound volume by 1 step according to the direction of the "quick slide". For example, if the current volume is at level 3, then sliding 1952/1954 quickly forward increases the volume 1904, changing the volume from level 3 to level 4, where 1 represents the minimum volume and 8 represents the maximum volume. Likewise, a quick backward slide from level 3 will result in a decrease in the volume 1906, changing the volume from level 3 to level 2.
"Slow sliding" of the user to the touch slider application means that the volume is continuously increased or decreased. For example, if the current volume is at level 3, slowly sliding forward to the end of the touch sensor area increases the volume 1904, changing it from level 1 to level 8. Sliding forward slowly to the middle of the touch sensor area increases the volume 1904, changing it from level 1 to level 4.
If the volume is at level 4, sliding 1962/1964 slowly back to the end of the touch sensor area decreases the volume 1906, changing it from level 4 to level 1.
From the incoming call state 1920, sliding 1952/1954 forward results in the incoming call being answered at 1924. From the incoming call state 1920, sliding 1962/1964 backward results in rejection of the incoming call at 1922.
A double click by the user on the touch slider area 1930 of the smart glasses 1940 is used to activate or deactivate the wake-up chip. In one or more embodiments, if the voice wake is in the "off state, double clicking on the touch slider area changes the voice wake to the" on "state 1910. Similarly, if the voice wake-up is in the "on" state 1910, double clicking on the touch slider area changes the voice wake-up to the "off" state 1902.
In various embodiments, a single tap is used to play music by changing the system to music "play" state 1902 or to music "pause" state 1908, where music play is paused while awaiting further input from the user. While in the "pause" state, a subsequent click by the user returns control to the "play" state and resumes music play at 1902. Additional state changes are made by a single click thereafter to transition back and forth between "play" and "pause" as desired.
Fig. 20 illustrates a state diagram, generally at 2000, of proximity sensor operation, in accordance with an embodiment of the present invention. In various embodiments, a proximity sensor (e.g., 1618 in fig. 16) is used to detect whether the user has worn the head-wearable device or has detached the head-wearable device (glasses). In one example, for purposes of illustration only, and not meant to be limiting as such, the logic for sensor operation is as follows. If the output of the proximity sensor is "1", it means that the user wears the glasses 2002. If the output of the proximity sensor is "0", it means that the user has removed the glasses 2004.
The logic is configured to "turn off" the music playback when the user takes the glasses off the user's head. For example, if the user has worn the glasses 2002, the music will be controlled to "play" at 2006 by any of the sensors described above that control the music playing function. If the user takes off the glasses 2004, the proximity sensor will output "0", when the "0" output continues for more than a predetermined time, the music will "stop" and the system will enter a "pause" state 2008 for music playing. In one or more embodiments, the predetermined time required for music to stop playing is five (5) seconds. If the user puts on the glasses again, the music will revert to the "play" state 2006, in which case the output of the proximity sensor becomes "1".
If the glasses are off the user's head for more than a predetermined time, the system is powered down to an "off" state at 2010. In one or more embodiments, the predetermined time required to shut down the system when the user is not wearing the glasses is ten (10) seconds or more. One of ordinary skill in the art will recognize that the predetermined times given above are examples, and that different predetermined times may be used in various embodiments. The time chosen in the examples given herein is not meant to be limiting.
Fig. 21A to 21D show positions of the touch sensor and the multifunction button. Referring to fig. 21A, the eyeglass apparatus is shown in perspective view in a rearward direction by 2102. The multi-function button 2106 is shown positioned on the underside of the right temple 2108. Touch sensor area 2104 is shown located on the outer surface of right temple 2108. Both the multi-function button 2106 and the touch sensor area 2104 provide the functionality described above in connection with the figures.
Referring to fig. 21B, the eyeglass apparatus 2252 is shown in a front perspective view at 2200 and in a side view at 2275. A multi-function button 2276 is shown positioned on the underside of right temple 2278 of eyewear device 2252. Touch sensor area 2254 is shown positioned on an outer surface of right temple 2278. Both the multifunction button 2276 and the touch sensor area 2254 provide the functionality described above in connection with the figures.
Referring to fig. 21C, an eyeglass apparatus is shown in side view by 2300. Touch sensor area 2304 is shown as a rectangular area on the outside of right temple 2378. It is noted that the touch sensor region 2304 can be other shapes than rectangular. Rectangle 2304 is for illustration only and is not meant to be limiting. The user slides the finger 2306 forward or backward to control the volume and start a click as described above in connection with previous figures.
Referring to fig. 21D, an eyeglass apparatus 2402 is shown in side view by 2400. Multi-function button 2404 is shown on the underside of right temple 2478. The user presses button 2404 with finger 2406 to control functions in the eyewear system in 2402 as described above in connection with the previous figures.
It is noted that the multi-function buttons and/or touch sensor areas may be located elsewhere on the head wearable device, such as, for example, on an outer surface of the temple arm, on a top surface of the temple arm, or on the left temple arm.
For example, in one or more embodiments, the multi-function button is located on a top surface of the right temple arm. In use, the user grasps the right temple with two fingers, one finger resting on the bottom surface of the temple and the other finger pressing against the top surface of the temple. For example, in an application setting, a user places the thumb of the right hand against the underside of the right temple and the middle finger of the right hand on the top surface of the right temple, thereby grasping the right temple. The user can operate the multi-function button with the index finger of the right hand. The multifunctional button arranged in this way is arranged behind the plane of the front mirror frame so that the button is aligned with and operable by the index finger of the user when the temple arm is grasped as described above.
In various embodiments, this arrangement of multi-function buttons is easier to operate when the user is performing an activity such as cycling and the user is operating the multi-function buttons.
In some embodiments, raised, recessed, or otherwise shaped alignment marks are formed in the temple arms to serve as alignment positions for the user's one or more fingers relative to the position of the multi-function buttons. Placing the multi-function button at a specified distance from the alignment position allows the user to quickly find the multi-function button when the eyeglasses are worn on the user's head. In some embodiments, alignment is provided when a user grasps the temple at its connection to the front frame with the thumb and middle finger.
While the multi-function button is shown on the right temple, the multi-function button may also be located on the left temple of the eyewear apparatus. A left-handed user may prefer the multi-function buttons and touch sensor to be located on the left temple, while a right-handed user may prefer the multi-function buttons and touch sensor to be located on the right temple. Thus, embodiments of the present invention are configured in a button/temple configuration.
Gesture detection
In various embodiments, the hardware architecture includes the touch sensor and multi-axis motion sensor described above. In some embodiments, a nine (9) axis motion sensor is used. Sensor data is used to detect various head gestures. Once a head pose is detected, the system will take corresponding action. In one example, when a call comes in, the user may tap his or her head in one direction, e.g., up and down to answer the call. Similarly, a head shaking from left to right or right to left is understood by the system as rejecting listening. For example, with a phone call in, the user may place his or her finger on the touch sensor and then point his or her head down to answer the phone call. Or place a finger on the touch sensor and the user shakes his or her head to reject hearing.
In some embodiments, a multi-axis sensor is used for gesture detection of a user. In various embodiments, the sensors collect accelerometer data, gyroscope data, and magnetometer data, which are then transmitted to the system for processing using software algorithms running on a Central Processing Unit (CPU), DSP, etc., as described in connection with the above figures. In some embodiments, the sensor is configured with three orthogonal axes. The data is processed using one or more of the following: software algorithms, a CPU and a DSP to detect the pose of the user's head. In various embodiments, when the user's head is out of position for a long time, a voice message is generated and broadcast to the user through a speaker. For example, such communication with the user allows the user to take corrective action and improve posture.
Audio content
As described above, in various embodiments, the head-wearable device is used in conjunction with a mobile device to facilitate a phone call with a system configured in the head-wearable device (eyewear device). Content, such as music, may be streamed to the head wearable device through the mobile device. The content stream may also originate from the "cloud," i.e., the internet or local area network, and be streamed to the head wearable device. Further, a local storage device configured with the system (fig. 16) or on the head-wearable device for use by the system (fig. 16) may be used to provide a source of content that is played for the user through a speaker in conjunction with the head-wearable device. Thus, in various embodiments, content is played for a user through a head wearable device in conjunction with a mobile device or in a stand-alone configuration without a mobile device.
Answering telephone
Various ways of answering a call may be used in various embodiments. Telephone calls can be answered and/or terminated using a voice interface that uses a local speech recognition system to receive calls using control words such as "answer" and control words such as "goodbye" to end the call. Alternatively, the phone may be answered using one or more physical sensors such as "touch sensors" and/or "buttons". Alternatively, the phone may be answered by analyzing the head pose using data output from an accelerometer, gyroscope, or the like. It is noted that a combination of one or more of the above (i.e., speech recognition, sensor output, and gesture recognition) can be combined to answer a telephone call. Similarly, one or more of the above may be combined to provide a user with the selection and/or playback of audio content through a system incorporated into the head wearable device.
Command and control
System control-the system is woken up from a sleep state using a wake-up word, e.g., the wake-up word "SOLOS". Content controls- "play running music," "skip song," "turn up volume," "turn down volume," and so on. Telephone control-for example: "call" a name "(a telephone number corresponding to the" name ", for example, selectable from an address book)," turn up volume ", and" turn down volume ". Information control — for example: "internet browsing", viewing "temperature", viewing "weather", "navigation", etc. These examples are provided for illustration only and do not constitute a limitation on embodiments of the invention.
Magnetometer
In various embodiments, the magnetometer is incorporated into a head wearable device. In some embodiments, the magnetometer is a three axis magnetometer. A magnetometer in the head wearable device is used for navigation to determine the orientation of the user with respect to the earth's magnetic field. A magnetometer mounted in a head-wearable device, when on a user, has a fixed pointing direction that is consistent with the user's motion. Thus, the output from the magnetometer of the head-wearable device will provide a more useful signal than a magnetometer that can be incorporated into the user's mobile phone, as the mobile phone is not necessarily aligned with the direction in which the user is pointing.
In various embodiments, for example, the output of the magnetometer is applied in an application that displays the user's direction with a map, and rotates the digital map as the user changes direction north.
Accelerometer
In various embodiments, one or more accelerometers are provided in a head-wearable device. In some embodiments, a three-axis accelerometer is provided.
Gyroscope
In various embodiments, one or more gyroscopes are provided in a head wearable device. In some embodiments, a three-axis gyroscope is provided.
Batteries housed in one or more temples
In various embodiments, one or more batteries are provided to power the system, and the one or more batteries are housed within the space of the one or more temples. Batteries are constructed of materials such as lithium ion or other battery chemistries to support longer life and allow multiple charge cycles over the life of the battery. Depending on the anticipated power requirements of the head wearable device, several different sized temples may be provided for different applications. For example, some athletic activities may require six hours or more of "on" time for a system. In this case, a large temple houses a long-life battery. A temple of smaller size will house a smaller battery having a shorter useful life between charging cycles. The head wearable device is configured for various uses, such as athletic activity, business use, home, and business use. Some non-limiting examples of athletic activities are, but are not limited to, cycling, running, skiing, rowing, hiking, and the like.
System distribution throughout head wearable device
In one or more embodiments, the electronics systems of the head-wearable device are distributed among the left temple, the front frame, and the right temple. In one or more embodiments, the left temple arm houses a battery, one or more microphones, and at least one speaker. The right temple arm houses a battery, system electronics, one or more microphones, and at least one speaker. In some embodiments, the electrical connection between the system components (temple and front frame) is provided in the form of a detachable connector. In some embodiments, the connectors may be hinged.
For the purposes of discussion and understanding of the various embodiments, it is understood that various terms are used by those skilled in the art to describe techniques and methods. Furthermore, in the description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments. It will be apparent, however, to one skilled in the art that the embodiments may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the various embodiments. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the present invention.
Some portions of the description may be presented in terms of algorithms and symbolic representations of operations on data bits within, for example, a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of acts leading to a desired result. The acts are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the discussion, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, may refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
An apparatus for performing the operations herein may implement the present invention. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. The computer program may be stored on a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, hard disks, optical disks, compact disk read-only memories (CD-ROMs), magnetic-optical disks, read-only memories (ROMs), Random Access Memories (RAMs), Dynamic Random Access Memories (DRAMs), electrically programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), flash memory, magnetic or optical cards, RAID, or the like, or any type of media suitable for storing electronic instructions local to or remote from a computer.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the required method. For example, any of the methods according to the embodiments may be implemented by a hard-wired circuit obtained by programming a general-purpose processor, or by any combination of hardware and software. Those skilled in the art will appreciate that embodiments may be practiced with other computer system configurations than those described, including: handheld devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, Digital Signal Processing (DSP) devices, set top boxes, network personal computers, minicomputers, mainframe computers, and the like. The embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
The methods herein may be implemented using computer software. If written in a programming language conforming to a recognized standard, sequences of instructions designed to implement the methods can be compiled for execution on a variety of hardware platforms and for interface to a variety of operating systems. In addition, embodiments are not described with reference to any particular programming language. It should be appreciated that a variety of programming languages may be used to implement the embodiments described herein. Additionally, it is common in the art to speak of software, in one form or another (e.g., program, procedure, application, driver … …) as taking an action or causing a result. Such expressions are merely a shorthand way of saying that execution of the software by a computer causes the processor of the computer to perform an action or produce a result.
It is to be understood that various terms and techniques are used by those skilled in the art to describe communications, protocols, applications, implementations, mechanisms, and the like. A similar technique is to describe the implementation of the technique in terms of algorithms or mathematical expressions. That is, although a technique may be implemented, for example, as executing code on a computer, an expression of the technique may be more aptly and succinctly conveyed or conveyed as a formula, algorithm, or mathematical expression. Thus, those of ordinary skill in the art will recognize that the implementation of a + B ═ C as a block of an addition function, in hardware and/or software, will take two inputs (a and B) and produce one summed output (C). Thus, the use of formula, algorithm, or mathematical expression as descriptions is to be understood as having a physical representation of, at least, hardware and/or software (e.g., a computer system in which the techniques of the present invention may be implemented and realized as embodiments).
A non-transitory machine-readable medium is understood to include any mechanism for storing information (e.g., program code, etc.) in a form readable by a machine (e.g., a computer). For example, a machine-readable medium, synonymously referred to as a computer-readable medium, includes Read Only Memory (ROM); random Access Memory (RAM); a magnetic disk storage medium; an optical storage medium; a flash memory device; electrical, optical, acoustical or other form of information transfer, other than by way of a propagated signal (e.g., a carrier wave, infrared signal, digital signal, etc.); and so on.
As used in this specification, the word "one embodiment" or "an embodiment" or similar phrases means that the feature being described is included in at least one embodiment of the present invention. References to "one embodiment" in this description do not necessarily refer to the same embodiment, however, the embodiments are not mutually exclusive. Nor does "one embodiment" imply that there is only one embodiment of the invention. For example, features, structures, acts, etc. described in connection with one embodiment may be included in other embodiments. Thus, the invention may include various combinations and/or integrations of the embodiments described herein.
While the invention has been described in terms of several embodiments, those skilled in the art will recognize that the invention is not limited to the embodiments described, but can be practiced with modification and alteration within the spirit and scope of the appended claims. The description is thus to be regarded as illustrative instead of limiting.
Appendix 1
Figure GDA0003526204770000261
Figure GDA0003526204770000271

Claims (24)

1. A reconfigurable assembly for use in an eyewear apparatus, comprising:
an embedded electronic system configured for wireless communication and processing of sensor signals, the embedded electronic system being embedded in a component of the eyewear apparatus; and
a plurality of sensors embedded in one or more of the reconfigurable component and the eyewear apparatus, the plurality of sensors in electrical communication with the embedded electronic system; the embedded electronic system further comprises:
a processor configured to:
receiving outputs from the plurality of sensors; and
determining a system control parameter based on an output of at least one of the plurality of sensors.
2. The reconfigurable assembly of claim 1, wherein the processor extracts user head motion data from the plurality of sensors, the user head motion data being used by the system as system control parameters.
3. The reconfigurable assembly of claim 2, wherein the system control parameters are used to answer an incoming call.
4. The reconfigurable assembly of claim 2, wherein the system control parameters are used to reject incoming calls.
5. The reconfigurable assembly of claim 2, wherein the system control parameters are used to control music playback.
6. The reconfigurable assembly according to claim 2, wherein the at least one sensor is an accelerometer, the at least one sensor is a gyroscope, and the at least one sensor is a magnetometer.
7. The reconfigurable assembly of claim 6, wherein the plurality of sensors further comprises:
a sensor configured to measure acceleration along three mutually orthogonal axes x, y and z;
a sensor configured for gyroscope output along three mutually orthogonal axes x, y and z; and
a sensor configured for magnetometer output along three mutually orthogonal axes x, y and z.
8. The reconfigurable assembly of claim 1, wherein the embedded electronic system further comprises:
an embedded speech recognition system configured to receive an audio signal from a microphone and to change the embedded electronic system to an operational state when a wake-up word is detected.
9. The reconfigurable assembly of claim 8, wherein the embedded speech recognition system is configured to receive audio signals from the microphone and the processor to facilitate wireless voice communication when a command is recognized by the embedded speech recognition system.
10. The reconfigurable assembly of claim 8, wherein the embedded speech recognition system is configured to receive audio signals from the microphone to facilitate content control when a command is recognized by the embedded speech recognition system.
11. The reconfigurable assembly of claim 8, wherein the embedded speech recognition system is configured to receive audio signals from the microphone to facilitate information control when a command is recognized by the embedded speech recognition system.
12. A reconfigurable assembly for use in an eyewear apparatus, comprising:
a temple insert module removably coupled with an interface of a temple, the temple configured for use with the eyewear apparatus, the temple insert module further comprising:
an embedded electronic system configured for wireless communication and processing of sensor signals, the embedded electronic system being embedded in a component of an eyewear device; and
a plurality of sensors embedded in one or more of the temple insert module, the temple, and the eyewear apparatus, the plurality of sensors in electrical communication with the embedded electronic system, the embedded electronic system further comprising:
a processor configured to:
receiving outputs from the plurality of sensors; and
determining a system control parameter based on an output of at least one of the plurality of sensors.
13. The reconfigurable assembly of claim 12, further comprising:
a multi-function button, wherein the processor is configured to receive a signal from the multi-function button, the processor configured to determine a length of time that a user pressed the multi-function button, the length of time being used by the processor to begin one or more of:
a.) pairing mobile devices;
b.) unpairing the mobile device;
c.) playing the content;
d.) selecting the next song;
e.) making a telephone call;
d.) answering the telephone call; and
e.) refusing to receive the telephone call.
14. The reconfigurable assembly of claim 12, further comprising:
a touch sensor, wherein the processor is configured to receive signals from the touch sensor, the processor is configured to process a click input from a user to a touch sensor surface or a slide input from the user to the touch sensor surface, the processor is configured to initiate one or more click actions upon receiving a click input, the click actions comprising:
a.) playing the content;
b.) pausing the playing of the content;
c.) voice-controlled audio input; and
the processor is configured to initiate one or more of the following swipe actions after receiving a swipe input from a user, the swipe actions including:
d.) answering the incoming call;
e.) refusing to answer the incoming call;
f.) turning down the volume; and
g.) turn up the volume.
15. The reconfigurable assembly of claim 14, wherein the click input to switch between playing content and pausing playing content is a single click input.
16. The reconfigurable assembly of claim 14, wherein the click input to initiate voice control is a double click input.
17. The reconfigurable assembly of claim 14, wherein a short slide input results in an incremental change in volume.
18. The reconfigurable assembly of claim 14, wherein a long slide input results in a continuous change in volume.
19. The reconfigurable assembly of claim 12, further comprising:
a proximity sensor, the processor configured to receive signals from the proximity sensor, the processor configured to determine a presence of a user from the proximity sensor output and perform one of the following actions;
a.) pausing playback of content if a user has removed the eyewear device from the user's head;
b.) resume content playback when the user puts the eyewear device back on the user's head;
c.) turning off content playback after the eyewear device continues to leave the user's head for a first predetermined amount of time; and
d.) turn off the eyewear device after the eyewear device continues to leave the user's head for a second predetermined amount of time.
20. A method of providing information to a user via an eyewear device, comprising:
receiving, at a processor, output from a plurality of sensors, the processor included within a temple insertion module included within a temple of the eyewear apparatus, the plurality of sensors configured with the eyewear apparatus;
determining a system control parameter using at least one output from the plurality of sensors;
controlling at least one function of the eyewear device using the system control parameters.
21. The method of claim 20, wherein one of the plurality of sensors is a button and the processor is configured to associate a duration of time that the button is pressed by a user with a system action, the system action being one or more of:
pairing with a mobile device;
unpairing from the mobile device;
playing the content;
selecting a next song;
dialing a call;
answering a telephone call; and
the telephone call is rejected.
22. The method of claim 20, wherein one of the plurality of sensors is a touch sensor and the processor is configured to receive signals from the touch sensor, wherein the processor is configured to process a click input from a user to a touch sensor surface or a slide input from the user to the touch sensor surface, wherein the processor is configured to initiate one or more click actions upon receiving a click input, wherein the click actions comprise:
pairing with a mobile device; and
unpairing from the mobile device.
23. The method of claim 22, wherein the processor is configured to initiate one or more click actions upon receiving a slide input to the touch sensor surface, the click actions comprising:
answering the incoming call;
refusing to answer the incoming call;
reducing a volume of an audio broadcast from the eyewear device; and
increasing the volume of the audio broadcast from the eyewear device.
24. The method of claim 20, wherein one of the plurality of sensors is a proximity sensor, wherein the processor is configured to receive a signal from the proximity sensor and determine a presence of a user from the signal and perform actions further comprising:
pausing content playback if the user has removed the eyewear device from the user's head;
resuming content playback when the user wears the eyewear device back on the user's head;
turning off content playback after the eyewear device continues to leave the user's head for a first predetermined amount of time; and
turning off the eyewear device after the eyewear device continues to leave the user's head for a second predetermined amount of time.
CN202080049359.1A 2019-07-13 2020-07-13 Hardware architecture for modular eyewear systems, devices, and methods Pending CN114270296A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201962873889P 2019-07-13 2019-07-13
US62/873,889 2019-07-13
US16/926,722 2020-07-12
US16/926,722 US11709376B2 (en) 2018-12-12 2020-07-12 Hardware architecture for modularized eyewear systems, apparatuses, and methods
PCT/IB2020/000831 WO2021044219A2 (en) 2019-07-13 2020-07-13 Hardware architecture for modularized eyewear systems apparatuses, and methods

Publications (1)

Publication Number Publication Date
CN114270296A true CN114270296A (en) 2022-04-01

Family

ID=74853450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080049359.1A Pending CN114270296A (en) 2019-07-13 2020-07-13 Hardware architecture for modular eyewear systems, devices, and methods

Country Status (4)

Country Link
JP (2) JP7352004B2 (en)
CN (1) CN114270296A (en)
GB (1) GB2600562B (en)
WO (1) WO2021044219A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114153078A (en) * 2021-11-26 2022-03-08 美特科技(苏州)有限公司 Intelligent glasses and camera device thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220299792A1 (en) * 2021-03-18 2022-09-22 Meta Platforms Technologies, Llc Lanyard for smart frames and mixed reality devices

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8094858B2 (en) 2009-04-27 2012-01-10 Joseph Adam Thiel Eyewear retention device
GB2476033A (en) 2009-12-04 2011-06-15 Marcus Lewis Personal audio equipment device
EP2439580A1 (en) 2010-10-01 2012-04-11 Ophtimalia Data exchange system
JP6021582B2 (en) 2012-10-24 2016-11-09 オリンパス株式会社 Glasses-type wearable device and front part of glasses-type wearable device
KR102209512B1 (en) 2014-06-30 2021-01-29 엘지전자 주식회사 Glasses type mobile terminal
JP2017092628A (en) 2015-11-06 2017-05-25 セイコーエプソン株式会社 Display device and display device control method
US20170015260A1 (en) * 2015-07-13 2017-01-19 LAFORGE Optical, Inc. Apparatus And Method For Exchanging And Displaying Data Between Electronic Eyewear, Vehicles And Other Devices
US9910298B1 (en) * 2017-04-17 2018-03-06 Vision Service Plan Systems and methods for a computerized temple for use with eyewear
CN109032384B (en) * 2018-08-30 2021-09-28 Oppo广东移动通信有限公司 Music playing control method and device, storage medium and wearable device
CN109407858A (en) 2018-09-29 2019-03-01 深圳前海格物致知科技有限公司 A kind of intelligent glasses

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114153078A (en) * 2021-11-26 2022-03-08 美特科技(苏州)有限公司 Intelligent glasses and camera device thereof
CN114153078B (en) * 2021-11-26 2024-01-30 美特科技(苏州)有限公司 Intelligent glasses and camera device thereof

Also Published As

Publication number Publication date
WO2021044219A2 (en) 2021-03-11
JP2024001031A (en) 2024-01-09
GB2600562A (en) 2022-05-04
JP2022543738A (en) 2022-10-14
WO2021044219A3 (en) 2021-06-03
GB202115401D0 (en) 2021-12-08
GB2600562B (en) 2023-09-20
JP7352004B2 (en) 2023-09-27

Similar Documents

Publication Publication Date Title
US20240085724A1 (en) Modularized eyewear systems, apparatus, and methods
TWI821208B (en) Sound reproduction device
KR101231026B1 (en) Wireless interactive headset
EP3525031B1 (en) Display device
JP2024001031A (en) Hardware architecture for modularized eyeglass system, apparatus, and method
US10191289B2 (en) Modular accessories for head-mountable device
US11835798B2 (en) Eyewear systems, apparatuses, and methods for providing assistance to a user
TW201415113A (en) Adapter for eyewear
US10321217B2 (en) Vibration transducer connector providing indication of worn state of device
US20230324709A1 (en) Hardware architecture for modularized eyewear systems, apparatuses, and methods
WO2018186062A1 (en) Headphones
CN114079838A (en) Audio control method, equipment and system
GB2476064A (en) Video recording spectacles with removable camera
JP7295253B2 (en) Personalized Directional Audio for Head-Worn Audio Projection Systems, Devices, and Methods
CN201156128Y (en) Bone conduction sport eyeglass
US20240094556A1 (en) Eyewear systems, apparatus, and methods for providing assistance to a user
US11871174B1 (en) Personalized directional audio for head-worn audio projection systems, apparatuses, and methods
WO2023185698A1 (en) Wearing detection method, and related apparatus
CN103873998B (en) Electronic equipment and sound collection method
KR20170060475A (en) Mobile terminal and method of controlling the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination