WO2002045044A1 - Integrated method and system for communication - Google Patents

Integrated method and system for communication Download PDF

Info

Publication number
WO2002045044A1
WO2002045044A1 PCT/US2001/044068 US0144068W WO0245044A1 WO 2002045044 A1 WO2002045044 A1 WO 2002045044A1 US 0144068 W US0144068 W US 0144068W WO 0245044 A1 WO0245044 A1 WO 0245044A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
micro
eye
light
eyewear
Prior art date
Application number
PCT/US2001/044068
Other languages
French (fr)
Inventor
Ronald D. Blum
Original Assignee
Smartspecs, L.L.C.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US72329000A priority Critical
Priority to US09/723,290 priority
Application filed by Smartspecs, L.L.C. filed Critical Smartspecs, L.L.C.
Publication of WO2002045044A1 publication Critical patent/WO2002045044A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms

Abstract

Method and apparatus for communication is provided. The method includes directing (Fig. 6) a light into the eye of a user (10); measuring the reflection of the light directed at the eye of the user (10), determining whether the user's eye is open or closed based upon the results of the measurement of the reflection of the light directed at the eye of the user and activating an alarm when the user's eye is closed for a predetermined length of time. The apparatus includes: a head-mounted eyewear frame; a light source supported by the frame and positioned on the frame so that the light source can send a ray of light at an eye of the user; a light detector supported by the frame and positioned on the frame to detect a reflection of the ray of light sent to the user's eye by the light source; an audio speaker supported by the frame; and a microprocessor (26) supported by the frame wherein the microprocessor (26) is adapted to generate an alarm signal when the light detector senses that the user's eye has been closed for a predetermined period of time.

Description

Integrated Method and System for Communication

Related Applications

This is a continuation-in-part of U.S. Application serial no. 09/615,763, filed on July 13, 2000, which claims the benefit of: U.S. Provisional Application No. 60/144,728, filed July 20, 1999; U.S. Provisional Application No. 60/150,544, filed August 25, 1999; and, U.S. Provisional Application No. 60/164,873, filed November 12, 1999. All of these above listed applications are incorporated herein by reference.

Field of the Invention

The present invention relates to methods and systems for communication. More particularly the present invention regards an integrated communication system capable of providing visual and audible information to a user while the user is participating in other activities.

Background of the Invention

The integration of computers into our daily lives has increased substantially over the last several decades. Main frame computers, once the only source of computing power, have evolved into desktop units, lap top units, Personal Digital Assistants (PDA's), and, now, wearable computers. In each one of these evolutionary steps, the computers have become smaller and more easily accessible.

Wearable computers may be computers which suspend a visual display in the field of view of a user. Known wearable computers contain a headset physically connected to a processing unit. These headsets provide visual information to a user in the form of a full or quarter screen computer display. As can be imagined these existing display units are somewhat large, unwieldy, bulky, and utilize an exposed connecting wire to a battery pack and/or processor worn on one's waist. Their size is in part due to the power and data transfer rates required to reproduce a quarter or more of the display screen of a typical computer monitor. These large power and data transfer rates have worked to buoy the size and cost of wearable computers.

Concomitantly, over these same decades, wireless communication networks have been created and have, understandably, shaped the daily lives of individuals throughout society. Wireless networks have connected people in ways never before thought possible. Previously inaccessible individuals now have access to one another and to information in a more timely and accurate fashion through the use of cellular phones.

Cellular phones, which transmit and receive information over today's wireless networks, have become smaller and less costly while at the same time have continued to offer a longer range and more services. Not only may these known cellular phones be used in wireless telephony but they may also be used in other applications as well. For example, certain known cellular phones currently offer Internet access.

Despite this increase in available services and their reduction in size known cellular phones continue to suffer from the same infirmities as those from the past. For example, in order to view information from the Internet which is displayed on a known cellular phone a user must hold the phone at some distance from their eyes while within their field of vision and must divert their attention to the display on the cellular phone to see the displayed information. Consequently, users are unable to perform other tasks while viewing the information displayed on the cellular phone. While it is foreseeable that the use of wearable computers and wireless telephones will continue to increase, under their current design paradigms they will nevertheless suffer from the conceptual shortcomings experienced today.

Summary of the Invention In accordance with the present invention, an integrated method and apparatus for communication is provided. A method of notifying a user is presented in one embodiment. This method includes directing a light into the eye of a user; measuring the reflection of the light directed at the eye of the user; determining whether the user's eye is open or closed based upon the results of the measurement of the reflection of the light directed at the eye of the user; and activating an alarm when the user's eye is closed for a predetermined length of time.

In a second embodiment eyewear for a user is provided. The eyewear in this embodiment includes: a head-mounted eyewear frame; a light source supported by the frame and positioned on the frame so that the light source can send a ray of light at an eye of the user; a light detector supported by the frame and positioned on the frame to detect a reflection of the ray of light sent to the user's eye by the light source; an audio speaker supported by the frame; and a microprocessor supported by the frame wherein the microprocessor is adapted to generate an alarm signal when the light detector senses that the user's eye has been closed for a predetermined period of time.

Brief Description of The Drawings Fig. 1 is a side perspective view of eyewear worn by a user in accordance with an embodiment of the present invention.

Fig. 2.0 is a front perspective view of the eyewear of Fig. 1 in accordance with an embodiment of the present invention.

Fig. 2.1 is an exemplary view of a line of characters as seen by a user looking into the micro-optical display mounted on the eyewear illustrated in Fig. 2.0. Fig. 2.2 is a front perspective view of a Processor Communication Control and Storage Unit (PCCSU) in accordance with an alternative embodiment of the present invention.

Fig. 2.3 is a view of an Integrated Communication System being worn by a user in accordance with another alternative embodiment of the present invention.

Fig. 3 is a comparison of a full VGA field of view, a quarter VGA field of view, and the field of view of an exemplary micro-optical display in accordance with another alternative embodiment of the present invention. Fig. 4.0 is a view of a single line of text as displayed by a micro- optical display in accordance with another alternative embodiment of the present invention.

Fig. 4.1 illustrates multiple rows of text of various lengths filling the field of view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 4.2 is a view of a picture displayed in the circular field of view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 4.3 is a picture having a rectangular field of view as displayed in a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 5.0 is a view of two lines of text displayed on a micro-optical display in accordance with another alternative embodiment of the present invention. Fig. 5.1 is a view of a single line of text displayed on a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 5.2 is a view of two lines of text displayed on a micro-optical display in accordance with another alternative embodiment of the present invention. Fig. 5.3 is a view of three lines of text displayed on a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 6 is a front perspective view of eyewear in accordance with another alternative embodiment of the present invention.

Fig. 7 is a front perspective view of eyewear in accordance with another alternative embodiment of the present invention.

Fig. 8 is a front perspective view of eyewear in accordance with another alternative embodiment of the present invention. Fig. 9 is a side view of the eyewear from Fig. 8.

Fig. 10 is a front perspective view of eyewear in accordance with another alternative embodiment of the present invention.

Fig. 11 is an enlarged view of a micro-optical display in accordance with another alternative embodiment of the present invention. Fig. 12 is an enlarged view of a rib cage structure from the micro- optical display of Fig. 11.

Fig. 13 is an enlarged view of the rib cage structure of Fig. 12 showing some of the components that may be placed within the rib cage structure. Fig. 14 is a side view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 15.0 is an enlarged perspective view of a micro-optical display positioned near a user's eye in accordance with another alternative embodiment of the present invention. Fig. 15.1 is an enlarged view of axis markings located on the micro- optical display of Fig. 15.0.

Fig. 16.0 is an enlarged perspective view of a micro-optical display in accordance with an alternative embodiment of the present invention.

Fig. 16.1 is a mirror adjustment chart as utilized for the micro-optical display of Fig. 16.0 in accordance with an alternative embodiment of the present invention. Fig. 17 is an enlarged perspective view of a micro-optical display positioned near a user's eye in accordance with another alternative embodiment of the present invention.

Fig. 18.0 is a side view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 18.1 is a side view of a micro-optical display positioned near a user's eye in accordance with another alternative embodiment of the present invention.

Fig. 18.2 is a side view of a micro-optical display positioned near a user's eye in accordance with another alternative embodiment of the present invention.

Fig. 18.3 is a side view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 18.4 is a side view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 18.5 is a side view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 18.6 is a side view of a micro-optical display in accordance with another alternative embodiment of the present invention. Fig. 18.7 is a side view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 18.8 is a side view of a micro-optical display in accordance with another alternative embodiment of the present invention.

Fig. 19 is a front perspective view of eyewear in accordance with another alternative embodiment of the present invention.

Fig. 20 is a front perspective view of eyewear in accordance with another alternative embodiment of the present invention. Fig. 20.1 is a front enlarged view of an eyewire and micro-optical display in accordance with another alternative embodiment of the present invention. Fig. 20.2 is a front enlarged view of a notched lens in accordance with another alternative embodiment of the present invention. Fig. 21 is a side view of eyewear in accordance with another alternative embodiment of the present invention.

Fig. 22 is an enlarged perspective view of a micro-optical display mounted on a lens in accordance with another alternative embodiment of the present invention.

Fig. 23 is a rear view of eyewear frames in accordance with another alternative embodiment of the present invention.

Fig. 24 is a front perspective view of eyewear in accordance with another alternative embodiment of the present invention. Fig. 25.0 is a side view of the eyewear from Fig. 24 in accordance with another alternative embodiment of the present invention.

Fig. 25.1 is an enlarged cutaway view of a lens in accordance with another alternative embodiment of the present invention.

Fig. 26 is an inside view of eyewear as seen by a user in accordance with another alternative embodiment of the present invention.

Fig. 27 is a side perspective view of eyewear in accordance with another alternative embodiment of the present invention.

Fig. 28 illustrates an emergency response system in accordance with another alternative embodiment of the present invention. Fig. 29 is a front perspective view of a charging data link unit in accordance with another alternative embodiment of the present invention.

Detailed Description

Fig. 1 is an illustration of the head of a user 10 wearing eyewear 11 in accordance with an embodiment of the present invention. The eyewear 11 may be conventional in appearance, having an eyewire 14, lens 18, and temple 15. As can be seen, the eyewear 11 in this embodiment has a micro-optical display 12 located on the lens 18 of the eyewear 11. The micro-optical display 12 in this embodiment is positioned to be within the field of view of the user's eye 17, so that, at the convenience of the user 10, the user 0 can look into the micro-optical display and view the information 13 displayed therein. By positioning the micro-optical display within the field of view of the user 10, the user 10 can access information from the display while participating in other activities.

The information 13 displayed to the user 10 may contain a variety of data including stock quotes, e-mails, driving directions, and other types of data.

The micro-optical display 12 may be integrated with or separately attached to the eyewear 11. The optics in the micro-optical display 12 may be designed to project the image of one or more lines of characters out into space so that the characters appear in the field of view of the user. Fig. 2.0 is a more detailed front perspective view of the eyewear from

Fig. 1. As is evident, the eyewear 11 in this embodiment is configured such that it appears to be a conventional pair of eyeglasses. Alternatively, as will be discussed below, the eyewear 11 may also be configured as sport glasses, safety goggles, and any other type of eyewear that can be worn by a user.

The eyewear 11 in this embodiment is part of a versatile communication system. Working in conjunction with a Processor, Communication, Control, and Storage Unit (PCCSU shown in Fig. 2.2), the eyewear 11 is able to send and receive wireless calls over a wireless network. In addition, the eyewear 11 may also receive data from the wireless network and display that data on the micro-optical display. In use, a user of the eyewear 11 would converse with someone else over a wireless network using the microphone 20 and the speaker 24 mounted on the eyewear 11. The antenna 23 located on the eyewear may be used to communicate with the PCCSU or, alternatively, may be used to communicate directly over a wireless network.

As can be seen, the eyewear 11 in Fig. 2.0 contains a microphone 20, a micro-optical display 12, solar cells 22, an antenna 23, speakers 24, a transceiver 25, a microprocessor 26, a lens 29, a bridge 28, and a battery 27. These solar cells 22, located on the temple 15 of the eyewear 11 , may be utilized to charge the battery 27 and may also be utilized to directly power the components of the eyewear 11. The microprocessor 26 is mounted in one of the ear rests of the eyewear 11. This microprocessor 26 may be in communication with each of the components and may be utilized to receive information from the transceiver 25 and then display it on the micro-optical display 12. The transceiver 25 may be used to perform transmitting and receiving functions for the eyewear 11. The transceiver 25 may be utilized to communicate with the PCCSU and may also be utilized to communicate directly over a wireless network. The lens 29 in this embodiment contains an opening or notch allowing the micro-optical display 12 to traverse through the lens and extend beyond the outside face and the inside face. Fig. 2.1 is a view of the viewing area of the micro-optical display 12 of

Fig. 2.0. As can be seen, a row of alphanumeric characters is displayed by the micro-optical display 12 in this embodiment. This row of characters may be displayed by the micro-optical display 12 such that they appear to the user to be projected into space from 4 inches to infinity or at any comfortable distance to the user. While one line of characters is shown in Fig. 2.1 , additional lines of characters may also be displayed in the micro- optical display 12. Examples of these additional lines of display, as well as the methods in which these various lines may be displayed, are discussed in more detail below. Fig. 2.2 is a front perspective view of the Processor, Communication,

Control, and Storage Unit (PCCSU) mentioned above. The PCCSU 220 may act as a communication link between a wireless network and the transceiver located in the eyewear. The PCCSU 220 may also act as a wireless telephone without utilizing the eyewear mentioned above. The PCCSU 220 may be the size of a belt pager and may be worn or somehow supported the user.

Fig. 2.2 reveals that the PCCSU 220 may have a microphone 221 , a highspeed infrared data-link 223, a speaker 232, a display 231 , a 3-D stack chip processor 230, an RF strip antenna 229, RF transceiver electronics 227, a fuel cell pack 226, nonvolatile memory 225, a recharge input plug 224, an on-off power switch 223, and display function keys 222. Regarding these components of the PCCSU, the highspeed infrared data link 223 of the PCCSU 220 may be used for sending and receiving data to and from external data transfer units. The speaker 232 may be used for providing audible alarms, and all types of audio, speech, audio music, a combination thereof, and other types of information to the user. The display 231 , which may be a liquid crystal display, may be used for providing the status of the PCCSU 220 as well as for providing other information to the user. The 3-D stack chip processor 230 contained within the PCCSU 220 may be used to carry out the processing functions of the PCCSU 220. The 3-D stack microprocessor may also control the internal management functions of the PCCSU 220. These functions would include: the allocation of power to the displays, earphones, and microphones; data storage in memory; and, transmission and conversion of data between audio, video, electronic digital, and radio-frequency formats. The RF strip antenna 229 maybe used to communicate with the eyewear as well as to communicate over wireless networks. The VRA adjustments 228 are used for the separate adjustment of voice and of background audio so as to enhance a user's listening experience. The fuel cell pack 226 may be used to power the PCCSU 229. The nonvolatile memory 225 may be used to contain the necessary execution software as well as any other information that has been downloaded into the PCCSU 229. The recharge input plug 224 may be utilized to recharge the fuel cells with direct or alternating current and to provide the required power to operate the PCCSU 229 without the use of batteries. The fuel pack 226 may be bulk units stored in the PCCSU 229 or, alternatively, may be thin-film batteries applied to the surface of the PCCSU 229. The fuel pack 226 or any other means of providing power, may also be a separate unit, wearable by the user under the shirt collar, or the belt, or any other location of the body and connected to the PCCSU 229 and the eyewear by a direct conductive link. The display function keys 222 may be utilized to control the information that is displayed on the display 231 and on the micro-optical display of the eyewear. These display function keys can include: a stock quote key; a sports score key; a telephone book key; a time, temperature, or direction key; and a location or global position key. Lastly, the microphone 221 may be used to receive voice activation commands from the user rather than requiring tactile input from the user.

While the PCCSU is shown as a separate unit other configurations and embodiments are foreseeable. These other embodiments include * electrically wiring or integrating the PCCSU functions into the eyewear. These other embodiments may also include eliminating the memory storage unit of the PCCSU and operating the PCCSU without a substantial amount of non-volatile storage. By removing the memory storage unit the size and power consumption requirements of the PCCSU can be greatly reduced. Alternatively, rather than having the transceiver, which performs the functions of a wireless telephone located in the PCCSU, the PCCSU may, instead, be connected to a wireless communication device that provides the requisite wireless communication functions. Moreover, in still other alternative embodiments the PCCSU may utilize the user's belt as an antenna and the PCCSU may be separable from the eyewear allowing the eyewear to communicate directly over a wireless network. Still further, while the command and control of the system may be through the use of a touch pad or buttons on the PCCSU, the PCCSU may also be commanded in a hands free manner through voice commands. The voice commands may be deciphered by voice recognition software incorporated into the PCCSU during its manufacture or alternatively the PCCSU may be "trained" to correlate certain user utterances with specific commands and functions. Artificial neural network co-processors could be also used to "train" the system to recognize the user's voice commands, thereby simplifying the software.

Fig. 2.3 is a front view of a user wearing an Integrated Communication System (ICS) in accord with an alternative embodiment of the present invention. In Fig. 2.3, the user 336 is shown wearing eyewear 332 and a PCCSU 330. As can be seen the eyewear 337 is supported by the user's head and is styled to look like a conventional pair of eyeglasses. The eyewear contains an eyewire 332, a temple 331 , and a micro-optical display 333. The PCCSU 330 is shown supported by a strap 334, however, it could be supported by the user's belt or carried in the user's pocket purse or supported by any other means by the user, and, as discussed above, the PCCSU 330 may feed data to the eyewear 337 to be displayed on the micro-optical display 333. The PCCSU 330 feeds this data to the eyewear 337 via the wireless link indicated by arrow 338.

The ICS described above may provide various features and services to the user. By way of examples only, the micro-optical display may be configured to provide the local time, date, stock quote, gps, and schedule of the user, it may also be configured to display telephone book information stored in the PCCSU, teleprompter information, voice mail, and pager messages. Moreover, a user who is hearing impaired may receive translations of the speech occurring around them in the micro-optical display much like the closed captioning systems employed on public broadcast television. Similarly, the ICS may provide a translation feature wherein the PCCSU would translate the conversation from an individual into a language that the user could understand. Once translated the conversation could then be displayed on the micro-optical display.

Fig. 3 is a comparative view of: a full VGA field of view 30; a quarter VGA field of view 32; and a micro-optical display field of view 31. As can be seen, the micro-optical display field of view 31 is smaller than both the full VGA field of view 30 and the quarter VGA field of view 32 in this embodiment. As noted above, an advantage of the micro-optical display is that the amount of power required to operate it is less than the amount of power required to drive a larger QVGA field of view 32 and a full VGA field of view 30. In addition, the data exchange requirements for displaying information on a micro-optical display may be less than the data exchange requirements for full VGA and QVGA displays.

The field of view shown in Fig. 3 replicates a 20-inch viewing distance from the user to a monitor. Using this scale, a 30 character wide micro- optical display field of view would be approximately 6.8 degrees in diameter. The display may also be other sizes such that the field of view available to the user may be greater or smaller.

Figs. 4.0 through 4.3 show various field of view configurations for the micro-optical display 12 of Fig. 1. As can be seen in Fig. 4.0, a single row of text, 30 characters wide, may be displayed in the micro-optical display. Alternatively, as is shown in Fig. 4.1 , multiple rows of text, of various lengths, can be displayed in the micro-optical display or, as shown in Fig. 4.2, an image within a circular field of view may also be displayed in the micro-optical display. Moreover, as shown in Fig. 4.3, an image with a rectangular field of view may also be displayed within the micro-optical display field of view. As is evident, the micro-optical display is a versatile display component providing various display options.

Figs. 5.0-5.3 are enlarged views of the various methods in which the micro-optical display can display alphanumeric characters. In Fig. 5.0 a first method is displayed wherein the first line of characters 512 remains stationary while the second line of characters 513 scrolls across the field of view of the user from right to left as indicated by the arrow 510. When the entire second line of characters 513 had scrolled across the display, the second line could move up as indicated by arrow 511 and displace the first line of characters 512. The stationary first line of characters 512 could then contain the first 30 characters of the second line of characters 513 and the second line of characters 513 could then display the next line of text in the scrolling fashion discussed above. While a defined string of 30 characters is illustrated in this embodiment the number of characters in each line may be increased or decreased to a number suitable to the user -- a 15-45 character range believed to be the optimum character display length for the micro- optical display.

Alternatively, as can be seen in Fig. 5.1 , a single line of characters 514 can be utilized to display the entire message. In Fig. 5.1 the single line of characters 514 would move from right to left and would contain the entire message. The rate of scroll of these characters can be controlled remotely by an adjustment on the PCCSU or on the user's glasses. This continually scrolling format can be useful for displaying real-time data such as stock quotes and other similar real-time information streams.

Similarly, as can be seen in Fig. 5.2, in circumstances where the lines of characters are less than the display length of the micro-optical display, the characters would not need to scroll across the screen but, rather, would simply be displayed in the row and would be displaced by the next row of characters. In Fig. 5.2 the first line of characters is noted at 532 and the second line of characters is noted at line 533.

Fig. 5.3 illustrates another alternative embodiment. In this embodiment, a three-line display 585 is illustrated. The bottom line 581 scrolls from right to left for the user as indicated by arrow 570. Then, when all the data to be scrolled on the bottom line has been displayed, the bottom line 581 displaces the middle line 582, which in turn displaces the top line 583. In other words, the bottom line 581 would be scrolling from right to left in the field of vision of the user while the middle line 582 and the top line 583 would be scrolling up the field of vision of the user.

In each of these various alternative embodiments the rate of scroll of the data across the display and the rate of scroll up and down the display may be adjusted and controlled by the user through controls located on the eyewear. The user's preferential settings may then be stored by the eyewear or the PCCSU for later retrieval and use. Moreover, not only can these settings be stored for a single user but the settings may also be stored for several users who may choose their predetermined preferences when they utilize the present invention. The micro-optical display may also display text in any language, and may scroll from right to left, left to right or vertically as required by the displayed language. Moreover, some languages allow either left to right or right to left reading (e.g. Chinese characters may be written in either direction), in these cases the user may select the preferred direction of scrolling.

The character size of the characters displayed on the micro-optical display may also be adjusted in accordance with another alternative embodiment of the present invention. In this alternative embodiment rather than displaying the characters on a single display line the characters are displayed across several display lines such that fewer individual characters may be displayed at one time but that those characters that are displayed may be two or three times as large as the regularly sized characters. Like the other preferential settings discussed above the user of this alternative embodiment may select this display option as one of her preferential settings.

Furthermore, the display may also be adjusted to allow the speed of the scrolling text to correspond with the speed and pauses of the coinciding input. In other words, in another alternative embodiment the microprocessor controlling the information displayed on the micro-optical display can modulate the display information to coincide with the natural pauses of the audible speech being displayed on the micro-optical display. These various display options allow the display to provide the information in the most comfortable and easily comprehensible manner to the user.

Fig. 6 is a perspective front view of an alternative embodiment of the present invention. As can be seen in Fig. 6, a micro-optical display 62 is mounted on a lens 691 positioned within the eyewire 69 of the eyewear 693. In addition, a battery 61 , a microprocessor 60, a miniature transmitter- receiver 67, a speaker 66, and an antenna 65 are all located on the temple 692 of the eyewear 693. A directional microphone 64 is also mounted on the eyewire 69 of the eyewear 693. The micro-optical display 62 in this alternative embodiment may be placed in the field of vision of either eye of a user or, alternatively, as will be discussed below, can be placed in the field of vision of both of the user's eyes. In so doing, a user can be provided information in a reference mode when information is provided to one eye, or in a three-dimensional mode when information is provided to both eyes.

Due to the extremely small diameter 63 of the micro-optical display 62 (6mm in this embodiment), the micro-optical display 62 should be properly aligned within the field of vision of the user's various angles of gaze. Similarly, when using two micro-optical displays, as is illustrated below, the inter-pupillary distance of the user (the distance between the pupils of the user's eyes) must be taken into account in order to properly position the micro-optical displays 62 in the user's line of sight.

Fig. 7 is a front perspective view of another alternative embodiment of the present invention. As can be seen a memory storage unit 79 has also been placed on the temple of the eyewear 70 of Fig. 7, and a remote wire 78 is shown connecting the micro-optical display 74 to the miniature transmitter receiver 77. Also illustrated in Fig. 7 are the microphone 76, the battery 71 , the microprocessor 73, and the antenna 75.

Fig. 8 illustrates a front perspective view of the eyewear 840 in accordance with another alternative embodiment of the present invention. In this embodiment, the microphone previously shown coupled to the eyewire in Fig. 7 is now extendable such that it may be positioned more closely to a user's mouth. As can be seen in Fig. 8, the microphone 815 may be positioned on an extension 820 which may be coupled to the eyewire 825. Buttons 810 are located on the transceiver 830 which is located on the temple 835 of the eyewear 840. In use, and as required by the user, the user would push the buttons 810 to extend or retract the microphone 815 to become closer to or further away from their mouth.

Fig. 9 is a side view of the embodiment shown in Fig. 8. As can be seen, the extended microphone 815 is attached to an extension 820 which is attached to the eyewire 825. Also visible in Fig. 9 are the ear 920 of the user, the eye 925 of the user, and buttons 810 for controlling the movement of the microphone 815.

Alternatively, the microphone, instead of being coupled to the eyewire, may also be placed on various other locations of the eyewear. For example, in Fig.10, the microphone 1000 is positioned on the temple 1010. Fig. 11 is an enlarged perspective view of a micro-optical display

1100 in accordance with an alternative embodiment of the present invention. The front 1130 of the micro-optical display is clearly illustrated, along with the back 1140, the soft skin 1110, the micro-character display 1150, the third optical element 1120, the markings 1180, the diameter 1170, and the length 1160. The diameter 1170 and length 1160 may vary according to the individual application of the micro-optical display 1100. In the embodiment illustrated in Fig. 11 the length 1160 is 7mm and the diameter 1170 is 6 mm.

The third optical element 1120 in this embodiment may be a lens which can be used to correct astigmatisms or other visual impairments of a user. Consequently, in use, a user looking into the front 1130 of the micro- optical display 1100 would not need the additional assistance of the corrective lenses of their eyewear as the image displayed by the micro- optical display 1100 would correct many visual impairments of the user. The markings 1180, which are also clearly evident in Fig. 10, are used to correct the user's astigmatic axis. The manner in which these markings 1180 are used is discussed in more detail below.

Fig. 12 is a perspective view of a collapsible safety cage structure 1210 that acts as a structural support for the micro-optical display 1100 of Fig. 11. The collapsible safety cage structure 1210 may be made from metal, plastic or some other rigid material. In Fig. 12, the safety cage structure 1210, has a back 1140, a front 1130, rigid rings 1240, and support members 1230. The support members 1230 contain angle notches 1200.

As the micro-optical display can be located at various positions on the user's eyewear and, consequently, at various positions near the user's eye, the collapsible safety cage structure 1210 reduces the risk of injury to the user. As can be seen, collapsible angles 1200 are positioned throughout the support members 1230. These angles 1200 provide points of increased stress which can fail during axial impact. When axial compressive forces are placed on the collapsible safety cage structure 1210 the support members 1230 can fold or otherwise deform at these notches 1200. This tendency to collapse under axial loads reduces the amount of energy that can be transmitted through the structure 1210 to a user's eye as some of the energy from an impact would be dissipated during the deformation and collapse of the safety cage structure 1210. In use, if the micro-optical display was forced towards a user's eye, the rib cage structure 1210 would compress, acting as a cushion, thereby reducing or possibly eliminating the transfer of damaging impact forces to the user's eye. In certain embodiments, the safety cage structure may be made of materials that elastically deform during impact.

Fig. 13 is a perspective view of the collapsible safety cage structure 1210 of Fig. 12 shown containing additional elements. In Fig. 13, a micro- character display 1300 is mounted near the perimeter of the safety cage structure 1210. This micro-character display 1300 generates the image that will ultimately be viewed by the user. The micro-character display differs from conventional displays as it does not need to recreate the numerous resolution lines associated with conventional scanning displays. The micro- character display in this embodiment may be a Light Emitting Diode (LED) display, a transparent Liquid Crystal Display (LCD) or any other character generator.

The micro-character display 1300 of the embodiment in Fig. 13 is shown in optical communication with a first optical element 1320, a second optical element 1310, and a third optical element 1120. In this particular embodiment, the first optical element 1320 is an elliptical mirror 2mm x 3mm, the second optical element 1310 is a concave aspheric mirror having a radius of 20mm and the third optical element 1120 is an astigmatism- correcting lens. These three optical elements reflect, focus, and manipulate the image generated by the micro-character display 1300 to provide an enlarged and mostly focused view of the image generated by the micro- optical display 1300 for the user.

Presbyopic correction is designed into the micro-optical display device by collimating the rays of light to infinity. A simple adjustment of the back mirror 1310 location corrects sphere power required for hyperopes and myopes.

The above-noted optical elements may be made of plastic, glass, metal, ceramic or other appropriate materials which will permit them to reflect or focus light as required in their particular application. The skin of the micro-optical device may be made of plastic, rubber, fabric, or some other flexible material, and the cage may be made of metal, ceramic or other rigid material. For best image contrast, the micro-optical display skin should be opaque or reflective to external light, while the interior surface of the skin should be dark or black such that it will absorb errant or stray light.

Fig. 14 is a side view of a micro-optical display 1490 in accordance with another alternative embodiment of the present invention. The micro- optical display 1490 in Fig. 14 contains a micro-character display 1440 which itself contains an LCD array 1430. The LCD array 1430 in this embodiment is rectangular in shape and 60 by 800 microns in size although it may be other sizes. The pixel spacing for the LCD array 1430 may be between 10 and 12 microns in this embodiment. Therefore, the LCD array 1430 may be placed within a 1.5 mm by 1.5 mm space. The micro-optical display 1490 also contains a first optical element

1420, which may be a flat elliptical mirror having ellipses diameters of 2 mm and 3 mm, and a second optical element 1480, which may be a concave mirror with a radius of 20 mm and a diameter of 5 mm. In this exemplary embodiment, the micro-optical display 1490 has a diameter 1410 of 6.0 mm and a length 1470 of 7.0 mm. The length 1450 of the micro-character display 1440 in this embodiment is 1.5 mm.

The LCD array 1430 utilizes ambient light to illuminate the image that will be displayed for the user. By using ambient light the amount of power required to drive the display can be reduced. The micro optical display device in this embodiment does not utilize a beam splitter thereby maximizing available illumination and thereby allowing for the use of only ambient illumination.

The dimensions of the micro-optical display may be modified in various other embodiments. For example, for micro-character displays employing 10 micron pixel spacing and having up to three rows of characters, each 30 characters in length, wherein mirrors are used as the optical elements within the micro-optical display, the length of the micro- optical display may be 7.0 mm and the diameter may be 6.0 mm. By comparison, in lens based designs, where focusing and enlarging lenses are utilized as optical elements within the micro-optical display the micro-optical display height may be reduced to 4.5 mm and the diameter may be reduced to 4.0 mm. Similarly, if prisms and lenses were employed as the optical elements the length of the micro-optical display may be reduced to 4.5 mm and the diameter of the micro-optical display may be reduced to 4.0 mm. In addition, while a micro-character display 1440 is discussed above, it is worthy of note that, if desired by the user, the light guides may be adapted to accommodate a larger micro-character display.

The LCD display 1430 may be made from transmissive LCD's and reflective LCD's. When reflective LCD's are utilized supplemental illumination may be required. For displays employing up to three rows of 30 characters each, the pixel to pixel spacing may be reduced to 5 microns, the length of the micro-optical display maybe 4.5 mm, the diameter of the micro- optical display may be 4.0 mm and the power consumption may be reduced to 10mW. For similarly sized micro-character displays utilizing illuminated transmissive LCD's the size and power consumption would be similar wherein the pixel to pixel spacing would be 10 microns. Comparatively, for similarly sized micro-character displays utilizing ambient lighting, the power consumption and the pixel to pixel spacing would be similar but the micro- optical display would be larger in length and in diameter.

The LCD may also be made from emetic LCDs and ferro-electric LCDs. Ferro-electric LCDs are advantageous because they allow for closer pixel spacing per unit area which results in a higher image resolution. Fig. 15.0 is an enlarged view of a micro-optical display 1515 in accordance with another alternative embodiment of the present invention. As can be seen in Fig. 15.0, micro-optical display 1515 is coupled to the rear concave surface closest to the lens 1514 closest to the user's eye and is in the direct line of vision of the eye 1512 of the user. The micro-optical display 1515 is marked with an astigmatic axis 1511 connecting two axis markings 1513. In this particular embodiment, this astigmatic axis 1511 may be set to correct a user's astigmatic lens axis. It should be further identified that the proper astigmatic powered lens is provided by way of lens customization and the astigmatic axis 1511 is set for the user's needs. Also, the user's sphere power is provided by way of adjusting the distance between the optical elements incorporated in the micro-optical display 1515. Furthermore, some or all of the sphere power could be incorporated with the lens which provides the astigmatic power.

Fig. 15.1 is an enlarged view of the axis markings 1513 of Fig. 15.0. As can be seen, they are marked with different degrees ranging from 0 to 180 and are marked around the entire circumference of the micro-optical display 1515. Regarding the astigmatic correction provided by the micro- optical display, a sag of 1 diopter cylinder power over the 6.0 mm diameter is 0.009 mm or 9 microns. Consequently, for a 10 diopter cylinder, the sag thickness would be 0.090 mm or 90 microns. Fig. 16.0 is an enlarged view of a micro-optical display 1640 positioned near an eye 1612 of a user in accordance with an alternative embodiment of the present invention. As can be seen, the astigmatic corrector lens 1610 (a refractive or diffractive lens) having the proper astigmatic axis setting 1615, is aligned with the eye 1612 of a user and diopter markings 1630 are present on the side of the micro-optical display 1640 having a micro-character display 1660 attached. While a lens is used to correct for the astigmatism of the user in this embodiment, any optical element may be used to correct for the astigmatism.

Fig. 16.1 is a sphere power chart providing the mirror adjustment in millimeters for various sphere diopter settings. As can be seen, mirror adjustment ranges from 0 mm to 1 mm as the diopters of sphere power increases from 0 to 10 diopters. The information provided in this chart is calculated based upon one embodiment wherein a micro-optical display has a 6-mm diameter and a 7-mm length. This mirror adjustment 165 allows for customized correction of the user's spherical correction. Therefore, the proper combination of the astigmatic correction lens 1610, the astigmatic axis setting 1615, and the sphere power adjustment 1650 will allow the user to see the image of the micro-character display 1660 clearly.

Fig. 17 is another alternative embodiment of the present invention. As can be seen, the micro-character display 1700 is positioned on top of the micro-optical display 1720. Alternatively, the micro-character display 1700 could be positioned in numerous other positions on or within the micro- optical display 1720 as required by the individual application.

Fig. 18.0 is another alternative embodiment of the present invention wherein the micro-character display 1810 is illustrated with the optional illumination system 1800. In this embodiment, as discussed above, rather than using ambient light to illuminate the micro-character display 1810, the optional illumination system 1800 provides additional light to illuminate the micro-character display. In so doing, the micro-optical display may be used in various light situations ranging from a complete lack of light whereby additional light is utilized from light source 1800 to low level, moderate or bright ambient light.

Figs. 18.1-18.8 provide various positions and configurations of the micro-optical display in accordance with other alternative embodiments of the present invention. As can be seen in Fig. 18.1 , the micro-optical display does not need to be in direct axial alignment with the eye of the user.

Comparatively, as can be seen in Fig. 18.2, the micro-optical display may be in direct axial alignment with the eye of the user. Thus, in order to accommodate the various relative locations between the user's eye and the micro-optical display, the micro-optical display may contain or be in optical communication with various optical elements. Some of these various configurations are illustrated in Figs. 18.3-18.8.

In Fig. 18.3, three different optical elements are used, a prism 1830, a curved mirror 1832, and a flat mirror 1831. These three optical elements manipulate and enlarge an image generated by the micro-character display 1834 such that the image may be readily viewed by the user.

Similarly, in Fig. 18.4, the large mirror 1840 is tilted to direct the display image of the micro-optical display toward a user's eye, which is not shown. In Fig. 18.5 both mirrors, 1840 and 1870, are tilted in order to align the display of the micro-optical display. In Fig. 18.6, a large mirror 1860 is utilized to guide the alignment of the display image towards the user's eye. In Fig. 18.7, a flat alignment mirror 1870 is used to align the display and in Fig. 18.8, a focus element 1880 is used to align the display with the user's eye. As is evident, then, these and numerous other configurations of optical elements, both within and outside of the micro-optical display, may be used in accordance with various embodiments of the present invention. Figs. 19 and 20 are front perspective views of alternative embodiments of the present invention. Rather than utilizing the light source coupled directly to the micro-character display as described in the above embodiments, the embodiment viewed in Fig. 19 uses a light source 1910 coupled to the eyewire 1915 of the eyewear 1920 to illuminate the micro- optical display 1912. As can be seen, in order to accomplish this, a fiber optic line 1911 may connect the light source 1910 with the micro-optical display.

In Fig. 20, an alternative embodiment of Fig. 19, the micro-optical display 2022 may be coupled to the eyewire 2023 as opposed to being suspended in the lens as in Fig. 19. The light source 2020 is coupled to the eyewire 2023 and sends light to illuminate the micro-character display resident in the micro-optical display 2022 along fiber optic line 2021. In this preferred embodiment the lens maybe notched or grooved to create an opening that accepts the micro-optical display.

Fig. 20.1 is an enlarged view of an eyewire 2011 and a micro-optical display 2012 in accordance with a preferred alternative embodiment. As can be seen in this embodiment the micro-optical display 2012 may be coupled to the eyewire 2011 and may be supported by it. In this embodiment a spectacle lens (not shown) would be placed into the eyewire 2011 around the micro-optical display. By mounting the micro-optical display 2012 directly to the eyewire 201 a spectacle lens can be interchangeably mounted to the eyewire. In other words, a user can have a prescription spectacle lens inserted into the eyewire 2011 and then, later, should their prescription change, the original spectacle lens can be readily removed and the new spectacle lens can be placed within the eyewire 2011. As previously noted, the corrective lens power of the micro-optical display can be independent of the spectacle lens. Fig. 20.2 is a front view of a spectacle lens 2024 that has been notched to fit around a micro-optical display (not shown).

Fig. 21 is a side view of another alternative embodiment of the present invention. As can be seen in Fig. 21 , a micro-processor 2170 and battery 2160 are coupled to a temple 2115 of eyewear 2100. The micro- optical display 2140, in Fig. 21 , has a micro-character display 2130 coupled to it. In addition, the micro-optical display 2140 is positioned within the spectacle lens 2110. In this embodiment, the micro-optical display 2140 protrudes from both the rear surface 2181 and the front surface 2182 of the spectacle lens 2110. Fig. 22 is a side view of another alternative embodiment of the present invention wherein the micro-optical display 2211 is mounted completely on the inside face of the spectacle lens 2210. As can be seen, the micro-optical display device 2211 is coupled to a pivot ball 2212 that is coupled to a mounting pad 2213. This pivot ball 2212 acts as a ball joint for the micro-optical display 2211 and allows the micro-optical display 2211 to be positioned in various positions of alignment relative to the eye 2214 of the user. The micro-optical display 2211 is in communication with the mounting pad 2213 which has two mounting pins 2216. These mounting pins 2216 releasably anchor the mounting pad 2213 to the lens and also provide the necessary signal and power connections for the micro-optical display 2211. Consequently, as required by the user, the micro-optical display 2211 may be relocated to other positions on the lens by unplugging the mounting pad 2213 and re-plugging it into pin openings at a different location on the face of the spectacle lens 2210. Fig. 23 is the inside view of eyewear in accordance with another alternative embodiment of the present invention. As can be seen, a sliding track 2320 is present in the eyewires 2310 of the eyewear 2350. The sliding track 2320 provides a guide for the micro-optical display 2330 to slide back and forth within. By providing the sliding track 2320 for the micro-optical display 2330, a user can more comfortably align and position the micro- optical display 2330 for their viewing. Fig. 24 is a front perspective view of another alternative embodiment of the present invention. Two micro-optical displays 2410 are utilized in this embodiment. As mentioned earlier, by providing two micro-optical displays three dimensional images may be projected into the field of view of the user. These two micro-optical displays 2410 may be rigidly connected to extenders 2400 and the extenders 2400 may be coupled to the temples 2440 of the eyewear 2420. Buttons 2430 may be integral parts of the transceiver 2460 which can be coupled to the temple 2440 of the eyewear 2420. These buttons 2430 may be used to control the movement of the extenders 2400 and, consequently, the position of the micro-optical display 2410 between the lens and the user's eye.

Fig. 25.0 is a side view of the embodiment illustrated in Fig. 24. The connection of the extenders 2400 to the temple 2440 is clearly seen in this view. In Fig. 25.0 the extender 2400 is coupled to the temple 2440 of the eyewear and the micro-optical display 2410 is suspended in rear of the spectacle lens 2500 closest to the eye 2501. Alternatively, while the micro- optical display 2410 is shown between the spectacle lens 2500 and the user, it may also be placed in front of the lens such that the viewer would view the micro-optical display 2500 through the eyewear spectacle lens 2500. If the user were to view the micro-optical display 2410 through the spectacle lens 2500 the micro-optical display would not need to correct the refractor error of the user. Comparatively, and as discussed above, if the user were to directly view the images in the micro-optical display 2410 the micro-optical display 2410 would need to correct for the vision impairments of the user. Therefore, if the micro-optical display is located behind the lens or adjacent to the lens whereby the spectacle lens is not looked through by the user prior to looking into the micro-optical display the user's refractive error needs to be corrected by the micro-optical display. Fig. 25.1 is another alternative embodiment of the present invention. In this embodiment rather than utilizing the micro-optical display discussed above, a display 2510, containing organic light emitting diodes, is coupled to the outside surface 2515 of the spectacle lens 2511. This display 2510 is in optical communication with a lens 2512 located on the inside surface 2516 of the lens 2511. This lens 2512 may contain an electro-active layer 2513. An advantage of utilizing the organic light emitting diodes in this embodiment is that the display may be transparent until it is activated.

As noted, the display 2510 in Fig. 25.1 is fixed to the outside surface 2515 of the lens 2511. This may be accomplished by attaching the display 2510 through a variety of available bonding techniques known to those of skill in the art. This electro-active diffractive layer 2513, which is located on the inside surface 2516 of the lens 2511 , may be approximately the same size as or larger than the display 2510. The display 2510 consists of a conventional fixed focal length lens 2512 in contact with a diffractive layer 2513 which, when activated through an applied voltage, works in combination with the lens 2512 to magnify and project the image into the field of view of the user.

Fig. 26 is a rear view of eyewear 2630 in accordance with yet another alternative embodiment of the present invention. In this embodiment, light detectors 2610 are placed on the user's side of the eyewires 2620 along with light sources 2600. These light sources 2600 and light detectors 2610 work in tandem to provide various services to the wearer of the eyewear 2630. This eyewear may contain lenses that assist the wearer in driving a motor vehicle. For example the lenses may be treated such that they grow lighter or darker depending on how much ambient light exists and is passing through the lenses into the eyes of the wearer. This treatment may include chemical methods that react to varying degrees of radiation and electrical methods that apply a voltage across the lens or use electricity in some other fashion to adjust the translucence of the lenses.

In use, the light source 2600, which is aimed at the user's eye, is activated. Light reflected by the user's eye from the light source 2600 will then be detected by the detectors 2610. When the user's eye is closed the amount of light reflected back to the detectors 2610 will be different from when the user's eye is opened. In some cases, the wavelength of light may be such that the eye is more reflective than the lid. In other cases and at other wavelengths, the reverse may obtain. It is also possible to reflect light at grazing incidence (Snell's angle) so that the total internal reflection from the eye occurs.

Utilizing this information the eyewear can determine when the user's eyes are opened and closed. This information is then compared to stored conventional information in the eyewear's microprocessor. Such information allows for normal blinking but may not allow for more prolonged lid closure which could be a precursor to sleep. One foreseeable application of this apparatus would be to utilize it when a user is driving. Should a user fall asleep while driving, the eyewear's microprocessor in this embodiment could initiate an audible alarm or could alternatively flash an intense light using light sources 2600 to awaken the user. The user (or wearer as used interchangeably throughout the application), wearing the eyewear, may adjust the eyewear to change the period of acceptable eye closure versus the period of unacceptable eye closure. In other words the wearer may adjust the time his eyes may remain shut before the eyewear interprets the closure as a period of falling asleep versus a normal blinking cycle. In the alternative, the eyewear may be preset for lid closure response by the manufacturer and not adjustable by the wearer.

In addition, rather than sounding an audible alarm associated with the eyewear, the eyewear may be in communication with the passenger vehicle being operated by the user and an alarm may be generated by the passenger vehicle. For instance the passenger vehicle may initiate various types of alarms including sounding its horn, turning on its interior lights or raising the volume of its radio. Alternatively, the passenger vehicle may move or shake the driver's seat, turn on the air conditioning or perhaps disengage its speed control. As can be seen the passenger vehicle may perform any number of automated tasks to stimulate the wearer and alert him that he may be falling asleep. The passenger vehicle described herein may be any vehicle that can be operated by a user including sailboats, construction equipment, military vehicles, automobiles, aircraft, and spacecraft. The eyewear may also have other alarm mechanisms designed to awaken or alert the wearer. For example the eyewear may generate an electric shock or vibrate to alert the wearer. Consequently, as can be seen, numerous methods may be employed to awaken or alert the wearer once it is determined that the wearer may be falling asleep. Benefits other than the stimulation of a driver also exist. For example, this embodiment may also be used to alert individuals who need to stay alert and vigilant. These individuals include: pilots, ship captains, astronauts, construction workers, military personnel, surveillance patrol personnel, shift workers, and production line employees. By monitoring the eye movements of these individuals and determining if they may be falling asleep this embodiment of the invention may assist them in more accurately and more safely performing their assigned tasks and duties. Specifically, for example, a surveillance sentry in the military or a civilian night watchman in the middle of their shift may be alerted that they are at risk of falling asleep and that they should take precautionary measures to stay awake so that they may fulfill the duties of their patrol. Similarly, quality review employees, responsible for monitoring a production line, can also benefit from this embodiment as they may be alerted that they are at risk of falling asleep in the middle of their shift. Consequently, numerous applications of this embodiment of the present invention are plausible.

In addition to alerting the user, the eyewear may also be utilized by a global positioning system to pinpoint the wearer's location and to communicate with a third party such as an emergency rescue facility. This third party may be alerted of the user's status, i.e. that the user of the eyewear is falling asleep and is in jeopardy of being involved in an accident. Likewise, rather than the eyewear being in communication with a global positioning system the passenger vehicle may, instead, be in communication with a global positioning system that can pinpoint the wearer's location and send help or alert a third party, should the wearer fall asleep. This third party could include emergency response personnel as well as a monitoring group specifically formed to monitor the eyewear. Finally, for the purposes of the invention eyewear could be any means of correcting a frame structure of some type that holds the light transmission means or the light detector means. Said eyewear could be attached to a helmet, hat, eyeglass, etc.

Fig. 27 provides a view of another alternative embodiment of the present invention. As can be seen in Fig. 27, eyewear 2795, which resembles sports goggles, may also be employed in the present invention. In Fig. 27 the eyewear 2795 contains: a speaker 2770, a battery strip 2730, a camera 2780, a microphone 2790, a micro-optical display 2700, an expandable/retractable arm 2785, and a motor 2710. Fig. 28 illustrates an emergency response system in accordance with another alternative embodiment of the present invention. As can be seen, the PCCSU 2830 in this alternative embodiments is in wireless communication (as depicted by arrow 2870) with the remote wireless sensor 2810 located on the body of a user 2800. The PCCSU 2830 is also in wireless communication with 911 emergency services 2840 (as depicted with arrow 2860) and with the eyewear 2850 (as depicted by arrow 2860). In use, should the remote wireless sensor 2810 sense that a user is having heart troubles or perhaps has fallen, the PCCSU 2830 would detect an appropriate signal and would then take appropriate steps to resolve the situation.

If the PCCSU 2830 detected that the user 2800 had fallen, the PCCSU 2830 would determine if the user was still conscious by sounding an audible signal or by flashing some lights in the eyewear 2820. If no response was detected, the PCCSU would contact emergency services 2840. Similarly, if the PCCSU 2830 were to receive a signal alerting it of a heart attack in the user, potentially through sensors located near the user's chest or temple, the PCCSU 2830 would immediately contact emergency services 2840 via a wireless network. In so doing, a wearer would have access to emergency services, such as 911 , even if she had become unconscious or had fallen and were otherwise unable to contact the emergency services. The PCCSU 2830 in this medical embodiment may possess GPS capabilities and notify 911 of the exact location of the user. Fig. 29 is a front perspective view of a charging data link unit 2930 in accordance with another alternative embodiment of the present invention. In Fig. 29, a charging unit 2930 having a highspeed infrared data link 2950 is shown charging the PCCSU 2910, the eyewear 2940, and the additional earpieces 2960, which contain additional batteries. When a user needs to charge the batteries in the eyewear or, alternatively, to download data to the PCCSU or the eyewear, the user would place the eyewear 2940 and the PCCSU 2910 into the charging data link unit 2930. Then, as required, the battery located either in the complete eyewear or just the spare temple would be changed, further, the appropriate data could be transferred to the PCCSU and the eyewear through the highspeed data link 2950.

Numerous other embodiments of the present invention are also plausible. For example, the light sources in the eyewear may be utilized to silently signal the arrival and urgency of an incoming message. This alert may then be received either visually through the micro-optical display, or audibly through speakers or earphones. Similarly, various sensors may be incorporated into the ICS for providing navigational or other real-time functions. For example, a user would be able to discern their position using Global Positioning System. Moreover, the ICS could interact with the automobile of the user so that directions, the speed of the vehicle or other vehicle warning lights, could be displayed by the micro-optical display. In addition, rather than utilizing the ICS to find the User's global position, the ICS may also be utilized to find the user's position in smaller areas as well. For example, prior to entering a commercial store, a list of items to be purchased may be inputted into the PCCSU's memory. Then, upon entering the store, the PCCSU would begin to communicate with a device, located in or at the commercial store, which can communicate with the PCCSU. By way of this interactive communication between the PCCSU and the store's device the user will be guided through the store, up and down the aisles or through the departments in the most efficient manner. Furthermore, as the user approaches each item on the list, the user will be alerted to their proximity to the item by a signal to collect the item.

In still other alternative embodiment an earphone aid, which attaches to the user's ear is utilized to assist the user in hearing audible information. Furthermore, a video camera may be integrated or coupled to the eyewear providing for video and or still images to be taken. These images would then be transmitted to the PCCSU and then re-transmitted over a wireless network. The camera in this embodiment may be placed in any location within or on the eyewear.

While manual controls have been described above, the various embodiments of the present invention may also be controlled through voice commands or through other interactive commands. For example, the light detectors can be utilized to detect a series of blinks which can then control the various aspects of the present invention. This could include controlling the micro-optical display and the PCCSU. Similarly, portions of the eyewear or the PCCSU may be powered down to conserve energy when the particular functions are not needed. Moreover, while each of the above embodiments have contained eyewear containing a full spectacle lens, the eyewear embodied in the present invention is not so limited as the eyewear may also include multi-focal lenses, half-eye frames, minimal frames without lenses, and any other configuration wherein the eyewear can be mounted to the face of the user such that a micro-optical display can be positioned in the viewing range of the user.

In still another embodiment, the lenses contained within the eyewear may be electro-active powered lenses which may change power and or tint electro-actively. In this alternative embodiment the user is able to actuate changes of the lenses' power or color through voice commands or manual switching. Thus the present invention provides for a versatile Integrated Communication System. The above disclosed embodiments are illustrative of the various ways in which the present invention maybe practiced. Other embodiments can be implemented by those skilled in the art without departing from the spirit and scope of the present invention.

Claims

What Is Claimed Is:
1. A method of notifying a user comprising: directing a light into the eye of a user; measuring the reflection of the light directed at the eye of the user; determining, with the measurement of step (b), whether the user's eye is open or closed; and, activating an alarm when the user's eye is closed for a predetermined length of time.
2. The method of claim 1 wherein the alarm is an audible alarm.
3. The method of claim 1 wherein the alarm is a visual alarm.
4. The method of claim 1 wherein the alarm is a spike of electrical energy sent into the user.
5. The method of claim 1 wherein the alarm is a vibrating alarm.
6. The method of claim 5 wherein the vibrating alarm is located within a seat of a passenger vehicle.
7. The method of claim 1 further comprising: detecting the location of a passenger vehicle being operated by the user through the use of a global positioning system; and, notifying a third party of the location of the user and the status of the user.
8. The method of claim 7 wherein the third party communicates with emergency response personnel.
9. A system for awakening a user comprising: a light source; a light detector, the light detector positioned to receive reflected light, generated by the light source, that has been reflected off of the eye of the user; and, a microprocessor in communication with the light detector and in communication with the light source, the microprocessor adapted to generate an alarm signal when the microprocessor determines that the amount of reflected light detected by the light detector indicates that the user's eye is closed for a period of time.
10. The system of claim 9 further comprising: an audio speaker, the audio speaker in communication with the microprocessor, the audio speaker adapted to generate an audible sound capable of awakening the user.
11. The system of claim 9 wherein the light source is adapted to generate a visual alarm capable of awakening the user.
12. The system of claim 9 further comprising: a head mounted frame, the head mounted frame connected to the light source and the light detector.
13. The system of claim 9 further comprising: an electrical stimulator, the electrical stimulator in communication with the microprocessor and adapted to generate a spike of electrical energy sent into the body of the user.
14. The system of claim 9 further comprising: a passenger vehicle, the passenger vehicle in communication with the microprocessor and the passenger vehicle adapted to stimulate the user upon receiving an instruction from the microprocessor.
15. The system of claim 9 wherein the microprocessor is further adapted to communicate with a global positioning system to determine the location of the user.
16. The system of claim 9 wherein the microprocessor is further adapted to communicate over a wireless network to a third party to provide the status of the user to the third party.
17. Eyewear for a user comprising: a head-mounted eyewear frame; a light source supported by the frame, the light source positioned on the frame to be able to send a ray of light at an eye of the user; a light detector supported by the frame, the light detector positioned to detect a reflection of the ray of light sent to the user's eye by the light source; an audio speaker supported by the frame; and a microprocessor supported by the frame, wherein the microprocessor is adapted to generate an alarm signal when the light detector senses that the user's eye has been closed for a predetermined period of time.
18. The eyewear of claim 17 further comprising: a spectacle lens coupled to the frame, the spectacle lens adapted to regulate the amount of ambient light passing through the lens and entering the user's eye.
19. The eyewear of claim 17 wherein the microprocessor is further adapted to communicate with a third party.
20. The eyewear of claim 17 further comprising: a electrical power source supported by the frame; and, a capacitor in communication with the electrical power source, the capacitor adapted to quickly discharge an electrical charge into the user upon receiving a signal associated with the user's eye being closed for a predetermined period of time.
PCT/US2001/044068 2000-11-28 2001-11-26 Integrated method and system for communication WO2002045044A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US72329000A true 2000-11-28 2000-11-28
US09/723,290 2000-11-28

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU3047402A AU3047402A (en) 2000-11-28 2001-11-26 Integrated method and system for communication

Publications (1)

Publication Number Publication Date
WO2002045044A1 true WO2002045044A1 (en) 2002-06-06

Family

ID=24905623

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/044068 WO2002045044A1 (en) 2000-11-28 2001-11-26 Integrated method and system for communication

Country Status (3)

Country Link
AR (1) AR031435A1 (en)
AU (1) AU3047402A (en)
WO (1) WO2002045044A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1501061A1 (en) * 2003-07-23 2005-01-26 Politechnika Warszawska Device for detecting reduced vigilance condition
EP1913566A1 (en) * 2005-08-11 2008-04-23 Sleep Diagnostics Pty Ltd Alertness sensing spectacles
WO2011130374A1 (en) * 2010-04-13 2011-10-20 PixelOptics Attachable electro-active lens systems
WO2012160205A1 (en) * 2011-05-26 2012-11-29 Agro Prof Felice Eugenio Anti-sleep glasses
CN103927007A (en) * 2014-04-09 2014-07-16 惠州Tcl移动通信有限公司 Intelligent device and method for detecting fatigue of users on basis of intelligent device
EP2946980A1 (en) * 2014-05-23 2015-11-25 Valeo Vision Driving-assistance device including driving-assistance spectacles
US9628707B2 (en) 2014-12-23 2017-04-18 PogoTec, Inc. Wireless camera systems and methods
US9635222B2 (en) 2014-08-03 2017-04-25 PogoTec, Inc. Wearable camera systems and apparatus for aligning an eyewear camera
US9823494B2 (en) 2014-08-03 2017-11-21 PogoTec, Inc. Wearable camera systems and apparatus and method for attaching camera systems or other electronic devices to wearable articles
US10241351B2 (en) 2015-06-10 2019-03-26 PogoTec, Inc. Eyewear with magnetic track for electronic wearable device
US10341787B2 (en) 2015-10-29 2019-07-02 PogoTec, Inc. Hearing aid adapted for wireless power reception

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4953111A (en) * 1987-02-12 1990-08-28 Omron Tateisi Electronics Co. Doze detector
US5729619A (en) * 1995-08-08 1998-03-17 Northrop Grumman Corporation Operator identity, intoxication and drowsiness monitoring system and method
US5867587A (en) * 1997-05-19 1999-02-02 Northrop Grumman Corporation Impaired operator detection and warning system employing eyeblink analysis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4953111A (en) * 1987-02-12 1990-08-28 Omron Tateisi Electronics Co. Doze detector
US5729619A (en) * 1995-08-08 1998-03-17 Northrop Grumman Corporation Operator identity, intoxication and drowsiness monitoring system and method
US5867587A (en) * 1997-05-19 1999-02-02 Northrop Grumman Corporation Impaired operator detection and warning system employing eyeblink analysis

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1501061A1 (en) * 2003-07-23 2005-01-26 Politechnika Warszawska Device for detecting reduced vigilance condition
EP1913566A1 (en) * 2005-08-11 2008-04-23 Sleep Diagnostics Pty Ltd Alertness sensing spectacles
EP1913566A4 (en) * 2005-08-11 2010-01-20 Sleep Diagnostics Pty Ltd Alertness sensing spectacles
US8678581B2 (en) 2010-04-13 2014-03-25 Pixeloptics, Inc. Attachable electro-active lens systems
WO2011130374A1 (en) * 2010-04-13 2011-10-20 PixelOptics Attachable electro-active lens systems
CN103534739A (en) * 2011-05-26 2014-01-22 费利切·欧金尼奥·阿格罗 Anti-sleep glasses
US20140303690A2 (en) * 2011-05-26 2014-10-09 Felice Eugenio Agro Anti-sleep glasses
WO2012160205A1 (en) * 2011-05-26 2012-11-29 Agro Prof Felice Eugenio Anti-sleep glasses
CN103927007B (en) * 2014-04-09 2017-08-08 惠州Tcl移动通信有限公司 The method of detecting whether a smart device based on user fatigue and smart devices
CN103927007A (en) * 2014-04-09 2014-07-16 惠州Tcl移动通信有限公司 Intelligent device and method for detecting fatigue of users on basis of intelligent device
EP2946980A1 (en) * 2014-05-23 2015-11-25 Valeo Vision Driving-assistance device including driving-assistance spectacles
FR3021282A1 (en) * 2014-05-23 2015-11-27 Valeo Vision Driver assisting device comprising driving helmets
US9635222B2 (en) 2014-08-03 2017-04-25 PogoTec, Inc. Wearable camera systems and apparatus for aligning an eyewear camera
US9823494B2 (en) 2014-08-03 2017-11-21 PogoTec, Inc. Wearable camera systems and apparatus and method for attaching camera systems or other electronic devices to wearable articles
US10185163B2 (en) 2014-08-03 2019-01-22 PogoTec, Inc. Wearable camera systems and apparatus and method for attaching camera systems or other electronic devices to wearable articles
US9628707B2 (en) 2014-12-23 2017-04-18 PogoTec, Inc. Wireless camera systems and methods
US9930257B2 (en) 2014-12-23 2018-03-27 PogoTec, Inc. Wearable camera system
US10348965B2 (en) 2014-12-23 2019-07-09 PogoTec, Inc. Wearable camera system
US10241351B2 (en) 2015-06-10 2019-03-26 PogoTec, Inc. Eyewear with magnetic track for electronic wearable device
US10341787B2 (en) 2015-10-29 2019-07-02 PogoTec, Inc. Hearing aid adapted for wireless power reception

Also Published As

Publication number Publication date
AR031435A1 (en) 2003-09-24
AU3047402A (en) 2002-06-11

Similar Documents

Publication Publication Date Title
US10185147B2 (en) Enhanced optical and perceptual digital eyewear
US8696113B2 (en) Enhanced optical and perceptual digital eyewear
US4869575A (en) Headwear-mounted periscopic display device
US7310072B2 (en) Portable communication display device
US5003300A (en) Head mounted display for miniature video display system
US5714967A (en) Head-mounted or face-mounted image display apparatus with an increased exit pupil
CN1047711C (en) Image display means
US6356392B1 (en) Compact image display system for eyeglasses or other head-borne frames
US9101459B2 (en) Apparatus and method for hierarchical object identification using a camera on glasses
US8079713B2 (en) Near eye display system
US5079416A (en) Compact see-through night vision goggles
US8531355B2 (en) Unitized, vision-controlled, wireless eyeglass transceiver
TWI599796B (en) Wearable device having an input and an output structure
US7118212B2 (en) Image display device
US6452572B1 (en) Monocular head-mounted display system
EP0509090B1 (en) Head mounted video display
EP1143326A2 (en) Computer display optimizer
US8587514B2 (en) Device for controlling an external unit
US8970962B2 (en) Visor heads-up display
JP2910111B2 (en) Eyeglass-type retina direct display device
US5812100A (en) Image display apparatus
KR101977433B1 (en) Wearable device with input and output structures
US6005536A (en) Captioning glasses
US7798638B2 (en) Eyeglasses with integrated video display
US20120019662A1 (en) Eye gaze user interface and method

Legal Events

Date Code Title Description
AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
WWW Wipo information: withdrawn in national office

Country of ref document: JP

NENP Non-entry into the national phase in:

Ref country code: JP