WO2018066962A1 - Lunettes intelligentes - Google Patents

Lunettes intelligentes Download PDF

Info

Publication number
WO2018066962A1
WO2018066962A1 PCT/KR2017/011065 KR2017011065W WO2018066962A1 WO 2018066962 A1 WO2018066962 A1 WO 2018066962A1 KR 2017011065 W KR2017011065 W KR 2017011065W WO 2018066962 A1 WO2018066962 A1 WO 2018066962A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
lens
smart glasses
main
mirror
Prior art date
Application number
PCT/KR2017/011065
Other languages
English (en)
Korean (ko)
Inventor
문명일
Original Assignee
엠티스코퍼레이션(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020170117369A external-priority patent/KR20180037887A/ko
Application filed by 엠티스코퍼레이션(주) filed Critical 엠티스코퍼레이션(주)
Priority to US16/339,373 priority Critical patent/US11378802B2/en
Publication of WO2018066962A1 publication Critical patent/WO2018066962A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/02Simple or compound lenses with non-spherical faces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • G02B2027/0114Head-up displays characterised by optical features comprising device for genereting colour display comprising dichroic elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility

Definitions

  • An embodiment of the present invention relates to smart glasses, and more particularly, to a smart glasses having a display device.
  • a head mount display refers to a digital device that displays a virtual image in a position close to the eye and worn on the head like glasses.
  • HMD has evolved into a form that is combined with Augmented Reality technology beyond simple display functions.
  • the HMD provides various conveniences to the user through a screen in front of the user's eyes, communicates with an external digital device to output the corresponding content, and receives user input for the external digital device or communicates with the external digital device. It may also work in conjunction.
  • Augmented reality is a technology that shows a 3D virtual object superimposed on the real world, and refers to a hybrid virtual reality that fuses the reality and the virtual environment.
  • the complex virtual reality is used in various fields such as military, entertainment, medical, learning, film, architectural design, tourism, etc., and is gradually applied to real life beyond the imaginary stage described in science fiction or film.
  • Virtual reality systems include Window Systems, Mirror Systems, Vehicle-based Systems, and Augmented Reality Systems. These systems are basically divided into output devices and input devices.
  • Output devices are devices that allow users of virtual reality systems to perceive sight, hearing, touch, and movement through sensory channels. Such output devices include visual display devices, auditory display devices, tactile display devices, motion feedback and display devices. Representative hardware of visual display device is HMD and smart glasses.
  • Presence originally refers to the sense of being in a certain environment. In this sense, remote presence allows a user to experience the reality of a user in a certain environment by means of communication media.
  • the conventional smart glasses supply images to one eye of either the left or the right side, and the user may watch the multimedia contents with only one eye, so that even if the user watches only for a predetermined time, dizziness or fatigue increases rapidly. there is a problem.
  • an object of the present invention is to reflect the multimedia content provided by one display unit in both directions and to reflect the contents enlarged in the magnifying lens to both eyes to the same in both eyes It is to provide smart glasses that can observe the content and optionally implement augmented reality and virtual reality using 3D content as needed.
  • Another object of the present invention is to provide a smart glasses that can simultaneously watch or interact with the multimedia content implemented in the display in both eyes.
  • Another object of the present invention to provide a smart glasses that can implement augmented reality and virtual reality together in the head-up display device.
  • the main frame having a spectacle frame form; A support frame coupled to a central portion of the main frame; A first display supported by the support frame and displaying a first image; A second display supported by the support frame and displaying a second image; A first mirror reflecting the first image; A second mirror reflecting the second image; A first main lens configured to provide a first main image to the inside of the main frame at the center portion based on the first image reflected by the first mirror; And a second main lens configured to provide a second main image from the center part to the inside of the main frame based on the second image reflected from the second mirror.
  • the smart glasses may include a first auxiliary lens disposed between the first mirror and the first main lens; And a second auxiliary lens disposed between the second mirror and the second main lens.
  • Each of the first auxiliary lens and the second auxiliary lens enlarges the first image and the second image.
  • the first or second auxiliary lens may include a plurality of aspherical lenses.
  • the first main lens and the second main lens are formed of a plate-shaped reflective and transmissive member, and are disposed to be inclined at a predetermined angle with a central axis of each of the first and second auxiliary lenses.
  • the thickness of the first main lens or the second main lens is several millimeters or less.
  • the smart glasses may further include a communication and power board accommodated in the left or right frame portion of the main frame and connected to any one or more of the first and second displays.
  • the communication and power board is wired and connected to an external control and power supply, and transmits signal and data communication between the control and power supply and the first and second displays.
  • both the application processor and the battery included in the control and power supply are disposed outside the smart glasses.
  • the first display and the second display have a display interface for controlling.
  • the smart glasses further include a front camera installed on the central outer surface of the main frame to photograph the front and connected to the control and power supply through the communication and power board.
  • the front camera includes a first front camera and a second front camera spaced apart at regular intervals.
  • the smart glasses further include a rear camera or sensor installed on the inner side of the central portion of the main frame to photograph the rear side and connected to the control and power supply through the communication and power board.
  • the rear camera or sensor detects the eyelid of the user.
  • the smart glasses, the actuator or driver is housed in at least one of the left frame portion and the right frame portion of the main frame and operates in response to signals from the rear camera, the sensor or the control and power supply. It includes more.
  • the smart glasses may be a speaker that is housed in at least one of the left frame portion and the right frame portion of the main frame and outputs an alarm in response to a signal from the rear camera, the sensor or the control and power supply. It includes more.
  • control and power supply includes a control device and a power supply, wherein at least one of the control device and the power supply is supported by a main frame or mounted on an external device and mounted on the external device; It is connected to the communication and power board via a power line.
  • the main frame having a spectacle frame form;
  • a support frame coupled to the central portion of the main frame;
  • a first display supported by the support frame and outputting a first image in a first direction;
  • a first mirror that reflects the first image in a second a direction perpendicular to the first direction;
  • a first main lens reflecting the first image reflected by the first mirror toward the eye of the user;
  • a first convex lens disposed between the first display and the first main lens to adjust a focal length of the virtual image of the first image viewed through the first main lens;
  • a second display supported by the support frame and outputting a second image in a first direction;
  • a second mirror that reflects the second image in a second direction perpendicular to the first direction and opposite to the second direction;
  • a second main lens reflecting the second image reflected by the second mirror toward the user's eye;
  • a second convex lens installed between the second display and the second main lens to adjust a focal length of the virtual image of the second image viewed through the second
  • the first convex lens may be disposed between the first display and the first mirror, and the second convex lens may be disposed between the second display and the second mirror.
  • the first convex lens may be disposed between the first mirror and the first main lens, and the first convex lens may be disposed between the second mirror and the second main lens.
  • the first mirror and the second mirror may be a concave mirror.
  • the first main lens and the second main lens may be formed of a plate-shaped reflective and transmissive member, and may be disposed to be inclined at a predetermined angle with a central axis of each of the first and second auxiliary lenses. have.
  • the main frame having a form that is caught by one ear of the user and the length extending from the one ear to the front of one eye of the adjacent user;
  • a support frame coupled to the front portion of the main frame corresponding to one eye front;
  • a display supported by the support frame and outputting an image in a first direction;
  • a mirror that reflects the image in a second direction orthogonal to the first direction;
  • a main lens reflecting the image reflected from the mirror to the user's eyes;
  • a convex lens installed between the display and the main lens to adjust a focal length of a virtual image of an image viewed through the main lens.
  • the convex lens may be disposed between the display and the mirror, or between the mirror and the main lens.
  • one or two display units disposed at the center of the front surface of the smart glasses have an advantage of providing a side view angle by providing independent images to both eyes.
  • the user's eyelids may be recognized through a rear camera or a sensor, and vibrations or alarms may be generated through an actuator, a driver, or a speaker stored in the device according to a recognition result, thereby warning a danger to a user, especially a drowsy driver.
  • the heat generated when the device is used for a long time can be effectively discharged to the heat dissipation structure or the heat sink disposed in the upper front portion of the main frame to provide an operational stability and reliability for the smart glasses.
  • smart glasses when the battery is removed from the smart glasses, and the power is supplied to the external control and power supply through the battery to the smart glasses, smart glasses can be effectively reduced in weight and increase the wearing comfort and ease of use.
  • FIG. 1 is a perspective view of smart glasses according to an embodiment of the present invention.
  • FIG. 2 is a side view of the smart glasses of FIG. 1.
  • FIG. 3 is a perspective view illustrating an operating state of the smart glasses of FIG. 1.
  • FIG. 4 is a side view of the smart glasses of FIG. 3.
  • FIG. 5 is an exploded perspective view of the smart glasses of FIG. 1.
  • FIG. 6 is an exploded perspective view of smart glasses according to another embodiment of the present invention.
  • FIG. 7 is a perspective view of smart glasses according to another embodiment of the present invention.
  • FIG. 8 is a front view of the smart glasses of FIG. 7.
  • FIG. 9 is a partial projection front view for describing an internal structure of the smart glasses of FIG. 8.
  • FIG. 10 is a right side view of the smart glasses of FIG. 7.
  • FIG. 11 is a partial projection right side view for explaining the internal structure of the smart glasses of FIG. 10.
  • FIG. 12 is a partially exploded perspective view for explaining the components of the smart glasses of FIG.
  • 13 and 14 are diagrams for describing a display path of a display image of the smart glasses of FIG. 7.
  • FIG. 15 is a block diagram of a control board that may be employed in the smart glasses of FIG. 7.
  • FIG. 16 is a block diagram illustrating another control board and a power supply device that may be employed in the smart glasses of FIG. 7.
  • FIG. 17 is a perspective view of a smart glasses and a control system according to another embodiment of the present invention.
  • FIG. 18 is a block diagram illustrating a main configuration of the smart glasses of FIG. 17.
  • FIG. 19 is a block diagram illustrating an operation of a control apparatus of smart glasses according to another embodiment of the present invention.
  • 20 and 21 are flowcharts for explaining the operation of the control device of the smart glasses according to another embodiment of the present invention.
  • 22 to 25 are diagrams illustrating modifications of the optical system that may be employed in the smart glasses of FIG. 7.
  • 26 is a perspective view of smart glasses according to another embodiment of the present invention.
  • FIG. 1 is a perspective view of smart glasses according to an embodiment of the present invention.
  • FIG. 2 is a side view of the smart glasses of FIG. 1.
  • 3 is a perspective view illustrating an operating state of the smart glasses of FIG. 1.
  • 4 is a side view of the smart glasses of FIG. 3.
  • 5 is an exploded perspective view of the smart glasses of FIG. 1.
  • the smart glasses 100 include a first frame 50 and a second frame 51, and the first frame 50 includes the second frame 51.
  • the first hinge portion 82 and the second hinge portion 83 may be hinged to each other.
  • the first frame 50 may be used in a state where the first frames 50 are positioned above the glasses lenses 137a and 137b.
  • the first frame 50 may include a display device, a mirror, a lens, and the like, and may reflect and enlarge an image of the display device to the user's eyes.
  • the user When a user wants to implement augmented reality or virtual reality using content transmitted from an external digital device, as shown in FIGS. 3 and 4, the user places the first frame 50 on the spectacle lenses 137a and 138a.
  • the contents of the smart glasses can be used by rotating downward from above to operate a predetermined portion of the first frame 50 in front of the spectacle lenses 137a and 138a.
  • the multimedia content received from the external digital device may be output as it is, or may be combined or converted on the display unit 10.
  • the display unit 10 may be a display such as a thin film transistor liquid crystal display (TFT LCD), an organic light emitting display (OLED), liquid crystal on silicon (LCoS), or digital light processing (DLP) according to an environment or a user's needs or requirements. It can be configured by selecting any one of the devices.
  • TFT LCD thin film transistor liquid crystal display
  • OLED organic light emitting display
  • LCDoS liquid crystal on silicon
  • DLP digital light processing
  • the content implemented in the display unit 10 is reflected by the reflector 20 in both directions.
  • the reflector 20 may be configured as an X-Prism.
  • the reflecting surface of the reflecting unit 20 where the content is reflected may be provided with a half mirror coating that can simultaneously reflect and transmit.
  • the reflecting unit 20 is not limited to the exprism, but may be used as long as an optical element capable of simultaneously reflecting and transmitting, such as a beam splitter, a half mirror, or a combination thereof.
  • the content reflected by the reflector 20 is enlarged while passing through the magnifying lens 30.
  • the magnification lens unit 30 may be configured with a plurality of aspherical lenses to enlarge the content and to eliminate spherical aberration and the like.
  • three aspherical lenses are used to enlarge the content and eliminate spherical aberration.
  • the three aspherical lenses may include a combination of optical elements in the form of one-sided convex lens, two-sided convex lens, and one-sided convex lens when viewed on the optical path starting from the display device. Techniques for designing using aspherical lenses to magnify content and eliminate spherical aberration are well known and are not described in detail here.
  • the enlarged content passing through the magnification lens unit 30 is reflected in both eyeball directions by the reflection and transmission unit 40.
  • the reflective and transmissive portion 40 is not only to reflect the enlarged content in the eyeball direction, but also to reflect the contents for realizing the virtual reality and the augmented reality through the reflective and transmissive portion so that the reflective surface has It can be provided with a half mirror coating that can be made at the same time.
  • a transparent prism is used as the reflective and transmissive portion 40.
  • the optical prism is not limited to the transparent prism, and may be used as long as an optical element capable of simultaneously reflecting and transmitting a beam splitter or a half mirror.
  • the display unit 10, the reflecting unit 20, the magnifying lens unit 30, the reflecting and transmitting unit 40 are seated in the first frame 50, and specifically, the front surface provided in the first frame 50. It is seated in the case 52.
  • the display unit 10 receives the multimedia content from the main printed circuit board (PCB) 70 through the flexible printed circuit board (FPCB) 60.
  • the battery 61 is connected to the FPCB 60 to supply necessary power to the display unit 10.
  • the FPCB 60 is connected to transmit the multimedia content from the main PCB 70 to the display unit 10, and is connected to the battery 61 to supply power to the smart glasses 100a.
  • the main PCB 70 may be equipped with a wireless communication module (not shown) and a controller (not shown).
  • the wireless communication module receives multimedia content from an external digital device, and the controller can control driving of the main PCB 70 and the display unit 10.
  • the first frame 50 includes a first side case 80 and a second side case 81, a first hinge part 82, and a second coupler coupled to the second frame 51. Each hinge is hinged through the hinge portion 83.
  • the first frame 50 may rotate between the upward and downward directions (or the front of the spectacle lenses) of the spectacle lenses 137a and 138a through the first and second hinge parts.
  • the main PCB 70 and the battery 61 may be seated on the inner side of the first side case 80 and the second side case 81.
  • first frame 50 may further include a camera unit 90.
  • second frame 51 may further include an indicator 92 such as a power on / off button 91 and a light emitting diode (LED).
  • indicator 92 such as a power on / off button 91 and a light emitting diode (LED).
  • the operating principle of the smart glasses of this embodiment is as follows.
  • the user When the user wants to implement augmented reality or virtual reality using the smart glasses, the user rotates the first frame 50 of the smart glasses located above the eyeglass lenses 137a and 138a in a downward direction to provide the spectacle lens ( 101 can be operated to be located in front of.
  • the controller of the main PCB 70 may receive multimedia content from an external digital device through the wireless communication module and transmit the multimedia content to the display unit 10.
  • the image content When the image content is output from the display unit 10, the image content is reflected in both directions from the reflector 20 and is transmitted to the magnification lens unit 30.
  • the reflector 20 may include an X-Prism that transmits one content in both directions.
  • the enlarged content passing through the magnification lens unit 30 is reflected by the reflection and transmission unit 40 and transmitted to both eyes.
  • the distance from the magnifying lens unit 30 to the enlarged image may be adjusted as necessary. That is, the length of the transmission path may be adjusted so that the image enlarged by the magnifying lens unit 30 is not a real image but a virtual image and is visible to the user through smart glasses.
  • the distance li from the magnifying lens unit 30 to the virtual image felt by the user or the driver is the focal length f of the magnifying lens unit 30 and the distance from the display unit 10 to the magnifying lens unit 30 ( lo).
  • the relationship between the above distances satisfies Equation 1 below.
  • smart glasses may be designed and manufactured so that an enlarged image is formed at a desired position of a user.
  • 20-inch content can be implemented about 2.4 meters from the smart glasses. If the user can provide 20 inches of content about 2 meters from the smart glasses, the user can conveniently implement and use augmented reality or virtual reality as needed.
  • FIG. 6 is an exploded perspective view of smart glasses according to another embodiment of the present invention.
  • the smart glasses according to the present exemplary embodiment may include the spectacle lens combined main lenses 137 and 138 instead of the spectacle lenses 137a and 138a in the center frame 51c of the second frame 51.
  • the main lenses 137 and 138 may function as a bidirectional reflector that reflects the image of the display unit 10 to the front of the eye while reflecting the image of the user's eye.
  • the above-described reflection and transmission unit 40 May be installed or replaced as a reflecting member that performs only a reflecting function.
  • the smart glasses of the present embodiment are substantially the same as the smart glasses of the embodiment described above with reference to FIGS. 1 to 5 except for using the main lenses 137 and 138 instead of the glasses lenses 137a and 138a.
  • the description of the elements and their associations is omitted.
  • FIG. 7 is a perspective view of smart glasses according to another embodiment of the present invention.
  • FIG. 8 is a front view of the smart glasses of FIG. 7.
  • FIG. 9 is a partial projection front view for describing an internal structure of the smart glasses of FIG. 8.
  • FIG. 10 is a right side view of the smart glasses of FIG. 7.
  • FIG. 11 is a partial projection right side view for explaining the internal structure of the smart glasses of FIG. 10.
  • 12 is a partially exploded perspective view for explaining the components of the smart glasses of FIG.
  • the smart glasses 100 may include a main frame 110, a first display 131, a second display 132, a first mirror 133, a second mirror 134, and a first main lens. 137 and a second main lens 138.
  • the smart glasses 100 may further include a first auxiliary lens 135 and a second auxiliary lens 136.
  • the smart glasses 100 may include a controller including a processor (see 260 of FIG. 16), a display interface 160 for the first and second displays 131 and 132, and a flexible printing connection therebetween.
  • a flexible printed circuit board (FPCB) 164 may be provided.
  • the controller may be referred to as a control and power supply, and may be embedded or have separate communication and power boards.
  • the communications and power boards may include connectors or means or components for relaying or interconnecting communications or power sources.
  • the smart glasses 100 may include one or more front cameras 151 and one or more rear cameras 153.
  • the front camera 151 includes a first front camera and a second front camera spaced apart from each other by a predetermined distance, the input or application of the 3D content may be facilitated.
  • the front or rear camera may be a kind of sensor or function as a sensor.
  • the rear camera 153 or the sensor may be a means for detecting a user's eyelid movement or a device that performs a function corresponding to the means.
  • the smart glasses 100 may further include an actuator or driver 194 vibrating according to a signal from a camera, a sensor, or a control and power supply device, or a speaker for outputting an acoustic signal in response to the signal.
  • an actuator or driver 194 vibrating according to a signal from a camera, a sensor, or a control and power supply device, or a speaker for outputting an acoustic signal in response to the signal.
  • Driver 194 may include a vibration motor.
  • the smart glasses 100 may further include a heat dissipation structure or a heat sink for dissipating heat generated from the first display 131, the second display 132, the display interface 160, and the like. At least one of the first display 131 and the second display 132 may be referred to as a display unit.
  • the main frame 110 is formed of a rigid material having an approximately U-shape and has a frame shape or a frame shape of the frame.
  • the main frame 110 may be formed of a single structure, in which case the main frame 110 may be coupled to the center frame portion 110c and both ends thereof to form a frame shape of the left frame portion 110a and the right frame portion ( 110b).
  • the left and right sides may be opposite to the user.
  • the main frame 110 has a coupling structure of a support frame disposed in the middle portion and a first side frame coupled to the left side of the support frame and a second side frame coupled to the right side for ease of manufacture and light weight. It may be provided.
  • the support frame may be detachably formed of the first support frame 120a and the second support frame 120b to facilitate the support, arrangement, and assembly of the display, the mirror, the auxiliary lens, and the main lens.
  • the left frame portion and the first side frame are represented by a single reference numeral 110a
  • the right frame portion and the second side frame are represented by a single reference numeral 110b.
  • the support frame 120 having a shape in which the first support frame 120a and the second support frame 120b are coupled to each other may correspond to the front case 50 of FIG. 5.
  • the display interface 160 may be inserted into the central portion of the support frame 120 and may be electrically connected to the controller through a terminal of the FPCB 164 disposed on the upper center side of the support frame 120.
  • the display interface 160 is connected to the first display 131 and the second display 132.
  • the FPCB 164 may be installed to extend from the terminal at the center front portion to the first side frame 110a and the second side frame 110b along the upper surface of the main frame 110 or the support frame 120.
  • the first display 131 and the second display 132 are disposed on both sides of the box-shaped central portion of the support frame 120 to output the first image and the second image in opposite directions.
  • the first display 131 or the second display 132 may be a thin film transistor liquid crystal display (TFT LCD), an organic light emitting diode (OLED), liquid crystal on silicon (LCoS), It may include at least one selected from a digital light processing (DLP) based display.
  • TFT LCD thin film transistor liquid crystal display
  • OLED organic light emitting diode
  • LCDoS liquid crystal on silicon
  • DLP digital light processing
  • the first mirror 133 which is supported by the left inclined support structure portion of the support frame 120, is installed on the front surface of the first display 131 to substantially orthogonally cross the first image of the first display 131 with the left direction. It may reflect in the first downward direction.
  • a second mirror 134 supported by the right inclined support structure portion of the support frame 120 is installed on the front surface of the second display 132, and the second image of the second display 132 is directed to the right. It can reflect in the 2nd lower direction substantially orthogonal to.
  • the second lower direction may be a direction parallel to each other at a distance approximately equal to a distance between two eyes of the user.
  • the first image reflected by the first mirror 133 is enlarged by the first auxiliary lens 135.
  • the first auxiliary lens 135 may be inserted into and supported in the support structure having the uneven structure of the support frame 120.
  • the first auxiliary lens 135 enlarges and transmits the first image to the first main lens 137.
  • the second image reflected by the second mirror 134 is magnified by the second auxiliary lens 136.
  • the second auxiliary lens 136 may be inserted into and supported in the support structure having the uneven structure of the support frame 120.
  • the second auxiliary lens 136 enlarges the second image and transfers the second image to the second main lens 138.
  • the first auxiliary lens 135 or the second auxiliary lens 136 has a stacked structure or superposition of one side convex lens (the other side concave lens), two-sided convex lens and one side convex lens based on when the image is viewed from the input end side. It can have an arrangement.
  • the first and second auxiliary lenses 135 and 136 described above are used to enlarge the first image and the second image to a desired size and to remove spherical aberration.
  • the first main lens 137 is disposed under the first auxiliary lens 135.
  • the first main lens 137 may have an arrangement structure in which a relatively thin plate-like lens member is inclined with the central axis of the first auxiliary lens 135.
  • the first main lens 137 may be coupled to a lower portion of the support frame 120 or the center frame portion 110c (hereinafter, simply a center frame) on the lower side of the first auxiliary lens 135.
  • the first main lens 137 may reflect the first image passing through the first image or the first auxiliary lens 135 at approximately right angles to the user's eye as a beam splitter.
  • the first image descending from the first mirror 133 in a substantially vertical direction forms an image facing the user's eye and a virtual image facing the front of the user's eye with respect to the reflective surface of the first main lens 137.
  • the distance from the reflective surface to the location of the lower image or the image plane may be about 20 meters.
  • the second main lens 138 is disposed under the second auxiliary lens 136.
  • the second main lens 138 may have an arrangement structure in which a relatively thin plate-like lens member is inclined with the central axis of the second auxiliary lens 136.
  • the second main lens 138 may be coupled to the lower portion of the support frame 120 or the center frame 110c at the lower side of the second auxiliary lens 136.
  • the second main lens 138 may reflect a second image passing through the second auxiliary lens 136 at approximately right angles to the user's eye as a beam splitter.
  • the second image descending from the second mirror 134 in a substantially vertical direction forms an image facing the user's eye and a virtual image facing the front of the user's eye with respect to the reflective surface of the second main lens 138.
  • the convex surfaces of the auxiliary lenses 135 and 136 described above may be installed to include three aspherical lenses in order to enlarge an image or content and eliminate spherical aberration.
  • the nose pad member 140 may be connected to the central lower side of the support frame 120.
  • the nose pad member 140 may be made of at least a portion of a relatively soft material.
  • the soft material may include synthetic resin.
  • An upper cover 113 may be coupled to a central upper side of the main frame 110, an upper front cover 114 may be coupled to an upper front side of the center, and a front auxiliary cover 115 may be coupled to an uppermost center of the center front.
  • the upper front cover 114 may display the first and second support frames 120a and 120b and may be coupled to the central frame 120c by the rear cover 112c and fastening means such as screws or bolts.
  • the front subsidiary cover 115 may cover the exposed lens portion of the front camera 151, in which case at least a portion of the front subsidiary cover 115 may be formed of a transparent or translucent material.
  • the front subsidiary cover 115 may include an opening or a through hole for exposing the lens of the front camera 151.
  • the first side frame 110a may be connected to the left side of the support frame 120, and the other end thereof may be connected to the first side connection frame 121.
  • the first side connection frame 121 may be connected to a first side end frame 122.
  • the first side end frame 122 has an inner space for accommodating the first battery 172, and an opening in the inner space facing the second side end frame 125, which will be described later, is formed on the first flexible cover 123.
  • the first side connecting frame 121 and the first flexible cover 123 may be formed of rubber or a soft synthetic resin material to improve a feeling of wearing at least a portion of the surface contacting the user's ear and the back thereof.
  • one end of the second side frame 110b may be connected to the right side of the support frame 120, and the other end thereof may be connected to the second side connection frame 124.
  • the second side connection frame 124 may be connected to a second side end frame 125.
  • the second side end frame 125 has an inner space for accommodating the second battery 171, and the opening of the inner side facing the first side end frame 122 or the first flexible cover 123 described above is provided.
  • the second flexible cover 126 may be detachably closed.
  • the second side connecting frame 124 and the second flexible cover 126 may be formed of a rubber or soft synthetic resin material to improve a feeling of wearing at least a portion of the surface contacting the user's left ear and the back thereof.
  • the first battery 171 and the second battery 172 may be referred to as a power supply device, and one battery may be mounted or one battery may not be mounted depending on the implementation.
  • the first side cover 112a may be installed on an outer surface of the first side frame 110a to accommodate the first printed circuit board 161.
  • the first printed circuit board 161 may be connected to the first battery 172 and connected to one end of the FPCB 164.
  • a second side cover 112b may be coupled to an outer surface of the second side frame 110b to accommodate a second PCB 162.
  • the second printed circuit board 162 may be connected to the second battery 171 and may be connected to the other end of the FPCB 164.
  • the front side of the first side frame (110a) and the second side frame (110b) may be provided with a projection or uneven portion for securing the coupling force and positioning in the coupling of the shield (see 180 of FIG. 17).
  • a metal material attaching a magnet or a magnet may be provided in the vicinity of the forming surface, thereby facilitating coupling with the shield and maintaining a stable coupling state.
  • the first printed circuit board 161 may include at least one port or a first connector for transmitting and receiving data or receiving power from the outside.
  • the second printed circuit board 162 may include at least one port or a second connector for transmitting and receiving data or receiving power from the outside.
  • the first or second printed circuit boards 161 and 162 may include earphone terminals.
  • the front camera 151 may include a lens exposed on the front surface of the main frame 110 or the center frame 120c. Two front cameras 151 may be installed, but are not limited thereto. The lens of the front camera 151 or a peripheral portion thereof may be protected by the front subsidiary cover 115. The front camera 151 may be electrically connected to another end or extension of the FPCB 164.
  • the rear camera 153 may include a lens exposed at the rear of the main frame 110 or the support frame 120.
  • One rear camera 153 may be installed, but is not limited thereto.
  • the rear camera 153 may be electrically connected to another end or extension of the FPCB 164.
  • the processor may be disposed on the first or second printed circuit boards 161 and 162.
  • the processor may control the components of the smart glasses 100 to operate and manage the operation or function of the smart glasses 100.
  • the processor may include an application processor (AP).
  • AP application processor
  • the display interface 160 may control timing of signals transmitted to the first display 131 and the second display 132 under the control of the processor.
  • the signal may include an image signal.
  • the display interface 160 includes dual and 3D MIPI Dual & 3D Mobile Industry Processor Interface Digital Serial Interface (DSI) suitable for application to the first and second displays 131 and 132 and 3D content. Or means or apparatus for performing a similar function.
  • DSI Dual & 3D Mobile Industry Processor Interface Digital Serial Interface
  • the driver 194 may be built in the first or second side frames 110a and 110b.
  • the driver 194 may be electrically connected to the first or second printed circuit boards 161 and 162 and may operate in response to signals from the front camera 151, the rear camera 153, a sensor, or a processor to generate vibrations. .
  • the user may generate a vibration for the sleepiness prevention alarm according to the signal of the rear camera 153.
  • a heat dissipation structure may be installed at an upper portion of the support frame 120 or a lower portion or a portion of the upper cover 113.
  • One end of the heat dissipation structure may be connected to the display interface 160, the first display 131, the second display 132, the LCD backlight, and the like to emit heat generated in at least one of the above components.
  • the heat dissipation structure may extend along the lengthwise outline of the upper end of the center frame 120c, and may be installed with a plurality of fins on at least one surface thereof to extend the cross-sectional area.
  • the heat dissipation structure may be connected to a material having excellent conductivity installed at a predetermined width along the length direction of the upper cover 113 or may be integrally formed with the high conductive material.
  • 13 and 14 are diagrams for describing a display path of a display image of the smart glasses of FIG. 7.
  • the optical system including the first main lens 137 will be described, but the same may be applied to the optical system including the second main lens 138.
  • the first main lens 137 according to the present exemplary embodiment is disposed to be inclined at approximately 30 degrees to 60 degrees facing the first auxiliary lens.
  • the first auxiliary lens includes a first one-sided convex lens 135a, a first double-sided convex lens 135b, and a first convex surface lens 1372.
  • One surface of the first one-side convex lens 135a has a convex surface, and the other surface has a concave surface.
  • the curvature of the concave surface of the first one-sided convex lens 135a may be set to be substantially similar to or the same as the curvature of the opposite convex surface of the first two-sided convex lens 135b.
  • the first biconvex lens 135b has a convex surface on both one surface and the other surface thereof.
  • the radius of curvature of the convex surface of one surface and the convex surface of the other surface may be the same, but is not limited thereto.
  • the first convex lens 1372 may have a convex surface facing the other surface convex surface of the first biconvex lens 135b, and an opposite surface of the convex surface may have a planar shape.
  • the first image and the second image output from the first and second displays 131 and 132 of the smart glasses according to the present embodiment are the first mirror 133 and the second mirror 134.
  • After reflecting at each of the first and second auxiliary lenses 135 and 136 is enlarged by a combination of the convex surface of each of the first and second reflections of the main lens 137 and the main lens 138
  • the light is split and reflected on the reflecting surface and projected to the rear and front surfaces of the main lenses 137 and 138, respectively.
  • the reflective surface may be referred to as the beam split surface.
  • the first image and the second image are enlarged by the combination of the convex surface of the first auxiliary lens 135 and the convex surface of the second auxiliary lens 136, and the first auxiliary lens 135 and the second image are enlarged.
  • the convex surface of the auxiliary lens 136 is formed as an aspherical surface, thereby spherical aberration can be eliminated.
  • first image and the second image focused at a predetermined position on the front surface of the first and second main lenses 137 and 138 may form a predetermined single image plane or a virtual image plane.
  • a stereoscopic (3D) image, a virtual reality image, an augmented reality image, etc. it is a matter of course that a multi-layered image / virtual image may be provided.
  • each of the main lenses 137 and 138 has an empty space between the side frames (see 112b of FIG. 7). As such, by not disposing any components on the side surfaces of the main lenses 137 and 138, there is an advantage in that the wide viewing angle can be provided without limiting the viewing angle of the smart glasses wearer.
  • the content implemented in the display can be enlarged using the auxiliary lens.
  • the distance from the auxiliary lens to the enlarged image can be adjusted as necessary. That is, the image enlarged by the auxiliary lens may be easily adjusted in size or resolution so that the image is not a real image but a virtual image to be seen by the user over the smart glasses.
  • the distance li to the virtual image felt by the user through the main lens may be adjusted by the relationship between the focal length f of the main lens and the distance lo from the display to the reflective surface of the main lens.
  • the relationship between the distances satisfies Equation 1.
  • the smart glasses can be configured to implement about 20 inches of content about 2.4 meters from the front of the smart glasses.
  • the content of about 20 inches can be provided about 2 meters from the smart glasses, and the user can easily implement and use augmented reality or virtual reality as needed.
  • FIG. 15 is a block diagram of a control board that may be employed in the smart glasses of FIG. 7.
  • the smart glasses 100 may include a controller 210 connected to the first and second displays D1 and D2 131 and 132.
  • the smart glasses 100 may include a power supply unit 170 and a communication unit 190.
  • the power supply unit 170 may be referred to as a power supply device or a power supply device, and may include a battery or a power supply terminal.
  • the communication unit 190 may include a subsystem for wired or wireless communication. At least a portion of the power supply unit 170 and the communication unit 190 may be mounted on a printed circuit board. In that case, the printed circuit board may be referred to as a communication and power board.
  • the controller 210 may include a processor and a memory 220.
  • the controller 210 may control the operation of the power supply unit 170 and the communication unit 190 by a program stored in the memory 220 and control the operation of the smart glasses 100.
  • the controller 210 and the power supply unit 190 are integrally formed in the smart glasses 100 in an accommodating form, but are not limited thereto.
  • the display interface is described as a separate configuration from the controller 210, but the present invention is not limited to such a configuration, and the display interface may be implemented to be integrally formed with the controller.
  • FIG. 16 is a block diagram illustrating another control board and a power supply device that may be employed in the smart glasses of FIG. 7.
  • the smart glasses 100 have a power supply unit installed as an auxiliary power supply unit 170 including a small battery or one battery, and a main power supply device including a large or large capacity battery.
  • the 240 may be implemented to be detachably connected through the data and power line 192 from the outside of the smart glasses.
  • the main power supply 240 may be housed in an external device (see 300 of FIG. 17), in which case the external device 300 may monitor and manage the power supply through the display window.
  • FIG. 17 is a perspective view of a smart glasses and a control system according to another embodiment of the present invention.
  • FIG. 18 is a block diagram illustrating a main configuration of the smart glasses of FIG. 17.
  • the smart glasses 100 may be implemented to include a display interface and to arrange a control device connected to the display interface outside the smart glasses 100.
  • the smart glasses 100 are implemented to place substantially all of the battery outside the smart glasses 100.
  • the display interface may be a dual or 3D MIPI DSI (Dual & 3D Mobile Industry Processor Interface Digital Serial Interface, 210a) (hereinafter, simply referred to as DSI) or a means or device for performing a similar function.
  • DSI Device & 3D Mobile Industry Processor Interface Digital Serial Interface
  • This display interface is suitable for implementing 3D content with the first and second displays 131, 132.
  • the smart glasses 100 includes a first display 131, a second display 132, a display interface 160, a camera 150, a communication and power board 164.
  • the camera 150 may include a front or rear camera, and the communication and power board 164 may be installed at an installation position of the first or second printed circuit board while an external main power supply (MPS), 240 may be connected to an application processor (AP) 260.
  • MPS main power supply
  • AP application processor
  • the external device 300 may control and monitor the operation of the smart glasses 100 from the outside as a control and power supply device.
  • the first battery 172 and the second battery 171 together with most of the controller, the first printed circuit board 161, and the second printed circuit board 162 may be omitted. That is, the communication and power board 164 having a relatively simple structure is installed at the position of the second printed circuit board 162 to relay the communication and power connection between the smart glasses 100 and the external device 300 and the remaining electronics. Most of the components for the part can be omitted.
  • the weight of the smart glasses 100 can be considerably reduced, and there is an advantage of increasing the ease of use.
  • FIG. 19 is a block diagram illustrating an operation of a control apparatus of smart glasses according to another embodiment of the present invention.
  • control apparatus of the smart glasses may be referred to as an application processor and may include a video interface 261, a tracking module 262, a rendering module 263, and a measurement module 264. It may include.
  • the tracking module 262, the rendering module 263, and the measurement module 264 may be connected to a user interface U / I 265.
  • the video interface 261 may receive a video image stream input from the video cameras 151 and 152 and transfer an image of a desired section to the tracking module 262.
  • the tracking module 262 may receive image data from the video interface 261 and transmit the measured image and the pose estimation result to the rendering module. In this case, the tracking module 262 may receive adjustment information including adjustment parameters and the like through the user interface 265, and recognize the marker using the marker information stored in the marker information database 266.
  • the marker serves as a medium between the real image and the virtual object.
  • the marker information may include information about the size, pattern, etc. of the marker.
  • the marker information database 266 may be replaced with marker information stored in a predetermined storage unit.
  • the rendering module 263 is responsible for creating and removing virtual objects.
  • the rendering module 263 may obtain adjustment information or the like through the user interface 265.
  • the adjustment information may include load or unload, rotation, movement (coordinate movement, etc.), scaling, and the like.
  • the rendering module 263 may interwork with an engine that supports three-dimensional (3D) modeling (in brief, the 3D modeling engine 268) or a means for performing such a function through the content resource database 267.
  • the 3D modeling engine 268 may generate a virtual object of the measured image based on a virtual reality modeling language (VRML).
  • the rendering module 263 may receive the virtual object from the content resource database 267 and render the measured image.
  • VRML virtual reality modeling language
  • the measurement module 264 may measure and process the distance between the virtual objects, the distance and the direction between the generated coordinate systems, and the interference between the virtual objects.
  • the measurement module 264 may receive the augmented image from the rendering module 263 and provide the augmented reality image to the display device according to the object information input through the user interface 265.
  • the object information may include information about positive or negative matching, three-dimensional point, collision, and the like.
  • 20 and 21 are flowcharts for explaining the operation of the control device of the smart glasses according to another embodiment of the present invention.
  • the control apparatus of the smart glasses may acquire a stereoscopic image through a camera (S201).
  • the controller may generate a depth map image based on the stereoscopic image.
  • control device may detect or extract the hand image (S203).
  • the controller may extract the hand image from the depth map image.
  • the control device may extract a command corresponding to the series of hand images from the storage unit or the database (S205).
  • the controller may execute a preset command corresponding to a vector component indicated by a series of hand images generated by the extracted hand images for a predetermined time, an image region corresponding to the start and end points of the vector component, or an object located in the image region. I can recognize it.
  • the object may have different types of menus and icons in response to a preset command.
  • the control device of the smart glasses may detect the hand motion through image processing and control the image seen by the smart glasses based on the same.
  • the image control may include forward or reverse image forwarding, fast forwarding the image, rewinding the image, and performing a preset command by recognizing a touch on a specific region of the image.
  • the controller recognizes a hand from an input image frame for hand gesture recognition, detects an image of a hand position, or recognizes or predicts a direction of movement of a hand, and accordingly puts a content on a screen or moves a current image.
  • the current image may be overlapped with another image.
  • Prediction of hand movements may include recognizing a hand, a finger, or a combination thereof.
  • Such gesture recognition may be performed based on image processing and preset gesture information.
  • the present embodiment it is possible to provide a user interface using hand gestures in augmented reality, virtual reality, mixed reality, and the like, thereby effectively providing various contents of smart glasses. It can be used, and can greatly improve the user convenience.
  • the operator when working at a desk without a monitor, or wearing smart glasses when shopping, performing payment through augmented reality, using a navigation function through augmented reality, or playing augmented reality games, , Watch a virtual reality video, make a video call with augmented reality or virtual reality, the operator can wear smart glasses and work in the field with augmented reality, or simply operate and use the functions of the smartphone on the screen of smart glasses have.
  • the control apparatus of the smart glasses acquires a stereoscopic image through a camera (S201), and detects or extracts a hand image based on this (S203). Screen mirroring of a mobile computing device such as a mobile terminal or a notebook may be performed in response to the series of hand images (S206).
  • Screen mirroring may be referred to as screen mirroring, and refers to a function of allowing a screen of a mobile computing device to be wirelessly viewed on a screen of smart glasses without connecting a separate line.
  • the display device of the control device or the control board of the smart glasses receives and decodes the screen information when the mobile computing device encodes the screen information and transmits it at a frequency, and displays the screen of the smart glasses through the display.
  • Can be output to Wireless includes, but is not limited to, Bluetooth, other short range wireless communication schemes may be used.
  • 22 to 25 are diagrams illustrating modifications of the optical system that may be employed in the smart glasses of FIG. 7.
  • a first auxiliary lens 135, a second auxiliary lens 1372a, and a first mirror 133 are disposed between the first display 131 and the main lens.
  • the first auxiliary lens 135 and the second auxiliary lens 1372a may be disposed between the first display 131 and the first mirror 133 and sequentially arranged on the optical path.
  • the first auxiliary lens 135 and the second auxiliary lens 1372a may be convex lenses having the same shape and structure, but are not limited thereto.
  • the inverse of the composite focal length of the first auxiliary lens 135 and the second auxiliary lens 1372a is equal to the sum of the inverses of the focal lengths of the respective auxiliary lenses.
  • the reciprocal of the composite distance is the reciprocal of the distance from the first auxiliary lens 135 and the second auxiliary lens 1372a to the first display 131 and the first auxiliary lens 135 and the second auxiliary lens 1372a. Is equal to the reciprocal of the distance to the virtual image via the first mirror 133.
  • a first mirror 133, a first auxiliary lens 135, and a second auxiliary lens 1372a are disposed between the first display 131 and the main lens. May be sequentially arranged on the optical path in the order described.
  • the first auxiliary lens 135 and the second auxiliary lens 1372a may be disposed between the first mirror 133 and the main lens or between the first mirror 133 and the user's eyes.
  • the first auxiliary lens 135 and the first concave mirror 133a are described on the optical path between the first display 131 and the main lens. Can be arranged sequentially.
  • the first auxiliary lens 135 may be disposed between the first display 131 and the first concave mirror 133a.
  • the first concave mirror 133a has a combined form and function of one convex lens and a mirror to transmit an image transmitted to one of two eyes of a user.
  • the concave surface of the first concave mirror 133a corresponds to the reflective surface of the light or the image.
  • the first concave mirror 133a and the first auxiliary lens 135 are described in the optical path between the first display 131 and the main lens. Can be arranged sequentially.
  • the first auxiliary lens 135 may be disposed between the first concave mirror 133a and the main lens.
  • 26 is a perspective view of smart glasses according to another embodiment of the present invention.
  • the smart glasses 100h are not limited to a configuration of providing an independent image to both eyes of a user, and may have a form and a structure of providing an image only to one eye of the user. Can be.
  • the smart glasses 100h may be implemented as a device for providing an image to a user's left eye, but the present invention is not limited thereto.
  • the smart glasses 100h include a main frame having a spectacle frame shape, a support frame coupled to the center of the main frame, a display supported by the support frame, displaying an image, a mirror reflecting the image, and an image reflected from the mirror. It includes a main lens 137 reflecting to the side.
  • the smart glasses 100h may include a main frame having a shape that is caught by one ear of a user and a length extending from the one ear to the front of one eye of an adjacent user; A support frame coupled to the front portion of the main frame corresponding to one eye front; A display supported by the support frame and outputting an image in a first direction; A mirror that reflects the image in a second direction orthogonal to the first direction; A main lens reflecting the image reflected from the mirror to the user's eyes; And a convex lens installed between the display and the main lens to adjust a focal length of a virtual image of the image viewed through the main lens.
  • An upper cover 113 may be coupled to a central upper side of the main frame, an upper front cover 114 may be coupled to a central front, and a front auxiliary cover 115 may be coupled to an upper front cover 114.
  • the nose pad member 140 may be coupled to the lower portion of the main frame.
  • the side frame may be coupled to the side of the main frame.
  • the side cover 112b may be provided with a printed circuit board on the side frame.
  • a side connecting frame 124 may be connected to the distal end of the side frame, and one end of the side connecting frame may be coupled to the side end frame 125.
  • the battery is accommodated in the inner space of the side termination frame 125 and its opening may be detachably covered by the flexible cover 126.

Abstract

L'invention concerne des lunettes intelligentes comprenant un dispositif d'affichage. Les lunettes intelligentes comprennent : une monture principale retenant une forme de monture de lunettes ; une monture de support couplée à une partie centrale de la monture principale ; un premier dispositif d'affichage, porté par la monture de support, servant à afficher une première image ; un second dispositif d'affichage, porté par la monture de support, servant à afficher une seconde image ; un premier miroir servant à réfléchir la première image ; un second miroir servant à réfléchir la seconde image ; une première lentille principale servant à fournir une première image principale à un côté interne de la monture principale à partir d'une partie centrale sur la base de la première image réfléchie par le premier miroir ; et une seconde lentille principale servant à fournir une seconde image principale au côté interne de la monture principale à partir d'une partie centrale sur la base de la seconde image réfléchie par le second miroir.
PCT/KR2017/011065 2016-10-05 2017-09-29 Lunettes intelligentes WO2018066962A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/339,373 US11378802B2 (en) 2016-10-05 2017-09-29 Smart eyeglasses

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20160128098 2016-10-05
KR10-2016-0128098 2016-10-05
KR10-2017-0117369 2017-09-13
KR1020170117369A KR20180037887A (ko) 2016-10-05 2017-09-13 스마트 안경

Publications (1)

Publication Number Publication Date
WO2018066962A1 true WO2018066962A1 (fr) 2018-04-12

Family

ID=61831099

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/011065 WO2018066962A1 (fr) 2016-10-05 2017-09-29 Lunettes intelligentes

Country Status (2)

Country Link
KR (1) KR102269833B1 (fr)
WO (1) WO2018066962A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110572619A (zh) * 2019-09-25 2019-12-13 冀成 一种视力辅助工具

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102193581B1 (ko) 2020-02-11 2020-12-23 아이콘에이아이 주식회사 디스플레이를 구비하고 보이스 어시스턴트와 연동하는 스마트 메이크업 미러 디바이스
KR20220104507A (ko) * 2021-01-18 2022-07-26 삼성전자주식회사 메타렌즈를 구비하는 카메라 및 그를 포함하는 전자 장치
KR20220163162A (ko) * 2021-06-02 2022-12-09 삼성전자주식회사 웨어러블 전자 장치 및 그의 파워 패스 제어 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000333211A (ja) * 1999-05-18 2000-11-30 Nippon Telegr & Teleph Corp <Ntt> 三次元表示方法およびヘッドマウントディスプレイ装置
KR20150062222A (ko) * 2013-11-28 2015-06-08 주식회사 이랜텍 디스플레이부 위치조절 기능을 갖는 시스루 스마트안경
KR20150095342A (ko) * 2014-02-13 2015-08-21 삼성전자주식회사 헤드마운트형 디스플레이장치
KR20150123969A (ko) * 2007-07-26 2015-11-04 리얼디 인크. 헤드 장착형 단일 패널 입체영상 디스플레이
KR20160109021A (ko) * 2015-03-09 2016-09-21 하정훈 증강 현실 구현 장치

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3250236B2 (ja) * 1991-09-19 2002-01-28 ソニー株式会社 映像表示装置
JPH06105256A (ja) * 1992-09-18 1994-04-15 Olympus Optical Co Ltd 頭部又は顔面装着式ディスプレイ装置
JPH06331928A (ja) * 1993-05-24 1994-12-02 Sony Corp 眼鏡型ディスプレイ装置
US9304319B2 (en) * 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
JP2013114022A (ja) 2011-11-29 2013-06-10 Seiko Epson Corp 偏光装置及び表示装置
KR101598480B1 (ko) * 2014-11-10 2016-03-02 (주)그린광학 시스루형 헤드 마운트 표시장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000333211A (ja) * 1999-05-18 2000-11-30 Nippon Telegr & Teleph Corp <Ntt> 三次元表示方法およびヘッドマウントディスプレイ装置
KR20150123969A (ko) * 2007-07-26 2015-11-04 리얼디 인크. 헤드 장착형 단일 패널 입체영상 디스플레이
KR20150062222A (ko) * 2013-11-28 2015-06-08 주식회사 이랜텍 디스플레이부 위치조절 기능을 갖는 시스루 스마트안경
KR20150095342A (ko) * 2014-02-13 2015-08-21 삼성전자주식회사 헤드마운트형 디스플레이장치
KR20160109021A (ko) * 2015-03-09 2016-09-21 하정훈 증강 현실 구현 장치

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110572619A (zh) * 2019-09-25 2019-12-13 冀成 一种视力辅助工具

Also Published As

Publication number Publication date
KR20180037909A (ko) 2018-04-13
KR102269833B1 (ko) 2021-06-29

Similar Documents

Publication Publication Date Title
WO2015053449A1 (fr) Dispositif d&#39;affichage d&#39;image de type lunettes et son procédé de commande
WO2018066962A1 (fr) Lunettes intelligentes
WO2016190505A1 (fr) Terminal de type verre et son procédé de commande
US11378802B2 (en) Smart eyeglasses
EP3717992A1 (fr) Dispositif de fourniture de service de réalité augmentée et son procédé de fonctionnement
WO2018182159A1 (fr) Lunettes intelligentes capables de traiter un objet virtuel
WO2017022998A1 (fr) Système optique de visiocasque
WO2020138640A1 (fr) Dispositif électronique
WO2021040076A1 (fr) Dispositif électronique
WO2022108076A1 (fr) Procédé de connexion sans fil d&#39;un environnement de réalité augmentée et dispositif électronique associé
WO2016182090A1 (fr) Terminal de type lunettes et son procédé de commande
WO2017007101A1 (fr) Dispositif intelligent et son procédé de commande
KR20190097894A (ko) 가상 오브젝트의 처리가 가능한 스마트 안경
KR20190106485A (ko) 가상 오브젝트의 처리가 가능한 스마트 안경
KR102051202B1 (ko) 가상 오브젝트의 처리가 가능한 스마트 안경
WO2022092517A1 (fr) Dispositif électronique pouvant être porté comprenant une unité d&#39;affichage, procédé de commande d&#39;affichage, système comprenant un dispositif électronique pouvant être porté et boîtier
WO2020022750A1 (fr) Dispositif électronique pouvant fournir multiples points focaux d&#39;une lumière émise par un dispositif d&#39;affichage
WO2021107200A1 (fr) Terminal mobile et procédé de commande de terminal mobile
WO2022075686A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2022059893A1 (fr) Dispositif électronique à porter sur soi comprenant une structure de dissipation de chaleur
WO2021242008A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2016099090A1 (fr) Dispositif d&#39;affichage vestimentaire
WO2020004941A1 (fr) Dispositif électronique comprenant un élément de réflexion et un élément de réflexion translucide qui peut transmettre, à une lentille, une lumière émise par un affichage
WO2021033790A1 (fr) Dispositif électronique
WO2022255682A1 (fr) Dispositif électronique habitronique et procédé de commande de trajet d&#39;alimentation de celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17858756

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17858756

Country of ref document: EP

Kind code of ref document: A1