WO2015002377A1 - Display device and method of controlling the same - Google Patents

Display device and method of controlling the same Download PDF

Info

Publication number
WO2015002377A1
WO2015002377A1 PCT/KR2014/002663 KR2014002663W WO2015002377A1 WO 2015002377 A1 WO2015002377 A1 WO 2015002377A1 KR 2014002663 W KR2014002663 W KR 2014002663W WO 2015002377 A1 WO2015002377 A1 WO 2015002377A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
display unit
adjusting
control command
transparency
Prior art date
Application number
PCT/KR2014/002663
Other languages
French (fr)
Inventor
Jongsoo Lee
Jinyoung YOU
Hyungyu JANG
Minjung Kim
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2015002377A1 publication Critical patent/WO2015002377A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Disclosed is a display device including a main body that is formed in a head-mounted manner, a display unit which is coupled to the main body and is arranged in a position corresponding to both eyes and to which visual information is output in such a manner that the visual information is superimposed on an external image that is input, and a controller that generates a control command for adjusting transparency of the display unit, in which the transparency of the display unit is adjusted according to the control command in order to adjust clearness of the external image.

Description

DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME
The present invention relates to a display device and more particularly to a display device that is capable of being head-mounted and a method of controlling the display device.
As the information society develops rapidly, the importance of a display device capable of implementing a screen with a sense of reality is being emphasized. For instance, a head-mounted display (HMD) device is being researched.
The HMD device is mainly implemented as safety goggles or a helmet. Once a user wears the HMD device, the user can see a screen in front of his or her eyes. The HMD device has been developed for realization of a sense of virtual reality. A small display such as a liquid crystal display is installed at the HMD device close to a user's two eyes, so that images can be projected to the display. Recently, the HMD device is being widely developed for use in a space development center, a reactor building, a military agency and a medical institution, for business use or games, etc.
Thanks to these improvements, smart glasses, one example of a head-mounted display device, are available on the market. The smart glasses realized as a wearable device conveniently executes a function that is executed in the existing mobile terminal.
However, since an external image that is viewed through the smart glasses being worn is displayed along with visual information that is output (augmented reality), a situation occurs in which the visual information is difficult to recognize. In addition, because content is displayed on lenses of the smart glasses, personal content or content of which security has to be ensured has the likelihood of being viewable to other people.
Therefore, an object of the present invention is to provide a display device that is capable of adjusting light transmissivity of a display unit to improve user convenience and a method of controlling the display device.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, there is provided a display device including a main body that is formed in a head-mounted manner, a display unit which is coupled to the main body and is arranged in a position corresponding to both eyes and to which visual information is output in such a manner that the visual information is superimposed on an external image that is input, and a controller that generates a control command for adjusting transparency of the display unit, in which the transparency of the display unit is adjusted according to the control command in order to adjust clearness of the external image.
In the display device, the controller may generate the control command for adjusting the transparency according to content that is output as the visual information.
In the display device, except for the external image that is input, only the visual information may be output to the display unit by adjusting the transparency according to the control command.
In the display device, the controller may generate the command for adjusting the transparency according to an amount of light that is detected from the outside.
In the display device, the controller may generate the control command for adjusting the transparency according to a predetermined condition.
In the display device, the controller may generate the control command for adjusting the transparency according to a locked state and an unlocked state.
In the display device, the controller may generate the control command for adjusting the transparency according to a predetermined external command that is input.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, there is provided a method of controlling a display device including a step (a) of outputting visual information in such a manner that the visual information is superimposed on an external image that is input through a display unit, a step (b) of generating a control method for adjusting transparency of the display unit, and a step (c) of adjusting clearness of the external image that is output, by adjusting the transparency according to the control command, in which the display unit is coupled to a main body that is formed in a head-mounted manner and is arranged in a position that corresponds to both eyes.
In the method, the step (b) may include a step of generating the control command for adjusting the transparency according to content that is output as the visual information.
In the method, the step (c) may include a step of outputting only the visual information, except for the external image that is input, by adjusting the transparency according to the control command.
In the method, the step (b) may include a step of generating the control command for adjusting the transparency according to an amount of light that is detected from the outside.
In the method, the step (b) may include a step of generating the control command for adjusting the transparency according to a predetermined condition.
In the method, the step (b) includes a step of generating the control command for adjusting the transparency according to a locked state and an unlocked state.
In the method, the step (b) includes a step of generating the control command for adjusting the transparency according to a predetermined external command that is input.
According to the present invention, the transparency of the display unit is adjusted. Accordingly, the visual information is output using a full display, not an augmented reality (AR) display, and the private content or the content of which security is to be ensured is prevented from being visible to other people.
In addition, the display unit functions as the sunglasses by adjusting a transparency level according the amount of external light. Accordingly, removable sunglasses that are mounted on the existing smart glasses are of no use. In the removable sunglasses, the amount of light is uniformly adjusted. In contrast, the suitable transparency is provided that results from considering an amount of ambient light.
The locked state and the unlocked state are displayed in an intuitively-recognizable manner, and thus damage is prevented such as information leakage that may occur when the display unit is stolen and is lost.
As a result, the user convenience is improved.
FIG. 1 is a block diagram illustrating a display device according to one embodiment of the present invention;
FIGS. 2A and 2B are diagrams illustrating a telecommunication system in which the display device according to the present invention can operate;
FIG. 3 is a diagram illustrating a display device according to one embodiment of the present invention;
FIG. 4 is a flowchart for describing a display device according to one embodiment of the present invention;
FIGS. 5A and 5B are diagrams illustrating an embodiment in which transparency is adjusted according to content;
FIG. 6 is diagram illustrating an embodiment in which the transparency is adjusted according to the content;
FIG. 7 is diagram illustrating an embodiment in which the transparency is adjusted according to an amount of light that is detected from the outside;
FIG. 8 is diagram illustrating an embodiment in which the transparency is adjusted according to a locked state and an unlocked state; and
FIGS. 9A to 9C are diagram illustrating an embodiment in which the transparency is adjusted according to the external command that is input.
Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. It will also be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Description will now be given in detail according to the exemplary embodiments, with reference to the accompanying drawings. For the sake of brief description with reference to the drawings, the same or equivalent components will be provided with the same reference numbers, and description thereof will not be repeated. A suffix "module" or "unit" used for constituent elements disclosed in the following description is merely intended for easy description of the specification, and the suffix itself does not give any special meaning or function. In describing the present invention, if a detailed explanation for a related known function or construction is considered to unnecessarily divert the gist of the present disclosure, such explanation has been omitted but would be understood by those skilled in the art. The accompanying drawings are used to help easily understood the technical idea of the present invention and it should be understood that the idea of the present disclosure is not limited by the accompanying drawings.
FIG. 1 is a block diagram of a display device 100 in accordance with one exemplary embodiment.
The display device 100 may comprise components, such as a wireless communication unit 110, an Audio/Video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, a power supply 190 and the like. FIG. 1B shows the display device 100 having various components, but it is understood that implementing all of the illustrated components is not a requirement. Greater or fewer components may alternatively be implemented.
Hereinafter, each component 110 to 190 is described in sequence.
The wireless communication unit 110 may typically include one or more modules which permit wireless communications between the display device 100 and a wireless communication system or between the display device 100 and a network within which the display device 100 is located. For example, the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, a location information module 115 and the like.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast managing entity via a broadcast channel.
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast managing entity may indicate a server which generates and transmits a broadcast signal and/or broadcast associated information or a server which receives a pre-generated broadcast signal and/or broadcast associated information and sends them to the mobile terminal. The broadcast signal may be implemented as a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, among others. The broadcast signal may further include a data broadcast signal combined with a TV or radio broadcast signal.
Examples of broadcast associated information may include information associated with a broadcast channel, a broadcast program, a broadcast service provider, and the like. The broadcast associated information may be provided via a mobile communication network, and received by the mobile communication module 112.
The broadcast associated information may be implemented in various formats. For instance, broadcast associated information may include Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H), and the like.
The broadcast receiving module 111 may be configured to receive digital broadcast signals transmitted from various types of broadcast systems. Such broadcast systems may include Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), Digital Video Broadcast-Handheld (DVB-H), Integrated Services Digital Broadcast-Terrestrial (ISDB-T) and the like. The broadcast receiving module 111 may be configured to be suitable for every broadcast system transmitting broadcast signals as well as the digital broadcasting systems.
Broadcast signals and/or broadcast associated information received via the broadcast receiving module 111 may be stored in a suitable device, such as a memory 160.
The mobile communication module 112 transmits/receives wireless signals to/from at least one of network entities (e.g., base station, an external mobile terminal, a server, etc.) on a mobile communication network. Here, the wireless signals may include audio call signal, video (telephony) call signal, or various formats of data according to transmission/reception of text/multimedia messages.
The mobile communication module 112 may implement a video call mode and a voice call mode. The video call mode indicates a state of calling with watching a callee's image. The voice call mode indicates a state of calling without watching the callee's image. The wireless communication module 112 may transmit and receive at least one of voice and image in order to implement the video call mode and the voice call mode.
The wireless Internet module 113 supports wireless Internet access for the mobile terminal. This module may be internally or externally coupled to the display device 100. Examples of such wireless Internet access may include Wireless LAN (WLAN) (Wi-Fi), Wireless Broadband (Wibro), Worldwide Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA) and the like.
The short-range communication module 114 denotes a module for short-range communications. Suitable technologies for implementing this module may include BLUETOOTHTM, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBeeTM, Near Field Communication (NFC) and the like.
The location information module 115 denotes a module for detecting or calculating a position of a mobile terminal. An example of the location information module 115 may include a Global Position System (GPS) module.
Still referring to FIG. 1, the A/V input unit 120 is configured to provide audio or video signal input to the mobile terminal. The A/V input unit 120 may include a camera 121 and a microphone 122. The camera 121 receives and processes image frames of still pictures or video obtained by image sensors in a video call mode or a capturing mode. The processed image frames may be displayed on a display unit 151.
The image frames processed by the camera 121 may be stored in the memory 160 or transmitted to the exterior via the wireless communication unit 110. Also, user's position information and the like may be calculated from the image frames acquired by the camera 121. Two or more cameras 121 may be provided according to the configuration of the mobile terminal.
The microphone 122 may receive an external audio signal while the mobile terminal is in a particular mode, such as a phone call mode, a recording mode, a voice recognition mode, or the like. This audio signal is processed into digital data. The processed digital data is converted for output into a format transmittable to a mobile communication base station via the mobile communication module 112 in case of the phone call mode. The microphone 122 may include assorted noise removing algorithms to remove noise generated in the course of receiving the external audio signal.
The user input unit 130 may generate input data input by a user to control the operation of the mobile terminal. The user input unit 130 may include a keypad, a dome switch, a touchpad (e.g., static pressure/capacitance), a jog wheel, a jog switch and the like.
The sensing unit 140 provides status measurements of various aspects of the mobile terminal. For instance, the sensing unit 140 may detect a location of the display device 100, a presence or absence of user contact with the display device 100, the location of the display device 100, acceleration/deceleration of the display device 100, and the like, so as to generate a sensing signal for controlling the operation of the display device 100. Other examples include sensing functions, such as the sensing unit 140 sensing the presence or absence of power provided by the power supply 190, the presence or absence of a coupling or other connection between the interface unit 170 and an external device.
The output unit 150 is configured to output an audio signal, a video signal or a tactile signal. The output unit 150 may include a display unit 151, an audio output module 153, an alarm unit 154 and a haptic module 155.
The display unit 151 may output information processed in the display device 100. For example, when the mobile terminal is operating in a phone call mode, the display unit 151 will provide a User Interface (UI) or a Graphic User Interface (GUI), which includes information associated with the call. As another example, if the mobile terminal is in a video call mode or a capturing mode, the display unit 151 may additionally or alternatively display images captured and/or received, UI, or GUI.
The display unit 151 may be implemented using, for example, at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT-LCD), an Organic Light-Emitting Diode (OLED), a flexible display, a three-dimensional (3D) display, an e-ink display or the like.
Some of such displays 151 may be implemented as a transparent type or an optical transparent type through which the exterior is visible, which is referred to as 'transparent display'. A representative example of the transparent display may include a Transparent OLED (TOLED), and the like. The rear surface of the display unit 151 may also be implemented to be optically transparent. Under this configuration, a user can view an object positioned at a rear side of a terminal body through a region occupied by the display unit 151 of the terminal body.
The display unit 151 may be implemented in two or more in number according to a configured aspect of the display device 100. For instance, a plurality of the displays 151 may be arranged on one surface to be spaced apart from or integrated with each other, or may be arranged on different surfaces.
The display unit 151 may also be implemented as a stereoscopic display unit 152 for displaying stereoscopic images.
Here, the stereoscopic image may be a three-dimensional (3D) stereoscopic image, and the 3D stereoscopic image is an image refers to an image making a viewer feel that a gradual depth and reality of an object on a monitor or a screen is the same as a reality space. A 3D stereoscopic image is implemented by using binocular disparity. Binocular disparity refers to disparity made by the positions of two eyes. When two eyes view different 2D images, the images are transferred to the brain through the retina and combined in the brain to provide the perception of depth and reality sense.
The stereoscopic display unit 152 may employ a stereoscopic display scheme such as stereoscopic scheme (a glass scheme), an auto-stereoscopic scheme (glassless scheme), a projection scheme (holographic scheme), or the like. Stereoscopic schemes commonly used for home television receivers, or the like, include Wheatstone stereoscopic scheme, or the like.
The auto-stereoscopic scheme includes, for example, a parallax barrier scheme, a lenticular scheme, an integral imaging scheme, or the like. The projection scheme includes a reflective holographic scheme, a transmissive holographic scheme, or the like.
In general, a 3D stereoscopic image is comprised of a left image (a left eye image) and a right image (a right eye image). According to how left and right images are combined into a 3D stereoscopic image, the 3D stereoscopic imaging method is divided into a top-down method in which left and right images are disposed up and down in a frame, an L-to-R (left-to-right, side by side) method in which left and right images are disposed left and right in a frame, a checker board method in which fragments of left and right images are disposed in a tile form, an interlaced method in which left and right images are alternately disposed by columns and rows, and a time sequential (or frame by frame) method in which left and right images are alternately displayed by time.
Also, as for a 3D thumbnail image, a left image thumbnail and a right image thumbnail are generated from a left image and a right image of the original image frame, respectively, and then combined to generate a single 3D thumbnail image. In general, thumbnail refers to a reduced image or a reduced still image. The thusly generated left image thumbnail and the right image thumbnail are displayed with a horizontal distance difference therebetween by a depth corresponding to the disparity between the left image and the right image on the screen, providing a stereoscopic space sense.
As illustrated, a left image and a right image required for implementing a 3D stereoscopic image is displayed on the stereoscopic display unit 152 by a stereoscopic processing unit (not shown). The stereoscopic processing unit may receive the 3D image and extract the left image and the right image, or may receive the 2D image and change it into a left image and a right image.
Here, if the display unit 151 and a touch sensitive sensor (referred to as a touch sensor) have a layered structure therebetween (referred to as a 'touch screen'), the display unit 151 may be used as an input device as well as an output device. The touch sensor may be implemented as a touch film, a touch sheet, a touchpad, and the like.
The touch sensor may be configured to convert changes of a pressure applied to a specific part of the display unit 151, or a capacitance occurring from a specific part of the display unit 151, into electric input signals. Also, the touch sensor may be configured to sense not only a touched position and a touched area, but also touch pressure. Here, a touch object is an object to apply a touch input onto the touch sensor. Examples of the touch object may include a finger, a touch pen, a stylus pen, a pointer or the like.
When touch inputs are sensed by the touch sensors, corresponding signals are transmitted to a touch controller. The touch controller processes the received signals, and then transmits corresponding data to the controller 180. Accordingly, the controller 180 may sense which region of the display unit 151 has been touched.
Still referring to FIG. 1, a proximity sensor 141 may be arranged at an inner region of the display device 100 covered by the touch screen, or near the touch screen. The proximity sensor 141 may be provided as one example of the sensing unit 140. The proximity sensor 141 indicates a sensor to sense presence or absence of an object approaching to a surface to be sensed, or an object disposed near a surface to be sensed, by using an electromagnetic field or infrared rays without a mechanical contact. The proximity sensor 141 has a longer lifespan and a more enhanced utility than a contact sensor.
The proximity sensor 141 may include a transmissive type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and so on. When the touch screen is implemented as a capacitance type, proximity of a pointer to the touch screen is sensed by changes of an electromagnetic field. In this case, the touch screen (touch sensor) may be categorized into a proximity sensor.
Hereinafter, for the sake of brief explanation, a status that the pointer is positioned to be proximate onto the touch screen without contact will be referred to as 'proximity touch', whereas a status that the pointer substantially comes in contact with the touch screen will be referred to as 'contact touch'. For the position corresponding to the proximity touch of the pointer on the touch screen, such position corresponds to a position where the pointer faces perpendicular to the touch screen upon the proximity touch of the pointer.
The proximity sensor 141 senses proximity touch, and proximity touch patterns (e.g., distance, direction, speed, time, position, moving status, etc.). Information relating to the sensed proximity touch and the sensed proximity touch patterns may be output onto the touch screen.
When a touch sensor is overlaid on the stereoscopic display unit 152 in a layered manner (hereinafter, referred to as 'stereoscopic touch screen'), or when the stereoscopic display unit 152 and a 3D sensor sensing a touch operation are combined, the stereoscopic display unit 152 may also be used as a 3D input device.
As examples of the 3D sensor, the sensing unit 140 may include a proximity sensor 141, a stereoscopic touch sensing unit 142, an ultrasonic sensing unit 143, and a camera sensing unit 144.
The proximity sensor 141 detects the distance between a sensing object (e.g., the user's fingeror a stylus pen) applying a touch by using the force of electromagnetism or infrared rays without a mechanical contact and a detect surface. By using the distance, the terminal recognizes which portion of a stereoscopic image has been touched. In particular, when the touch screen is an electrostatic touch screen, the degree of proximity of the sensing object is detected based on a change of an electric field according to proximity of the sensing object, and a touch to the 3D image is recognized by using the degree of proximity.
The stereoscopic touch sensing unit 142 is configured to detect the strength or duration of a touch applied to the touch screen. For example, the stereoscopic touch sensing unit 142 may sense touch pressure. When the pressure is strong, it may recognize the touch as a touch with respect to an object located farther away from the touch screen toward the inside of the terminal.
The ultrasonic sensing unit 143 is configured to recognize position information of the sensing object by using ultrasonic waves.
The ultrasonic sensing unit 143 may include, for example, an optical sensor and a plurality of ultrasonic sensors. The optical sensor is configured to sense light and the ultrasonic sensors may be configured to sense ultrasonic waves. Since light is much faster than ultrasonic waves, a time for which the light reaches the optical sensor is much shorter than a time for which the ultrasonic wave reaches the ultrasonic sensor. Therefore, a position of a wave generation source may be calculated by using a time difference from the time that the ultrasonic wave reaches based on the light as a reference signal.
The camera sensing unit 144 includes at least one of a camera 121, a photo sensor, and a laser sensor.
For example, the camera 121 and the laser sensor may be combined to detect a touch of the sensing object with respect to a 3D stereoscopic image. When distance information detected by a laser sensor is added to a 2D image captured by the camera, 3D information can be obtained.
In another example, a photo sensor may be laminated on the display device. The photo sensor is configured to scan a movement of the sensing object in proximity to the touch screen. In detail, the photo sensor includes photo diodes and transistors at rows and columns to scan content mounted on the photo sensor by using an electrical signal changing according to the quantity of applied light. Namely, the photo sensor calculates the coordinates of the sensing object according to variation of light to thus obtain position information of the sensing object.
The audio output module 153 may convert and output as sound audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode, a record mode, a voice recognition mode, a broadcast reception mode, and the like. Also, the audio output module 153 may provide audible outputs related to a particular function performed by the display device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 153 may include a speaker, a buzzer or the like.
The alarm unit 154 outputs a signal for informing about an occurrence of an event of the display device 100. Events generated in the mobile terminal may include call signal reception, message reception, key signal inputs, a touch input etc. In addition to video or audio signals, the alarm unit 154 may output signals in a different manner, for example, using vibration to inform about an occurrence of an event. The video or audio signals may be also outputted via the audio output module 153, so the display unit 151 and the audio output module 153 may be classified as parts of the alarm unit 154.
A haptic module 155 generates various tactile effects the user may feel. A typical example of the tactile effects generated by the haptic module 155 is vibration. The strength and pattern of the haptic module 155 can be controlled. For example, different vibrations may be combined to be outputted or sequentially outputted.
Besides vibration, the haptic module 155 may generate various other tactile effects such as an effect by stimulation such as a pin arrangement vertically moving with respect to a contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a contact on the skin, a contact of an electrode, electrostatic force, etc., an effect by reproducing the sense of cold and warmth using an element that can absorb or generate heat.
The haptic module 155 may be implemented to allow the user to feel a tactile effect through a muscle sensation such as fingers or arm of the user, as well as transferring the tactile effect through a direct contact. Two or more haptic modules 155 may be provided according to the configuration of the display device 100.
The memory 160 may store software programs used for the processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, video, etc.) that are inputted or outputted. In addition, the memory 160 may store data regarding various patterns of vibrations and audio signals outputted when a touch is inputted to the touch screen.
The memory 160 may include at least one type of storage medium including a Flash memory, a hard disk, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read-Only Memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk. Also, the display device 100 may be operated in relation to a web storage device that performs the storage function of the memory 160 over the Internet.
The interface unit 170 serves as an interface with every external device connected with the display device 100. For example, the external devices may transmit data to an external device, receives and transmits power to each element of the display device 100, or transmits internal data of the display device 100 to an external device. For example, the interface unit 170 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input/output (I/O) ports, video I/O ports, earphone ports, or the like.
The identification module may be a chip that stores various information for authenticating the authority of using the display device 100 and may include a user identity module (UIM), a subscriber identity module (SIM) a universal subscriber identity module (USIM), and the like. In addition, the device having the identification module (referred to as 'identifying device', hereinafter) may take the form of a smart card. Accordingly, the identifying device may be connected with the terminal 100 via the interface unit 170.
When the display device 100 is connected with an external cradle, the interface unit 170 may serve as a passage to allow power from the cradle to be supplied therethrough to the display device 100 or may serve as a passage to allow various command signals inputted by the user from the cradle to be transferred to the mobile terminal therethrough. Various command signals or power inputted from the cradle may operate as signals for recognizing that the mobile terminal is properly mounted on the cradle.
The controller 180 typically controls the general operations of the mobile terminal. For example, the controller 180 performs controlling and processing associated with voice calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 for reproducing multimedia data. The multimedia module 181 may be configured within the controller 180 or may be configured to be separated from the controller 180.
The controller 180 may perform a pattern recognition processing to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images, respectively.
Also, the controller 180 may execute a lock state to restrict a user from inputting control commands for applications when a state of the mobile terminal meets a preset condition. Also, the controller 180 may control a lock screen displayed in the lock state based on a touch input sensed on the display unit 151 in the lock state of the mobile terminal.
The power supply unit 190 receives external power or internal power and supplies appropriate power required for operating respective elements and components under the control of the controller 180.
Various embodiments described herein may be implemented in a computer-readable or its similar medium using, for example, software, hardware, or any combination thereof.
For hardware implementation, the embodiments described herein may be implemented by using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic units designed to perform the functions described herein. In some cases, such embodiments may be implemented by the controller 180 itself.
For software implementation, the embodiments such as procedures or functions described herein may be implemented by separate software modules. Each software module may perform one or more functions or operations described herein.
Software codes can be implemented by a software application written in any suitable programming language. The software codes may be stored in the memory 160 and executed by the controller 180.
Hereinafter, a communication system which is operable with the display device 100 according to the present disclosure will be described.
FIGS. 2A and 2B are conceptual views of a communication system operable with a display device 100 in accordance with the present disclosure.
First, referring to FIG. 2A, such communication systems utilize different air interfaces and/or physical layers. Examples of such air interfaces utilized by the communication systems include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS), the Long Term Evolution (LTE) of the UMTS, the Global System for Mobile Communications (GSM), and the like.
By way of non-limiting example only, further description will relate to a CDMA communication system, but such teachings apply equally to other system types including the CDMA wireless communication system.
Referring now to FIG. 2A, a CDMA wireless communication system is shown having a plurality of mobile terminals 100, a plurality of base stations (BSs) 270, base station controllers (BSCs) 275, and a mobile switching center (MSC) 280. The MSC 280 is configured to interface with a conventional Public Switch Telephone Network (PSTN) 290. The MSC 280 is also configured to interface with the BSCs 275. The BSCs 275 are coupled to the base stations 270 via backhaul lines. The backhaul lines may be configured in accordance with any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. Hence, the plurality of BSCs 275 can be included in the system as shown in FIG. 2A.
Each base station 270 may include one or more sectors, each sector having an omni-directional antenna or an antenna pointed in a particular direction radially away from the base station 270. Alternatively, each sector may include two or more different antennas. Each base station 270 may be configured to support a plurality of frequency assignments, with each frequency assignment having a particular spectrum (e.g., 1.25 MHz, 5 MHz, etc.).
The intersection of sector and frequency assignment may be referred to as a CDMA channel. The base stations 270 may also be referred to as Base Station Transceiver Subsystems (BTSs). In some cases, the term "base station" may be used to refer collectively to a BSC 275, and one or more base stations 270. The base stations may also be denoted as "cell sites." Alternatively, individual sectors of a given base station 270 may be referred to as cell sites.
A broadcasting transmitter (BT) 295, as shown in FIG. 2A, transmits a broadcast signal to the mobile terminals 100 operating within the system. The broadcast receiving module 111 (FIG. 1B) is typically configured inside the display device 100 to receive broadcast signals transmitted by the BT 295.
FIG. 2A further depicts several Global Positioning System (GPS) satellites 300. Such satellites 300 facilitate locating the position of at least one of plural mobile terminals 100. Two satellites are depicted in FIG. 2, but it is understood that useful position information may be obtained with greater or fewer satellites than two satellites. The GPS module 115 (FIG. 1B) is typically configured to cooperate with the satellites 300 to obtain desired position information. It is to be appreciated that other types of position detection technology, (i.e., location technology that may be used in addition to or instead of GPS location technology) may alternatively be implemented. If desired, at least one of the GPS satellites 300 may alternatively or additionally be configured to provide satellite DMB transmissions.
During typical operation of the wireless communication system, the base stations 270 receive sets of reverse-link signals from various mobile terminals 100. The mobile terminals 100 are engaging in calls, messaging, and executing other communications. Each reverse-link signal received by a given base station 270 is processed within that base station 270. The resulting data is forwarded to an associated BSC 275. The BSC 275 provides call resource allocation and mobility management functionality including the orchestration of soft handoffs between base stations 270. The BSCs 275 also route the received data to the MSC 280, which then provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN 290 interfaces with the MSC 280, and the MSC 280 interfaces with the BSCs 275, which in turn control the base stations 270 to transmit sets of forward-link signals to the mobile terminals 100.
Hereinafter, description will be given of a method for acquiring location information of a mobile terminal using a wireless fidelity (WiFi) positioning system (WPS), with reference to FIG. 2B.
The WiFi positioning system (WPS) 300 refers to a location determination technology based on a wireless local area network (WLAN) using WiFi as a technology for tracking the location of the display device 100 using a WiFi module provided in the display device 100 and a wireless access point 320 for transmitting and receiving to and from the WiFi module.
The WiFi positioning system 300 may include a WiFi location determination server 310, a display device 100, a wireless access point (AP) 320 connected to the display device 100, and a database 330 stored with any wireless AP information.
The WiFi location determination server 310 extracts the information of the wireless AP 320 connected to the display device 100 based on a location information request message (or signal) of the display device 100. The information of the wireless AP 320 may be transmitted to the WiFi location determination server 310 through the display device 100 or transmitted to the WiFi location determination server 310 from the wireless AP 320.
The information of the wireless AP extracted based on the location information request message of the display device 100 may be at least one of MAC address, SSID, RSSI, channel information, privacy, network type, signal strength and noise strength.
The WiFi location determination server 310 receives the information of the wireless AP 320 connected to the display device 100 as described above, and compares the received wireless AP 320 information with information contained in the pre-established database 330 to extract (or analyze) the location information of the display device 100.
On the other hand, referring to FIG. 2B, as an example, the wireless AP connected to the display device 100 is illustrated as a first, a second, and a third wireless AP 320. However, the number of wireless APs connected to the display device 100 may be changed in various ways according to a wireless communication environment in which the display device 100 is located. When the display device 100 is connected to at least one of wireless APs, the WiFi positioning system 300 can track the location of the display device 100.
Next, considering the database 330 stored with any wireless AP information in more detail, various information of any wireless APs disposed at different locations may be stored in the database 330.
The information of any wireless APs stored in the database 330 may be information such as MAC address, SSID, RSSI, channel information, privacy, network type, latitude and longitude coordinate, building at which the wireless AP is located, floor number, detailed indoor location information (GPS coordinate available), AP owner's address, phone number, and the like.
In this manner, any wireless AP information and location information corresponding to the any wireless AP are stored together in the database 330, and thus the WiFi location determination server 310 may retrieve wireless AP information corresponding to the information of the wireless AP 320 connected to the display device 100 from the database 330 to extract the location information matched to the searched wireless AP, thereby extracting the location information of the display device 100.
Furthermore, the extracted location information of the display device 100 may be transmitted to the display device 100 through the WiFi location determination server 310, thereby acquiring the location information of the display device 100.
Hereinafter, a structure of the display device 100 of FIG. 1 will be explained in more detail.
FIG. 3 is a diagram illustrating the display device 100 according to one embodiment of the present invention.
Referring to FIG. 3, the display device 100 according to the present invention includes the main body 310, the display unit 151, and the controller 180.
The display device 100 according to the present invention is realized as a head-mounted display (HMD) device. As a specific example, the display device 100 according to the present invention is realized as smart glasses.
The main body 310 is formed in such a manner that the main body 310 can be head-mounted. For example, the main body 310 is realized as a frame of the smart glasses.
The display unit 151 is coupled to the main body 310 in such a manner that the display unit 151 is arranged in a position corresponding to the left and right eyes. Visual information is output to the display unit 151 in such a manner that the visual information is superimposed on an external image that is input.
At this time, the external image means an external environment that is viewed through the display unit 151 by a wearer of the display device 100. That is, the external image is similar to the external environment that is viewed through the lenses by a wearer of glasses.
Then, the visual information means a virtual object that is generated in the display device 100 or is input from an external device. For example, the visual information means an application, an icon corresponding to the application, content, an UI for a telephone conversation mode, and the like. The visual information is generated by the controller 180 or is input from the mobile terminal such as a smart phone.
As an embodiment, an icon for forecasting the weather may be output to the display unit 151, along with the external image.
In addition, the display unit 151 is realized as smart glass of which light transmissivity changes. Accordingly, the display unit 151 adjusts transparency by changing light transmissivity in a specific situation or condition.
As described above, the controller 180 controls the display device 100. Specifically, the controller 180 generates a control command for adjusting the transparency of the display unit 151. As a result, the display unit 151 adjusts the transparency according to the control command generated by the controller 180 in order to adjust clearness of the external image. For example, if the wearer watches movie content, the controller 180 generates the control command for setting the display unit 151 to be in an opaque state. The controller 180 adjusts the transparency according to the control command in such a manner that the display unit 151 is in the opaque state. As a result, the external image is blocked from further visibility to the user and thus only the movie content is made visible to the wearer.
In addition, the controller 180 is mounted on the main body 310 of the display device 100, or the controller 180 and the main body 310 are integrally formed into one piece. As another embodiment, the controller 180 may be arranged away from the main body 310.
The camera 121 is arranged in front of at least one of the left-eye and right-eye display units 151. In addition, one camera 121 is arranged in one side of the frame 310 or the two cameras 121 are arranged in both sides of the frame 310, respectively, to photograph an object that is out of wearer's field of view.
A user input 130 is realized, as a separate touch panel, on one side of the frame 310, or the user inputs 130 are realized, as separate touch panels, on both sides of the frame 310, respectively. In addition, the user input 130 is realized as a physical key. For example, an ON/Off switch for a power source is realized on one side of the frame 310.
As another embodiment, the user input unit 130 may be realized as a separate external device that connects to the main body 310. Accordingly, the user can input a specific command into the separate external device. In addition, the display unit 151 is realized as a touch screen and thus the user can directly input the control command into the display unit 151.
On the other hand, as one example of the head-mounted display device, the smart glasses are available on the market. The smart glasses realized as a wearable device conveniently executes a function that is executed in the existing mobile terminal.
However, since an external image that is viewed through the smart glasses being worn is displayed along with visual information that is output (augmented reality), a situation occurs in which the visual information is difficult to recognize. In addition, because the content is displayed on the lenses of the smart glasses, the personal content or the content of which the security has to be ensured has the likelihood of being viewable to other people.
Accordingly, the display device 100 capable of adjusting the transparency of the display unit 151 for improving user convenience and a method of controlling the display device 100 are described below referring to the accompanying drawings.
FIG. 4 is a flowchart for describing the display device 100 (refer to FIG.1) according to one embodiment of the present invention. The display device 100 includes the main body 310, the display unit 151, and the controller 180.
Referring to FIG. 4, first, Step S410 proceeds in which the visual information is output in such a manner that the visual information is superimposed on the external image that is input through the display unit 151.
At this time, the display unit 151 is coupled to the main body 310 that is formed in such a manner that the main body 310 is mountable on the head, and is arranged in the position corresponding to the left and right eyes.
Subsequently, Step S429 proceeds in which the control command is generated for adjusting the transparency of the display unit 151.
Thereafter, Step S430 proceeds in which the clearness of the external image is adjusted by adjusting the transparency according to the generated control command.
As described above, the controller 180 generates the control command for adjusting the transparency of the display unit 151 in the specific situation or condition, and the display unit 151 sets the transparency according to the generated control command.
As an embodiment, the controller 180 may generate the control command for adjusting the transparency according to the content that is output as the visual information. As a result, except for the external image that is input, the display unit 151 outputs only the visual information by adjusting the transparency according to the control command.
FIGS. 5A and 5B are diagrams illustrating an embodiment in which the transparency is adjusted according to the content. FIG. 6 is diagram illustrating an embodiment in which the transparency is adjusted according to the content.
FIG. 5A(a) illustrates a screen that is output to the display unit 151 if the user checks email in a park while wearing the display device 100. That is, an external image 510 that is an image of the park, and visual information 520 that is email, are output together.
FIG. 5A(b) illustrates the screen output to the display unit 151 in FIG. 5A(a) when viewed from the outside. It is apparent from FIG. 5A(b) that the email 520 is checked through transparent display unit 151 from the outside.
Referring to FIG. 5B(c), if the security of details of the email 520 is to be ensured, the display unit 151 is set to be in a translucent or opaque state. As a result, only the email 520 is output to the display unit 151. In addition, the external image 510 is output with less clearness.
FIG. 5b(b) illustrates the display unit 151 in the translucent or opaque state in FIG. 5b(a) when viewed from the outside. It is apparent from FIG. 5B(b) that the details of the email 520 are difficult to check from the outside.
That is, if the security of the content that has to be ensured is output or if the privacy of the content that has to be protected is output, the display unit 151 is set to be in the translucent or opaque state to prevent the content from being checked from the outside.
For example, the content of which the security has to be ensured includes email related to company affairs, confidential matters, payment information and so on. In addition, the content of which the privacy has to be protected includes a photo album, a message received from a specific person, a personal profile, personal information, and so on.
Referring to FIG, 6(a), if the wearer of the display device 100 watches the movie content while riding a subway train, the external image that is an image of the inside of the railway carriage and the visual information that is the movie content are output together. At this time, the movie content and the external image are output in such a manner that they overlap each other. Thus, the external image prevents the user from watching the movie.
Referring to FIG. 6(b), the display unit 151 is set to be in the translucent or opaque state in order to block the external image and output only the movie content.
That is, if predetermined content is output, or according to a type of content, the display unit 151 is set to be in the translucent or opaque state in order to block the external image and output only the content. In this case, as illustrated in FIGS. 5A and 5B, the content is difficult to check from the outside.
For example, if the content, such a movie or a game, the display unit 151 is set to be in the translucent or opaque state in order to block the external image and output only the content to the entire screen.
As an embodiment, if the movie content displayed as illustrated in FIG. 6(b) is stopped or temporarily stopped, the external image may be output together as illustrated in FIG. 6(a).
As another embodiment, if a movement of the wearer is detected, the outputting of the blocked external image may be resumed. Accordingly, the blocking of the external image prevents safety hazards from occurring.
On the other hand, the controller 180 generates the control command for adjusting the transparency according to an amount of light that is detected from the outside.
FIG. 7 is diagram illustrating an embodiment in which the transparency is adjusted according to the amount of light that is detected from the outside.
Referring to FIG. 7, the display device 100 adjusts the transparency of the display unit 151 according to the amount of external light that is detected through an optical sensor or the camera 121.
For example, if strong sunlight is detected, the transparency of the display unit 151 is adjusted in order to decrease the clearness of the external image. As a result, the display device 100 serves as sunglasses that protect the wearer's eyes from the strong light.
In addition, the wearer recognizes the external image and the visual information in a clear manner in the proper amount of light.
On the other hand, the controller 180 generates the control command for adjusting the transparency according to a predetermined condition.
As an embodiment, if the blocking of the external image causes the safety hazards, the display unit 151 remains in the transparent state. Specifically, the control command for blocking the external image from further visibility to the user is input. At this time, when it is determined that the wearer drives or walks, a message telling that the external image is not blocked for the sake of safety reason is output.
As another embodiment, if the display device 100 is located in a place where the wearing of the smart glasses is not allowed, the display unit 151 may be changed to the opaque state and at the same the display device 100 may be made to stop operating.
As another embodiment, when the wearer under age outputs adult content or violent content to the external device, the display unit 151 may be made to be in the opaque in order to prevent the wearer under age from watching that content on the external device.
As another embodiment, an application or content that corresponds to a predetermined condition may be executed or displayed on the opaque display unit 151. Specifically, if the user does video chatting on-line with the opposite sex, the display unit 151 is automatically switched to an opaqueness mode in order to protect the privacy. In addition, if email from the company is checked, the display unit 151 is automatically switched to the opaqueness mode in order to ensure the security.
On the other hand, the controller 180 generates the control command for adjusting the transparency according to a locked state and an unlocked state.
FIG. 8 is diagrams illustrating an embodiment in which the transparency is adjusted according to the locked state and the unlocked state.
Referring to FIG. 8(a), if a user-authentication process of determining if the user is a registered user fails, the display unit 151 is changed to the opaqueness state.
For example, if the user-authentication process fails, an object 810 indicating the locked state on the display unit 151 is output. In addition, warning sound, vibration or the like is output at the same. Accordingly, it is easily recognized from the outside whether or not the user is an authenticated user.
As another embodiment, if a given period of time elapses from when the wearer takes off the display device 100, the display unit 151 is switched to the opaque state.
Referring to FIG. 8(b), if the user is authenticated as a registered user using various authentication methods, the display unit 151 is switched to the transparent state.
On the other hand, the controller 180 generates the control command for adjusting the transparency according to a predetermined external command that is input.
FIGS. 9A to 9C are diagrams illustrating an embodiment in which the transparency is adjusted according to the external command that is input.
Referring to FIG. 9A, the predetermined external command is input through the user input 130 that is arranged on one side or both sides of the frame 310.
As an embodiment, the transparency of the display unit 151 may be adjusted by applying a touch a drag input to the separate touch panel that is arranged in the frame 310.
As an embodiment, the user can adjust the transparency of the display unit 151 by operating a physical key arranged in the frame 310.
Referring to FIG. 9B, the predetermined external command is input through contact with or connection to the external device.
As an embodiment, the display unit 151 may be switched if the smart phone and the display device 100 are brought into contact with each other and are connected to each other. Thereafter, the content that is output to the smart phone is output to the opaque display unit 151.
Referring to FIG. 9C, a gesture that the wearer makes toward the display unit 151 is input as the external command.
As an embodiment, if the wearer makes a hand-raising gesture toward the front side of the display unit 151, the transparency may be increased, and if the wearer makes a hand-lowering gesture toward the front side of the display unit 151, the transparency may be decreased. Likewise, the wearer can make a left-to-right sliding gesture toward the front of the display unit 151 to adjust the transparency.
At this time, the wearer can make a gesture command toward each of the left and right display units in order to make the transparency of the left display unit 151 different from the transparency of the right display unit 151.
As an embodiment, if the wearer who displays the movie or the game content on the opaque display unit 151 raises up his/her head, the display unit 151 may be switched to the transparent state. In addition, if the wearer sees leftward or rightward, movements of pupils of the wearer's eyes are detected, and, among the left and right display units 151, the display unit 151 positioned in the direction in which the movement is detected is switched to the transparent state.
In addition, according to one embodiment disclosed in the present specification, the method described above may be realized by being stored as processor-readable codes in a program-stored medium. A ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device and the like are examples of the processor-readable medium, and the processor-readable medium may be realized in the form of a carrier wave (for example, a transmission over the Internet).
The configuration and the method of the embodiments according to the present invention, described above, are not applied in a limiting manner, but all of or some of the embodiments may be selectively combined with each other to create various modifications to the embodiments.
It will also be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
The preferred embodiment of the present invention provide a structure for controlling transparenct of the display unit, and are applicable to various industrial field related to the strcture.

Claims (14)

  1. A display device comprising:
    a main body that is formed in a head-mounted manner;
    a display unit which is coupled to the main body and is arranged in a position corresponding to both eyes and to which visual information is output in such a manner that the visual information is superimposed on an external image that is input; and
    a controller that generates a control command for adjusting transparency of the display unit,
    wherein the transparency of the display unit is adjusted according to the control command in order to adjust clearness of the external image.
  2. The display device of claim 1, wherein the controller generates the control command for adjusting the transparency according to content that is output as the visual information.
  3. The display device of claim 1 or 2, wherein except for the external image that is input, only the visual information is output to the display unit by adjusting the transparency according to the control command.
  4. The display device of claim 1, wherein the controller generates the command for adjusting the transparency according to an amount of light that is detected from the outside.
  5. The display device of claim 1, wherein the controller generates the control command for adjusting the transparency according to a predetermined condition.
  6. The display device of claim 1, wherein the controller generates the control command for adjusting the transparency according to a locked state and an unlocked state.
  7. The display device of claim 1, wherein the controller generates the control command for adjusting the transparency according to a predetermined external command that is input.
  8. A method of controlling a display device comprising: a step (a) of outputting visual information in such a manner that the visual information is superimposed on an external image that is input through a display unit;
    a step (b) of generating a control method for adjusting transparency of the display unit; and
    a step (c) of adjusting clearness of the external image that is output, by adjusting the transparency according to the control command,
    wherein the display unit is coupled to a main body that is formed in a head-mounted manner and is arranged in a position that corresponds to both eyes.
  9. The method of claim 8, wherein the step (b) includes a step of generating the control command for adjusting the transparency according to content that is output as the visual information.
  10. The method of claim 8 or 9, wherein the step (c) includes a step of outputting only the visual information, except for the external image that is input, by adjusting the transparency according to the control command.
  11. The method of claim 8, wherein the step (b) includes a step of generating the control command for adjusting the transparency according to an amount of light that is detected from the outside.
  12. The method of claim 8, wherein the step (b) includes a step of generating the control command for adjusting the transparency according to a predetermined condition.
  13. The method of claim 8, wherein the step (b) includes a step of generating the control command for adjusting the transparency according to a locked state and an unlocked state.
  14. The method of claim 8, wherein the step (b) includes a step of generating the control command for adjusting the transparency according to a predetermined external command that is input.
PCT/KR2014/002663 2013-07-02 2014-03-28 Display device and method of controlling the same WO2015002377A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0077362 2013-07-02
KR1020130077362A KR20150004192A (en) 2013-07-02 2013-07-02 Display device and control method thereof

Publications (1)

Publication Number Publication Date
WO2015002377A1 true WO2015002377A1 (en) 2015-01-08

Family

ID=52143928

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/002663 WO2015002377A1 (en) 2013-07-02 2014-03-28 Display device and method of controlling the same

Country Status (2)

Country Link
KR (1) KR20150004192A (en)
WO (1) WO2015002377A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3169022A1 (en) * 2015-11-13 2017-05-17 Sony Computer Entertainment Europe Ltd. Communication method and device
EP3187963A1 (en) * 2015-12-31 2017-07-05 Shanghai Xiaoyi Technology Co., Ltd. Display device and method
IT202100021212A1 (en) * 2021-08-05 2023-02-05 Luxottica Srl Electronic glasses.

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007121625A (en) * 2005-10-27 2007-05-17 Konica Minolta Photo Imaging Inc Image display device
KR20070105657A (en) * 2006-04-27 2007-10-31 (주)디오컴 Glass type monitor
JP2009092810A (en) * 2007-10-05 2009-04-30 Nikon Corp Head-mounted display device
US20110279355A1 (en) * 2009-01-27 2011-11-17 Brother Kogyo Kabushiki Kaisha Head mounted display
US20120086624A1 (en) * 2010-10-12 2012-04-12 Eldon Technology Limited Variable Transparency Heads Up Displays

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007121625A (en) * 2005-10-27 2007-05-17 Konica Minolta Photo Imaging Inc Image display device
KR20070105657A (en) * 2006-04-27 2007-10-31 (주)디오컴 Glass type monitor
JP2009092810A (en) * 2007-10-05 2009-04-30 Nikon Corp Head-mounted display device
US20110279355A1 (en) * 2009-01-27 2011-11-17 Brother Kogyo Kabushiki Kaisha Head mounted display
US20120086624A1 (en) * 2010-10-12 2012-04-12 Eldon Technology Limited Variable Transparency Heads Up Displays

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3169022A1 (en) * 2015-11-13 2017-05-17 Sony Computer Entertainment Europe Ltd. Communication method and device
US10120190B2 (en) 2015-11-13 2018-11-06 Sony Interactive Entertainment Europe Limited Entertainment device and method of communication for an entertainment device
EP3187963A1 (en) * 2015-12-31 2017-07-05 Shanghai Xiaoyi Technology Co., Ltd. Display device and method
IT202100021212A1 (en) * 2021-08-05 2023-02-05 Luxottica Srl Electronic glasses.
WO2023012014A1 (en) * 2021-08-05 2023-02-09 Luxottica S.R.L. Electronic eyeglasses

Also Published As

Publication number Publication date
KR20150004192A (en) 2015-01-12

Similar Documents

Publication Publication Date Title
WO2015026030A1 (en) Display device and method of controlling the same
WO2015002362A1 (en) Display device and control method thereof
WO2017209533A1 (en) Mobile device and controlling method thereof
WO2015053470A1 (en) Mobile terminal and control method thereof
WO2015190666A1 (en) Mobile terminal and method for controlling the same
WO2014129753A1 (en) Mobile terminal and touch coordinate predicting method thereof
WO2017014374A1 (en) Mobile terminal and controlling method thereof
US9761050B2 (en) Information provision device for glasses-type terminal and information provision method
WO2017039098A1 (en) Mobile device, wearable device and method of controlling each device
WO2015053449A1 (en) Glass-type image display device and method for controlling same
WO2015060501A1 (en) Apparatus and method for controlling mobile terminal
EP3028206A1 (en) Mobile terminal, smart watch, and method of performing authentication with the mobile terminal and the smart watch
WO2018048092A1 (en) Head mounted display and method for controlling the same
WO2015130040A1 (en) Mobile terminal and controlling method thereof
WO2015199287A1 (en) Head mounted display and method of controlling the same
WO2016056723A1 (en) Mobile terminal and controlling method thereof
EP2982042A1 (en) Terminal and control method thereof
WO2019124641A1 (en) Camera module and mobile terminal having the same
WO2019112165A1 (en) Electronic device having camera device
WO2018016731A1 (en) Electronic device and method for controlling the same
WO2014065595A1 (en) Image display device and method for controlling same
WO2014123306A1 (en) Mobile terminal and control method thereof
WO2019059466A1 (en) Digital device and biometric authentication method therein
EP3542523A1 (en) Mobile terminal and method for controlling the same
WO2014142373A1 (en) Apparatus for controlling mobile terminal and method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14820031

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14820031

Country of ref document: EP

Kind code of ref document: A1