WO2024076007A1 - Dispositif à porter sur soi et procédé de réglage de luminosité d'affichage sur la base d'une transparence de lentille - Google Patents

Dispositif à porter sur soi et procédé de réglage de luminosité d'affichage sur la base d'une transparence de lentille Download PDF

Info

Publication number
WO2024076007A1
WO2024076007A1 PCT/KR2023/013284 KR2023013284W WO2024076007A1 WO 2024076007 A1 WO2024076007 A1 WO 2024076007A1 KR 2023013284 W KR2023013284 W KR 2023013284W WO 2024076007 A1 WO2024076007 A1 WO 2024076007A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
display
wearable device
intensity
transparency
Prior art date
Application number
PCT/KR2023/013284
Other languages
English (en)
Korean (ko)
Inventor
김종아
윤희웅
이기혁
이동한
최광호
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220136904A external-priority patent/KR20240048430A/ko
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2024076007A1 publication Critical patent/WO2024076007A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/04Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions
    • G09G3/16Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions by control of light from an independent source
    • G09G3/19Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of a single character by selection from a plurality of characters, or by composing the character by combination of individual elements, e.g. segments using a combination of such display devices for composing words, rows or the like, in a frame with fixed character positions by control of light from an independent source using electrochromic devices
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • the present disclosure relates to a wearable device and method for adjusting the brightness of a display based on the transparency of the lens.
  • the electronic devices for interaction between reality and virtual worlds in virtual reality (VR), augmented reality (AR), and/or mixed reality (MR) are emerging.
  • the electronic devices can be classified into closed type electronic devices that cannot see the outside, and see-through type electronic devices that view both the external environment and the image on the display.
  • the transmissive electronic device may include a transparent display. A user of the transmissive electronic device can view visual objects provided by the transmissive electronic device while viewing the external environment through the transparent display.
  • a wearable device may include a lens, a display, a sensor, and a processor.
  • the processor may change the intensity of the first light output from one side of the lens by changing the state of the lens based on the intensity of the first light included in the data of the sensor.
  • the processor may obtain the luminance of the display based on the changed intensity of the first light and a reference luminance associated with the second light output from the display.
  • the processor may be configured to control the display to display a screen having the obtained luminance within a display area formed on the one surface of the lens.
  • the method includes including the first light directed to the first surface in a lens that transmits the first light to the second surface, based on the intensity of the first light included in the data of the sensor. It may include an operation of changing the transparency of the lens with respect to the first light by changing the state of the transparent member.
  • the method includes, based on the intensity of the first light attenuated by the lens and a reference luminance associated with the first light and the second light, through a display area formed on the second side of the lens, the second light. It may include an operation of acquiring the luminance of the display to output.
  • the method may include controlling the display based on the obtained luminance to display a screen 702 within the display area.
  • a wearable device includes a memory, a lens that transmits the first light directed to the first side to the second side, and a display area formed on the second side of the lens to output the second light. It may include a display, sensor, and processor.
  • the processor may change the transparency of the lens with respect to the first light based on the intensity of the first light included in the data of the sensor while the wearable device is worn by the user.
  • the processor may obtain the luminance of the display based on the intensity of the first light attenuated by the lens.
  • the processor may identify a change in the intensity of the first light included in the data from the sensor while displaying a screen in the display area, based on the obtained luminance.
  • the processor may maintain the brightness by controlling the transparency using the user's profile information related to the attenuated intensity of the first light, based on identifying a change in the intensity of the first light.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments.
  • FIG. 2A shows an example of a perspective view of a wearable device, according to one embodiment.
  • FIG. 2B illustrates an example of one or more hardware deployed within a wearable device, according to one embodiment.
  • 3A to 3B show an example of the appearance of a wearable device, according to an embodiment.
  • Figure 4 shows an example block diagram of a wearable device according to one embodiment.
  • 5A to 5B illustrate example states for adjusting transparency of a wearable device, according to an embodiment.
  • FIG. 6 shows an example graph representing the relationship between luminance, transparency, and illuminance, according to one embodiment.
  • FIGS. 7A and 7B illustrate example states in which a wearable device adjusts luminance, according to an embodiment.
  • Figure 8 is an example flowchart showing the operation of a wearable device, according to one embodiment.
  • 9A to 9B illustrate example states in which a wearable device adjusts luminance according to user preference, according to an embodiment.
  • Figure 10 is an example flowchart showing the operation of a wearable device, according to an embodiment.
  • Figure 11 is an example flowchart showing the operation of a wearable device, according to an embodiment.
  • module used in this document includes a unit comprised of hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit, for example.
  • a module may be an integrated part, a minimum unit that performs one or more functions, or a part thereof.
  • a module may be comprised of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • a first network 198 e.g., a short-range wireless communication network
  • a second network 199 e.g., a second network 199.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 includes a main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 e.g., a central processing unit or an application processor
  • auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 101 includes a main processor 121 and a secondary processor 123
  • the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
  • co-processor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
  • the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multi media interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multi media interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • Communication module 190 is configured to provide a direct (e.g., wired) communication channel or wireless communication channel between electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • processor 120 e.g., an application processor
  • the communication module 190 is a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 to communicate within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency). -latency communications)) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. It can support technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199).
  • the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit or receive signals or power to or from the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for the communication method used in the communication network, such as the first network 198 or the second network 199, is connected to the plurality of antennas by, for example, the communication module 190. can be selected Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
  • all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • FIG. 2A shows an example of a perspective view of a wearable device, according to one embodiment.
  • FIG. 2B illustrates an example of one or more hardware deployed within a wearable device, according to one embodiment.
  • the wearable device 101 may have the form of glasses that can be worn on a user's body part (eg, head).
  • the wearable device 101 of FIGS. 2A and 2B may be an example of the electronic device 101 of FIG. 1 .
  • the wearable device 101 may include a head-mounted display (HMD).
  • the housing of the wearable device 101 may be made of a flexible material such as rubber and/or silicone, which has a shape that adheres closely to a portion of the user's head (e.g., a portion of the face surrounding both eyes). may include.
  • the housing of wearable device 101 may include one or more straps capable of being twined around the user's head, and/or one or more temples attachable to the ears of the head. ) may include.
  • a wearable device 101 may include at least one display 250 and a frame 200 supporting the at least one display 250.
  • the wearable device 101 may be worn on a part of the user's body.
  • the wearable device 101 may provide augmented reality (AR), virtual reality (VR), or mixed reality (MR) that combines augmented reality and virtual reality to a user wearing the wearable device 101.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • the wearable device 101 may, in response to a user's designated gesture obtained through the motion recognition cameras 260-2 and 264 of FIG. 2B, perform a motion at least one optical device 282, 284 of FIG. 2B.
  • the provided virtual reality image may be displayed on at least one display 250.
  • At least one display 250 may provide visual information to the user.
  • at least one display 250 may include a transparent or translucent lens.
  • At least one display 250 may include a first display 250-1 and/or a second display 250-2 spaced apart from the first display 250-1.
  • the first display 250-1 and the second display 250-2 may be placed at positions corresponding to the user's left eye and right eye, respectively.
  • At least one display 250 displays visual information transmitted from external light to the user through a lens included in the at least one display 250, and displays information that is distinct from the visual information. Can provide other visual information.
  • the lens may be formed based on at least one of a Fresnel lens, a pancake lens, or a multi-channel lens.
  • at least one display 250 may include a first surface 231 and a second surface 232 opposite to the first surface 231 .
  • a display area may be formed on the second side 232 of at least one display 250.
  • At least one display 250 displays an augmented reality image in which a virtual reality image provided by at least one optical device 282 or 284 is combined with a real screen transmitted through external light, and a second display. It can be displayed in a display area formed on the surface 232.
  • the at least one display 250 includes at least one waveguide 233, 234 that diffracts light emitted from the at least one optical device 282, 284 and transmits it to the user.
  • At least one waveguide 233 or 234 may be formed based on at least one of glass, plastic, or polymer.
  • a nanopattern may be formed on the outside or at least a portion of the inside of at least one waveguide (233, 234).
  • the nanopattern may be formed based on a polygonal and/or curved grating structure. Light incident on one end of the at least one waveguide (233, 234) may propagate to the other end of the at least one waveguide (233, 234) by the nanopattern.
  • At least one waveguide 233 or 234 may include at least one of a diffractive element (e.g., a diffractive optical element (DOE), a holographic optical element (HOE)), or a reflective element (e.g., a reflective mirror).
  • a diffractive element e.g., a diffractive optical element (DOE), a holographic optical element (HOE)
  • a reflective element e.g., a reflective mirror.
  • at least one waveguide 233 or 234 may be disposed within the wearable device 101 to guide the screen displayed by the at least one display 250 to the user's eyes.
  • the screen may be transmitted to the user's eyes based on total internal reflection (TIR) generated within at least one waveguide (233, 234).
  • TIR total internal reflection
  • the wearable device 101 analyzes objects included in real-world images collected through a shooting camera (not shown), and corresponds to an object that is the target of providing augmented reality among the analyzed objects.
  • the virtual objects may be combined and displayed on at least one display 250.
  • the virtual object may include at least one of text and images for various information related to the object included in the real image.
  • the wearable device 101 can analyze objects based on multi-cameras, such as stereo cameras. For the object analysis, the wearable device 101 may perform time-of-flight (ToF) and/or simultaneous localization and mapping (SLAM) supported by multi-cameras.
  • a user wearing the wearable device 101 can watch images displayed on at least one display 250.
  • the frame 200 may be made of a physical structure that allows the wearable device 101 to be worn on the user's body. According to one embodiment, the frame 200 is configured so that when the user wears the wearable device 101, the first display 250-1 and the second display 250-2 correspond to the user's left eye and right eye. It can be configured so that it can be located.
  • the frame 200 may support at least one display 250. For example, the frame 200 may support the first display 250-1 and the second display 250-2 to be positioned at positions corresponding to the user's left eye and right eye.
  • the frame 200 may include an area 220 at least partially in contact with a portion of the user's body when the user wears the wearable device 101.
  • the area 220 of the frame 200 that is in contact with a part of the user's body is a part of the user's nose, a part of the user's ear, and a side part of the user's face that the wearable device 101 touches.
  • the frame 200 may include a nose pad 210 that contacts a part of the user's body. When the wearable device 101 is worn by a user, the nose pad 210 may be in contact with a portion of the user's nose.
  • the frame 200 may include a first temple 204 and a second temple 205 that are in contact with another part of the user's body that is distinct from the part of the user's body.
  • the frame 200 includes a first rim 201 surrounding at least a portion of the first display 250-1 and a second rim 201 surrounding at least a portion of the second display 250-2. 202), a bridge 203 disposed between the first rim 201 and the second rim 202, a first bridge 203 disposed along a portion of the edge of the first rim 201 from one end of the bridge 203 Pad 211, a second pad 212 disposed along a portion of the edge of the second rim 202 from the other end of the bridge 203, and a first pad that extends from the first rim 201 and is fixed to a portion of the wearer's ear.
  • the temple 204 and a second temple 205 may include a temple 204 and a second temple 205 that extends from the second rim 202 and is fixed to a portion of the ear opposite the ear.
  • the first pad 211 and the second pad 212 may be in contact with a portion of the user's nose, and the first temple 204 and the second temple 205 may be in contact with a portion of the user's face and a portion of the ear. may come into contact with.
  • the temples 204 and 205 may be rotatably connected to the rim through the hinge units 206 and 207 of FIG. 2B.
  • the first temple 204 may be rotatably connected to the first rim 201 through a first hinge unit 206 disposed between the first rim 201 and the first temple 204. .
  • the second temple 205 may be rotatably connected to the second rim 202 through a second hinge unit 207 disposed between the second rim 202 and the second temple 205.
  • the wearable device 101 uses a touch sensor, a grip sensor, and/or a proximity sensor formed on at least a portion of the surface of the frame 200 to detect an external object that touches the frame 200 ( Gestures performed by, for example, a user's fingertip, and/or the external object may be identified.
  • the wearable device 101 may include hardware that performs various functions (eg, hardware to be described later based on the block diagram of FIG. 4).
  • the hardware includes a battery module 270, an antenna module 275, at least one optical device 282, 284, speakers (e.g., speakers 255-1, 255-2), and a microphone. (e.g., microphones 265-1, 265-2, 265-3), a light emitting module (not shown), and/or a printed circuit board (PCB) 290 (e.g., a printed circuit board).
  • PCB printed circuit board
  • Various hardware may be placed within frame 200.
  • a microphone e.g., microphones 265-1, 265-2, and 265-3) of the wearable device 101 is disposed in at least a portion of the frame 200 to acquire a sound signal.
  • microphones 265-3 are shown in FIG. 2B, the number and placement of microphones 265 are not limited to the embodiment of FIG. 2B.
  • the wearable device 101 uses a plurality of microphones disposed on different parts of the frame 200 to determine the direction of the sound signal. can be identified.
  • At least one optical device 282 or 284 may project a virtual object on at least one display 250 in order to provide various image information to the user.
  • at least one optical device 282, 284 may be a projector.
  • the at least one optical device 282 or 284 may be disposed adjacent to the at least one display 250 or may be included within the at least one display 250 as part of the at least one display 250 .
  • the wearable device 101 includes a first optical device 282 corresponding to the first display 250-1 and a second optical device (282) corresponding to the second display 250-2. 284) may be included.
  • the at least one optical device 282, 284 may include a first optical device 282 disposed at an edge of the first display 250-1 and an edge of the second display 250-2. It may include a second optical device 284.
  • the first optical device 282 may transmit light to the first waveguide 233 disposed on the first display 250-1
  • the second optical device 284 may transmit light to the second display 250-1.
  • -2) Light can be transmitted to the second waveguide 234 disposed on the top.
  • the camera 260 may include an imaging camera, an eye tracking camera (ET CAM) 260-1, and/or a motion recognition camera 260-2.
  • the shooting camera, eye tracking camera 260-1, and motion recognition camera 260-2 may be placed at different positions on the frame 200 and may perform different functions.
  • the gaze tracking camera 260-1 may output data representing the gaze of the user wearing the wearable device 101.
  • the wearable device 101 may detect the gaze from an image including the user's pupils obtained through the gaze tracking camera 260-1.
  • An example in which the gaze tracking camera 260-1 is positioned toward the user's right eye is shown in FIG. 2B, but the embodiment is not limited thereto, and the gaze tracking camera 260-1 is positioned toward the user's left eye alone. It may be placed towards, or towards both eyes.
  • the capture camera may capture a real image or background to be matched with a virtual image to implement augmented reality or mixed reality content.
  • the capturing camera may capture an image of a specific object that exists at a location where the user is looking, and provide the image to at least one display 250.
  • At least one display 250 displays information about a real image or background including an image of the specific object obtained using a photographing camera, and a virtual image provided through at least one optical device 282 or 284.
  • One overlapping image can be displayed.
  • the imaging camera may be placed on the bridge 203 disposed between the first rim 201 and the second rim 202.
  • the gaze tracking camera 260-1 tracks the gaze of a user wearing the wearable device 101, thereby tracking the user's gaze and visual information provided to at least one display 250. By matching them, a more realistic augmented reality can be realized. For example, when the user looks forward, the wearable device 101 may naturally display environmental information related to the user's front view on at least one display 250 at the location where the user is located.
  • the gaze tracking camera 260-1 may be configured to capture an image of the user's pupil in order to determine the user's gaze. For example, the gaze tracking camera 260-1 may receive gaze detection light reflected from the user's pupil and track the user's gaze based on the position and movement of the received gaze detection light.
  • the eye tracking camera 260-1 may be placed at positions corresponding to the user's left and right eyes.
  • the eye tracking camera 260-1 may be placed within the first rim 201 and/or the second rim 202 to face the direction in which the user wearing the wearable device 101 is located. You can.
  • the motion recognition camera 260-2 specifies the screen provided on at least one display 250 by recognizing the movement of the entire or part of the user's body, such as the user's torso, hands, or face. Events can be provided.
  • the gesture recognition camera 260-2 may recognize a user's gesture, obtain a signal corresponding to the gesture, and provide a display corresponding to the signal to at least one display 250.
  • the processor may identify a signal corresponding to the operation and perform a designated function based on the identification.
  • the motion recognition camera 260-2 may be disposed on the first rim 201 and/or the second rim 202.
  • the camera 260 included in the wearable device 101 is not limited to the eye tracking camera 260-1 and the motion recognition camera 260-2 described above.
  • the wearable device 101 may use the camera 260 disposed toward the user's FoV to identify external objects included within the FoV. Identification of an external object by the wearable device 101 is performed based on a sensor for identifying the distance between the wearable device 101 and the external object, such as a depth sensor and/or a time of flight (ToF) sensor. It can be.
  • the camera 260 disposed toward the FoV may support an autofocus function and/or an optical image stabilization (OIS) function.
  • the wearable device 101 includes a camera 260 (e.g., a face tracking (FT) camera) disposed toward the face of a user wearing the wearable device 101 to obtain an image including the face. ) may include.
  • FT face tracking
  • the wearable device 101 radiates light toward a subject (e.g., the user's eyes, face, and/or an external object within the FoV) captured using the camera 260. It may further include a light source (eg, LED).
  • the light source may include an LED with an infrared wavelength.
  • the light source may be disposed in at least one of the frame 200 and the hinge units 206 and 207.
  • the battery module 270 may supply power to electronic components of the wearable device 101.
  • the battery module 270 may be disposed within the first temple 204 and/or the second temple 205.
  • the battery module 270 may be a plurality of battery modules 270 .
  • a plurality of battery modules 270 may be disposed on each of the first temple 204 and the second temple 205.
  • the battery module 270 may be disposed at an end of the first temple 204 and/or the second temple 205.
  • the antenna module 275 may transmit a signal or power to the outside of the wearable device 101, or may receive a signal or power from the outside.
  • the antenna module 275 may be disposed within the first temple 204 and/or the second temple 205.
  • the antenna module 275 may be placed close to one surface of the first temple 204 and/or the second temple 205.
  • the speaker 255 may output an audio signal to the outside of the wearable device 101.
  • the sound output module may be referred to as a speaker.
  • the speaker 255 may be placed within the first temple 204 and/or the second temple 205 to be placed adjacent to the ear of the user wearing the wearable device 101.
  • the speaker 255 is disposed within the first temple 204 and thus adjacent to the user's left ear
  • the second speaker 255-2 is disposed within the second temple 205 to be adjacent to the user's right ear. It may include a first speaker 255-1, which is disposed adjacent to the ear.
  • a light emitting module may include at least one light emitting device.
  • the light emitting module may emit light in a color corresponding to a specific state or emit light in an operation corresponding to the specific state. For example, when the wearable device 101 requires charging, it may repeatedly emit red light at designated times.
  • the light emitting module may be disposed on the first rim 201 and/or the second rim 202.
  • the wearable device 101 may include a printed circuit board (PCB) 290.
  • the PCB 290 may be included in at least one of the first temple 204 or the second temple 205.
  • the PCB 290 may include an interposer disposed between at least two sub-PCBs.
  • one or more hardware included in the wearable device 101 eg, hardware shown by different blocks in FIG. 4
  • the wearable device 101 may include a flexible PCB (FPCB) for interconnecting the hardware.
  • FPCB flexible PCB
  • the wearable device 101 includes a gyro sensor for detecting the posture of the wearable device 101 and/or the posture of a body part (e.g., head) of a user wearing the wearable device 101, It may include at least one of a gravity sensor and/or an acceleration sensor.
  • the gravity sensor and acceleration sensor may each measure gravitational acceleration and/or acceleration based on designated three-dimensional axes (eg, x-axis, y-axis, and z-axis) that are perpendicular to each other.
  • a gyro sensor can measure the angular velocity of each of designated three-dimensional axes (e.g., x-axis, y-axis, and z-axis).
  • At least one of the gravity sensor, the acceleration sensor, and the gyro sensor may be referred to as an inertial measurement unit (IMU).
  • the wearable device 101 may identify a user's motion and/or gesture performed to execute or stop a specific function of the wearable device 101 based on the IMU.
  • FIGS. 3A and 3B show an example of the appearance of a wearable device, according to an embodiment.
  • the wearable device 101 of FIGS. 3A and 3B may be an example of the electronic device 101 of FIG. 1 .
  • An example of the appearance of the first side 310 of the housing of the wearable device 101, according to one embodiment, is shown in FIG. 3A, with a second side 320 opposite to the first side 310. ) can be shown in Figure 3b.
  • the first surface 310 of the wearable device 101 may have a form attachable to a user's body part (e.g., the user's face).
  • wearable device 101 may include a strap for securing on a user's body part, and/or one or more temples (e.g., first temple 204 in FIGS. 2A-2B, and/or It may further include 2 temples (205).
  • a first display 250-1 for outputting an image to the left eye of the user's eyes, and a second display 250-2 for outputting an image to the right eye of the user's eyes, have a first surface 310. It can be placed on top.
  • the wearable device 101 is formed on the first surface 310 and emits light (e.g., external light (e.g., external light) different from the light emitted from the first display 250-1 and the second display 250-2. Rubber or silicone packing may be further included to prevent interference due to ambient light.
  • light e.g., external light (e.g., external light) different from the light emitted from the first display 250-1 and the second display 250-2.
  • Rubber or silicone packing may be further included to prevent interference due to ambient light.
  • the wearable device 101 includes a camera for photographing and/or tracking both eyes of a user adjacent to each of the first display 250-1 and the second display 250-2. It may include (260-3, 260-4). The cameras 260-3 and 260-4 may be referred to as ET cameras. According to one embodiment, the wearable device 101 may include cameras 260-5 and 260-6 for photographing and/or recognizing the user's face. The cameras 260-5 and 260-6 may be referred to as FT cameras.
  • a camera e.g., Cameras 260-7, 260-8, 260-9, 260-10, 260-11, 260-12
  • sensors e.g., depth sensor 330
  • the cameras 260-7, 260-8, 260-9, and 260-10 are used to recognize an external object (e.g., the external object 220 of FIG. 2) that is different from the wearable device 101.
  • the wearable device 101 may obtain an image and/or video to be transmitted to each of the user's eyes.
  • the camera 260-11 is disposed on the second side 320 of the wearable device 101 to acquire an image to be displayed through the second display 250-2 corresponding to the right eye among the two eyes. You can.
  • the camera 260-12 may be disposed on the second side 320 of the wearable device 101 to acquire an image to be displayed through the first display 250-1 corresponding to the left eye of the two eyes. there is.
  • the wearable device 101 may include a depth sensor 330 disposed on the second surface 320 to identify the distance between the wearable device 101 and an external object. Using the depth sensor 330, the wearable device 101 acquires spatial information (e.g., depth map) about at least a portion of the FoV of the user wearing the wearable device 101. can do.
  • spatial information e.g., depth map
  • a microphone for acquiring sound output from an external object may be disposed on the second surface 320 of the wearable device 101.
  • the number of microphones may be one or more depending on the embodiment.
  • the wearable device 101 includes hardware (e.g., cameras 260-11, 260-12, and/or depth sensor) for identifying body parts including the user's hands. (330)).
  • the wearable device 101 can identify gestures that appear due to motion of body parts.
  • the wearable device 101 may provide a UI based on the identified gesture to a user wearing the wearable device 101.
  • the UI may support functions for editing images and/or videos stored in the wearable device 101.
  • the wearable device 101 may communicate with an external electronic device that is different from the wearable device 101 in order to more accurately identify the gesture.
  • the wearable device 101 of FIG. 4 may be an example of the electronic device 101 of FIG. 1 and the wearable device 101 of FIGS. 2A and 2B.
  • the wearable device 101 includes a processor 120, a memory 130, a display driving circuit 430, a lens 460, a display 450, It may include at least one of the lens driving circuit 440 or the sensor 470.
  • the processor 120, memory 130, display driving circuit 430, lens driving circuit 440, lens 460, display 450, and sensor 470 are connected to a communication bus such as a communication bus. They may be electrically and/or operably coupled to each other by an electronic component.
  • hardware components are operatively combined so that a second hardware component is controlled by a first hardware component among the hardware components, such that a direct connection or an indirect connection between the hardware components is made by wire or wirelessly. It can mean established.
  • the embodiment is not limited thereto, and some of the hardware components shown in FIG. 4 (e.g., the processor 120, the memory 130, and at least a portion of the communication circuit (not shown)) It may be included in a single integrated circuit, such as a system on a chip (SoC).
  • SoC system on a chip
  • the type and/or number of hardware components included in the wearable device 101 are not limited to those shown in FIG. 4 .
  • wearable device 101 may include only some of the hardware components shown in FIG. 4 .
  • the processor 120 of the wearable device 101 may include a hardware component for processing data based on one or more instructions.
  • Hardware components for processing data may include, for example, an arithmetic and logic unit (ALU), a floating point unit (FPU), a field programmable gate array (FPGA), and/or a central processing unit (CPU).
  • ALU arithmetic and logic unit
  • FPU floating point unit
  • FPGA field programmable gate array
  • CPU central processing unit
  • the number of processors 120 may be one or more.
  • the processor 120 may have the structure of a multi-core processor, such as a dual core, quad core, or hexa core.
  • the processor 120 may include a sensor member for processing data obtained from the sensor 470.
  • the processor 120 of FIG. 4 may include the processor 120 of FIG. 1 .
  • the memory 130 of the wearable device 101 includes hardware components for storing data and/or instructions input to and/or output from the processor 120. can do.
  • Memory 130 may include, for example, volatile memory such as random-access memory (RAM), and/or non-volatile memory such as read-only memory (ROM). You can.
  • the volatile memory may include, for example, at least one of dynamic RAM (DRAM), static RAM (SRAM), cache RAM, and pseudo SRAM (PSRAM).
  • the non-volatile memory includes, for example, at least one of programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), flash memory, hard disk, compact disk, and embedded multi media card (eMMC).
  • PROM programmable ROM
  • EPROM erasable PROM
  • EEPROM electrically erasable PROM
  • flash memory hard disk, compact disk, and embedded multi media card
  • eMMC embedded multi media card
  • one or more instructions indicating operations and/or operations to be performed by processor 120 on data may be stored.
  • a collection of one or more instructions may be referred to as firmware, operating system, process, routine, sub-routine and/or application (e.g., application 146 in FIG. 1).
  • the wearable device 101 and/or the processor 120 executes a set of a plurality of instructions distributed in the form of an operating system, firmware, driver, and/or application. , at least one of the operations of FIG. 8, FIG. 10, or FIG. 11 may be performed.
  • This may mean stored in an executable format (eg, a file with an extension specified by the operating system of the wearable device 101) by the processor 120.
  • the processor 120 of the wearable device 101 uses the user's profile information stored in the memory 130 to determine the first light (e.g., illuminance) obtained using the sensor 470, which will be described later. Transparency of the lens 460 can be adjusted based on intensity. The operation of the wearable device 101 to adjust transparency using the user's profile information will be described later with reference to FIGS. 9A and 9B.
  • the first light e.g., illuminance
  • the display driving circuit 430 of the wearable device 101 may be operatively coupled to the display 450.
  • the display driving circuit 430 may obtain commands from the processor 120.
  • the commands may be provided from the processor 120 through various interfaces.
  • the commands may be transmitted from the processor 120 to a mobile industry processor interface (MIPI), mobile display digital interface (MDDI), serial peripheral interface (SPI), inter-integrated circuit (I2C), and/or compact disc (CDP). It may be provided to the display driving circuit 430 through a display port).
  • MIPI mobile industry processor interface
  • MDDI mobile display digital interface
  • SPI serial peripheral interface
  • I2C inter-integrated circuit
  • CDP compact disc
  • each of the above commands can be used to control the display 450.
  • the commands may include commands used to set the brightness of an image displayed through the display 450. However, it is not limited to this.
  • the lens driving circuit 440 of the wearable device 101 may control the lens 460 to adjust the intensity of light passing through the lens 460.
  • the wearable device 101 can adjust the transparency of the lens 460 by using sensor data to reversibly change the state of the transparent member based on the lens driving circuit 440. there is.
  • the wearable device 101 can adjust the intensity of the transmitting light by adjusting the transparency of the lens 460.
  • the wearable device 101 uses the lens driving circuit 440 to adjust the power transmitted to the lens 460, thereby physically or Transparency can be controlled by chemically altering it.
  • the lens 460 of the wearable device 101 may include a transparent member.
  • the number of lenses 460 included in the wearable device 101 may be one or more.
  • the lens 460 may include a transparent member for transmitting light directed to the first side of the lens 460 to the second side.
  • the transparent member may include an electrochromic, suspended particle glass, liquid crystal, photochromic, or electrochromic device for changing the color of a portion of the lens 460. and thermochromic.
  • the display 450 of the wearable device 101 may output visualized information to the user.
  • the number of displays 450 included in the wearable device 101 may be one or more.
  • the display 450 may be controlled by the processor 120 and/or a graphic processing unit (GPU) (not shown) to output visualized information to the user.
  • the display 450 may include a flat panel display (FPD) and/or electronic paper.
  • the FPD may include a liquid crystal display (LCD), a plasma display panel (PDP), a digital mirror device (DMD), one or more light emitting diodes (LEDs), and/or micro LED.
  • the LED may include an organic LED (OLED).
  • the display 450 of FIG. 4 may include the display module 160 of FIG. 1 .
  • transmission of light may occur in at least a portion of display 450.
  • the wearable device 101 can provide a user experience related to augmented reality by providing a combination of light output through the display 450 and light passing through the display 450.
  • at least a portion of the display 450 may overlap with a portion of the lens 460.
  • the display 450 may include a lens 460.
  • the display 450 of the wearable device 101 is worn on a part of the user's body, such as the head. Within the state, it may have a structure to cover the entire user's field-of-view (FoV), or to radiate light toward the FoV.
  • FoV field-of-view
  • the wearable device 101 may include other output means for outputting information in forms other than visual and auditory forms.
  • the wearable device 101 may include at least one speaker for outputting an audio signal, and/or a motor (or actuator) for providing haptic feedback based on vibration.
  • the sensor 470 of the wearable device 101 may be processed by the processor 120 and/or memory 130 from non-electronic information related to the wearable device 101. capable of generating electrical information.
  • the electrical information generated by the sensor 470 may be stored in the memory 130, processed by the processor 120, and/or transmitted to another electronic device distinct from the wearable device 101.
  • the wearable device 101 may obtain information dependent on the state of the wearable device 101 from one or more sensors included in the sensor 470.
  • the processor 120 may identify the surrounding environment of the wearable device 101 from sensor data of one or more sensors included in the sensor 470.
  • the illumination sensor may output an electrical signal representing the intensity (or amount of light) of light reaching at least a portion of the illumination sensor exposed to the outside.
  • the illuminance sensor may output sensor data indicating the intensity of ambient light of the wearable device 101.
  • the illuminance sensor can output data indicating a change in conductivity (eg, photoelectric effect) using electrons generated by ambient light.
  • the illuminance sensor may be an example of a CdS sensor that uses cadmium sulfide as an element.
  • the wearable device 101 may adjust the luminance (or brightness) of the display 450 based on the intensity of light using an illuminance sensor. The operation of the wearable device 101 to adjust brightness based on data acquired using a sensor will be described later with reference to FIGS. 7A and 7B.
  • Embodiments of wearable device 101 are not limited to the type and/or number of sensors illustrated in FIG. 4 .
  • the sensor 470 is a grip sensor that can identify contact between the wearable device 101 and an external object (e.g., a user), and/or can identify movement of the wearable device 101. It may further include a gyro sensor or an acceleration sensor.
  • the wearable device 101 may identify the intensity of ambient light (eg, illuminance) of the wearable device 101 using a sensor.
  • the wearable device 101 can adjust the transparency of the lens and the brightness of the display based on the intensity of ambient light.
  • the wearable device 101 can maintain the difference between the first light passing through the lens and the second light output from the display by adjusting transparency and luminance.
  • the wearable device 101 can improve the visibility of the user wearing the wearable device 101 by maintaining the difference.
  • 5A to 5B illustrate example states for adjusting transparency of a wearable device, according to an embodiment.
  • the wearable device 101 may be an example of the electronic device 101 of FIG. 1 and/or the wearable device 101 of FIG. 2A.
  • 5A to 5B show that the wearable device 101 uses a display to display luminance and/or transparency within the field-of-view (FoV) 507 of the user wearing the wearable device 101.
  • States 500 and 510 displaying visual objects for adjustment are shown.
  • the wearable device 101 may display visual objects 520 and 530 within the FoV 507 of the user 505 wearing the wearable device 101.
  • FoV 507 may include the display area of a display (e.g., display 450 of FIG. 4) and/or the FoV of a camera.
  • the wearable device 101 may control the display to display at least one screen obtained based on the FoV of a camera on at least a portion of the FoV 507.
  • at least one screen obtained based on the FoV of the camera may include information indicating the intensity of light emitted from at least one external object.
  • the visual object 520 may include a visual object 525 having a specified range for adjusting the brightness of a display (eg, display 450 in FIG. 4).
  • the visual object 525 may be represented as a slider for adjusting the brightness of the display.
  • the wearable device 101 may provide information about the luminance of the display (eg, luminance value) to the user using the visual object 525.
  • the wearable device 101 may use the visual object 525 to adjust the brightness of the display within a specified range (eg, about 10 nits to about 2010 nits).
  • the visual object 530 is a visual object 535 for adjusting the transparency of a lens (e.g., lens 460 in FIG. 4) within a specified range (e.g., about 10% to about 100%), and/or It may include a visual object 533 indicating transparency adjustment based on user input.
  • the visual object 535 may be represented as a slider for adjusting the transparency of the lens 460.
  • the wearable device 101 may provide information about transparency to the user using the visual object 535.
  • the visual objects 520 and 530 may be obtained based on a user interface executable by the operating system of the wearable device.
  • State 500 may mean a state in which transparency can be adjusted based on input from the user 505.
  • state 500 may include a state in which the wearable device 101 executes a manual transparency adjustment mode.
  • the wearable device 101 may include an interface for receiving the input in at least a portion of the housing of the wearable device 101.
  • wearable device 101 may receive the input based on identifying the user's 505 gesture.
  • the wearable device 101 may receive the input using an external electronic device (eg, controller) that is different from the wearable device 101.
  • the wearable device 101 may adjust the transparency of the lens 460 using a lens driving circuit (eg, the lens driving circuit 440 of FIG. 4) based on receiving the input.
  • a lens driving circuit eg, the lens driving circuit 440 of FIG. 4
  • the wearable device 101 is based on sensor data (e.g., data indicating the intensity of illumination) acquired using a sensor (e.g., sensor 470 of FIG. 4) in state 500.
  • sensor data e.g., data indicating the intensity of illumination
  • the wearable device 101 may use a sensor to identify the intensity of first light (eg, ambient light) directed to the wearable device from the external environment.
  • the wearable device 101 may obtain the luminance of the display in proportion to the intensity of the first light.
  • the wearable device 101 based on receiving an input for adjusting transparency using a visual object 535, adjusts the transparency and/or the intensity of the first light, as described later in FIG. 6. From the data, luminance can be obtained.
  • the wearable device 101 may receive an input indicating changing the visual object 533 in state 510.
  • the input may be an example of an input indicating that transparency is automatically adjusted based on ambient light, independently of a user input for adjusting transparency.
  • state 510 may include a state in which the wearable device 101 executes an automatic transparency adjustment mode.
  • the wearable device 101 may adjust transparency based on sensor data obtained using a sensor. For example, the wearable device 101 may identify the intensity of the first light directed to the wearable device from the surrounding environment. For example, the wearable device 101 may change transparency within a specified range (eg, approximately 10% to approximately 90%) based on the intensity of the first light. The wearable device 101 may reduce transparency based on identifying that the intensity of the first light increases. The wearable device 101 may acquire luminance using data to be described later in FIG. 6 based on the increased intensity of the first light and/or the decreased transparency.
  • a specified range eg, approximately 10% to approximately 90%
  • wearable device 101 may receive user input for adjusting transparency using visual object 535.
  • the wearable device 101 may obtain user profile information corresponding to the intensity of the first light in response to the received user input.
  • the profile information may include a relationship between intensity and transparency of the first light.
  • the profile information may include information between the intensity and transparency of the first light customized to the user wearing the wearable device 101. The operation of the wearable device 101 to adjust transparency based on the intensity of the first light using the user's profile information will be described later with reference to FIGS. 9A and 9B.
  • the wearable device 101 may control the transparency of the lens, in state 500, using the visual object 535 in response to an input indicating adjustment of transparency.
  • the wearable device 101 may adjust the brightness of the display using the transparency and/or the intensity of ambient light obtained through a sensor.
  • the wearable device 101 may obtain transparency corresponding to the intensity of ambient light in state 510.
  • the wearable device 101 may adjust the brightness of the display based on the intensity of ambient light and/or the obtained transparency.
  • the wearable device 101 can prevent glare to the user due to the brightness by adjusting the brightness based on the intensity and/or transparency of ambient light.
  • FIG. 6 the operation of the wearable device 101 to obtain the luminance based on the relationship between the intensity of ambient light and transparency and the reference luminance related to the light included in the luminance will be described later.
  • FIG. 6 shows an example graph representing the relationship between luminance, transparency, and illuminance, according to one embodiment.
  • the wearable device in FIG. 6 may be an example of the electronic device 101 in FIG. 1 and/or the wearable device 101 in FIGS. 2A to 5 .
  • a graph 600 is shown showing the relationship between intensity, transparency, and luminance of first light (eg, ambient light).
  • a wearable device includes first light directed to the first side of the lens (or display), transparency of the lens (e.g., lens 460 in FIG. 4), and/or a display (e.g., the display in FIG. 4).
  • Information indicating the relationship between luminance (450)) can be identified.
  • the graph 600 may include information representing the above relationship.
  • the wearable device may adjust luminance in proportion to the intensity of the first light.
  • a wearable device can obtain luminance in proportion to transparency.
  • the wearable device can obtain the luminance of the display based on the first light and transparency using Equation 1.
  • Equation 1 above is only an example to aid understanding, but is not limited thereto, and can be modified, applied, or expanded in various ways.
  • the luminance of the display may mean the intensity of the first light (eg, ambient light).
  • the transparency of the lens may mean the intensity of the first light attenuated by the lens.
  • the wearable device can maintain the difference between the intensity of the attenuated first light and the luminance by using the reference luminance.
  • the reference luminance may be absolutely the same, independent of the intensity and transparency of the first light.
  • the wearable device can provide an augmented reality service to the user based on constant visibility, independently of the surrounding environment of the wearable device.
  • the wearable device can adjust the brightness of the display in proportion to the intensity of the first light obtained from the sensor using Equation 1 above.
  • the wearable device stores each data for luminance, first light, and/or transparency mapped to the graph 600 and/or Equation 1, through an application program executable by the operating system of the wearable device. It can be saved as a resource file (or format) included in the package.
  • a wearable device can display the data as shown in Table 1.
  • a wearable device has the formula of Equation 1: and/or By adjusting , the data in Table 1 can be changed.
  • a wearable device can adjust luminance within a specified threshold (eg, approximately 2010 nits).
  • the wearable device controls the lens based on the first transparency (e.g., about 40%), and uses a sensor to identify the first intensity of the first light (e.g., about 10000 lux), A second intensity (e.g., about 400000 lux), which is greater than the first intensity of the first light, can be identified.
  • the wearable device may control a display (e.g., display 450 of FIG.
  • the wearable device controls the lens based on the first transparency (e.g., about 10%) and determines the first transparency within the condition that identifies the first intensity of the first light (e.g., about 10000 lux). 2 Input may be received indicating adjustment to transparency (e.g., approximately 80%). For example, the wearable device may control the lens, independently of the received input, based on the first intensity and based on the first transparency.
  • first transparency e.g., about 10%
  • 2 Input may be received indicating adjustment to transparency (e.g., approximately 80%).
  • the wearable device may control the display to display a visual object indicating a decrease in transparency. For example, this state may be included in state 510 of FIG. 5B.
  • a wearable device may provide the user with information about a designated threshold for controlling luminance by displaying a visual object.
  • Table 1 is based on data set by the user and/or the wearable device, and is not limited to the above-described embodiment. As an example, the wearable device may set a designated threshold of luminance to approximately 2010 nits or higher.
  • the wearable device includes data (e.g., graph 600) indicating the relationship between the intensity, transparency, and/or luminance of the first light (e.g., ambient light) obtained using a sensor. Based on the data included in )), the display can be controlled and the brightness can be adjusted. For example, the wearable device can identify the first light attenuated by the lens by controlling the lens based on transparency. The wearable device may obtain the luminance using the intensity of the attenuated first light and a reference luminance related to the second light included in the luminance. By using the reference luminance, the wearable device can provide an augmented reality service based on constant visibility to the user, independently of the surrounding environment.
  • data e.g., graph 600
  • the display Based on the data included in )), the display can be controlled and the brightness can be adjusted.
  • the wearable device can identify the first light attenuated by the lens by controlling the lens based on transparency.
  • the wearable device may obtain the luminance using the intensity of the attenuated first light and
  • FIGS. 7A to 7B illustrate example states in which a wearable device adjusts luminance, according to an embodiment.
  • the wearable device of FIGS. 7A to 7B may be an example of the electronic device 101 of FIG. 1 and/or the wearable device 101 of FIGS. 2A to 5 .
  • States 700, 705, 710, 715, and 720 of FIGS. 7A-7B may be included in state 500 of FIG. 5A.
  • States 700, 705, 710, 715, and 720 may refer to states in which a portion of the lens or display of the wearable device 101 is enlarged.
  • 7A to 7B, the lens 701 of the wearable device 101 and the display area 703 of the display are shown separately within the states 700, 705, 710, 715, and 720; It is not limited to examples.
  • the wearable device 101 displays a screen 702 within a display area 703 of a display (e.g., display 450 of FIG. 4). can do.
  • Screen 702 may include visual objects 520 and 530.
  • the wearable device 101 uses a sensor (e.g., sensor 470 in FIG. 4) to detect the first intensity of the first light (e.g., ambient light) indicating the illuminance (or ambient light) of the surrounding environment of the wearable device. , approximately 10000 lux) can be identified.
  • the wearable device 101 acquires a first luminance (e.g., about 510 nits) using Equation 1 in FIG. 6 based on the first intensity and first transparency (e.g., about 100%) of the first light. You can.
  • the wearable device 101 may control the display based on the first luminance and display the screen 702 within the display area 703.
  • wearable device 101 may use a sensor to identify a second intensity that is greater than or equal to the first intensity of the first light (e.g., approximately 40000 lux).
  • the wearable device 101 may acquire a second luminance (eg, about 2010 nits) based on the second intensity of the first light and the first transparency (eg, about 100%).
  • the wearable device 101 may control the display based on the second luminance to display the screen 702 within the display area 703.
  • identifying a change from state 700 to state 705 means that the wearable device 101 changes the surrounding environment of the wearable device 101 from low illuminance to relatively high illuminance. ) may mean that the change has been identified.
  • wearable device 101 may, in response to an input indicating an adjustment of transparency, adjust a second transparency, different from the first transparency (e.g., about 50%), the lens 701 can be controlled.
  • the wearable device 101 uses Equation 1 in FIG. 6 based on the second intensity of the first light (e.g., about 40000 lux) and the second transparency (e.g., about 50%).
  • 3 luminance e.g., approximately 1010 nits
  • the wearable device 101 may display at least one screen on the display based on the third luminance.
  • the wearable device 101 may use a sensor to identify that the surrounding environment of the wearable device 101 changes from high illuminance to low illuminance. For example, the wearable device 101 may identify the first intensity (eg, about 10000 lux) of the first light during a specified time using a sensor. The wearable device 101 uses the visual object 535 to control the lens 701 based on the first transparency (e.g., about 50%) in response to an input for setting transparency. You can. The wearable device 101 may acquire a first luminance (e.g., about 260 nits) based on the first intensity (e.g., about 10000 lux) and the first transparency (e.g., about 50%).
  • a first luminance e.g., about 260 nits
  • the wearable device 101 may display the screen 702 in the display area 703 based on the first luminance (e.g., about 260 nits) while displaying the second intensity (less than the first intensity of the first light). Yes, approximately 1000 lux) can be discerned.
  • the wearable device 101 identifies a second luminance (e.g., about 35 nits) using Equation 1 in FIG. 6 based on the second intensity and/or the first transparency (e.g., about 50%). can do.
  • a second luminance e.g., about 35 nits
  • the first transparency e.g., about 50%
  • the wearable device 101 uses a sensor to identify that the intensity of the first light in the surrounding environment of the wearable device 101 is variably changed
  • the wearable device 101 uses a sensor to determine the intensity of the first light based on the first luminance.
  • the screen can be displayed through the display.
  • the wearable device 101 may respond to an input indicating a change in transparency from the user while displaying the screen 702 in the display area 703 based on the second luminance (e.g., about 35 nits).
  • a second transparency e.g, approximately 100%
  • the wearable device 101 uses Equation 1 in FIG. 6 based on the second intensity (e.g., about 1000 lux) and/or the second transparency (e.g., about 100%) to determine the third luminance (e.g., Approximately 60 nit) can be obtained.
  • the wearable device 101 may display at least one screen using a display based on the third luminance.
  • the wearable device 101 when the wearable device 101 fails to identify a change in the surrounding environment of the wearable device 101, it uses the visual object 535 to provide an input indicating transparency adjustment. can be identified.
  • wearable device 101 may, in state 715, determine the intensity of the first light (e.g., ambient light) included in the sensor's data, and/or the first transparency (e.g., about 100%).
  • the brightness of the display can be set to the first brightness (eg, about 510 nits).
  • the wearable device 101 may control the display and display at least one screen based on the set first luminance.
  • wearable device 101 may, in state 725, while displaying the at least one screen, use visual object 535 to display a first transparency in response to an input indicating a change in first transparency. 2 Transparency (e.g., approximately 50%) can be discerned.
  • the wearable device may obtain a second luminance (eg, about 260 nits) using Equation 1 in FIG. 6 based on the intensity of the first light and/or the second transparency.
  • the wearable device 101 may control the display and display at least one screen based on the acquired second luminance.
  • the wearable device 101 uses a sensor (e.g., sensor 470 in FIG. 4) to detect ambient light of the wearable device 101.
  • the intensity can be identified.
  • the wearable device 101 may identify the first luminance of the display based on the identified intensity. For example, the wearable device 101 may receive an input indicating to adjust the transparency of the lens while displaying a screen on the display based on the first luminance.
  • the wearable device 101 may change the transparency of the lens with respect to the intensity of the ambient light by changing the state of the transparent member of the lens in response to the received input.
  • the wearable device 101 may determine whether the intensity of the ambient light changes using a sensor based on the change in transparency of the lens.
  • the wearable device 101 may identify a second luminance that is different from the first luminance based on the intensity of the ambient light and/or transparency of the lens.
  • the wearable device 101 may control the display and display at least one screen based on the second luminance.
  • wearable device 101 adjusts the luminance of the display based on identifying the surrounding environment of wearable device 101 and/or interaction with the user (e.g., input indicating a change in transparency). You can control it. For example, while the wearable device 101 controls the display based on the specified luminance corresponding to the low-light environment, the wearable device 101 may identify that it is moving from a low-light environment to a high-light environment. For example, when the intensity of ambient light in the high-brightness environment is relatively greater than the specified luminance, the user can identify glare by the luminance of the display. To remove the identified glare, the user can use visual object 535 to reduce transparency.
  • the wearable device 101 may change the designated luminance to a luminance corresponding to the high-brightness environment based on the reduced transparency and the intensity of the ambient light.
  • the wearable device 101 may provide an augmented reality service with reduced glare to a user of the wearable device (eg, user 505) by variably controlling the brightness of the display.
  • FIG. 8 is an example flowchart showing the operation of a wearable device, according to one embodiment. At least one of the operations in FIG. 8 is performed by the processor (e.g., the processor 120 in FIG. 4) of the electronic device 101 in FIG. 1 and/or the wearable device 101 in FIGS. 2A to 5. You can. In the following embodiments, each operation may be performed sequentially, but is not necessarily performed sequentially. For example, the order of each operation may be changed, and at least two operations may be performed in parallel.
  • the processor e.g., the processor 120 in FIG. 4
  • each operation may be performed sequentially, but is not necessarily performed sequentially. For example, the order of each operation may be changed, and at least two operations may be performed in parallel.
  • the wearable device changes the state of the transparent member included in the lens based on the intensity of the first light included in the data of the sensor, and changes the state of the transparent member included in the first light.
  • the intensity of the first light included in the data of the sensor may mean the intensity of illumination obtained by the wearable device using the sensor.
  • the wearable device may use a lens driving circuit (eg, the lens driving circuit 440 of FIG. 4) of the wearable device to change the state of the transparent member.
  • the wearable device can adjust the transparency of the lens based on changes in the physical and/or chemical state of the transparent member.
  • the wearable device can control the lens based on the adjusted transparency.
  • the wearable device may obtain the luminance of the display based on the intensity of the first light attenuated by the lens and the reference luminance related to the second light.
  • the first light may transmit from the first side of the lens to the second side of the lens.
  • the intensity of the first light transmitted from the second surface may be attenuated based on the transparency of the lens.
  • the wearable device may output second light through a display area of a display formed on the second surface of the lens.
  • the wearable device may identify the intensity of the attenuated first light and the reference luminance related to the second light using Equation 1 in FIG. 6 .
  • the reference luminance may mean the difference between the intensity of the attenuated first light and the intensity of the second light.
  • the reference luminance may refer to the difference between ambient light passing through the lens of the wearable device and light output from the display.
  • the wearable device can provide an augmented reality service based on substantially the same visibility to the user.
  • luminance may further include the second light and color information.
  • the wearable device may control the display based on the obtained luminance to display a screen within the display area.
  • the display area may be referenced to display area 703 in FIG. 7A.
  • Wearable devices can improve the user experience for augmented reality by obtaining luminance based on ambient light and/or transparency of the lens.
  • FIGS. 9A and 9B an operation in which the wearable device controls the transparency of the lens using the user's profile information to maintain the brightness of the display will be described later.
  • FIGS. 9A to 9B illustrate example states in which a wearable device adjusts luminance according to user preference, according to an embodiment.
  • the wearable device 101 of FIGS. 9A and 9B may be an example of the electronic device 101 of FIG. 1 and/or the wearable device 101 of FIGS. 2A and 7B.
  • States 900, 905, 910, and 920 of FIGS. 9A-9B may be included in state 510 of FIG. 5B.
  • the wearable device 101 may, in state 900, identify the first intensity of the first light using a sensor (e.g., sensor 470 of FIG. 4). there is.
  • the wearable device 101 may adjust the transparency of the lens based on the first intensity.
  • the wearable device may map the transparency of the lens corresponding to the intensity of the first light (e.g., illuminance) using at least one data stored in a memory (e.g., memory 130 of FIG. 4).
  • the wearable device 101 may identify a second intensity that is higher than the first intensity using a sensor. Based on identifying the second intensity, wearable device 101 may reduce the transparency of the lens.
  • the wearable device 101 may adjust the brightness of the display using the first light based on the second intensity attenuated by the lens, based on the reduced transparency of the lens.
  • the wearable device 101 uses a sensor to identify that a user wearing the wearable device 101 (e.g., user 505 in FIG. 5A) moves from a low-light environment to a high-light environment. , By adjusting the transparency of the lens, glare to the user can be prevented.
  • the wearable device 101 may use a sensor to identify first light based on a third intensity that is less than the first intensity.
  • the wearable device 101 may increase the transparency of the lens 701 by identifying the first light based on the third intensity.
  • the wearable device 101 may adjust the brightness of the display using the increased transparency and the first light based on the third intensity.
  • the wearable device 101 uses a sensor to increase the transparency of the lens based on identifying that the user (e.g., user 505 in FIG. 5A) moves from a high-light environment to a low-light environment, User visibility can be secured.
  • the wearable device 101 in state 900, uses the user's profile information stored in the memory to identify the user (e.g., the user 505 in FIG. 5A) and/or the wearable device 101.
  • the relationship between the intensity of the first light and transparency set by can be obtained.
  • the above relationship is Equation 1 in FIG. 6 It can be expressed as a value corresponding to .
  • the profile information may include information about the relationship based on user preferences.
  • the wearable device 101 may identify a first intensity of the first light (e.g., about 10000 lux) and determine a first luminance (e.g., about 510 nits) based on the first transparency (e.g., about 100%). ) can be obtained.
  • the wearable device 101 may obtain the relationship between the first intensity and the first transparency based on the user's profile information.
  • the wearable device 101 may store the relationship in memory.
  • the wearable device 101 may store the relationship in memory based on obtaining a value corresponding to the relationship a specified number of times or more. However, it is not limited to this.
  • the wearable device may generate the user's profile information by accumulating (or adding) values corresponding to the relationship.
  • the wearable device 101 may, in state 905, display a first intensity of the first light (e.g., about 510 nits) while displaying at least one screen through the display. , it is possible to identify a second intensity (eg, about 40,000 lux) of the first light, which is about 10,000 lux or more.
  • the wearable device 101 may obtain a second transparency (eg, about 25%) based on identifying the second intensity and using the user's profile information. For example, a relationship between a first intensity (e.g., about 10000 lux) and a first transparency (e.g., about 100%), and a second intensity (e.g., about 40000 lux) and a second transparency (e.g., about 25%).
  • the wearable device 101 may maintain the first luminance (eg, about 510 nits) obtained in state 900.
  • the wearable device 101 may provide a substantially similar virtual reality environment to the user by adjusting transparency based on profile information independently of changes in the states 900 and 905 of the wearable device 101.
  • the wearable device 101 is based on the first transparency (about 50%) corresponding to the first intensity of the first light (e.g., about 5000 lux), which is obtained based on sensor data.
  • the lens 701 can be controlled.
  • the wearable device 101 may obtain a first luminance (eg, about 135 nits) of the display based on the first intensity and/or the transparency.
  • the wearable device 101 may identify changes in the surrounding environment of the wearable device 101 while displaying a screen through the display based on the first luminance. For example, the wearable device 101 may identify a second intensity (e.g., an intensity less than about 2500 lux) that is less than a first intensity (e.g., about 5000 lux). The wearable device 101 may identify a second transparency (eg, about 100%) corresponding to the second intensity. The wearable device 101 may acquire a second luminance (eg, about 60 nits) based on the second intensity and/or the second transparency. The wearable device 101 may display a screen through a display based on the second luminance. When the wearable device 101 determines that the surrounding environment of the wearable device changes from a bright environment to a dark environment, it can prevent amblyopia of the user by increasing transparency.
  • a second intensity e.g., an intensity less than about 2500 lux
  • the wearable device 101 may identify a second transparency (eg, about 100%) corresponding to
  • the wearable device 101 is in state 910 and receives first light (e.g., illuminance) obtained using a sensor (e.g., sensor 470 of FIG. 4).
  • a first luminance e.g., approximately 510 nit
  • 1 intensity e.g, approximately 10000 lux
  • first transparency e.g, approximately 100%.
  • the wearable device 101 may control a display (eg, the display 450 of FIG. 4) based on the first luminance to display at least one screen.
  • wearable device 101 while displaying the at least one screen, responds to an input to adjust transparency of lens 701 using visual object 535.
  • a second transparency eg, about 50%
  • the wearable device 101 determines a relationship between the first intensity of the first light (e.g., about 10000 lux) and the second transparency (e.g., about 50%), based on obtaining the second transparency (e.g., about 50%).
  • the user's profile information representing can be obtained.
  • the wearable device 101 may acquire a second luminance (eg, approximately 250 nits) based on a first intensity (eg, approximately 10000 lux) and a second transparency (eg, approximately 50%).
  • the wearable device 101 allows a user wearing the wearable device 101 (e.g., user 505 in FIG. 5A) to view the first image attenuated by a lens based on a second transparency (e.g., about 50%). Preferences for identifying the first century can be inferred.
  • the wearable device 101 may identify changes in the surrounding environment of the wearable device 101 using a sensor while the profile information has been acquired.
  • Wearable device 101 in state 920, may use a sensor to identify a second intensity of the first light (e.g., about 40000 lux) that is greater than or equal to the first intensity of the first light (e.g., about 10000 lux).
  • the wearable device 101 may obtain a second transparency (eg, about 12.5%) using the profile information based on identifying the second intensity of the first light.
  • the wearable device 101 may maintain the second luminance (eg, 250 nits) by obtaining the second transparency based on the second intensity.
  • the wearable device 101 uses a visual object 535 to display a first transparency (e.g., about 50%) corresponding to a first intensity of the first light (e.g., about 5000 lux). You can identify inputs that indicate reduction.
  • the wearable device 101 may obtain a second transparency (eg, about 10%) in response to the identified input.
  • the wearable device 101 may obtain profile information indicating the relationship between the intensity of the first light and the transparency of the lens.
  • the wearable device 101 may acquire a first luminance (eg, about 35 nits) based on the first intensity and/or the second transparency.
  • the wearable device 101 may use a sensor to identify that a user wearing the wearable device moves from a bright environment to a relatively dark environment. As an example, within the dark environment, the wearable device 101 may identify a second intensity (eg, about 1000 lux) of the first light. The wearable device 101 uses the profile information to control the lens based on the third transparency (e.g., about 50%) corresponding to the second intensity (e.g., about 1000 lux) of the first light, thereby displaying the image on the display. Display of the screen based on the first luminance may be maintained.
  • a second intensity eg, about 1000 lux
  • the wearable device 101 may obtain profile information indicating the relationship between the illuminance of the surrounding environment of the wearable device and the transparency of the lens.
  • the wearable device 101 can adjust the transparency of the lens according to the illuminance of the surrounding environment using the profile information.
  • the wearable device 101 may provide luminance suitable for the user, independent of the user's visibility and/or glare on the display, by adjusting the transparency of the lens using profile information.
  • FIG. 10 is an example flowchart showing the operation of a wearable device, according to an embodiment. At least one of the operations of FIG. 10 may be performed by a processor (e.g., processor 120 of FIG. 4) of the electronic device 101 of FIG. 1 and/or the wearable device 101 of FIGS. 2A to 9B. You can. In the following embodiments, each operation may be performed sequentially, but is not necessarily performed sequentially. For example, the order of each operation may be changed, and at least two operations may be performed in parallel.
  • a processor e.g., processor 120 of FIG. 4
  • each operation may be performed sequentially, but is not necessarily performed sequentially. For example, the order of each operation may be changed, and at least two operations may be performed in parallel.
  • the wearable device detects the first light based on the intensity of the first light included in the data of the sensor while the wearable device is worn by the user. You can change the transparency of the lens.
  • This state may be referenced to state 510 in FIG. 5B.
  • the wearable device may change the transparency of the lens (e.g., lens 460 in FIG. 4) based on the intensity of the first light (e.g., illuminance) based on the transparency automatic adjustment mode.
  • the wearable device may acquire the luminance of the display based on the intensity of the first light attenuated by the lens. For example, the wearable device may obtain the luminance of the display using Equation 1 in FIG. 6 based on the attenuated intensity of the first light. For example, the wearable device may obtain the luminance using a reference luminance representing the difference between the second light output from the display and the first light. For example, the reference luminance may be set by a user and/or a wearable device independently of the intensity of the first light and/or the intensity of the second light.
  • the wearable device identifies a change in the intensity of the first light included in the data of the sensor while displaying the screen in the display area, based on the obtained luminance.
  • identifying a change in the intensity of the first light may mean that the wearable device identifies a change in the surrounding environment of the wearable device.
  • a change in the surrounding environment may mean a change in the intensity of illumination that the wearable device identifies using a sensor.
  • a wearable device may use a sensor to identify that a user moves from a dark environment to a bright environment.
  • a wearable device can use sensors to identify that a user moves from a bright environment to a dark environment.
  • the wearable device adjusts transparency using the user's profile information related to the attenuated intensity of the first light, based on identifying a change in the intensity of the first light.
  • the profile information may include information about transparency corresponding to the intensity of the first light to maintain the attenuated intensity of the first light.
  • the wearable device can maintain brightness.
  • the wearable device 101 uses the profile information to maintain the intensity of the first light attenuated by the lens controlled based on transparency, independently of the intensity of the first light obtained by the sensor, The brightness of the display can be maintained.
  • a wearable device can provide a screen based on customized luminance to the user by maintaining luminance using the user's profile information, independently of the surrounding environment of the wearable device.
  • FIG. 11 is an example flowchart showing the operation of a wearable device, according to an embodiment. At least one of the operations of FIG. 11 may be performed by a processor (e.g., processor 120 of FIG. 4) of the electronic device 101 of FIG. 1 and/or the wearable device 101 of FIGS. 2A to 9B. You can.
  • a processor e.g., processor 120 of FIG. 4
  • each operation may be performed sequentially, but is not necessarily performed sequentially. For example, the order of each operation may be changed, and at least two operations may be performed in parallel.
  • the wearable device changes the state of the lens based on the intensity of the first light included in the data of the sensor to change the state of the first light output from one side of the lens.
  • the intensity of the first light may be obtained using a sensor and/or a camera.
  • At least one screen may be an example of a screen acquired through a camera based on the FoV 507 of FIG. 5A.
  • the first light may be emitted from at least one external object (or subject) included in a screen obtained through a camera.
  • the wearable device may obtain the luminance of the display based on the changed intensity of the first light and the reference luminance related to the second light output from the display.
  • the second light may be generated by at least one screen displayed from the display.
  • the at least one screen may include the second light and at least one color information.
  • the reference luminance may mean the difference between the changed intensity of the first light and the intensity of the second light.
  • the wearable device may control the display to display a screen with the obtained luminance within a display area formed on one side of the lens.
  • the displayed screen may include information about the changed intensity of the first light and the second light.
  • a wearable device can reduce the fatigue of a user using an augmented reality service by displaying a screen based on the obtained luminance.
  • a wearable device may adjust the brightness of a display using transparency based on the intensity of ambient light of the wearable device.
  • a method is needed for a wearable device to provide a screen based on luminance suitable for a user wearing the wearable device.
  • the wearable device 101 may include a lens 460 (701), a display 450, a sensor 470, and a processor 120.
  • the processor may change the intensity of the first light output from one side of the lens by changing the state of the lens based on the intensity of the first light included in the data of the sensor.
  • the processor may obtain the luminance of the display based on the changed intensity of the first light and a reference luminance associated with the second light output from the display.
  • the processor may be configured to control the display to display a screen 702 having the obtained luminance within a display area 703 formed on the one surface of the lens.
  • the processor changes the state of a transparent member included in the lens based on the intensity of the first light included in the data of the sensor, and changes the state of the lens for the first light. Transparency can be changed.
  • the processor may change the intensity of the first light based on the changed transparency of the lens.
  • the processor may identify a change in the intensity of the first light included in the data from the sensor. Based on identifying a change in the intensity of the first light, the processor determines the transparency of the lens using profile information of a user 505 wearing the wearable device associated with the changed intensity of the first light. It may be configured to control and maintain the brightness.
  • the processor may obtain the luminance in proportion to at least one of the intensity of the first light or the transparency of the lens.
  • the wearable device 101 includes a lens 460 (701) that transmits the first light directed to the first side to the second side, and a mark formed on the second side of the lens. It may include a display 450, a sensor 470, and a processor 120 for outputting second light through the area 703.
  • the processor changes the state of a transparent member included in the lens based on the intensity of the first light included in the data of the sensor to change the transparency of the lens with respect to the first light. can be changed.
  • the processor may obtain the luminance of the display based on the intensity of the first light attenuated by the lens and a reference luminance associated with the second light.
  • the processor may be configured to control the display based on the obtained luminance to display a screen 702 within the display area.
  • the processor may identify a change in the intensity of the first light included in the data from the sensor.
  • the processor determines the transparency using profile information of a user 505 wearing the wearable device associated with the attenuated intensity of the first light, based on identifying a change in the intensity of the first light. It may be configured to control and maintain the brightness.
  • the profile information may be set to maintain the attenuated intensity of the first light.
  • the processor may, based on the luminance corresponding to a specified threshold, identify an increase in the intensity of the first light using the sensor within a state of displaying the screen, It may be configured to display, on the display, a first visual object.
  • the processor may be configured to, within the state of displaying the screen, adjust the transparency based on the intensity of the first light, independently of an input indicating increasing the transparency.
  • the processor may be configured to obtain the luminance in proportion to the intensity of the first light.
  • the processor may be configured to obtain the luminance in proportion to the transparency.
  • the processor may be configured to change the transparency within a specified range based on the intensity of the first light.
  • the processor may be configured to display a second visual object 530 for adjusting the transparency.
  • the processor may be configured to control the transparency using a second visual object in response to an input indicating adjustment of the transparency.
  • the method includes detecting the first light directed to the first surface based on the intensity of the first light included in the data of the sensor 470. It may include an operation 810 of changing the transparency of the lens with respect to the first light by changing the state of a transparent member included in the lens 460 (701) that transmits through the second surface.
  • the method based on the intensity of the first light attenuated by the lens and a reference luminance associated with the first light and the second light, through a display area 703 formed on a second side of the lens, An operation 820 of acquiring the luminance of the display 450 to output the second light may be included.
  • the method may include an operation 830 of controlling the display based on the obtained luminance to display a screen 702 within the display area.
  • the method may include identifying a change in the intensity of the first light included in the data from the sensor.
  • the method uses profile information of a user 505 wearing the wearable device, which is related to the attenuated intensity of the first light, based on identifying a change in the intensity of the first light, to determine the transparency. It may include an operation of controlling and maintaining the brightness.
  • the method may include an operation of maintaining the luminance by controlling the transparency using the set profile information to maintain the attenuated intensity of the first light.
  • the method may be based on identifying, using the sensor, an increase in the intensity of the first light within a state of displaying the screen, based on the luminance corresponding to a specified threshold.
  • the operation may include displaying a first visual object on the display.
  • the method may include, within the state of displaying the screen, adjusting the transparency based on the intensity of the first light, independently of an input indicating increasing the transparency.
  • the method may include obtaining the luminance in proportion to the intensity of the first light.
  • the method may include changing the transparency within a specified range based on the intensity of the first light.
  • the method may include displaying a second visual object 530 for adjusting the transparency.
  • the method may include controlling the transparency using a second visual object in response to an input indicating that the transparency is to be adjusted.
  • the wearable device 101 includes a memory 130, a lens 460 that transmits the first light directed to the first side to the second side, and a lens 460 that transmits the first light directed to the first side to the second side. It may include a display 450, a sensor 470, and a processor 120 for outputting second light through the display area 703 formed in .
  • the processor may change the transparency of the lens for the first light based on the intensity of the first light included in the data of the sensor while the wearable device is worn by the user 505. .
  • the processor may obtain the luminance of the display based on the intensity of the first light attenuated by the lens.
  • the processor may identify a change in the intensity of the first light included in the data from the sensor while displaying the screen 702 in the display area, based on the obtained luminance.
  • the processor may maintain the brightness by controlling the transparency using the user's profile information related to the attenuated intensity of the first light, based on identifying a change in the intensity of the first light.
  • the processor may obtain the luminance of the display based on the intensity of the first light attenuated by the lens and a reference luminance associated with the second light.
  • the profile information may be set to maintain the attenuated intensity of the first light.
  • the processor may obtain the luminance in proportion to the intensity of the first light.
  • the processor may, based on the luminance corresponding to a specified threshold, identify an increase in the intensity of the first light using the sensor within a state of displaying the screen, On the display, a first visual object may be displayed.
  • the processor may display a second visual object 530 for adjusting the transparency.
  • a computer-readable storage medium storing one or more programs according to an embodiment as described above
  • the one or more programs when executed by an electronic device, the one or more programs are based on the intensity of the first light included in the data of the sensor, It may be configured to cause the electronic device to change the intensity of the first light output from one side of the lens by changing the state of .
  • the one or more programs when executed by the electronic device, are configured to cause the electronic device to obtain a luminance of the display based on the changed intensity of the first light and a reference luminance associated with the second light output from the display. It can be.
  • the one or more programs when executed by the electronic device, they control the display to display a screen 702 having the obtained luminance within a display area 703 formed on the one side of the lens. It may be configured to cause an electronic device.
  • Electronic devices may be of various types.
  • Electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
  • Electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used simply to distinguish one element from another, and may be used to distinguish such elements in other respects, such as importance or order) is not limited.
  • One (e.g. first) component is said to be “coupled” or “connected” to another (e.g. second) component, with or without the terms “functionally” or “communicatively”.
  • any of the components can be connected to the other components directly (e.g. wired), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. It can be used as A module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions. For example, according to one embodiment, the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document are one or more instructions stored in a storage medium (e.g., built-in memory 136 or external memory 138) that can be read by a machine (e.g., electronic device 101). It may be implemented as software (e.g., program 140) including these.
  • a processor e.g., processor 120
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and this term refers to cases where data is semi-permanently stored in the storage medium. There is no distinction between temporary storage cases.
  • Computer program products are commodities and can be traded between sellers and buyers.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or through an application store (e.g. Play StoreTM) or on two user devices (e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • a machine-readable storage medium e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play StoreTM
  • two user devices e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • at least a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or plural entity, and some of the plurality of entities may be separately placed in other components. there is.
  • one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar manner as those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, or omitted. Alternatively, one or more other operations may be added.
  • the device described above may be implemented with hardware components, software components, and/or a combination of hardware components and software components.
  • the devices and components described in the embodiments include a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA), and a programmable logic unit (PLU).
  • ALU arithmetic logic unit
  • FPGA field programmable gate array
  • PLU programmable logic unit
  • It may be implemented using one or more general-purpose or special-purpose computers, such as a logic unit, microprocessor, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system. Additionally, a processing device may access, store, manipulate, process, and generate data in response to the execution of software.
  • OS operating system
  • a processing device may access, store, manipulate, process, and generate data in response to the execution of software.
  • a single processing device may be described as being used; however, those skilled in the art will understand that a processing device includes multiple processing elements and/or multiple types of processing elements. It can be seen that it may include.
  • a processing device may include a plurality of processors or one processor and one controller. Additionally, other processing configurations, such as parallel processors, are possible.
  • Software may include a computer program, code, instructions, or a combination of one or more of these, which may configure a processing unit to operate as desired, or may be processed independently or collectively. You can command the device.
  • the software and/or data may be embodied in any type of machine, component, physical device, computer storage medium or device for the purpose of being interpreted by or providing instructions or data to the processing device. there is.
  • Software may be distributed over networked computer systems and stored or executed in a distributed manner.
  • Software and data may be stored on one or more computer-readable recording media.
  • the method according to one embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer-readable medium.
  • the medium may continuously store a computer-executable program, or temporarily store it for execution or download.
  • the medium may be a variety of recording or storage means in the form of a single or several pieces of hardware combined. It is not limited to a medium directly connected to a computer system and may be distributed over a network. Examples of media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, And there may be something configured to store program instructions, including ROM, RAM, flash memory, etc. Additionally, examples of other media include recording or storage media managed by app stores that distribute applications, sites or servers that supply or distribute various other software, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif à porter sur soi selon un mode de réalisation peut comprendre une lentille, un afficheur, un capteur et un processeur. Le processeur peut modifier l'intensité de la première lumière émise par une surface de la lentille en changeant l'état de la lentille sur la base de l'intensité de la première lumière incluse dans des données du capteur. Le processeur peut acquérir la luminosité de l'afficheur sur la base de l'intensité modifiée de la première lumière et d'une luminosité de référence associée à une seconde sortie de lumière émise par l'afficheur. Le processeur peut être configuré pour commander l'afficheur de telle sorte qu'un écran ayant la luminosité acquise est affiché à l'intérieur d'une zone d'affichage formée sur ladite surface de la lentille.
PCT/KR2023/013284 2022-10-06 2023-09-05 Dispositif à porter sur soi et procédé de réglage de luminosité d'affichage sur la base d'une transparence de lentille WO2024076007A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20220128217 2022-10-06
KR10-2022-0128217 2022-10-06
KR10-2022-0136904 2022-10-21
KR1020220136904A KR20240048430A (ko) 2022-10-06 2022-10-21 렌즈의 투명도에 기반하여 디스플레이의 휘도를 조절하기 위한 웨어러블 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2024076007A1 true WO2024076007A1 (fr) 2024-04-11

Family

ID=90608612

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/013284 WO2024076007A1 (fr) 2022-10-06 2023-09-05 Dispositif à porter sur soi et procédé de réglage de luminosité d'affichage sur la base d'une transparence de lentille

Country Status (1)

Country Link
WO (1) WO2024076007A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190002795A (ko) * 2017-06-29 2019-01-09 삼성디스플레이 주식회사 헤드 마운트 표시장치 및 이의 구동 방법
KR101991133B1 (ko) * 2012-11-20 2019-06-19 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 헤드 마운트 디스플레이 및 그 제어 방법
KR20200010952A (ko) * 2018-07-23 2020-01-31 삼성전자주식회사 배터리 잔량에 기반하여, 투명 부재의 투과율 및 프로젝터의 출력 휘도를 제어하는 웨어러블 전자 장치 및 그 동작 방법
KR20200095985A (ko) * 2019-02-01 2020-08-11 주식회사 피앤씨솔루션 변색렌즈 및 조도센서를 이용한 머리 착용형 디스플레이 장치
KR102260393B1 (ko) * 2019-11-27 2021-06-03 주식회사 피앤씨솔루션 사용자에 따라 화면 조도가 자동으로 조절되는 머리 착용형 디스플레이 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101991133B1 (ko) * 2012-11-20 2019-06-19 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 헤드 마운트 디스플레이 및 그 제어 방법
KR20190002795A (ko) * 2017-06-29 2019-01-09 삼성디스플레이 주식회사 헤드 마운트 표시장치 및 이의 구동 방법
KR20200010952A (ko) * 2018-07-23 2020-01-31 삼성전자주식회사 배터리 잔량에 기반하여, 투명 부재의 투과율 및 프로젝터의 출력 휘도를 제어하는 웨어러블 전자 장치 및 그 동작 방법
KR20200095985A (ko) * 2019-02-01 2020-08-11 주식회사 피앤씨솔루션 변색렌즈 및 조도센서를 이용한 머리 착용형 디스플레이 장치
KR102260393B1 (ko) * 2019-11-27 2021-06-03 주식회사 피앤씨솔루션 사용자에 따라 화면 조도가 자동으로 조절되는 머리 착용형 디스플레이 장치

Similar Documents

Publication Publication Date Title
WO2022186454A1 (fr) Appareil électronique comprenant une carte de circuit imprimé flexible
WO2024076007A1 (fr) Dispositif à porter sur soi et procédé de réglage de luminosité d'affichage sur la base d'une transparence de lentille
WO2024076009A1 (fr) Dispositif portable, procédé et support de stockage lisible par ordinateur, pour adapter des informations de regard d'un utilisateur
WO2024063253A1 (fr) Dispositif électronique et procédé de commande de résolution de chaque zone d'une pluralité de zones incluses dans une image obtenue d'une caméra
WO2024122999A1 (fr) Dispositif électronique et procédé d'identification d'entrée d'utilisateur dans un espace virtuel
WO2024049178A1 (fr) Dispositif électronique et procédé de commande d'affichage d'au moins un objet externe parmi un ou plusieurs objets externes
WO2024025076A1 (fr) Dispositif électronique pour ajuster un volume à l'aide d'un signal sonore émis par un objet externe, et procédé associé
WO2024063463A1 (fr) Dispositif électronique pour ajuster un signal audio associé à un objet représenté par l'intermédiaire d'un dispositif d'affichage, et procédé associé
WO2024063353A1 (fr) Dispositif électronique et procédé de changement de signal audio sur la base d'informations relatives à un objet visuel
WO2024090844A1 (fr) Dispositif habitronique pour changer l'état d'un écran, et procédé associé
WO2024080579A1 (fr) Dispositif à porter sur soi pour guider la posture d'un utilisateur et procédé associé
WO2024029718A1 (fr) Dispositif électronique pour sélectionner au moins un dispositif électronique externe sur la base d'au moins un objet externe, et procédé associé
WO2024080770A1 (fr) Dispositif portable permettant de détecter des informations d'iris et son procédé de commande
WO2024144158A1 (fr) Dispositif habitronique pour commander au moins un objet virtuel en fonction d'attributs d'au moins un objet virtuel, et son procédé de commande
WO2024101676A1 (fr) Dispositif portable pour fournir, sur la base d'un type d'objet externe, des informations concernant un article inclus dans un objet externe, et procédé associé
WO2024048912A1 (fr) Dispositif électronique pour commander un dispositif portable sur la base d'une entrée par un dispositif électronique, et procédé associé
WO2024043438A1 (fr) Dispositif électronique portable commandant un modèle de caméra et son procédé de fonctionnement
WO2024029858A1 (fr) Procédé de commande de module d'affichage et dispositif électronique exécutant le procédé
WO2023153607A1 (fr) Procédé d'affichage de contenu de réalite augmentée (ra) basé sur l'éclairement ambiant et dispositif électronique
WO2024043546A1 (fr) Dispositif électronique et procédé de suivi de mouvement d'utilisateur
WO2023027276A1 (fr) Dispositif électronique pour exécuter une pluralité de fonctions à l'aide d'un stylet et son procédé de fonctionnement
WO2024080583A1 (fr) Procédé et dispositif électronique pour fournir des informations d'ar à l'aide d'une image
WO2024096460A1 (fr) Dispositif électronique permettant d'acquérir des informations de distance, et son procédé de fonctionnement
WO2022050638A1 (fr) Procédé de modification de paramètres d'affichage et dispositif électronique
WO2024128843A1 (fr) Dispositif électronique, procédé et support de stockage lisible par ordinateur pour afficher un objet visuel représentant une application en utilisant une zone formée sur la base d'informations physiques de l'utilisateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23875076

Country of ref document: EP

Kind code of ref document: A1