US10971053B2 - Electronic device for changing characteristics of display according to external light and method therefor - Google Patents
Electronic device for changing characteristics of display according to external light and method therefor Download PDFInfo
- Publication number
- US10971053B2 US10971053B2 US16/733,768 US202016733768A US10971053B2 US 10971053 B2 US10971053 B2 US 10971053B2 US 202016733768 A US202016733768 A US 202016733768A US 10971053 B2 US10971053 B2 US 10971053B2
- Authority
- US
- United States
- Prior art keywords
- electronic device
- frame data
- color
- display
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 230000015654 memory Effects 0.000 claims abstract description 60
- 230000008859 change Effects 0.000 claims description 62
- 230000004044 response Effects 0.000 claims description 61
- 208000006992 Color Vision Defects Diseases 0.000 claims description 19
- 201000007254 color blindness Diseases 0.000 claims description 19
- 230000000295 complement effect Effects 0.000 claims description 17
- 238000004891 communication Methods 0.000 description 54
- 230000000903 blocking effect Effects 0.000 description 42
- 238000010586 diagram Methods 0.000 description 26
- 230000000007 visual effect Effects 0.000 description 25
- 230000003190 augmentative effect Effects 0.000 description 23
- 230000006870 function Effects 0.000 description 14
- 210000003128 head Anatomy 0.000 description 13
- 239000003086 colorant Substances 0.000 description 12
- 238000003491 array Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 6
- 230000002238 attenuated effect Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000002156 mixing Methods 0.000 description 4
- 201000009310 astigmatism Diseases 0.000 description 3
- 230000004438 eyesight Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 208000001491 myopia Diseases 0.000 description 3
- 230000004379 myopia Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 206010020675 Hypermetropia Diseases 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000004305 hyperopia Effects 0.000 description 2
- 201000006318 hyperopia Diseases 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000003155 kinesthetic effect Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/2003—Display of colours
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/06—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing light sensitivity, e.g. adaptation; for testing colour vision
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0112—Head-up displays characterised by optical features comprising device for genereting colour display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0181—Adaptation to the pilot/driver
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0242—Compensation of deficiencies in the appearance of colours
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0666—Adjustment of display parameters for control of colour parameters, e.g. colour temperature
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/16—Calculation or use of calculated indices related to luminance levels in display data
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
Definitions
- the disclosure relates to an electronic device for controlling a display based on luminance of external light and a method therefor.
- the electronic device which can perform wireless voice calling and information exchange, has become a daily necessity.
- the electronic device has been recognized simply as a portable apparatus capable of wireless voice calling.
- the electronic device has been developed into a multimedia apparatus performing functions such as scheduling, games, remote control, image photography, Internet searching, and an Social Network Service (SNS) in addition to the wireless voice call, thereby satisfying the needs of users.
- SNS Social Network Service
- the augmented reality service refers to a service for showing users real world images overlaid with virtual images having additional information, and can provide users with virtual images including content associated with external objects distinguished from the real-world images.
- an aspect of the disclosure is to provide an apparatus and method for controlling a display based on luminance of external light and a method therefor.
- the brightness of the external light may determine whether or not the user can see the content. For example, when relatively bright external light reaches the user, the visibility of content displayed to the user may be reduced.
- an electronic device includes a display, a sensor, a memory, and at least one processor operably connected to the display, the sensor, and the memory.
- the at least one processor is configured to, identify first information regarding external light directed to the electronic device by using the sensor, in response to identifying wearing of the electronic device on a user, acquire first frame data based on the identified first information and second information regarding the user, display the first frame data on the display in response to acquisition of the first frame data, identify second frame data distinguished from the first frame data from an application stored in the memory while the first frame data is outputted on the display, adjust color of at least one of multiple pixels included in the second frame data at least partially based on the first frame data, in response to identification of the second frame data, and control the display based on at least one of the first frame data or the adjusted second frame data.
- an electronic device in accordance with another aspect of the disclosure, includes a display, a sensor, a memory, and at least one processor operably connected to the display, the sensor, and the memory.
- the at least one processor is configured to, identify first external light directed to the electronic device by using the sensor, display content having first color associated with the first external light on the display that transmits the first external light, based on identification of the first external light, the content being acquired from an application executed by the at least one processor, identify second external light distinguished from the first external light by using the sensor, while content based on the first color is outputted, and change color of the content outputted on the display from the first color to second color associated with the second external light, in response to identification of the second external light.
- a method of an electronic device includes identifying whether or not a user of the electronic device wears the electronic device identifying first information indicating luminance of external light directed to the electronic device by using a sensor of the electronic device, in response to identifying wearing of the electronic device on the user, acquiring first frame data based on the identified first information and second information regarding the user, outputting the first frame data on a display of the electronic device in response to acquisition of the first frame data, identifying second frame data distinguished from the first frame data from an application stored in a memory of the electronic device, while the first frame data is outputted on the display, adjusting color of at least one of multiple pixels included in the second frame data at least partially based on the first frame data, in response to identification of the second frame data, and controlling the display based on at least one of the first frame data or the adjusted second frame data.
- FIG. 1 is a block diagram of an electronic device inside a network environment according to an embodiment of the disclosure
- FIG. 2 is a block diagram of an electronic device and an external electronic device according to an embodiment of the disclosure
- FIG. 3 is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure
- FIG. 4A is a diagram illustrating an electronic device that can be worn on a user's head in according to an embodiment of the disclosure
- FIG. 4B is a diagram illustrating an electronic device that can be worn on a user's head in according to an embodiment of the disclosure
- FIG. 5 is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure acquiring first information regarding first frame data based on external light;
- FIG. 6 is a diagram illustrating operations of an electronic device according to an embodiment of the disclosure acquiring first information by using a luminance sensor
- FIG. 7A is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure generating first frame data based on first information regarding the external environment and second information regarding the user;
- FIG. 7B is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure generating first frame data based on first information regarding the external environment and second information regarding the user;
- FIG. 8 is a diagram illustrating multiple pixels included in first frame data generated by an electronic device according to an embodiment of the disclosure.
- FIG. 9 is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure adjusting second frame data based on first frame data
- FIG. 10A is a diagram illustrating operations of an electronic device according to an embodiment of the disclosure controlling a display based on first frame data and second frame data;
- FIG. 11 is a flowchart illustrating an order according to which an electronic device according to an embodiment of the disclosure generates first frame data and second frame data;
- FIG. 12 is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure identifying second information regarding first frame data based on information inputted from the user;
- FIG. 13A is a diagram illustrating an example of a user interface (UI) that an electronic device according to an embodiment of the disclosure provides to the user in order to identify the second information in FIG. 12 ;
- UI user interface
- FIG. 13B is a diagram illustrating an example of a UI that an electronic device according to an embodiment of the disclosure provides to the user in order to identify the second information in FIG. 12 ;
- FIG. 13C is a diagram illustrating an example of a UI that an electronic device according to an embodiment of the disclosure provides to the user in order to identify the second information in FIG. 12 ;
- FIG. 13D is a diagram illustrating an example of a UI that an electronic device according to an embodiment of the disclosure provides to the user in order to identify the second information in FIG. 12 ;
- FIG. 14 is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure identifying second information regarding first frame data by using an external electronic device;
- FIG. 15 is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure generating first frame data based on a mode and/or a state enabled by the user;
- FIG. 16 is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure changing the color of content provided to the user according to a change in external light;
- FIG. 17 is a flowchart illustrating operations of an electronic device according to an embodiment of the disclosure generating first frame data and second frame data based on an image acquired from an image sensor;
- FIG. 18 is a diagram illustrating a situation in which an electronic device according to an embodiment of the disclosure provides the user with an augmented reality service.
- FIG. 1 is a block diagram illustrating an electronic device 101 in a network environment 100 according to an embodiment of the disclosure.
- the electronic device 101 in the network environment 100 may communicate with an electronic device 102 via a first network 198 (e.g., a short-range wireless communication network), or an electronic device 104 or a server 108 via a second network 199 (e.g., a long-range wireless communication network).
- a first network 198 e.g., a short-range wireless communication network
- an electronic device 104 or a server 108 via a second network 199 (e.g., a long-range wireless communication network).
- the electronic device 101 may communicate with the electronic device 104 via the server 108 .
- the electronic device 101 may include a processor 120 , memory 130 , an input device 150 , a sound output device 155 , a display device 160 , an audio module 170 , a sensor module 176 , an interface 177 , a haptic module 179 , a camera module 180 , a power management module 188 , a battery 189 , a communication module 190 , a subscriber identification module (SIM) 196 , or an antenna module 197 .
- at least one (e.g., the display device 160 or the camera module 180 ) of the components may be omitted from the electronic device 101 , or one or more other components may be added in the electronic device 101 .
- the components may be implemented as single integrated circuitry.
- the sensor module 176 e.g., a fingerprint sensor, an iris sensor, or an illuminance sensor
- the display device 160 e.g., a display
- the processor 120 may execute, for example, software (e.g., a program 140 ) to control at least one other component (e.g., a hardware or software component) of the electronic device 101 coupled with the processor 120 , and may perform various data processing or computation. According to one embodiment, as at least part of the data processing or computation, the processor 120 may load a command or data received from another component (e.g., the sensor module 176 or the communication module 190 ) in volatile memory 132 , process the command or the data stored in the volatile memory 132 , and store resulting data in non-volatile memory 134 .
- software e.g., a program 140
- the processor 120 may load a command or data received from another component (e.g., the sensor module 176 or the communication module 190 ) in volatile memory 132 , process the command or the data stored in the volatile memory 132 , and store resulting data in non-volatile memory 134 .
- the processor 120 may include a main processor 121 (e.g., a central processing unit (CPU) or an application processor (AP)), and an auxiliary processor 123 (e.g., a graphics processing unit (GPU), an image signal processor (ISP), a sensor hub processor, or a communication processor (CP)) that is operable independently from, or in conjunction with, the main processor 121 .
- auxiliary processor 123 may be adapted to consume less power than the main processor 121 , or to be specific to a specified function.
- the auxiliary processor 123 may be implemented as separate from, or as part of the main processor 121 .
- the auxiliary processor 123 may control at least some of functions or states related to at least one component (e.g., the display device 160 , the sensor module 176 , or the communication module 190 ) among the components of the electronic device 101 , instead of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or together with the main processor 121 while the main processor 121 is in an active state (e.g., executing an application).
- the auxiliary processor 123 e.g., an image signal processor or a communication processor
- the memory 130 may store various data used by at least one component (e.g., the processor 120 or the sensor module 176 ) of the electronic device 101 .
- the various data may include, for example, software (e.g., the program 140 ) and input data or output data for a command related thereto.
- the memory 130 may include the volatile memory 132 or the non-volatile memory 134 .
- the program 140 may be stored in the memory 130 as software, and may include, for example, an operating system (OS) 142 , middleware 144 , or an application 146 .
- OS operating system
- middleware middleware
- application application
- the input device 150 may receive a command or data to be used by another component (e.g., the processor 120 ) of the electronic device 101 , from the outside (e.g., a user) of the electronic device 101 .
- the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (e.g., a stylus pen).
- the sound output device 155 may output sound signals to the outside of the electronic device 101 .
- the sound output device 155 may include, for example, a speaker or a receiver.
- the speaker may be used for general purposes, such as playing multimedia or playing record, and the receiver may be used for incoming calls. According to an embodiment, the receiver may be implemented as separate from, or as part of the speaker.
- the display device 160 may visually provide information to the outside (e.g., a user) of the electronic device 101 .
- the display device 160 may include, for example, a display, a hologram device, or a projector and control circuitry to control a corresponding one of the display, hologram device, and projector.
- the display device 160 may include touch circuitry adapted to detect a touch, or sensor circuitry (e.g., a pressure sensor) adapted to measure the intensity of force incurred by the touch.
- the audio module 170 may convert a sound into an electrical signal and vice versa. According to an embodiment, the audio module 170 may obtain the sound via the input device 150 , or output the sound via the sound output device 155 or a headphone of an external electronic device (e.g., an electronic device 102 ) directly (e.g., wiredly) or wirelessly coupled with the electronic device 101 .
- an external electronic device e.g., an electronic device 102
- directly e.g., wiredly
- wirelessly e.g., wirelessly
- the sensor module 176 may detect an operational state (e.g., power or temperature) of the electronic device 101 or an environmental state (e.g., a state of a user) external to the electronic device 101 , and then generate an electrical signal or data value corresponding to the detected state.
- the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
- the interface 177 may support one or more specified protocols to be used for the electronic device 101 to be coupled with the external electronic device (e.g., the electronic device 102 ) directly (e.g., wiredly) or wirelessly.
- the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, a secure digital (SD) card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD secure digital
- a connecting terminal 178 may include a connector via which the electronic device 101 may be physically connected with the external electronic device (e.g., the electronic device 102 ).
- the connecting terminal 178 may include, for example, a HDMI connector, a USB connector, a SD card connector, or an audio connector (e.g., a headphone connector).
- the haptic module 179 may convert an electrical signal into a mechanical stimulus (e.g., a vibration or a movement) or electrical stimulus which may be recognized by a user via his tactile sensation or kinesthetic sensation.
- the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electric stimulator.
- the camera module 180 may capture a still image or moving images.
- the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
- the power management module 188 may manage power supplied to the electronic device 101 .
- the power management module 188 may be implemented as at least part of, for example, a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 189 may supply power to at least one component of the electronic device 101 .
- the battery 189 may include, for example, a primary cell which is not rechargeable, a secondary cell which is rechargeable, or a fuel cell.
- the communication module 190 may support establishing a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 101 and the external electronic device (e.g., the electronic device 102 , the electronic device 104 , or the server 108 ) and performing communication via the established communication channel.
- the communication module 190 may include one or more communication processors that are operable independently from the processor 120 (e.g., the application processor (AP)) and supports a direct (e.g., wired) communication or a wireless communication.
- AP application processor
- the communication module 190 may include a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., a local area network (LAN) communication module or a power line communication (PLC) module).
- a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
- GNSS global navigation satellite system
- wired communication module 194 e.g., a local area network (LAN) communication module or a power line communication (PLC) module.
- LAN local area network
- PLC power line communication
- a corresponding one of these communication modules may communicate with the external electronic device via the first network 198 (e.g., a short-range communication network, such as BluetoothTM, wireless-fidelity (Wi-Fi) direct, or infrared data association (IrDA)) or the second network 199 (e.g., a long-range communication network, such as a cellular network, the Internet, or a computer network (e.g., LAN or wide area network (WAN)).
- the first network 198 e.g., a short-range communication network, such as BluetoothTM, wireless-fidelity (Wi-Fi) direct, or infrared data association (IrDA)
- the second network 199 e.g., a long-range communication network, such as a cellular network, the Internet, or a computer network (e.g., LAN or wide area network (WAN)
- These various types of communication modules may be implemented as a single component (e.g., a single chip), or may be implemented as multi components (e.
- the wireless communication module 192 may identify and authenticate the electronic device 101 in a communication network, such as the first network 198 or the second network 199 , using subscriber information (e.g., international mobile subscriber identity (IMSI)) stored in the subscriber identification module 196 .
- subscriber information e.g., international mobile subscriber identity (IMSI)
- the signal or the power may then be transmitted or received between the communication module 190 and the external electronic device via the selected at least one antenna.
- another component e.g., a radio frequency integrated circuit (RFIC)
- RFIC radio frequency integrated circuit
- At least some of the above-described components may be coupled mutually and communicate signals (e.g., commands or data) therebetween via an inter-peripheral communication scheme (e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)).
- an inter-peripheral communication scheme e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
- commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 via the server 108 coupled with the second network 199 .
- Each of the electronic devices 102 and 104 may be a device of a same type as, or a different type, from the electronic device 101 .
- all or some of operations to be executed at the electronic device 101 may be executed at one or more of the external electronic devices 102 , 104 , or 108 .
- the electronic device may be one of various types of electronic devices.
- the electronic devices may include, for example, a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance. According to an embodiment of the disclosure, the electronic devices are not limited to those described above.
- each of such phrases as “A or B,” “at least one of A and B,” “at least one of A or B,” “A, B, or C,” “at least one of A, B, and C,” and “at least one of A, B, or C,” may include any one of, or all possible combinations of the items enumerated together in a corresponding one of the phrases.
- such terms as “1st” and “2nd,” or “first” and “second” may be used to simply distinguish a corresponding component from another, and does not limit the components in other aspect (e.g., importance or order).
- the one or more instructions may include a code generated by a complier or a code executable by an interpreter.
- the machine-readable storage medium may be provided in the form of a non-transitory storage medium.
- the term “non-transitory” simply means that the storage medium is a tangible device, and does not include a signal (e.g., an electromagnetic wave), but this term does not differentiate between where data is semi-permanently stored in the storage medium and where the data is temporarily stored in the storage medium.
- a method may be included and provided in a computer program product.
- the computer program product may be traded as a product between a seller and a buyer.
- the computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc read only memory (CD-ROM)), or be distributed (e.g., downloaded or uploaded) online via an application store (e.g., PlayStoreTM), or between two user devices (e.g., smart phones) directly. If distributed online, at least part of the computer program product may be temporarily generated or at least temporarily stored in the machine-readable storage medium, such as memory of the manufacturer's server, a server of the application store, or a relay server.
- CD-ROM compact disc read only memory
- an application store e.g., PlayStoreTM
- two user devices e.g., smart phones
- operations performed by the module, the program, or another component may be carried out sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order or omitted, or one or more other operations may be added.
- FIG. 2 is a block diagram of an electronic device 101 and an external electronic 230 device according to an embodiment of the disclosure.
- the electronic device 101 may correspond to a wearable device including at least one of an accessory-type device (for example, a watch, a ring, a bracelet, an anklet, a necklace, glasses, contact lenses, or a head-mounted device (HMD)), a fabric or garment-integrated device (for example, an electronic garment), a body-attached device (for example, a skin pad or a tattoo), or a bio-implantable device (for example, an implantable circuit).
- an accessory-type device for example, a watch, a ring, a bracelet, an anklet, a necklace, glasses, contact lenses, or a head-mounted device (HMD)
- a fabric or garment-integrated device for example, an electronic garment
- a body-attached device for example, a skin pad or a tattoo
- a bio-implantable device for example, an implantable circuit
- the processor 120 may execute at least one instruction stored in the memory 120 .
- the processor 120 may include a circuit for processing data, for example, at least one of an integrated circuit (IC), an arithmetic logic unit (ALU), a field programmable gate array (FPGA), and a large-scale integration (LSI).
- Data processed by the processor 120 may include, for example, at least one of the luminance measured by the luminance sensor 210 or the image data acquired by the camera module 180 .
- the memory 130 may store an instruction regarding an application and an instruction regarding an operating system (OS).
- the OS may be included in system software executed by the processor 120 .
- the processor 120 may manage hardware components included in the electronic device 101 .
- the OS may provide an application programming interface (API) with an application corresponding to the other software than the system software.
- API application programming interface
- At least one application which is a set of multiple applications, may be installed in the memory 130 .
- the fact that an application is installed in the memory 130 means that the application is stored in such a format that the same can be executed by the processor 120 connected to the memory 130 .
- the electronic device 101 may identify at least one external object from image data acquired from the camera module 180 .
- the display 220 may include a panel configured to transmit at least a part of external light reaching the first surface thereof to a second surface opposite to the first surface.
- the display 220 may correspond to an HMD attachable to the user's head, or a head-up display (HUD) disclosed toward one side of the user's head.
- the second surface of the panel may face the user's eyes, and external light reaching the first surface may pass through the panel such that the same is delivered to the user's eyes.
- the extent to which the panel transmits external light (for example, transparency) may be adjusted based on a control signal from the processor 120 .
- An embodiment of the electronic device 101 attached to the user's head and an embodiment of the structure of the display 220 will be described later with reference to FIGS. 4A to 4B .
- the processor 120 may display a UI on the panel such that the UI is superimposed on an external object viewed by the user.
- the UI may be generated based on an application for providing an augmented reality service.
- the processor 120 may change the color and/or transparency of the UI displayed on the panel of the display 220 , based on luminance measured by the luminance sensor 210 .
- the processor 120 may change the color and/or transparency of the entire panel based on luminance measured by the luminance sensor 210 .
- the processor 120 may determine the color and/or transparency of the UI displayed on the entire panel or on a part of the panel, based on luminance measured by the luminance sensor 210 , information regarding the user, or a combination thereof.
- the information regarding the user may include, for example, information regarding at least one of the user's color weakness and/or color blindness.
- the communication module 190 may connect the electronic device 101 to the external electronic device 230 based on a wireless network, such as Bluetooth, Wireless Fidelity (Wi-Fi), near field communication (NFC), or Long Term evolution (LTE), and a wired network such as a local area network (LAN) or Ethernet.
- a wireless network such as Bluetooth, Wireless Fidelity (Wi-Fi), near field communication (NFC), or Long Term evolution (LTE), and a wired network such as a local area network (LAN) or Ethernet.
- the communication module 190 may include at least one of a communication circuit supporting a wireless network or a wired network, a communication processor (CP), and a communication interface.
- CP communication processor
- the external electronic device 230 may include at least one of a communication module 240 , a processor 250 , or a memory 260 .
- the communication module 240 , the processor 250 , and the memory 260 may include hardware components similar to those of the communication module 190 , the processor 120 , and the memory 130 included in the electronic device 101 , respectively.
- the communication module 240 , the processor 250 , or the memory 260 may be electrically or operably connected to each other through a communication bus (not illustrated), for example.
- the electronic device 101 and the external electronic device 230 may be connected to each other through a wireless or wired network based on the communication modules 190 and 240 .
- the electronic device 101 may transmit information regarding the user, which is used to determine the color and/or transparency of the UI displayed on the entire panel or on a part of the panel. Operations performed by the electronic device 101 and the external electronic device 230 based on the information will be described later with reference to FIG. 14 .
- FIG. 3 is a flowchart 300 illustrating operations of an electronic device according to an embodiment of the disclosure.
- the electronic device in FIG. 3 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the operations in FIG. 3 may be performed by the electronic device 101 in FIGS. 1 to 2 or by the processor 120 in FIGS. 1 to 2 , for example.
- the electronic device may identify first information regarding external light directed to the electronic device.
- operation 310 may be performed in response to execution of an application associated with an augmented reality service by the electronic device.
- operation 310 may be performed based on whether or not the user of the electronic device wears the electronic device.
- the external light directed to the electronic device may be identified based on at least one sensor included in the electronic device.
- the external light may be light directed to the luminance sensor that is viewable through at least a part of the housing of the electronic device.
- the external light may be light directed to the camera module that is viewable through at least a part of the housing of the electronic device.
- the first information identified based on the external light may include a parameter measured by at least one sensor included in the electronic device.
- the first information may include at least one parameter obtained from the luminance sensor (for example, the luminance sensor 210 in FIG. 2 ) included in the electronic device, the camera module (for example, the camera module 180 in FIGS. 1 to 2 ), or a combination thereof.
- the first information may include the luminance of the external light.
- the first information may include an image that is two-dimensionally received on an image sensor of the camera module.
- the electronic device may identify second information regarding the user.
- the second information may include one or more parameters that has been stored in the memory of the electronic device.
- the second information may include one or more parameters associated with at least one of the user's color weakness and/or color blindness.
- the second information may include at least one of the type, the frequency, and the identifier of multiple colors that the user cannot distinguish, or a combination thereof.
- the second information may include at least one of the type, the frequency, and the identifier of one or more colors that the user cannot distinguish, or a combination thereof.
- the second information may include at least one of the type, the frequency, and the identifier of a designated color for correcting color weakness and/or color blindness of the user, or a combination thereof.
- the second information is not limited to the above-mentioned examples, and an embodiment of the operation of the electronic device identifying second information regarding the user will be described later with reference to FIG. 15 .
- the electronic device may acquire first frame data based on the first information and the second information.
- the size and/or resolution of the first frame data may correspond to the size and/or resolution of the entire area of the display included in the electronic device.
- the first frame data may include multiple pixels. The color and transparency of the multiple pixels included in the first frame data may be determined based on the first information and the second information.
- the color of the multiple pixels of the first frame data may be determined based on the second information regarding the user (for example, color designated based on the user's preferences, color designated to correct the user's color weakness and/or color blindness).
- the transparency of the multiple pixels of the first frame data may be determined based on the luminance of external light included in the first information. Embodiments performed by the electronic device in connection with operation 330 will be described later with reference to FIGS. 7A to 7B .
- the electronic device may display the first frame data on the display in operation 340 .
- the display may be disposed adjacent to the user's eyes while the user wears the electronic device.
- the display may transmit at least a part of external light to the eyes. Since the electronic device displays the first frame data on the display that transmits at least a part of the external light to the eyes, the user can see the first frame data together with the external light.
- the size or resolution of the first frame data corresponds to the size or resolution of the entire area of the display, the user may see external light that is shifted based on the color or transparency of the first frame data.
- the electronic device may identify second frame data from an application.
- the application may correspond to an application for providing content and/or UI associated with augmented reality.
- the electronic device may generate second frame data including content associated with an external object identified based on the external light, based on the application.
- the location of the content inside the second frame data may correspond to a location on the display where light of the external object is penetrated through.
- the size of the content and/or the UI included in the second frame data may correspond to the size of at least a part of the entire area of the display.
- the first frame data may correspond to the background of content and/or UI provided by the application.
- the first frame data may be associated with a background image of the UI provided to the user by the electronic device, background frame data, background color, or background object.
- the second frame data may be associated with a foreground image provided to the user by the electronic device, foreground frame data, or foreground object.
- the electronic device may maintain display of the first frame data based on operation 340 .
- the electronic device may change the second frame data based on the first frame data in operation 360 .
- the electronic device may adjust the color of at least one of multiple pixels included in the second frame data at least partially based on the first frame data, in response to identification of the second frame data.
- the electronic device may change the color of at least one of multiple pixels included in the second frame data, based on the color of at least one of multiple pixels included in the first frame data.
- the electronic device may change the color of content included in the second frame data from the second color to a third color, based on the first color of multiple pixels included in the first frame data.
- the third color may be a complementary color of the first color.
- the first color and the third color, which are complementary to each other, may be the opposite colors in a color spectrum.
- red and green colors are complementary colors.
- the third color may correspond to a color determined such that, when content included in the second frame data is overlaid with the first frame data, the user can recognize the content in spite of the first color.
- the color of content is changed from the second color to a third color that is complementary to the first color, the content overlaid with multiple pixels of the first frame data having the first color can be recognized by the user more clearly due to the complementary contrast.
- the electronic device may display the first frame data and the second frame data on the display.
- the second frame data displayed on the display may correspond to the second frame data changed based on operation 360 .
- the electronic device may control the display based on at least one of the first frame data and/or the second frame data adjusted based on operation 360 .
- the color and transparency of each of the multiple pixels included in the display may have the color and/or transparency of the corresponding pixel included in the first frame data and/or the second frame data.
- the electronic device may merge multiple pixels of the first frame data and multiple pixels of the second frame data based on an alpha blending technique.
- the electronic device may output the merged multiple pixels on the display.
- the electronic device may control the display such that at least one pixel included in the second frame data is displayed while being overlaid with the multiple pixels included in the first frame data. Operations of the electronic device controlling the display based on the first frame data and the second frame data will be described later with reference to FIGS. 10A to 10B .
- FIG. 4A is a diagram illustrating an electronic device 101 that can be worn on the head of the user 410 in an embodiment of the disclosure.
- FIG. 4B is a diagram illustrating an electronic device 101 that can be worn on the head of the user 410 in an embodiment of the disclosure.
- the electronic device 101 in FIGS. 4A to 4B may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the electronic device 101 in FIGS. 4A to 4B may correspond to a wearable device.
- the electronic device 101 may include one or more straps that may be fastened on the head of the user 410 .
- the user may connect multiple straps included in the electronic device 101 such that the electronic device 101 is worn on the head of the user 410 .
- the electronic device 101 may include a visor 480 that is disposed in front of the user's both eyes while the user 410 wears the electronic device 101 on his/her head.
- the type of the visor 480 may be similar to the type of at least one of sunglasses, goggles, or glasses.
- Inside the visor 480 at least one display included in the electronic device 101 may be disposed in front of the user's both eyes.
- At least one of the visor 480 and/or the display arranged in front of the user's both eyes may include a material that transmit at least a part of external light directed to the user's both eyes. At least a part of external light associated with an external object 420 may pass through the visor 480 and the display and thus reach the user's both eyes.
- the camera module 180 included in the electronic device 101 may be included in one side of the electronic device 101 , on which the visor is disposed, so as to acquire an image of the scenery in front of the user's both eyes. For example, when the user 410 wearing the electronic device 101 gazes at an external object 420 , the electronic device 101 may acquire an image including the external object 420 .
- the luminance sensor 210 included in the electronic device 101 may also be included in the above-mentioned one side so as to measure the luminance of external light directed to the user's both eyes.
- the camera module 180 and/or the luminance sensor 210 of the electronic device 101 may measure images and/or luminance based on a prism and/or a reflective mirror.
- the electronic device 101 may change the color and/or transparency of at least one piece of frame data displayed on the display, based on at least one of or a combination of an image acquired from the camera module 180 , luminance measured by the luminance sensor 210 , and information inputted from the user.
- the transparency of first frame data which is displayed on the display, and which has a size corresponding to that of the entire area of the display may be determined based on luminance measured by the luminance sensor 210 .
- the color of the first frame data may be determined based on information inputted from the user.
- the color of second frame data which is displayed so as to overlap a portion of the display 220 where the external light associated with the external object 420 is penetrated, and which includes content including information regarding the external object 420 may be determined based on the color of the first frame data.
- the electronic device 101 may acquire second frame data including content having one or more colors based on an application for recognizing an external object 420 .
- the electronic device 101 may change one or more colors of the content based on the color associated with the first frame data.
- the changed one or more colors of the content may be included in a color area that is centered on a color associated with the first frame data (for example, complementary color of the color of the first frame data).
- the electronic device 101 may adjust the first frame data and/or the second frame data based on a change in external light of the user 410 .
- the electronic device 101 may adjust the color and/or transparency of the first frame data and/or the second frame data based on the changed luminance.
- operations performed by the electronic device 101 based on the changed luminance will be described later with reference to FIG. 16 .
- the electronic device 101 may control multiple pixels included in the display based on the first frame data and/or the second frame data.
- the number of displays included in the electronic device 101 may be one or a larger number.
- the electronic device 101 may include two displays 220 - 1 and 220 - 2 corresponding to both eyes 410 - 1 and 410 - 2 of the user, respectively.
- the two displays 220 - 1 and 220 - 2 may include pixel arrays 430 - 1 and 430 - 2 including multiple pixels, respectively.
- the pixel arrays 430 - 1 and 430 - 2 may include multiple pixels based on an LCD and/or an OLED.
- the electronic device 101 may control pixels included in the pixel arrays 430 - 1 and 430 - 2 , respectively, based on the first frame data and/or the second frame data.
- light emitted from the pixels included in each of the pixel arrays 430 - 1 and 430 - 2 may pass through lenses 440 - 1 and 440 - 2 disclosed on the front surfaces of the pixel arrays 430 - 1 and 430 - 2 , respectively. After passing through each of the lenses 440 - 1 and 440 - 2 , the light may pass through each of waveguide plates 450 - 1 and 450 - 2 and may reach holographic optical elements (HOE) 460 - 1 and 460 - 2 .
- HOE holographic optical elements
- the light may propagate along the waveguide plates 450 - 1 and 450 - 2 and may reach each of holographic optical elements 470 - 1 and 470 - 2 disposed in front of both eyes 410 - 1 and 410 - 2 of the user.
- the light may be reflected toward both eyes 410 - 1 and 410 - 2 of the user together with external light passing through each of the holographic optic elements 470 - 1 and 470 - 2 .
- the manner of the electronic device 101 providing light which is superimposed on external light, and which is based on the first frame data and/or the second frame data, toward both eyes 410 - 1 and 410 - 2 of the user is not limited to the embodiment illustrated in FIG. 4B .
- the electronic device 101 may output light which is superimposed on external light, and which is based on the first frame data and/or the second frame data, to the user by using a panel which is disposed in front of both eyes 410 - 1 and 410 - 2 of the user, and which includes liquid crystals.
- the electronic device 101 may output light which is superimposed on external light, and which is based on the first frame data and/or the second frame data, to the user by simultaneously using the pixel arrays 430 - 1 and 430 - 2 and/or the panel including liquid crystals.
- the electronic device 101 may change the color and/or transparency of the first frame data and/or the second frame data in order to enhance the visibility of the external object 420 and/or the content that the electronic device 101 has generated by recognizing the external object 420 .
- the color and/or transparency of the first frame data and/or the second frame data may be changed based on information regarding the external environment of the electronic device 101 and/or both eyes 410 - 1 and 410 - 2 of the user.
- the external environment of the electronic device 101 may be associated with at least one of the intensity of external light, the luminance, the atmospheric environment, and the weather, or a combination thereof.
- Information regarding both eyes 410 - 1 and 410 - 2 of the user may be associated with at least one of color weakness, color blindness, eyesight, astigmatism, myopia, and hypermetropia, or a combination thereof.
- the electronic device 101 may change the color and/or transparency of first frame data having a size corresponding to the size of the entire area of the display, based on the passage of time from night to day or the movement of the user 410 from indoor to outdoor environment.
- the display of the electronic device 101 may support a function of dynamically changing the color according to the external environment, based on the first frame data. For example, by determining the color of the first frame data such that multiple colors that the user 410 with color weakness cannot distinguish are shifted, the display of the electronic device 101 may support a function similar to a corrective lens for the user 410 with color weakness.
- FIG. 5 is a flowchart 500 illustrating operations of an electronic device according to an embodiment of the disclosure acquiring first information regarding first frame data based on external light.
- the electronic device in FIG. 5 may correspond to the electronic device 101 in FIGS. 1 to 2 and FIGS. 4A to 4B .
- the operations in FIG. 5 may be performed by the electronic device 101 in FIGS. 1 to 2 or by the processor 120 in FIGS. 1 to 2 .
- At least one of the operations illustrated in FIG. 5 may be associated with operation 310 in FIG. 3 .
- the electronic device may determine whether or not the user wears the electronic device. For example, when the user connects multiple straps of the electronic device, the electronic device may determine, based on a switch included in at least one of the multiple straps, that the user wears the electronic device. For example, based on a proximity sensor that is viewable through one side of the electronic device, the electronic device may identify wearing of the electronic device on a user.
- the electronic device may not identify external light.
- the electronic device may detect whether or not the user wears the electronic device for a designated time.
- the electronic device may identify external light directed to the electronic device in operation 520 .
- the electronic device may identify the external light based on the luminance sensor.
- the electronic device may identify the external light based on an image sensor included in the camera module.
- the electronic device may acquire first information including the luminance of the identified external light.
- the first information may include a luminance value measured by the luminance sensor included in the electronic device.
- the value and/or data included in the first information is not limited to the above-mentioned luminance value.
- the first information may include image data acquired from the image sensor, temperature measured by the temperature sensor, the current time, the geographical location, or a combination thereof.
- the acquired first information may be processed based on system software and/or application (for example, application for providing an augmented reality service) executed by the electronic device.
- the acquired first information may be used to identify at least one external object included in the user's field of view.
- the acquired first information may be used to determine the color and/or transparency of a display covering at least a part of the field of view (FOV) of the user.
- FOV field of view
- the acquired first information may be used to acquire the first frame data in FIG. 3 .
- FIG. 6 is a diagram illustrating operations of an electronic device 101 according to an embodiment of the disclosure acquiring first information by using a luminance sensor 210 .
- the electronic device 101 in FIG. 6 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the electronic device 101 in FIG. 6 may acquire first information based on at least one of the operations in FIG. 5 , for example,
- the processor 120 of the electronic device 101 may enable the luminance sensor 210 based on whether or not the state of the electronic device 101 satisfies a designated condition.
- the designated condition may be associated with whether or not the user of the electronic device 101 wears the electronic device 101 .
- the designated condition may be associated with whether the electronic device 101 is in an awake state that is distinguished from a sleep state.
- the processor 120 may enable the luminance sensor 210 in response to identifying wearing of the electronic device 101 on the user.
- the enabled luminance sensor 210 may receive external light directed to the luminance sensor 210 .
- the luminance sensor 210 may measure the luminance of the external light.
- An electric signal outputted by the luminance sensor 210 may be an analog electric signal having a voltage, a current, and/or a frequency corresponding to the measured luminance.
- the electronic device 101 may include an analog-digital converter (ADC) 610 connected to the luminance sensor 210 so as to change the analog electric signal outputted from the luminance sensor 210 to a digital electric signal.
- ADC analog-digital converter
- the ADC 610 may be disposed between the luminance sensor 210 and the processor 120 .
- the ADC 610 may output a digital electric signal corresponding to the measured luminance.
- the digital electric signal outputted from the ADC 610 may be transmitted to the processor 120 of the electronic device 101 .
- the processor 120 may access a 720 stored in the memory 130 .
- the table 620 may include information indicating the relation between the luminance and the transparency.
- the table 620 may include information regarding the mapping between each of multiple levels of luminance and one of multiple levels of transparency.
- the transparency may be associated with the transparency of the first frame data and/or the alpha value thereof.
- the table 620 may be tuned by the user of the electronic device 101 or may be changed heuristically.
- the first information acquired by the electronic device 101 based on the luminance sensor 210 may include various parameters associated with the intensity of luminance of external light.
- the first information may include at least one of an analog electric signal outputted from the luminance sensor 210 , a digital electric signal outputted from the ADC 610 , the transparency of first frame data acquired by accessing the table 620 , and/or the alpha value thereof.
- the first information may include the transparency of the first frame data and/or the alpha value thereof, or may include various parameters used to acquire the transparency of the first frame data and/or the alpha value thereof.
- FIG. 7A is a flowchart 700 - 1 illustrating operations of an electronic device according to an embodiment of the disclosure generating first frame data based on first information regarding the external environment and second information regarding the user.
- FIG. 7B is a flowchart 700 - 2 illustrating operations of an electronic device according to an embodiment of the disclosure generating first frame data based on first information regarding the external environment and second information regarding the user.
- the electronic device in FIGS. 7A to 7B may correspond to the electronic device 101 in FIGS. 1 to 2 and FIGS. 4A to 4B .
- the operations in FIGS. 7A to 7B may be performed by the electronic device 101 in FIGS. 1 to 2 or by the processor 120 in FIGS. 1 to 2 .
- At least one of the operations illustrated in FIGS. 7A to 7B may be associated with operation 330 in FIG. 3 .
- the electronic device may identify the transparency of multiple pixels included in first frame data based on first information.
- the first information refers to information acquired based on at least one of the operations in FIG. 5 , and may be associated with the external environment of the electronic device.
- the electronic device may identify the transparency of the multiple pixels included in the first frame data, based on the luminance of external light included in the first information.
- the transparency of the multiple pixels may be determined, based on the table 620 in FIG. 6 , as a transparency and/or an alpha value corresponding to the luminance.
- the electronic device may identify the color of the multiple pixels included in the first frame data based on second information.
- the second information refers to information regarding the user, and may be associated with the user's eyes, for example.
- the color of the multiple pixels included in the first frame data may be configured such that the color of external light passing through the display is shifted based on information regarding the eyes of the user of the electronic device (for example, one or more parameters associated with at least one of color weakness and/or color blindness). Operations of the electronic device identifying the second information from the user will be described later with reference to FIG. 12 .
- the electronic device may generate first frame data having an identified color and an identified transparency.
- Each of multiple pixels included in the first frame data may include values corresponding to red light, blue light, and green light, respectively.
- Each of the multiple pixels included in the first frame data may further include an alpha value indicating the transparency.
- the electronic device may determine the intensity of each of red light, blue light, and green light of all of the multiple pixels of the first frame data.
- the electronic device may determine the alpha value of all of the multiple pixels of the first frame data.
- the intensity of red light, the intensity of blue light, the intensity of green light, and the alpha value, included in the multiple pixels of the first frame data may be identical to each other.
- the electronic device may store the generated first frame data.
- the first frame data may be stored in a designated area (for example, first frame buffer) corresponding to the first frame data inside the memory of the electronic device.
- the electronic device may control the multiple pixels included in the display based on the first frame data stored in the first frame buffer. After the electronic device has controlled the multiple pixels included in the display based on the first frame data, the intensity of external light may be attenuated according to the transparency identified in operation 710 while the same passes through the display. Similarly, the color of the external light may be shifted according to the color identified in operation 720 while the same passes through the display.
- the electronic device may change the color and/or transparency of some of the multiple pixels included in the first frame data.
- Some of the multiple pixels included in the first frame data may correspond to pixels disposed on a part of the display, through which external light brighter or darker than in the case of other pixels passes.
- FIG. 7B is a flowchart 700 - 2 illustrating operations of an electronic device according to an embodiment changing the color and/or transparency of some of multiple pixels included in first frame data. Descriptions of operations performed similarly or identically as in FIG. 7A , among the operations in FIG. 7B , will be omitted herein. For example, operations 710 , 720 , and 730 in FIG. 7B may be performed similarly as described with reference to FIG. 7A .
- the electronic device may identify first frame data including multiple pixels based on operations 710 , 720 , and 730 , the intensity of red light, the intensity of blue light, the intensity of green light, and the alpha value of the multiple pixels being identical to each other.
- the electronic device may identify an external object overlapping the first frame data on the display.
- the electronic device may acquire image data regarding the external object captured by the image sensor.
- the external object may be included in the field of view of the user wearing the electronic device.
- the electronic device may identify the intensity of external light emitted from the external object.
- the electronic device may adjust the color and/or transparency of at least a part of the first frame data based on the identified external object.
- the electronic device may adjust the color and/or transparency of at least one pixel corresponding to the external object, among the multiple pixels included in the first frame data.
- the at least one pixel corresponding to the external object may be included in a part of the first frame data overlapping the external object, while the first frame data is outputted on the display.
- the electronic device may store the adjusted first frame data in operation 740 .
- Operation 740 may be performed similarly as described with reference to FIG. 7A .
- FIG. 8 is a diagram illustrating multiple pixels included in first frame data generated by an electronic device according to an embodiment of the disclosure.
- the electronic device in FIG. 8 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the electronic device in FIG. 8 may acquire first frame data based on the operations in FIG. 7A or 7B .
- first frame data stored in a designated area (for example, first frame buffer 810 ) of the memory is illustrated.
- the first frame data may include multiple pixels.
- the multiple pixels included in the first frame data may correspond to multiple pixels included in the display of the electronic device.
- the width and height of the first frame data may correspond to the width and height of the display, respectively.
- the width and height of the first frame data may correspond to the sum of widths of the two displays and the sum of heights thereof, respectively.
- the (N+1) ⁇ (M+1) pixels included in the first frame data are illustrated.
- the first frame data may correspond to a display including (N+1) pixels horizontally and (M+1) pixels vertically.
- the intensify of red light of the pixel disposed at coordinate (a, b) inside the first frame data may be referred to as Rab
- the intensity of green light thereof may be referred to as Gab
- the intensity of blue light thereof may be referred to as Bab
- the transparency and/or alpha value thereof may be referred to as Aab.
- the transparency and/or alpha value of the first frame data may be determined based on the intensity and/or luminance of external light passing through the display.
- the electronic device may acquire the transparency and/or alpha value of the first frame data according to first information regarding the external environment of the electronic device based on operation 310 in FIG. 3 , the operations in FIG. 5 , or operation 710 in FIG. 7 .
- the electronic device may input the acquired transparency and/or alpha value to the transparency and/or alpha value of the (N+1) ⁇ (M+1) pixels in FIG. 8 .
- the color of the first frame data may be determined based on second information regarding the user wearing the electronic device.
- the second information refers to information pre-inputted by the user, and may include a color preferred by the user and multiple colors mapped to multiple situations, respectively, by the user.
- the second information may include a color designated to correct at least one of the user's color blindness and/or color weakness.
- the electronic device may acquire the color of the first frame data according to the second information based on operation 320 in FIG. 3 or operation 720 in FIG. 7 .
- the electronic device may input the intensity of red light, the intensity of green light, and the intensity of blue light, corresponding to the acquired color, to the intensity of red light, the intensity of green light, and the intensity of blue light of the (N+1) ⁇ (M+1) pixels in FIG. 8 , respectively.
- the electronic device may change the color and/or transparency of the multiple pixels included in the first frame data based on the luminance and/or color of the changed external light. For example, when the user moves while wearing the electronic device, the electronic device may identify a change in the luminance resulting from a change in the external light, based on the luminance sensor. In response to identifying a change in the luminance, the electronic device may change the transparency of the multiple pixels included in the first frame data.
- FIG. 9 is a flowchart 900 illustrating operations of an electronic device according to an embodiment of the disclosure adjusting second frame data based on first frame data.
- the electronic device in FIG. 9 may correspond to the electronic device 101 in FIGS. 1 to 2 and FIGS. 4A to 4B .
- the operations in FIG. 9 may be performed by the electronic device 101 in FIGS. 1 to 2 or by the processor 120 in FIGS. 1 to 2 .
- At least one of the operations illustrated in FIG. 9 may be associated with operation 360 in FIG. 3 .
- the electronic device may identify the first color of at least one of multiple pixels included in first frame data.
- the first frame data may be acquired based on operation 330 in FIG. 3 or at least one of the operations described with reference to FIGS. 7A to 7B .
- the electronic device may identify the first color from first frame data stored in a designated area of the memory as in FIG. 8 (for example, first frame buffer 810 in FIG. 8 ).
- the first color may be indicated as data including the intensity of red light, the intensity of green light, and the intensity of blue light, inputted to at least one of the multiple pixels included in the first frame data.
- the electronic device may change the color of at least one of the pixels included in the second frame data based on the identified first color.
- Content acquired from an application currently executed by the electronic device may be included in a part of the second frame data.
- the electronic device may adjust the color of pixels included in the part, among the pixels included in the second frame data.
- the electronic device may change the second color of at least one of the multiple pixels included in the second frame data to a third color corresponding to the complementary color of the first color.
- the electronic device may control the display by simultaneously using the second frame data and the first frame data, based on operation 370 in FIG. 3 .
- the electronic device may adjust the color and/or transparency of the entire area of the display based on the first frame data and then may display content included in the second frame data. Since the color of the entire area of the display adjusted based on the first frame data and the color of the content displayed based on the second frame data are complementary to each other, the user of the electronic device may clearly recognize the content.
- operations of an electronic device according to an embodiment controlling the display by simultaneously using second frame data and first frame data will be described.
- FIG. 10A is a diagram illustrating operations of an electronic device according to an embodiment of the disclosure controlling a display 220 based on first frame data 1030 and second frame data 1060 .
- FIG. 10B is a diagram illustrating operations of an electronic device according to an embodiment of the disclosure controlling a display 220 based on first frame data 1030 and second frame data 1060 .
- the electronic device in FIGS. 10A to 10B may correspond to the electronic device 101 in FIGS. 1 to 2 and FIGS. 4A to 4B .
- FIG. 10A an example of an image 1050 included in the FOV of a user wearing an electronic device according to various embodiments on his/her head is illustrated. Since the electronic device includes a display 220 disposed in front or the user's both eyes, the user can see the image 1050 by using external light passing through the display 220 .
- the electronic device may generate first frame data 1030 based on first information 1010 and second information 1020 .
- the first information 1010 may include data associated with the external environment of the electronic device (for example, image 1050 included in the FOV of the user), for example, data regarding the luminance of external light directed to the electronic device.
- the electronic device may identify or acquire first information 1010 based on operation 310 in FIG. 3 or the operations in FIG. 5 .
- the second information 1020 refers to information regarding the user of the electronic device, and may include, for example, data indicating the color preferred by the user, data regarding both eyes of the user (data regarding at least one of color weakness and/or color blindness), or a combination thereof.
- the electronic device may identify or acquire the second information 1020 based on operation 320 in FIG. 3 .
- the first frame data 1030 in FIG. 10A may be identified or acquired based on operation 330 in FIG. 3 or the operations in FIG. 7A .
- the first frame data 1030 may have uniform color and transparency in the entire area of the display 220 of the electronic device.
- data of all pixels included in the first frame data 1030 may be identical to each other as [RB, GB, BB, AB].
- the color [RB, GB, BB] of all pixels included in the first frame data 1030 may be determined based on second information 1020 .
- the color of multiple pixels included in the first frame data may correspond to a color for shifting a color of external light penetrating the display 220 , based on one or more parameters associated with at least one of the user's color weakness and/or color blindness.
- the transparency and/or alpha value (AB) of all pixels included in the first frame data 1030 may be determined based on the first information 1010 .
- the electronic device may determine the alpha value (AB) corresponding to luminance measured by the luminance sensor, based on the table 620 in FIG. 6 , for example.
- the electronic device may acquire second frame data 1060 based on an application 1040 currently executed by the electronic device.
- the application 1040 may include instructions for providing an augmented reality service associated with an external object that the user is interested in.
- the electronic device may identify an external object (for example, a bridge) included in the image 1050 , based on the application 1040 .
- the electronic device may acquire at least one content to be provided to the user, based on the application 1040 .
- the content may include not only information regarding the identified external object, but also information regarding the user's external environment (for example, information regarding the current time, the current weather, the current temperature, and the current location).
- the second frame data 1060 may include a visual element corresponding to content acquired based on the application 1040 .
- the visual element is generated by the electronic device such that the content is visually expressed based on a text, an image, a video, a figure, or a combination thereof.
- the second frame data 1060 may include multiple visual elements 1062 , 1064 , and 1066 corresponding to multiple contents associated with the augmented reality service, respectively.
- the visual element 1062 may visually express the current time based on a digital watch and/or an analog watch type, as information regarding the user's external environment.
- the visual element 1064 may visually express the current temperature and/or the current weather based on a gauge type, as information regarding the user's external environment.
- the visual element 1066 may visually express the result of identifying an external object based on a text, an image, a figure, or a combination thereof.
- the color and/or transparency of the second frame data 1060 may be independent of the color and/or transparency of the first frame data 1030 .
- the electronic device when the electronic device merges first frame data 1030 and second frame data 1060 , which have been identified independently, and then outputs the same inside the display 220 , the visibility of multiple visual elements 1062 , 1064 , and 1066 included in the second frame data 1060 may be degraded by the color of the first frame data 1030 .
- the electronic device may change the color and/or transparency of the second frame data 1060 identified independently of the first frame data 1030 , based on the color and/or transparency of the first frame data 1030 .
- the electronic device may change the color and/or transparency of the second frame data 1060 based on operation 360 in FIG. 3 or the operations in FIG. 9 .
- the color of multiple pixels included in multiple visual elements 1062 , 1064 , and 1066 may be changed to the complementary color of the color used to generate the first frame data 1060 .
- the electronic device may change the color of multiple pixels included in the second frame data 1060 (for example, multiple pixels included in the multiple visual elements 1062 , 2064 , and 1066 ) to a dark color.
- the electronic device may change the color of multiple pixels included in the second frame data 1060 (for example, multiple pixels included in the multiple visual elements 1062 , 2064 , and 1066 ) to a bright color.
- the electronic device may adjust the color of multiple pixels included in the second frame data 1060 by applying a gamma curve to each of the intensity of red light, the intensity of green light, and the intensity of blue light, included in the multiple pixels of the second frame data 1060 .
- the electronic device may change the color of the visual element having red color among the multiple visual elements 1062 , 1064 , and 1066 .
- second frame data 1070 that the electronic device has acquired by adjusting the color of the second frame data 1060 based on the color and/or transparency of the first frame data 1030 is illustrated.
- the electronic device may control multiple pixels included in the display 220 such that an image corresponding to the second frame data 1070 having the changed color is overlaid with an image corresponding to the first frame data 1030 inside the display 220 .
- Control of the display 220 may be performed based on an alpha blending technique and/or a display controller.
- the result of controlling the display 220 , based on the first frame data 1030 and the second frame data 1070 , by the electronic device is illustrated.
- External light passing through the display 220 may be shifted by the color of the first frame data 1030 .
- External light passing through the display 220 may be attenuated according to the transparency and/or alpha value of the first frame data 1030 .
- Multiple visual elements 1062 , 1064 , and 1066 generated by recognizing an image 1050 which is displayed inside the display 220 , and which is included in the user's FOV, may have a different color that is complementary to the color of the first frame data 1030 .
- the electronic device may change the color and/or transparency of a part of the first frame data 1030 , based on a relatively bright part and/or a dark part inside the user's FOV.
- FIG. 10B another example of an image 1050 - 1 included in the user's FOV is illustrated.
- a relatively bright external object 1055 (for example, the sun) may be included in the user's FOV.
- the electronic device may identify a relatively bright part and/or a dark part inside the user's FOV, based on luminance and/or image data included in the first information 1010 - 1 .
- FIG. 11 is a flowchart 1100 illustrating an order according to which an electronic device according to an embodiment of the disclosure generates first frame data and second frame data.
- the electronic device in FIG. 11 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the operations in FIG. 11 may be performed by the electronic device 101 in FIGS. 1 to 2 or by the processor 120 in FIGS. 1 to 2 .
- the electronic device may acquire first frame data and second frame data based on multiple processes or threads distinguished from each other, respectively. At least some of the operations in FIG. 11 may be performed similarly to the operations in FIG. 3 .
- the electronic device may identify first information regarding external light directed to the electronic device.
- the electronic device may perform operation 1110 similarly to operation 310 in FIG. 3 .
- the electronic device may determine the transparency based on the identified first information in operation 1140 .
- the electronic device may identify second frame data from an application.
- the electronic device may perform operation 1130 similarly to operation 350 in FIG. 3 .
- the electronic device may identify multiple pieces of second frame data from the multiple applications that are currently executed.
- the electronic device in response to identification of the first frame data, may adjust at least one of multiple pixels included in the second frame data, based on the first frame data, in operation 1170 .
- the electronic device may adjust at least one of multiple pixels included in the second frame data based on the operation in FIG. 9 .
- the electronic device when the electronic device has identified multiple pieces of second frame data from multiple applications currently executed, the electronic device may perform operation 1170 with regard to each of the multiple pieces of identified second frame data.
- the electronic device may acquire third frame data based on the first frame data and the second frame data in operation 1180 after operations 1160 and 1170 .
- the electronic device may acquire third frame data by merging the first frame data and the second frame data based on alpha blending and/or overlay.
- the electronic device may merge the first frame data and the multiple pieces of second frame data based on alpha blending and/or overlay.
- the electronic device in response to acquisition of the third frame data, may output the acquired third frame data on the display in operation 1190 .
- the display of the electronic device may transmit external light directed to the first surface thereof to the second surface thereof, which is opposite to the first surface, and which faces the user's both eyes. Multiple pixels included in the display may output light to the second surface.
- the electronic device controls the multiple pixels included in the display based on the third frame data, the user can simultaneously receive external light passing from the first surface to the second surface and light corresponding to the multiple pixels included in the third frame data.
- FIG. 12 is a flowchart 1200 illustrating operations of an electronic device according to an embodiment of the disclosure identifying second information regarding first frame data based on information inputted from the user.
- the electronic device in FIG. 12 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the operations in FIG. 12 may be performed by the electronic device 101 in FIGS. 1 to 2 or by the processor 120 in FIGS. 1 to 2 . At least some of the operations in FIG. 12 may be associated with operation 320 in FIG. 3 .
- the electronic device may determine whether or not a request associated with a change of the background has been received from the user.
- the user wearing the electronic device may perform a designated gesture or input a voice command, thereby inputting a command to change the background to the electronic device.
- the designated gesture may include at least one of or a combination of a gesture of touching a designated part of the housing of the electronic device (for example, a part to which a touch panel has been applied), a gesture of tapping the electronic device, a gesture of pressing or clicking a designated button of the electronic device, and a gesture performed inside the FOV of the image sensor of the electronic device.
- the command to change the background may correspond to a command to change the color and/or transparency of the display.
- the command to change the background may correspond to a command to change the color and/or transparency of the first frame data.
- the electronic device may not acquire information regarding both eyes of the user of the electronic device.
- the electronic device may acquire information regarding the user's eyes in operation 1220 .
- the electronic device may acquire information personalized for the user, including information regarding the eyes, from the user.
- the electronic device may acquire information such as the user's gender and age.
- the electronic device may acquire at least one of the eyesight of each of the user's both eyes, whether or not the user has astigmatism, and whether or not the user has myopia.
- the electronic device may acquire information regarding at least one of the user's color weakness and/or color blindness.
- the electronic device may display a UI for acquiring the information to the user on the display.
- the electronic device may acquire an image from the image sensor included in the electronic device in operation 1230 .
- the acquired image may include external light directed to the user's both eyes or passing through the display. Operations 1220 and 1230 may be performed independently, and may not be limited to the order illustrated in FIG. 12 .
- the electronic device may identify a user input of selecting one from the multiple pieces of frame data. Operations of the electronic device displaying the list of multiple pieces of frame data based on operations 1250 and 1260 and identifying a user input will be described later with reference to FIGS. 13A to 13D .
- the electronic device may determine second information regarding the user of the electronic device, based on the selected frame data, in operation 1270 .
- the second information may include information regarding the color of the selected frame data.
- the second information may include information regarding the user's eyes, acquired in operation 1220 .
- the second information may include information regarding the color of a different piece of frame data, which is distinguished from the frame data selected from the list of multiple pieces of frame data.
- FIG. 13B is a diagram illustrating an example of a UI that an electronic device according to an embodiment of the disclosure provides to the user in order to identify the second information in FIG. 12 .
- FIG. 13D is a diagram illustrating an example of a UI that an electronic device according to an embodiment of the disclosure provides to the user in order to identify the second information in FIG. 12 .
- the electronic device in FIGS. 13A to 13D may correspond to the electronic device 101 in FIGS. 1 to 2 and FIGS. 4A to 4B .
- the UIs illustrated in FIGS. 13A to 13D may be associated with operations 1250 and 1260 in FIG. 12 .
- FIG. 13A an example of an image inside a display 220 disposed toward the user's both eyes, while the user wears the electronic device on his/her head, according to various embodiments is illustrated.
- the image inside the display 220 may be a combination of external light passing through the display and light emitted from multiple pixels of the display.
- the electronic device may display a visual element 1310 (for example, “Background change” menu) for changing the background of the display 220 and/or the first frame data, inside the display 220 .
- the user may select the visual element 1310 based on a designated gesture and/or voice command.
- the electronic device may display a UI associated with the selected frame data on the display.
- the color and/or transparency of the display may be associated with the color and/or transparency of the selected frame data.
- the selected frame data may be determined as the first frame data 1030 in FIG. 10A , or may be stored in the first frame buffer 810 in FIG. 8 .
- the UI outputted together with the selected frame data may include a visual element 1330 for recommending the selected frame data to other users (for example, a menu including the text “Would you recommend the background to other users?”).
- the electronic device may transmit information regarding the selected frame data to an external electronic device.
- FIG. 14 is a flowchart 1400 illustrating operations of an electronic device 101 according to an embodiment of the disclosure identifying second information regarding first frame data by using an external electronic device 230 .
- the electronic device 101 in FIG. 14 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the external electronic device 230 in FIG. 14 may correspond to the external electronic device 230 in FIG. 2 .
- the operations in FIG. 14 may be performed by the electronic device 101 and the external electronic device 230 in FIG. 2 or by the processors 120 and 250 in FIG. 2 . At least some of the operations in FIG. 14 may be associated with the operations in FIG. 12 .
- the external electronic device 230 may identify multiple pieces of candidate frame data to be transmitted to the electronic device 101 , based on information and image data collected from multiple electronic devices. For example, the external electronic device 230 may identify first frame data used by another user having information similar to that of the user of the electronic device 101 and/or first frame data used by another user viewing an image similar to that viewed by the user of the electronic device 101 . The external electronic device 230 may transmit information regarding the identified first frame data to the electronic device 101 as candidate frame data. The external electronic device 230 may transmit an expert analysis corresponding to each of the multiple pieces of candidate frame data, a user review, a preview, or a combination thereof to the electronic device 101 .
- the electronic device 101 may display a list of multiple pieces of candidate frame data on the display in operation 1420 .
- the list of multiple pieces of candidate frame data may be provided to the user based on the UI in FIGS. 13B to 13C , for example.
- the electronic device 101 may identify a user input of selecting one from the multiple pieces of candidate frame data.
- the user may select one from the multiple pieces of candidate frame data by using the visual element 1320 in FIGS. 13B to 13C .
- the user may select candidate frame data that enables the user to view the currently viewed image most comfortably.
- the user may select candidate frame data that enables relatively better correction of color weakness and/or color blindness from the multiple pieces of candidate frame data.
- the electronic device 101 may determine second information regarding the user of the electronic device 101 based on the selected candidate frame data in operation 1440 .
- the second information may include data regarding the color and/or transparency of the selected candidate frame data.
- the second information may include information regarding the user's eyes, acquired in operation 1220 .
- the second information may include image data acquired in operation 1230 .
- the second information may include data regarding a piece of candidate frame data other than the piece of candidate frame data selected from the multiple pieces of candidate frame data.
- FIG. 15 is a flowchart 1500 illustrating operations of an electronic device according to an embodiment of the disclosure generating first frame data based on a mode and/or a state enabled by the user.
- the electronic device in FIG. 15 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the operations in FIG. 15 may be performed by the electronic device 101 in FIGS. 1 to 2 or by the processor 120 in FIGS. 1 to in 2 .
- the operations in FIG. 15 may be associated with operation 330 in FIG. 3 .
- the electronic device may adjust the color of first frame data based not only on information regarding the user (for example, second information), but also on data measured by the infrared sensor and/or the ultraviolet sensor.
- the electronic device may identify the transparency of multiple pixels included in first frame data based on first information.
- the transparency may be determined based on the luminance of external light received by the luminance sensor of the electronic device.
- the transparency may be determined within a designated range (for example, 25% to 100%) based on the luminance.
- the electronic device may identify the color of multiple pixels included in the first frame data based on second information. The color may be determined based on information inputted from the user (for example, data regarding at least one of color weakness and/or color blindness).
- Operations 1510 and 1515 may be performed similarly to operations 310 and 320 in FIG. 3 .
- the electronic device may determine whether or not the infrared blocking mode is enabled.
- the infrared blocking mode may be enabled based on the user's designated gesture and/or voice input.
- the electronic device may determine whether or not the infrared blocking mode is enabled, based on a parameter and/or a flag associated with the infrared blocking mode.
- the electronic device may adjust the color identified in operation 1515 based on the acquired intensity of infrared light, in operation 1530 .
- the electronic device may adjust the color identified in operation 1515 based on mapping data as given in Table 1 below:
- the color identified in operation 1515 may be changed based on the infrared blocking ratios in Table 1. For example, when the color identified in operation 1515 is yellow, the electronic device may compare the infrared blocking ratio (10%) of yellow color with the threshold (for example, 40%) of the infrared blocking ratio corresponding to the infrared blocking mode. Since the infrared blocking ratio (10%) of yellow color is below the threshold, the electronic device may change the identified color to a color having a relatively high infrared blocking ratio. For example, the electronic device may change the identified color to one of gray color, brown color, and/or green color having infrared blocking ratios equal to or higher than the threshold.
- the threshold for example, 40%
- the color identified in operation 1515 may be changed based on the infrared blocking ratios in Table 1 and the intensity of infrared light acquired in operation 1525 .
- the electronic device may identify the extent to which the intensity of infrared light acquired in operation 1525 is attenuated by the identified color.
- the attenuated intensity of light is equal to or higher than a designated threshold corresponding to the infrared blocking mode, the electronic device may change the identified color to a different color having a relatively high infrared blocking ratio.
- the electronic device may determine in operation 1535 whether or not the ultraviolet blocking mode is enabled.
- the ultraviolet blocking mode may also be enabled based on the user's designated gesture and/or voice input.
- the electronic device may acquire the intensity of ultraviolet light from the ultraviolet sensor in operation 1540 .
- the electronic device may include not only a luminance sensor, but also an ultraviolet sensor.
- the electronic device may measure the intensity of the ultraviolet wavelength band of external light directed to the electronic device, based on data measured by the ultraviolet sensor.
- the electronic device may adjust the identified color based on the acquired intensity of ultraviolet light in operation 1545 .
- the color identified in operation 1515 may be changed based on ultraviolet blocking ratios given in Table 1. For example, when the color identified in operation 1515 is pink, the electronic device may compare the ultraviolet blocking ratio (20%) of pink color with the threshold (for example, 40%) of the ultraviolet blocking ratio corresponding to the ultraviolet blocking mode. Since the ultraviolet blocking ratio of pink color is below the threshold, the electronic device may change the identified color to a different color having a relatively high ultraviolet blocking ratio. For example, the electronic device may change the identified color to one of gray color, brown color, and/or green color having ultraviolet blocking ratios equal to or higher than the threshold.
- the color identified in operation 1515 may be changed based on the ultraviolet blocking ratios in Table 1 and the intensity of ultraviolet light acquired in operation 1540 .
- the electronic device may apply an ultraviolet blocking ratio corresponding to the identified color to the intensity of ultraviolet light acquired in operation 1540 , thereby calculating the intensity of ultraviolet light after passing through the display having the identified color.
- the electronic device may change the color identified in operation 1515 to a different color having a relatively high ultraviolet blocking ratio.
- the electronic device may generate first frame data having adjusted color and identified transparency.
- the transparency of the first frame data may correspond to the transparency identified in operation 1510 .
- the color of the first frame data may correspond to the color identified in operation 1515 .
- the infrared blocking mode is enabled, the color of the first frame data may correspond to the color adjusted in operation 1530 .
- the ultraviolet blocking mode is enabled, the color of the first frame data may correspond to the color adjusted in operation 1545 .
- the electronic device may store the generated first frame data in operation 1560 .
- the first frame data may be stored in a designated area (for example, the first buffer area 810 in FIG. 8 ) inside the memory.
- the electronic device may control the multiple pixels of the display based on the generated first frame data.
- the color and/or transparency of the multiple pixels of the display may correspond to the color and/or transparency of the multiple pixels of the first frame data.
- FIG. 16 is a flowchart 1600 illustrating operations of an electronic device according to an embodiment of the disclosure changing the color of content provided to the user according to a change in external light.
- the electronic device in FIG. 16 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the operations in FIG. 16 may be performed by the electronic device 101 in FIGS. 1 to 2 or by the processor 120 in FIGS. 1 to in 2 .
- the operations in FIG. 16 may be associated with operations in FIG. 3 .
- the electronic device may identify first external light directed to the electronic device.
- the electronic device may identify the first external light based on the luminance sensor, the ultraviolet sensor, the infrared sensor, and/or the image sensor.
- the electronic device may acquire first information regarding the first external light.
- the first information may be acquired based on operation 310 in FIG. 3 , for example.
- the first information may include data regarding the luminance of the first external light.
- the electronic device may display content having a first color associated with the first external color on the display in operation 1620 .
- the electronic device may acquire frame data based on a designated color associated with the user and the first external light.
- the electronic device may identify the first color at least partially based on the color of the frame data acquired based on the luminance of the first external light and the designated color.
- the first color may correspond to the complementary color of the designated color.
- the designated color may be associated with one or more parameters associated with the user's color weakness and/or color blindness.
- the display may be disposed in front of both eyes of the user of the electronic device so as to transmit first external light to both eyes of the user.
- the electronic device may display the content on the display while being superimposed on the acquired frame data.
- the content may be acquired from an application executed by the processor, and may be associated with an augmented reality service provided to the user of the electronic device.
- the acquired frame data may correspond to the first frame data in FIG. 3 , and the content may be included in the second frame data in FIG. 3 .
- the electronic device may determine in operation 1630 whether or not second external light distinguished from the first external light has been identified.
- the electronic device may identify second external light distinguished from the first external light.
- the second external light may also be identified based on the luminance sensor, the ultraviolet sensor, the infrared sensor, and/or the image sensor, as in the case of the first external light.
- the color and/or luminance of the second external light may differ from the color and/or luminance of the first external light.
- the electronic device may change the color of the content from the first color to a second color associated with the second external light in operation 1640 .
- the electronic device may not maintain the color of the content to the second color, but may maintain the first color.
- the electronic device in response to identification of the second external light, may change the color and/or transparency of the frame data based on the designated color and the luminance of the second external light.
- the electronic device may identify the second color based on the color and/or transparency of the changed frame data.
- the second color may correspond to the complementary color of the color of the changed frame data.
- the electronic device in response to a change in the external light on the periphery of the user (for example, change from the first external light to the second external light), the electronic device may differently change the color of content associated with the augmented reality service outputted on the display. Accordingly, in spite of the change in the external light passing through the display, the user may clearly identify the content provided from the electronic device.
- FIG. 17 is a flowchart 1700 illustrating operations of an electronic device according to an embodiment of the disclosure generating first frame data and second frame data based on an image acquired from an image sensor.
- the electronic device corresponds to a smartphone and/or a smart pad
- external light directed to the first surface of the electronic device may not be transmitted to the second surface, which includes a display, which is viewed by the user, and which is opposite to the first surface, but may be blocked by the first surface of the electronic device.
- the electronic device may display not only the above-mentioned first frame data and/or second frame data, but also image data acquired from the image sensor, on the display, in order to provide an augmented reality service to the user.
- the electronic device may acquire an image from the image sensor.
- the image sensor may be disposed on the first surface so as to acquire external light directed to the first surface.
- the acquired image may be processed by an application which is currently executed by the electronic device, and which is for the purpose of providing an augmented reality service.
- the electronic device may identify first frame data based on the image acquired in operation 1710 , luminance, user information, or a combination thereof.
- the luminance may be identified from the image acquired from the image sensor and/or data measured by the luminance sensor included in the electronic device.
- the user information refers to information regarding the user of the electronic device, and may include, for example, the user's eyesight, whether or not the user has astigmatism, whether or not the user has myopia, whether or not the user has hypermetropia, color weakness, color blindness, or a combination thereof.
- the first frame data may be identified based on operation 330 in FIG. 3 or the operations in FIGS. 7A to 7B .
- the electronic device may display the image acquired in operation 1710 and the first frame data identified in operation 1720 , on the display.
- the electronic device may display the image and the first frame data on the display so as to overlap.
- the electronic device may display the image and the first frame data such that the first frame data is superimposed on the image.
- the electronic device may determine in operation 1740 whether or not second frame data has been identified from an application.
- the application may correspond to an application for providing an augmented reality service.
- the second frame data may include the result of identifying an external object captured on an image and/or content acquired by accessing a web service (for example, search engine) based on the identified external object.
- the electronic device may change the second frame data based on the first frame data in operation 1750 .
- the electronic device may maintain display of the first frame data without the second frame data.
- the electronic device may change the color and/or transparency of at least one of multiple pixels included in the second frame data based on operation 360 in FIG. 3 or the operations in FIG. 9 .
- the color of at least one of multiple pixels included in the second frame data may be changed to the complementary color of the color of multiple pixels included in the first frame data.
- the electronic device may display the changed second frame data by overlaying the same with the displayed image and first frame data.
- content included in the second frame data may be displayed on the display to be superimposed on the image and the first frame data.
- the color of the content included in the second frame data corresponds to the complementary color of the color of multiple pixels included in the first frame data, based on operation 1750 , such that the user can clearly recognize the content.
- FIG. 18 is a diagram illustrating a situation in which an electronic device 101 - 1 according to an embodiment of the disclosure provides a user 410 with an augmented reality service.
- the electronic device 101 - 1 in FIG. 18 may correspond to the electronic device 101 in FIGS. 1 to 2 .
- the electronic device 101 - 1 in FIG. 18 may correspond to a smartphone and/or a smart pad.
- the electronic device 101 - 1 in FIG. 18 may provide the user 410 with an augmented reality service based on the operations in FIG. 17 .
- the user 410 of the electronic device 101 - 1 may move the electronic device 101 - 1 such that the camera module and/or the image sensor of the electronic device 101 - 1 face an external object.
- the electronic device 101 - 1 may control multiple pixels included in the display 220 based on the operations in FIG. 17 .
- the electronic device 101 - 1 may display the first frame data and/or the second frame data in FIG. 17 to be superimposed on the image acquired from the image sensor, inside the display 220 .
- the size of the first frame data may correspond to the size of the entire area of the display 220 .
- the color of the image may be shifted by the color of multiple pixels included in the first frame data, and may be attenuated by the transparency and/or alpha value of multiple pixels included in the first frame data.
- the second frame data includes content associated with an external object such that, as the electronic device 101 - 1 displays second frame data on the display to be superimposed on the image and the first frame data, the user 410 can view the content together with the image having a color shifted based on the first frame data.
- the electronic device 101 - 1 may adjust the color and/or transparency of the display based on various types of outdoor external light.
- the electronic device 101 - 1 may adjust the color and/or transparency of the display based on information regarding both eyes of the user 410 .
- the electronic device 101 - 1 may determine the color of content associated with the augmented reality to be provided to the user 410 , based on the adjusted color and/or transparency of the display. As multiple pixels of the display are adjusted based on information regarding the user 410 and external environment, the user 410 can view the external object and content displayed inside the display 220 more clearly.
- the electronic device 101 - 1 may transmit information regarding the adjusted color and/or transparency of the display to an external electronic device (for example, external electronic device 230 in FIG. 2 ) or may share the information therewith.
- an external electronic device for example, external electronic device 230 in FIG. 2
- information regarding the adjusted color and/or transparency of the display may be shared between the multiple electronic devices.
- a computer-readable storage medium for storing one or more programs (software modules) may be provided.
- the one or more programs stored in the computer-readable storage medium may be configured for execution by one or more processors within the electronic device.
- the at least one program may include instructions that cause the electronic device to perform the methods according to various embodiments of the disclosure as defined by the appended claims and/or disclosed herein.
- the programs may be stored in non-volatile memories including a random access memory and a flash memory, a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a magnetic disc storage device, a Compact Disc-ROM (CD-ROM), Digital Versatile Discs (DVDs), or other type optical storage devices, or a magnetic cassette.
- ROM Read Only Memory
- EEPROM Electrically Erasable Programmable Read Only Memory
- CD-ROM Compact Disc-ROM
- DVDs Digital Versatile Discs
- any combination of some or all of the above may form a memory in which the program is stored. Further, a plurality of such memories may be included in the electronic device.
- the programs may be stored in an attachable storage device which may access the electronic device through communication networks such as the Internet, Intranet, Local Area Network (LAN), Wide LAN (WLAN), and Storage Area Network (SAN) or a combination thereof.
- a storage device may access the electronic device via an external port.
- a separate storage device on the communication network may access a portable electronic device.
- a component included in the disclosure is expressed in the singular or the plural according to a presented detailed embodiment.
- the singular form or plural form is selected for convenience of description suitable for the presented situation, and various embodiments of the disclosure are not limited to a single element or multiple elements thereof. Further, either multiple elements expressed in the description may be configured into a single element or a single element in the description may be configured into multiple elements.
- An electronic device change the color and/or transparency of a display based on external light such that the user can clearly recognize content associated with augmented reality.
- An electronic device change the color and/or transparency of a display based on information regarding the user's both eyes (for example, information regarding color weakness or color blindness) such that the user can clearly recognize not only content associated with augmented reality, but also external light.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Ophthalmology & Optometry (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Controls And Circuits For Display Device (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020190000636A KR102629149B1 (ko) | 2019-01-03 | 2019-01-03 | 외부 광에 따라 디스플레이의 특성을 변경하는 전자 장치 및 방법 |
KR10-2019-000636 | 2019-01-03 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200219431A1 US20200219431A1 (en) | 2020-07-09 |
US10971053B2 true US10971053B2 (en) | 2021-04-06 |
Family
ID=71404482
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/733,768 Active US10971053B2 (en) | 2019-01-03 | 2020-01-03 | Electronic device for changing characteristics of display according to external light and method therefor |
Country Status (5)
Country | Link |
---|---|
US (1) | US10971053B2 (zh) |
EP (1) | EP3881311A4 (zh) |
KR (1) | KR102629149B1 (zh) |
CN (1) | CN113272888B (zh) |
WO (1) | WO2020141945A1 (zh) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220016713A (ko) | 2020-08-03 | 2022-02-10 | 삼성전자주식회사 | 조도 측정 방법 및 장치 |
TWI784563B (zh) * | 2021-06-09 | 2022-11-21 | 宏碁股份有限公司 | 顯示器校色方法及電子裝置 |
US11790816B1 (en) * | 2022-09-07 | 2023-10-17 | Qualcomm Incorporated | Sensor integrated circuit (IC) with opposite facing ambient light sensor and proximity sensor, and related electronic devices and fabrication methods |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8976085B2 (en) | 2012-01-19 | 2015-03-10 | Google Inc. | Wearable device with input and output structures |
US20160270656A1 (en) * | 2015-03-16 | 2016-09-22 | Magic Leap, Inc. | Methods and systems for diagnosing and treating health ailments |
US9529197B2 (en) | 2012-03-21 | 2016-12-27 | Google Inc. | Wearable device with input and output structures |
JP2017523481A (ja) | 2014-05-28 | 2017-08-17 | インオプテック リミテッド, ツヴァイクニーダーラッスング ドイチュランド | 電子眼鏡 |
KR20170115367A (ko) | 2016-04-07 | 2017-10-17 | 엘지전자 주식회사 | 스마트 글래스 장치 |
US20170323482A1 (en) * | 2016-05-05 | 2017-11-09 | Universal City Studios Llc | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
KR101885473B1 (ko) | 2017-05-10 | 2018-08-03 | 동국대학교 산학협력단 | 시각 장애인 보조용 스마트 글래스 |
KR20180105401A (ko) | 2017-03-15 | 2018-09-28 | 삼성전자주식회사 | 투명 디스플레이 장치 및 그의 디스플레이 방법 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3370225B2 (ja) * | 1995-12-20 | 2003-01-27 | シャープ株式会社 | 情報処理装置 |
US20050223343A1 (en) * | 2004-03-31 | 2005-10-06 | Travis Amy D | Cursor controlled shared display area |
JP4556523B2 (ja) * | 2004-07-16 | 2010-10-06 | ソニー株式会社 | ビデオ信号処理装置、ビデオ信号処理方法 |
WO2012011893A1 (en) | 2010-07-20 | 2012-01-26 | Empire Technology Development Llc | Augmented reality proximity sensing |
KR102256992B1 (ko) * | 2013-04-25 | 2021-05-27 | 에씰로 앙터나시오날 | 착용자에게 적응된 헤드 장착형 전자-광학 장치를 제어하는 방법 |
IL236243A (en) * | 2014-12-14 | 2016-08-31 | Elbit Systems Ltd | Visual enhancement of color icons is shown |
JP6433850B2 (ja) * | 2015-05-13 | 2018-12-05 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイ、情報処理装置、情報処理システム、およびコンテンツデータ出力方法 |
KR102565847B1 (ko) * | 2015-07-06 | 2023-08-10 | 삼성전자주식회사 | 전자 장치 및 전자 장치에서의 디스플레이 제어 방법 |
WO2018119276A1 (en) | 2016-12-22 | 2018-06-28 | Magic Leap, Inc. | Systems and methods for manipulating light from ambient light sources |
-
2019
- 2019-01-03 KR KR1020190000636A patent/KR102629149B1/ko active IP Right Grant
-
2020
- 2020-01-03 WO PCT/KR2020/000141 patent/WO2020141945A1/en unknown
- 2020-01-03 CN CN202080007896.XA patent/CN113272888B/zh active Active
- 2020-01-03 EP EP20736173.4A patent/EP3881311A4/en active Pending
- 2020-01-03 US US16/733,768 patent/US10971053B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8976085B2 (en) | 2012-01-19 | 2015-03-10 | Google Inc. | Wearable device with input and output structures |
US9529197B2 (en) | 2012-03-21 | 2016-12-27 | Google Inc. | Wearable device with input and output structures |
JP2017523481A (ja) | 2014-05-28 | 2017-08-17 | インオプテック リミテッド, ツヴァイクニーダーラッスング ドイチュランド | 電子眼鏡 |
US20160270656A1 (en) * | 2015-03-16 | 2016-09-22 | Magic Leap, Inc. | Methods and systems for diagnosing and treating health ailments |
KR20170115367A (ko) | 2016-04-07 | 2017-10-17 | 엘지전자 주식회사 | 스마트 글래스 장치 |
US20170323482A1 (en) * | 2016-05-05 | 2017-11-09 | Universal City Studios Llc | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
KR20180105401A (ko) | 2017-03-15 | 2018-09-28 | 삼성전자주식회사 | 투명 디스플레이 장치 및 그의 디스플레이 방법 |
KR101885473B1 (ko) | 2017-05-10 | 2018-08-03 | 동국대학교 산학협력단 | 시각 장애인 보조용 스마트 글래스 |
Non-Patent Citations (1)
Title |
---|
International Search Report dated Apr. 28, 2020, issued in an International Application No. PCT/KR2020/000141. |
Also Published As
Publication number | Publication date |
---|---|
EP3881311A1 (en) | 2021-09-22 |
KR102629149B1 (ko) | 2024-01-26 |
EP3881311A4 (en) | 2022-02-09 |
CN113272888A (zh) | 2021-08-17 |
CN113272888B (zh) | 2024-06-14 |
WO2020141945A1 (en) | 2020-07-09 |
KR20200084574A (ko) | 2020-07-13 |
US20200219431A1 (en) | 2020-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10672333B2 (en) | Wearable electronic device | |
EP2891966B1 (en) | Electronic glasses and method for correcting color blindness | |
US20180077409A1 (en) | Method, storage medium, and electronic device for displaying images | |
US20200168177A1 (en) | Electronic device, augmented reality device for providing augmented reality service, and method of operating same | |
US10971053B2 (en) | Electronic device for changing characteristics of display according to external light and method therefor | |
US11204645B2 (en) | Method for providing haptic feedback, and electronic device for performing same | |
KR20160059276A (ko) | 착용 가능한 전자 장치 | |
WO2022134632A1 (zh) | 作品处理方法及装置 | |
US12063346B2 (en) | Electronic device for displaying content and method of operation thereof | |
KR102536146B1 (ko) | 디스플레이 장치, 이를 포함하는 전자 장치 및 이의 동작 방법 | |
US11294452B2 (en) | Electronic device and method for providing content based on the motion of the user | |
EP4235363A1 (en) | Electronic device comprising flexible display, operating method therefor, and storage medium | |
CN108604367B (zh) | 一种显示方法以及手持电子设备 | |
US12033382B2 (en) | Electronic device and method for representing contents based on gaze dwell time | |
EP4398593A1 (en) | Method and device for obtaining image of object | |
EP4339684A1 (en) | Method and device for controlling brightness of ar image | |
EP3834404B1 (en) | A server for providing multiple services respectively corresponding to multiple external objects included in image | |
KR102535918B1 (ko) | 사용자의 움직임 정보에 기반하여 디스플레이의 오버 드라이빙 정보를 조정하기 위한 방법 및 웨어러블 장치 | |
US20240071021A1 (en) | Head-mounted electronic device for converting a screen of an electronic device into extended reality and electronic device connected thereto | |
US11335240B2 (en) | Electronic device including display driving circuit for displaying corrected time information on basis of temperature information | |
KR20220122328A (ko) | 전자 장치 및 그 제어 방법 | |
KR20230044833A (ko) | 전자 장치 및 컨텐츠 표시 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, SUNGJUN;MUN, BYEONGJUN;AHN, JINYOUNG;AND OTHERS;SIGNING DATES FROM 20191226 TO 20191227;REEL/FRAME:051411/0227 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |