WO2021194119A1 - Dispositif électronique et procédé de détection de l'éclairement d'un dispositif électronique - Google Patents

Dispositif électronique et procédé de détection de l'éclairement d'un dispositif électronique Download PDF

Info

Publication number
WO2021194119A1
WO2021194119A1 PCT/KR2021/002656 KR2021002656W WO2021194119A1 WO 2021194119 A1 WO2021194119 A1 WO 2021194119A1 KR 2021002656 W KR2021002656 W KR 2021002656W WO 2021194119 A1 WO2021194119 A1 WO 2021194119A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
illuminance
sensing
electronic device
brightness
Prior art date
Application number
PCT/KR2021/002656
Other languages
English (en)
Korean (ko)
Inventor
김일영
허재영
김상섭
김연수
김영재
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021194119A1 publication Critical patent/WO2021194119A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light

Definitions

  • This document relates to an electronic device, and more particularly, to an electronic device capable of sensing external illuminance using an illuminance sensor and a illuminance sensing method thereof.
  • the illuminance sensor used in the electronic device may be used to adjust the brightness of the display of the electronic device by sensing ambient brightness.
  • the illuminance sensor may be used by mounting the illuminance sensor in an inactive area of the display.
  • the electronic device may mount an illuminance sensor in an active area of the display to detect external illuminance based on a display off section and color on pixel ratio (COPR) information.
  • COPR color on pixel ratio
  • the frame rate of the display is increased to satisfy this demand.
  • the scan rate is increasing from about 60 Hz to about 120 Hz.
  • the time of the off period of the display is reduced, and the sensing accuracy of the illuminance sensor mounted in the active region of the display may be reduced.
  • An electronic device includes a display, an illuminance sensor disposed under the display and sensing ambient illuminance of the electronic device according to a predetermined period, and a display and a processor operatively connected to the illuminance sensor, and the processor confirms a sensing area including the position of the display corresponding to the position of the illuminance sensor, generates image data to be output through the display, and overlaps at least a portion of the sensing timing of the illuminance sensor among a plurality of frames of image data It may be set to check a frame output through the display, and to adjust the brightness of a pixel corresponding to a sensing region in the checked frame through image data.
  • An illuminance sensing method of an electronic device includes an operation of identifying a sensing region including a position of a display corresponding to a position of an illuminance sensor, an operation of generating image data to be output through a display, and a plurality of image data Checking the frame output through the display at a point in time that overlaps with at least a part of the sensing timing of the illuminance sensor among the frames, and adjusting the brightness of pixels corresponding to the sensing area in the checked frame through image data. have.
  • illuminance sensing time can be secured by reducing the brightness by controlling it with image data.
  • FIG. 1 is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present invention.
  • FIG. 2A is a perspective view of a front surface of an electronic device according to an embodiment of the present invention.
  • FIG. 2B is a perspective view of a rear surface of an electronic device according to an embodiment of the present invention.
  • 2C is a view showing a state of an illuminance sensor mounted on a display under panel according to an embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a main configuration of an electronic device according to an embodiment of the present invention.
  • FIG. 4 is a view for explaining an operating condition of an illuminance sensor according to an embodiment of the present invention.
  • FIG. 5 is a diagram illustrating operating frequencies of a display and an illuminance sensor according to an embodiment of the present invention.
  • 6A is a diagram illustrating a state in which image data is applied to a sensing area according to an embodiment of the present invention.
  • 6B is a diagram illustrating a state in which image data is applied to a predetermined area including a sensing area according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating operating frequencies of a display and an illuminance sensor by periodic illuminance sensing according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating an illuminance sensing method according to an embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a detailed illuminance sensing method according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input device 150 , a sound output device 155 , a display device 160 , an audio module 170 , and a sensor module ( 176 , interface 177 , haptic module 179 , camera module 180 , power management module 188 , battery 189 , communication module 190 , subscriber identification module 196 , or antenna module 197 . ) may be included. In some embodiments, at least one of these components (eg, the display device 160 or the camera module 180 ) may be omitted or one or more other components may be added to the electronic device 101 . In some embodiments, some of these components may be implemented as one integrated circuit. For example, the sensor module 176 (eg, a fingerprint sensor, an iris sensor, or an illuminance sensor) may be implemented while being embedded in the display device 160 (eg, a display).
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be loaded into the volatile memory 132 , process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be loaded into the volatile memory 132 , process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and a secondary processor 123 (eg, a graphics processing unit, an image signal processor) that can be operated independently or in conjunction with the main processor 121 . , a sensor hub processor, or a communication processor). Additionally or alternatively, the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a designated function. The auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphics processing unit, an image signal processor
  • the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a designated function.
  • the auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • the auxiliary processor 123 may be, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or when the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the coprocessor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. have.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176 ) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input device 150 may receive a command or data to be used by a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display device 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display device 160 may include a touch circuitry configured to sense a touch or a sensor circuit (eg, a pressure sensor) configured to measure the intensity of a force generated by the touch. have.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input device 150 , or an external electronic device (eg, a sound output device 155 ) connected directly or wirelessly with the electronic device 101 . The sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • an external electronic device eg, a sound output device 155
  • the sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more specified protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 388 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a wireless communication module 192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 eg, : It may include a local area network (LAN) communication module, or a power line communication module.
  • a corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, WiFi direct, or IrDA (infrared data association)) or a second network 199 (eg, a cellular network, the Internet, or It may communicate with an external electronic device via a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • These various types of communication modules may be integrated into one component (eg, a single chip) or may be implemented as a plurality of components (eg, multiple chips) separate from each other.
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified and authenticated.
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as a part of the antenna module 197 .
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the electronic devices 102 and 104 may be the same or a different type of the electronic device 101 .
  • all or part of the operations performed by the electronic device 101 may be executed by one or more of the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • the one or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first”, “second”, or “first” or “second” may simply be used to distinguish the component from other components in question, and may refer to components in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) readable by a machine (eg, electronic device 101).
  • a storage medium eg, internal memory 136 or external memory 138
  • the processor eg, the processor 120
  • the device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided in a computer program product (computer program product).
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play StoreTM) or on two user devices (eg, It can be distributed (eg downloaded or uploaded) directly, online between smartphones (eg: smartphones).
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.
  • 2A is a perspective view of a front side of an electronic device 200 according to various embodiments of the present disclosure.
  • 2B is a perspective view of a rear surface of the electronic device 200 of FIG. 2A according to various embodiments of the present disclosure.
  • the electronic device 200 includes a first surface (or front surface) 210A, a second surface (or rear surface) 210B, and a first surface 210A. and a housing 210 including a side surface 210C surrounding the space between the second surfaces 210B.
  • the housing may refer to a structure that forms part of the first surface 210A, the second surface 210B, and the side surface 210C of FIG. 1 .
  • the first surface 210A may be formed by a front plate 202 (eg, a glass plate including various coating layers, or a polymer plate) that is at least partially transparent.
  • the second surface 210B may be formed by a substantially opaque back plate 211 .
  • the back plate 211 is formed by, for example, coated or colored glass, ceramic, polymer, metal (eg, aluminum, stainless steel (STS), or magnesium), or a combination of at least two of the above materials.
  • the side surface 210C is coupled to the front plate 202 and the rear plate 211 and may be formed by a side bezel structure 218 (or “side member”) including a metal and/or a polymer.
  • the back plate 211 and the side bezel structure 218 are integrally formed and may include the same material (eg, a metal material such as aluminum).
  • the front plate 202 has a first area 210D that extends seamlessly by bending from the first surface 210A toward the rear plate, the long edge of the front plate 210D. edge) can be included at both ends.
  • the rear plate 211 may include a second region 210E that extends seamlessly from the second surface 210B toward the front plate at both ends of the long edge. have.
  • the front plate 202 or the back plate 211 may include only one of the first region 210D or the second region 210E.
  • the front plate 202 may not include the first region and the second region, but may include only a flat plane disposed parallel to the second surface 210B.
  • the side bezel structure 218 when viewed from the side of the electronic device, has a first thickness ( or width), and may have a second thickness thinner than the first thickness at the side surface including the first area or the second area.
  • the electronic device 200 includes the display 201 , the input device 203 , the sound output devices 207 and 214 , the sensor modules 204 and 219 , and the camera devices 205 , 212 , 213 . , a key input device 217 , an indicator (not shown), and at least one of connectors 208 and 209 .
  • the electronic device 200 may omit at least one of the components (eg, the key input device 217 or an indicator) or additionally include other components.
  • the display 201 can be seen through, for example, a top portion of the front plate 202 . In some embodiments, at least a portion of the display 201 may be viewed through the front plate 202 forming the first area 210D of the first surface 210A and the side surface 210C.
  • the display 201 may be coupled to or disposed adjacent to a touch sensing circuit, a pressure sensor capable of measuring the intensity (pressure) of a touch, and/or a digitizer that detects a magnetic field type stylus pen.
  • at least a portion of the sensor module 204 , 219 , and/or at least a portion of a key input device 217 is located in the first area 210D, and/or the second area 210E. can be placed.
  • the input device 203 may include a microphone 203 .
  • the input device 203 may include a plurality of microphones 203 arranged to sense the direction of the sound.
  • the sound output devices 207 and 214 may include speakers 207 and 214 .
  • the speakers 207 and 214 may include an external speaker 207 and a receiver 214 for a call.
  • the microphone 203 , the speakers 207 , 214 , and the connectors 208 , 209 are disposed in the space of the electronic device 200 , and externally through at least one hole formed in the housing 210 . may be exposed to the environment.
  • the hole formed in the housing 210 may be used in common for the microphone 203 and the speakers 207 and 214 .
  • the sound output devices 207 and 214 may include a speaker (eg, a piezo speaker) that operates while excluding a hole formed in the housing 210 .
  • the sensor modules 204 and 219 may generate electrical signals or data values corresponding to an internal operating state of the electronic device 200 or an external environmental state.
  • the sensor modules 204 and 219 are, for example, a first sensor module 204 (eg, a proximity sensor) and/or a second sensor module (not shown) disposed on the first surface 210A of the housing 210 . ) (eg, a fingerprint sensor), and/or a third sensor module 219 (eg, an HRM sensor) disposed on the second surface 210B of the housing 210 .
  • the fingerprint sensor may be disposed on the first surface 210A (eg, the home key button 215 ) of the housing 210 , a partial region of the second surface 210B, or under the display 201 .
  • the electronic device 200 may include a sensor module (not shown), for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, a temperature sensor, It may further include at least one of a humidity sensor and an illuminance sensor 204 .
  • a sensor module for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, a temperature sensor, It may further include at least one of a humidity sensor and an illuminance sensor 204 .
  • the camera devices 205 , 212 , and 213 are a first camera device 205 disposed on the first side 210A of the electronic device 200 , and a second camera device 212 disposed on the second side 210B of the electronic device 200 . ), and/or a flash 213 .
  • the camera devices 205 and 212 may include one or more lenses, an image sensor, and/or an image signal processor.
  • the flash 213 may include, for example, a light emitting diode or a xenon lamp. In some embodiments, two or more lenses (a wide-angle lens, an ultra-wide-angle lens, or a telephoto lens) and image sensors may be disposed on one side of the electronic device 200 .
  • the key input device 217 may be disposed on the side surface 210C of the housing 210 .
  • the electronic device 200 may not include some or all of the above-mentioned key input devices 217 and the not included key input devices 217 may be displayed on the display 201 as soft keys or the like. It may be implemented in other forms.
  • the key input device 217 may be implemented using a pressure sensor included in the display 201 .
  • the indicator may be disposed, for example, on the first surface 210A of the housing 210 .
  • the indicator may provide, for example, state information of the electronic device 200 in the form of light.
  • the light emitting element may provide, for example, a light source that is linked to the operation of the camera device 205 .
  • Indicators may include, for example, LEDs, IR LEDs and xenon lamps.
  • the connector holes 208 and 209 include a first connector hole 208 capable of receiving a connector (eg, a USB connector) for transmitting and receiving power and/or data to and from an external electronic device, and/or an external electronic device. and a second connector hole (or earphone jack) 209 capable of accommodating a connector for transmitting and receiving audio signals.
  • a connector eg, a USB connector
  • a second connector hole or earphone jack
  • Some camera devices 205 of the camera devices 205 and 212 , some sensor modules 204 of the sensor modules 204 and 219 , or an indicator may be arranged to be visible through the display 201 .
  • Some sensor modules 204 may be arranged to perform their functions without being visually exposed through the front plate 202 in the internal space of the electronic device.
  • FIG. 2C is a diagram illustrating an illuminance sensor mounted under a panel of a display and a sensing area corresponding thereto. According to an embodiment, FIG. 2C may be a cross-sectional view taken along line A-A' of FIG. 2A .
  • the electronic device may include a display and an illuminance sensor.
  • the illuminance sensor 20 may be of an under panel type, that is, the illuminance sensor 20 may be disposed under (or on the back side) of the display 201 .
  • the illuminance sensor 20 may be located at the lower end of the display 201 including the front cover 22 of the display, the display panel 24 and the subsidiary material layer 26 .
  • An area on the display 240 corresponding to the area 20 in which the illuminance sensor is located exists, and the area on the display required for illuminance sensing may form an area 250 wider than the corresponding area 240 by a certain range. have.
  • the illuminance sensor 20 may be disposed in or below an area in which at least a portion of the auxiliary material layer 26 is removed.
  • the auxiliary material layer 26 may be made of an opaque material through which light cannot pass. ) can be entered.
  • the auxiliary material layer 26 may include a light blocking layer (not shown) for blocking light generated by the display panel 24 or light incident from the outside, and the illuminance sensor 20 is light blocking. It may be disposed in an area that does not contain a layer.
  • the sensing area 220 may be about 51 pixels x 51 pixels, but is not necessarily limited thereto, and the area including the area on the display 240 corresponding to the area 20 where the illuminance sensor is located ( 250) may be applicable.
  • the illuminance sensor 20 may be mounted in an active area of the display 201, and may control external illuminance based on an off section of the display and color on pixel ratio (COPR) information. can detect As described above, the sensing area 250 may correspond to the area 250 including the area on the display 240 corresponding to the area 20 where the illuminance sensor is located.
  • COPR color on pixel ratio
  • FIG. 3 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • the electronic device may include a display 310 , an illuminance sensor 20 , a processor 330 , and a memory 320 , and even if at least some of the illustrated components are omitted or replaced, various embodiments can be implemented.
  • the electronic device may further include at least some of the configuration and/or functions of the electronic device 101 of FIG. 1 .
  • the display 310 may display an image based on image data input from the processor 330 .
  • the display 310 is a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light-emitting diode (OLED) display, or a microelectromechanical system ( It may be implemented as either a micro electro mechanical systems (MEMS) display or an electronic paper display, but is not limited thereto.
  • LCD liquid crystal display
  • LED light-emitting diode
  • OLED organic light-emitting diode
  • MEMS micro electro mechanical systems
  • the display 310 has a structure in which an illuminance sensor can be mounted under a panel of the display.
  • the display 310 includes a display panel 24 and an auxiliary material layer 26 , and at least a portion of the auxiliary material layer 26 made of an opaque material may be removed, and illuminance is provided below the corresponding area.
  • a sensor may be disposed.
  • the auxiliary material layer 26 may include a light blocking layer (not shown) for blocking light generated by the display panel 24 or light incident from the outside, and the illuminance sensor 20 is light blocking. It may be disposed in an area that does not contain a layer.
  • the display 310 may control the operating frequency according to the type of the application being displayed or whether illuminance is sensed.
  • the illuminance sensor 20 is a sensor capable of measuring illuminance when mounted under the display panel.
  • the illuminance sensor 20 is composed of a photodiode, and can sense ambient brightness by sensing a change in current according to the amount of light entering through the photodiode.
  • the illuminance sensor 20 may sense external illuminance according to a predetermined period.
  • the memory 320 may temporarily or permanently store various data, including volatile memory and non-volatile memory.
  • the memory 320 may store various instructions that may be executed by the processor 330 . Such instructions may include control commands such as arithmetic and logical operations, data movement, and input/output that can be recognized by the processor 330 .
  • the memory 320 may store the program 140 of FIG. 1 .
  • the memory 320 may store information of the illuminance sensor 20 and/or information of the display 310 .
  • the information of the illuminance sensor may include information such as a minimum operating time of the illuminance sensor for sensing illuminance and a minimum time for the illuminance sensor not to be affected by the brightness of the display when sensing the illuminance.
  • the information of the display may include information such as the brightness of the sensing area, the type of application being displayed, and the operation type.
  • the memory 320 includes information on the position on the display of the sensing region 250 and/or the region ( 250 ) including the region on the display 240 corresponding to the region where the illuminance sensor 20 is located. 250) can be stored.
  • a table for controlling pixel operation of the display eg, a determination criterion such as a threshold, control condition data, etc.
  • data enabling the processor 330 to recognize an illuminance sensing value, and the like may be stored.
  • the processor 330 is a configuration capable of performing an operation or data processing related to control and/or communication of each component of the electronic device, and may include one or more processors.
  • the processor 330 may be electrically, functionally, and/or operatively connected to each component of the electronic device, such as the display 310 , the illuminance sensor 20 , and the memory 320 .
  • the processor 330 may include at least some of the configuration and/or functions of the processor 120 of FIG. 1 .
  • the processor 330 may perform a function of controlling and recognizing the on/off time of the display 310 and adjusting the brightness of the display 310 according to the ambient brightness detected by the sensor.
  • the illuminance measurement timing of the illuminance sensor 20 and the operating condition of the display 310 may be controlled, and a function of recognizing the measured value of the illuminance sensor 20 may also be performed.
  • the on/off time of the display 201 may be related to a screen refresh rate (frame rate).
  • the screen refresh rate is also referred to as a screen refresh rate or a screen refresh rate, and refers to the number of times the display 201 displays a frame on the screen during one second.
  • turning on the display 201 may correspond to a state in which a frame is displayed on the screen
  • turning off the display 201 may correspond to a state in which a frame is not displayed on the screen.
  • the processor 330 has a function of identifying the sensing region 250 including the position 240 on the display corresponding to the position of the illuminance sensor 20 , a function of generating image data to be output through the display 310 , the image A function of checking a frame output through the display 310 at a point in time overlapping with at least a part of the sensing timing of the illuminance sensor 20 among a plurality of frames of data, and a function of checking a pixel corresponding to the sensing region 250 in the checked frame It can perform a function to adjust the brightness through image data.
  • the generation of image data to be output through the display 310 may be performed when the display 310 operates at a refresh rate or higher, and after the sensing area for sensing is controlled, the illuminance around the electronic device is measured and thus Based on this, a function can be performed to adjust the brightness of the display.
  • the processor 330 may identify the sensing region 250 including the position 240 on the display corresponding to the position of the illuminance sensor 20 .
  • the illuminance sensor 20 may be located at the lower end of the panel portion of the front cover 22 of the display, the display panel 24 and the subsidiary material layer 26 .
  • the auxiliary material layer 26 may be made of an opaque material through which light cannot pass. ) can be entered.
  • An area on the display 240 corresponding to the area 20 in which the illuminance sensor is located exists, and an area on the display required for illuminance sensing may require a larger area 240 than the corresponding area 240. have.
  • the sensing area 250 may be about 51 pixels x 51 pixels, but is not limited thereto.
  • the processor 330 may generate image data to be output through the display 310 .
  • the processor 330 may apply the image data generated by the processor 330 to pixels of a certain area on the display 310 , and when the pixel value of the image data is reduced, the color of the pixel becomes dark and the brightness of the frame can be reduced.
  • the pixel value of the image data may be 0, and the color of the corresponding pixel may be black.
  • the processor 330 may identify a frame output through the display 310 at a time point overlapping with at least a part of the sensing timing of the illuminance sensor 20 among a plurality of frames of image data.
  • the processor may check the frame to which the image data is to be applied and apply the generated image data to the pixels of the sensing area.
  • the processor 330 may adjust the brightness of a pixel corresponding to the sensing region 250 in the checked frame through image data.
  • the brightness of the pixel adjusted by the processor 330 may be 0, and the scan rate of the sensing region 250 of the display 310 may change.
  • FIG. 4 is a view for explaining an operating condition of an illuminance sensor.
  • the display may turn on/off each pixel constituting the display according to the refresh rate. For example, when the scan rate is about 60 Hz, current flows for a predetermined period of 1/60 second to turn on each pixel and turn it off for the rest of the time.
  • the illuminance sensor may perform a sensing operation in a section except for the area 410 in which the display is turned on. In the display-on section 410 , it is difficult to accurately measure the ambient illuminance of the electronic device due to the influence of the display brightness.
  • a predetermined time 420 must be secured for the operation of the illuminance sensor, and a predetermined time 430 must be secured so that the sensor is not affected by the brightness of the display when sensing the illuminance.
  • the minimum operating time of the illuminance sensor for sensing illuminance and the minimum time for the illuminance sensor not to be affected by the brightness of the display during illuminance sensing are included in the information of the illuminance sensor 20 and stored in the memory 320 as described above. stored, and the processor 330 may use it to control the sensing region 250 .
  • FIG. 5 is a view of a display and an operating frequency of an illuminance sensor according to an embodiment for explaining a reason for controlling a sensing region for the illuminance sensor operation.
  • the upper frequency 510 is when the display operates at about 60 Hz
  • the middle frequency 520 is the operating frequency of the illuminance sensor
  • the lower frequency 530 is when the display operates at about 120 Hz.
  • the refresh rate of the display and the operating frequency of the illuminance sensor described below are not limited to specific values.
  • the minimum operating time of the illuminance sensor and the minimum time during which the illuminance sensor is not affected by the brightness of the display are secured, but when the display operates at about 120 Hz (530), the operation of the illuminance sensor condition is not satisfied.
  • the entire display operates at about 120 Hz, if the sensing region is controlled to operate at about 60 Hz, the operating condition of the illuminance sensor is satisfied, and the ambient brightness of the electronic device can be accurately measured as in the case where the scan rate is low.
  • 6A is a diagram illustrating a state in which a scan rate of a sensing region is controlled through image data according to an embodiment of the present invention.
  • the processor identifies a frame output through the display at a time point overlapping at least a part of sensing timing of the illuminance sensor among a plurality of frames of image data, and the brightness of a pixel corresponding to the sensing region in the checked frame can be adjusted through image data.
  • the image data displayed on the display consists of a plurality of frames, and each frame is an odd frame (50a, 50b, 50c, 50d) and an even frame (52a), 52b, 52c, and 52d).
  • each frame is an odd frame (50a, 50b, 50c, 50d) and an even frame (52a), 52b, 52c, and 52d).
  • the display operates at a refresh rate of about 120 Hz
  • 60 frames intersecting each other among 120 frames displayed per second may be defined as odd frames, and the remaining 60 frames may be defined as even frames.
  • the odd frame is output as it is for a specific region (eg, the sensing region), but the brightness value of the even frame (52a, 52b, 52c, 52d) is output as 0 (eg, the pixel value of the sensing region is (0) ,0,0)), since the pixels of the sensing area are turned on only in odd frames, the display may be turned on at a cycle of 1/60 second, as in the case where the display operates at about 60 Hz.
  • the processor may adjust the brightness of a pixel corresponding to the sensing region in an even frame.
  • the pixel value of the image data is decreased, the color of the pixel may be darkened, and the brightness of the frame may be decreased.
  • the processor may set the pixel value of the sensing region to 0 in the even-numbered frames of the image data. Accordingly, the color of the pixel of the sensing region may be black.
  • the sensing region can be controlled by dividing the case in which the image data of the odd frame is not applied in the display operation (610) and the case in which the image data of the even frame is applied in the display operation (620).
  • the brightness of a pixel corresponding to a sensing region may be adjusted as image data.
  • the brightness of the adjusted pixel may be 0, and in this case, an effect may be obtained such that the display operates at a low refresh rate (eg, about 60 Hz).
  • the brightness of the adjusted pixel may be lowered only enough to satisfy the operating condition of the illuminance sensor.
  • it may be adjusted by controlling the pixel value of the image data to decrease. It can be a value close to 0 instead of 0.
  • the processor may apply a brighter pixel value from the color series of the existing display sensing area operation screen to the odd frames 50a, 50b, 50c, and 50d.
  • pixel values may be applied even to odd frames 50a, 50b, 50c, and 50d, and pixel values of image data to be applied may be increased.
  • FIG. 5 is a view of an operating frequency of a display and an illuminance sensor according to an embodiment for convenience of explanation, and a method for securing the sensing timing of the illuminance sensor through control divided into odd and even frames has been described. In order to secure the sensing timing, various combinations of frame control may be possible.
  • odd and even frame control may correspond to on-off-on-off pattern control, and according to an embodiment, frame control for satisfying the operating condition of the illuminance sensor
  • the control of the on-on-off-off pattern may be performed and the control of the on-off-off-on pattern may be performed.
  • 6B is a diagram illustrating a state in which image data is applied to a predetermined area including a sensing area according to an embodiment of the present invention.
  • image data may be applied to a certain region 640 including the sensing region 630 , and application of such image data may function as a user interface (UI) including design elements.
  • UI user interface
  • the sensing region may be included in an icon region in which the remaining battery level, Wi-Fi, time, etc. are displayed among the active regions of the display, and the pixel value of the image data applied to the icon region is reduced to obtain a dark series of images. It is possible to satisfy the operating conditions of the illuminance sensor while functioning as a UI including color and design elements.
  • FIG. 7 is a diagram illustrating periodic illuminance sensing.
  • the brightness adjustment of the pixel corresponding to the sensing region overlaps with at least a portion of the sensing timing of the illuminance sensor among a plurality of frames of image data. In a frame outputted through the display at a time point, it may be made in only one frame within a set period.
  • the operation of the display pixel is divided into odd frames 70a, 70b, 70c, and 70d and even frames 72a, 72b, 72c and 72d.
  • control of one even-numbered frame 72a within a predetermined period T may be performed.
  • the above description is exemplary, and in the control of the frame through the image data, the type of the frame is not limited or limited to a specific frame.
  • FIG. 8 is a flowchart illustrating a method of controlling a sensing region for illuminance sensing.
  • the processor checks a sensing region, and the sensing region may be an region including a position on the display corresponding to the position of the illuminance sensor.
  • the processor may generate image data to be output through the display.
  • the processor may identify a frame output through the display at a time point overlapping at least a part of the timing of the illuminance sensor among the plurality of frames of the image data. According to an embodiment, as described above, it may correspond to an even frame of the display operating frequency.
  • the processor may adjust the brightness of a pixel corresponding to the sensing region in the checked frame through image data.
  • the brightness of the pixel adjusted by the processor may be 0, and in this case, as described above, the display may operate at a low refresh rate. It is also possible to lower the brightness of the adjusted pixel only enough to satisfy the operating condition of the illuminance sensor.
  • FIG. 9 is a flowchart of a method of controlling a sensing region for illuminance sensing according to a special embodiment of the present invention.
  • operation 910 it may be determined whether the refresh rate of the display is higher than a specified refresh rate (eg, whether it is greater than about 60 Hz or about 120 Hz). If the scan rate is about 120 Hz, operation 920 may be performed as a display having a high scan rate. If the refresh rate is not about 120 Hz or is lower than a specified refresh rate (eg, less than about 120 Hz), operation 940 may be performed. In operation 920, check the minimum value of AOR (AMOLED Off Ratio), and if it is less than about 30%, it is necessary to satisfy the illuminance sensor operation condition for low-light sensing, so it moves on to operation 930. If it is about 30% or more, operation 940 is performed can do.
  • a specified refresh rate eg, whether it is greater than about 60 Hz or about 120 Hz.
  • the even frame of the sensing region may be turned off (black image data is applied), and AOR may be additionally secured.
  • illuminance may be measured with the illuminance sensor, and in operation 950, display brightness may be set based on the measured illuminance.
  • the present disclosure can secure the illuminance sensing time by controlling and reducing the brightness of a certain frame in the display on state with image data even under a high scan rate display condition in a situation where the illuminance sensor is mounted under the panel. Accordingly, it is possible to reduce power consumption and reduce user's eye fatigue by sensing the brightness around the electronic device and providing the display brightness corresponding to the ambient illuminance.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un procédé et un appareil servant à détecter l'éclairement d'un dispositif électronique, et, plus spécifiquement, elle concerne un procédé et un appareil qui servent à détecter l'éclairement d'un dispositif électronique et qui commandent et réduisent, au moyen de données d'image, la luminosité d'une trame prédéterminée dans un état d'affichage actif même dans des conditions dans lesquelles un dispositif d'affichage a une fréquence de trame élevée et un capteur d'éclairement monté sous le panneau, ce qui permet de garantir qu'il y ait suffisamment de temps pour détecter l'éclairement.
PCT/KR2021/002656 2020-03-23 2021-03-04 Dispositif électronique et procédé de détection de l'éclairement d'un dispositif électronique WO2021194119A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200035248A KR20210118678A (ko) 2020-03-23 2020-03-23 전자 장치 및 전자 장치의 조도 센싱 방법
KR10-2020-0035248 2020-03-23

Publications (1)

Publication Number Publication Date
WO2021194119A1 true WO2021194119A1 (fr) 2021-09-30

Family

ID=77892355

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/002656 WO2021194119A1 (fr) 2020-03-23 2021-03-04 Dispositif électronique et procédé de détection de l'éclairement d'un dispositif électronique

Country Status (2)

Country Link
KR (1) KR20210118678A (fr)
WO (1) WO2021194119A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160147173A (ko) * 2015-06-12 2016-12-22 엘지디스플레이 주식회사 표시패널의 분할 구동을 제어하는 타이밍 컨트롤러 및 이를 포함하는 유기발광표시장치
KR20170090951A (ko) * 2016-01-29 2017-08-08 삼성전자주식회사 전자 장치 및 그 제어 방법
KR20180024299A (ko) * 2016-08-29 2018-03-08 삼성전자주식회사 조도를 측정하는 방법 및 그 전자 장치
KR20180044129A (ko) * 2016-10-21 2018-05-02 삼성전자주식회사 전자 장치 및 전자 장치의 지문 정보 획득 방법
US20180348049A1 (en) * 2017-06-01 2018-12-06 Samsung Electronics Co., Ltd Electronic device and method for controlling ambient light sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160147173A (ko) * 2015-06-12 2016-12-22 엘지디스플레이 주식회사 표시패널의 분할 구동을 제어하는 타이밍 컨트롤러 및 이를 포함하는 유기발광표시장치
KR20170090951A (ko) * 2016-01-29 2017-08-08 삼성전자주식회사 전자 장치 및 그 제어 방법
KR20180024299A (ko) * 2016-08-29 2018-03-08 삼성전자주식회사 조도를 측정하는 방법 및 그 전자 장치
KR20180044129A (ko) * 2016-10-21 2018-05-02 삼성전자주식회사 전자 장치 및 전자 장치의 지문 정보 획득 방법
US20180348049A1 (en) * 2017-06-01 2018-12-06 Samsung Electronics Co., Ltd Electronic device and method for controlling ambient light sensor

Also Published As

Publication number Publication date
KR20210118678A (ko) 2021-10-01

Similar Documents

Publication Publication Date Title
WO2020171448A1 (fr) Structure d'agencement de composant électronique et dispositif électronique comprenant une telle structure
WO2019088667A1 (fr) Dispositif électronique pour reconnaître une empreinte digitale à l'aide d'un dispositif d'affichage
WO2019194606A1 (fr) Dispositif électronique comprenant un affichage flexible
WO2019039752A1 (fr) Dispositif électronique comprenant un dispositif de caméra et procédé
WO2018190619A1 (fr) Dispositif électronique consistant en un capteur biométrique
WO2019045497A1 (fr) Dispositif électronique comprenant un affichage et procédé de correction associé
WO2020159242A1 (fr) Dispositif électronique comprenant un dispositif d'affichage
WO2020022734A1 (fr) Dispositif électronique pouvant être porté destiné à commander, sur la base de la capacité de batterie restante, la transmittance d'un élément transparent et la luminance de sortie d'un projecteur, et son procédé de fonctionnement
WO2020096324A1 (fr) Dispositif électronique flexible et son procédé de fonctionnement
WO2020022702A1 (fr) Dispositif électronique et procédé de charge de stylet l'utilisant
WO2020060218A1 (fr) Dispositif électronique d'amélioration du phénomène de reconnaissance visuelle dans une zone partielle d'affichage
WO2021045480A1 (fr) Dispositif électronique comprenant module de photocapteur
WO2019143167A1 (fr) Dispositif électronique qui commande le degré auquel un capteur d'empreintes digitales est visuellement reconnu
WO2019035607A1 (fr) Dispositif électronique et procédé de commande de signaux de détection de toucher, et support d'informations
WO2019039734A1 (fr) Dispositif électronique comprenant un capteur et une ou plusieurs couches conductrices à exciter à l'aide d'un signal provenant d'un capteur
WO2020111799A1 (fr) Dispositif électronique comprenant un connecteur auquel un dispositif accessoire peut être fixé
WO2019143207A1 (fr) Dispositif électronique et afficheur pour réduire le courant de fuite
WO2020032525A1 (fr) Dispositif électronique pour afficher des informations concernant un stylet et procédé associé
WO2021066517A1 (fr) Procédé de compensation d'un changement d'excitation de pixel provoqué par un courant de fuite dû à l'émission de lumière provenant d'un capteur, et dispositif électronique utilisant ledit procédé
WO2020171359A1 (fr) Dispositif électronique et son procédé de guidage d'informations d'imagerie
WO2020101379A1 (fr) Structure d'antenne formée dans un support et dispositif électronique la comprenant
WO2019088549A1 (fr) Procédé d'acquisition d'image d'empreinte digitale et dispositif électronique le prenant en charge
WO2021194119A1 (fr) Dispositif électronique et procédé de détection de l'éclairement d'un dispositif électronique
WO2022203300A1 (fr) Boîtier ayant un élément de ventilation disposé à l'intérieur de celui-ci, et dispositif électronique le comprenant
WO2021118121A1 (fr) Dispositif électronique de commande de module de capteur optique et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21775786

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21775786

Country of ref document: EP

Kind code of ref document: A1