WO2019192044A1 - 一种环境光检测的方法及终端 - Google Patents
一种环境光检测的方法及终端 Download PDFInfo
- Publication number
- WO2019192044A1 WO2019192044A1 PCT/CN2018/085107 CN2018085107W WO2019192044A1 WO 2019192044 A1 WO2019192044 A1 WO 2019192044A1 CN 2018085107 W CN2018085107 W CN 2018085107W WO 2019192044 A1 WO2019192044 A1 WO 2019192044A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- brightness value
- terminal
- brightness
- value
- pixels
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0264—Details of the structure or mounting of specific components for a camera module assembly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0266—Details of the structure or mounting of specific components for a display module assembly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/22—Illumination; Arrangements for improving the visibility of characters on dials
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/57—Control of contrast or brightness
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0626—Adjustment of display parameters for control of overall brightness
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/38—Displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Definitions
- the present application relates to the field of electronic technologies, and in particular, to an ambient light detecting method and terminal.
- the mobile phone installs the ambient light sensor under the display screen, and the ambient light sensor performs the detection of the ambient light, and then the mobile phone adjusts the brightness of the display screen according to the detection result to enhance the user's visual experience.
- the ambient light sensor since the ambient light sensor needs to receive ambient light through the display screen, the portion of the display screen that covers the ambient light sensor cannot be used for display. Therefore, the ambient light sensor is generally installed at the head position of the mobile phone, and the portion of the display screen covering the ambient light sensor is not used for displaying the picture, thereby affecting the screen ratio of the mobile phone (the ratio of the screen area to the area of the whole machine). Affects the user experience.
- the ambient light detecting method and the terminal provided by the embodiments of the present invention can save the head space of the mobile phone, improve the screen ratio of the mobile phone, and improve the user experience.
- the method provided by the present application is applied to a terminal including a camera and a display screen.
- the camera includes at least two areas, each of which includes at least one first pixel.
- the method specifically includes:
- the terminal acquires a first brightness value of each first pixel, where each first pixel corresponds to a first brightness value; the terminal acquires the first area of each area according to the first brightness value of all the first pixels included in each area The second brightness value, each area corresponding to a second brightness value; the terminal acquires the brightness value of the current ambient light according to all the second brightness values; the terminal adjusts the brightness of the display screen according to the brightness value of the ambient light.
- the light distribution of the lens is different. In other words, different areas of the camera have different attenuation of light. Therefore, the brightness of the light collected by the camera in the sub-area is detected, and the brightness values of the detected regions are compensated to different degrees, which is beneficial to improving the accuracy of the detected brightness value.
- the terminal receives an operation for starting the camera; in response to the operation, the terminal activates at least one second pixel.
- the terminal includes a first pixel and a second pixel.
- the second pixel is a pixel that is not designated for ambient light detection.
- the terminal uses all the pixels in the camera, including the first pixel and the second pixel, for implementing the camera function. Therefore, when the terminal turns on the camera function, the terminal controls all the first pixels and all the second pixels to be in an active state, and can work.
- each first pixel is in an active state, and each second pixel is in an inactive state; if the camera is activated , each first pixel and each second pixel are in an active state.
- the terminal acquiring the current brightness value of the ambient light according to all the second brightness values includes: the terminal uses the maximum value of all the second brightness values as the brightness value of the current ambient light.
- the K1 first brightness values acquire the third brightness value of the red light, the K1 red light pixels correspond to the third brightness value of the 1 red light; and the terminal according to the first brightness value of the K2 green light pixels included in the first area Acquiring a third brightness value of the green light, the K2 green light pixels corresponding to the third brightness value of the green light; the terminal acquiring the third brightness value of the blue light according to the first brightness value of the K3 blue light pixels included in the first area, The K3 blue light pixels correspond to a third brightness value of one blue light; the terminal determines the second brightness value of the first area according to the third brightness value of the red light, the third brightness value of the green light, and the third brightness value of the blue light.
- K1, K2 and K3 may be the same or different.
- the third luminance value of the red light obtained by the terminal according to the K1 first brightness values of the K1 red light pixels included in the first area includes: the terminal averages K1 first brightness values or K1 The maximum value of the first brightness values is determined as the third brightness value of the red light;
- the third brightness value of the green light obtained by the terminal according to the K2 first brightness values of the K2 green pixels included in the first area includes: Determining an average value of the K2 first brightness values or a maximum value of the K2 first brightness values as a third brightness value of the green light; the terminal according to the K3 first brightness values of the K3 blue pixels included in the first area
- Obtaining the third brightness value of the blue light comprises: the terminal determining the average value of the K3 first brightness values or the maximum value of the K3 first brightness values as the third brightness value of the blue light.
- the visual experience presented to the user is different because the same brightness value is at different color temperatures. Therefore, different weights can be added to the luminance values of the R light, the G light, and the B light according to the color temperature, and then the luminance values of the R light, the G light, and the B light are added according to the weights to obtain the luminance value of the region, and finally the current The brightness value of ambient light.
- different light sources include different components of R light, G light, and B light
- the terminal in the process of acquiring the second brightness value of each area according to the first brightness value of all the first pixels included in each area, the terminal performs the following operations on each area:
- the terminal uses the first average value as the second brightness value of the first area, and the first average value is an average value of the first brightness values of all the first pixels in the first area; wherein the first area is in at least two areas anyone.
- a terminal in a second aspect, includes a camera, a processor, and a display screen.
- the camera includes at least two regions, each of the regions includes at least one first pixel, and a camera is configured to acquire a first brightness value of each of the first pixels.
- Each of the first pixels corresponds to a first brightness value
- the processor is configured to obtain a second brightness value of each region according to the first brightness value of all the first pixels included in each region, where each region corresponds to one a second brightness value;
- the processor is further configured to obtain a brightness value of the current ambient light according to all the second brightness values; and the processor is further configured to adjust the brightness of the display screen according to the brightness value of the ambient light.
- the processor before the terminal acquires the first brightness value of each first pixel, the processor is further configured to enable a function of automatically adjusting the display brightness of the display screen; in response to the opening, the camera is also used to control the terminal.
- Each first pixel is in an active state.
- the processor when acquiring the brightness value of the current ambient light according to all the second brightness values, is specifically configured to use the maximum value of all the second brightness values as the brightness value of the current ambient light.
- the processor when determining the second brightness value of the first region according to the third brightness value of the red light, the third brightness value of the green light, and the third brightness value of the blue light, the processor is specifically configured to acquire the red light. a weighted average of the third brightness value, the third brightness value of the green light, and the third brightness value of the blue light; the processor is further configured to use the weighted average value as the second brightness value of the first area.
- the processor when determining the second brightness value of the first region according to the third brightness value of the red light, the third brightness value of the green light, and the third brightness value of the blue light, the processor is specifically configured to acquire the red light. a weighted average of the third brightness value, the third brightness value of the green light, and the third brightness value of the blue light; the weighted average is compensated according to the position of the first area to obtain the second brightness value of the first area.
- the processor performs the following operations for each area:
- the compensation for the weighted average or the first average is smaller; if the first area is farther from the center of the camera, the weighted average is The greater the compensation of the value or the first average.
- the processor is any one of a sensor hub sensor hub and an application processor.
- a third aspect a terminal, comprising: a processor, a memory and a touch screen, the memory, the touch screen being coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions, and the processor reading the computer instruction from the memory To perform the method as described in any of the possible design methods of the first aspect.
- a fourth aspect a computer storage medium comprising computer instructions that, when executed on a terminal, cause the terminal to perform the method as described in any of the possible design methods of the first aspect.
- a fifth aspect a computer program product, when the computer program product is run on a computer, causing the computer to perform the method as described in any of the possible design methods of the first aspect.
- FIG. 1 is a schematic structural diagram 1 of a terminal according to an embodiment of the present disclosure
- FIG. 2 is a schematic structural diagram of a camera of a terminal according to an embodiment of the present disclosure
- FIG. 3 is a schematic diagram 1 of a divided area of a camera sensor of a terminal according to an embodiment of the present disclosure
- FIG. 3b is a schematic diagram 2 of a divided area of a camera sensor of a terminal according to an embodiment of the present disclosure
- 3c is a schematic diagram 3 of a divided area of a camera sensor of a terminal according to an embodiment of the present disclosure
- FIG. 4 is a schematic structural diagram 2 of a terminal according to an embodiment of the present disclosure.
- FIG. 5 is a schematic diagram of a method for compensating different regions in a camera according to an embodiment of the present application.
- FIG. 6 is a schematic flowchart of a method for detecting ambient light according to an embodiment of the present application.
- FIG. 7 is a schematic structural diagram 3 of a terminal according to an embodiment of the present disclosure.
- FIG. 8 is a schematic structural diagram 4 of a terminal according to an embodiment of the present disclosure.
- the ambient light sensor occupies space in the head position of the mobile phone, thereby affecting the problem of the proportion of the mobile phone screen.
- the embodiment of the present application proposes a method for performing ambient light detection by using a camera. In this way, the mobile phone can no longer need to install an ambient light sensor, which saves the position of a device for the head space of the mobile phone, and is beneficial to increasing the proportion of the mobile phone screen and improving the user experience.
- the method provided by the embodiment of the present application can be applied to a terminal having a display screen and a camera.
- the terminal can use the camera to perform ambient light detection, and adjust the brightness of the display screen according to the result of ambient light detection.
- the brightness of the display screen located on the same side of the camera can be adjusted according to the result of the ambient light detected by the camera. For example, use the results of the front camera detection to adjust the brightness of the display on the front of the phone. In some scenarios, such as when the difference in ambient light level between the front of the phone and the back of the phone is not large, you can also adjust the brightness of the display on the front of the phone by using the result of the rear camera detection of the phone. Or, if there is a display on the back of the phone, you can adjust the brightness of the display on the back of the phone based on the detection results of the rear camera. This embodiment of the present application does not limit this.
- the terminal in the present application may be a mobile phone (such as the mobile phone 100 shown in FIG. 1), a tablet computer, a personal computer (PC), a personal digital assistant (personal), which can install an application and display an application icon.
- Digital assistant (PDA) smart watch, netbook, wearable electronic device, Augmented Reality (AR) device, Virtual Reality (VR) device, etc.
- the application does not impose any special restrictions on the specific form of the terminal. .
- the mobile phone 100 is used as an example of the terminal.
- the mobile phone 100 may specifically include: a processor 101, a radio frequency (RF) circuit 102, a memory 103, a touch screen 104, a Bluetooth device 105, and one or more sensors 106. , Wireless Fidelity (WI-FI) device 107, positioning device 108, audio circuit 109, peripheral interface 110, and power supply device 111. These components can communicate over one or more communication buses or signal lines (not shown in Figure 1). It will be understood by those skilled in the art that the hardware structure shown in FIG. 1 does not constitute a limitation to a mobile phone, and the mobile phone 100 may include more or less components than those illustrated, or some components may be combined, or different component arrangements.
- RF radio frequency
- WI-FI Wireless Fidelity
- the processor 101 is a control center of the mobile phone 100, and connects various parts of the mobile phone 100 by using various interfaces and lines, and executes the mobile phone 100 by running or executing an application stored in the memory 103 and calling data stored in the memory 103.
- processor 101 can include one or more processing units, for example, processor 101 can include a baseband processor and an application processor.
- the radio frequency circuit 102 can be used to receive and transmit wireless signals during transmission or reception of information or calls.
- the radio frequency circuit 102 can process the downlink data of the base station and then process it to the processor 101; in addition, transmit the data related to the uplink to the base station.
- radio frequency circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
- the radio frequency circuit 102 can also communicate with other devices through wireless communication.
- the wireless communication can use any communication standard or protocol, including but not limited to global mobile communication systems, general packet radio services, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
- the memory 103 is used to store applications and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running applications and data stored in the memory 103.
- the memory 103 mainly includes a storage program area and a storage data area, wherein the storage program area can store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.); the storage data area can be stored according to the use of the mobile phone. Data created at 100 o'clock (such as audio data, phone book, etc.).
- the memory 103 may include a high speed random access memory (RAM), and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
- the memory 103 can store various operating systems, for example, developed by Apple. Operating system, developed by Google Inc. Operating system, etc.
- the above memory 103 may be independent and connected to the processor 101 via the above communication bus; the memory 103 may also be integrated with the processor 101.
- the memory 103 includes a storage device 207.
- the touch screen 104 may specifically include a touch panel 104-1 and a display 104-2.
- the touch panel 104-1 can collect touch events on or near the user of the mobile phone 100 (for example, the user uses any suitable object such as a finger, a stylus, or the like on the touch panel 104-1 or on the touchpad 104.
- the operation near -1), and the collected touch information is sent to other devices (for example, processor 101).
- the touch event of the user in the vicinity of the touch panel 104-1 may be referred to as a hovering touch; the hovering touch may mean that the user does not need to directly touch the touchpad in order to select, move or drag a target (eg, an icon, etc.) , and only the user is located near the device to perform the desired function.
- the touch panel 104-1 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
- a display (also referred to as display) 104-2 can be used to display information entered by the user or information provided to the user as well as various menus of the mobile phone 100.
- the display 104-2 can be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
- the touchpad 104-1 can be overlaid on the display 104-2, and when the touchpad 104-1 detects a touch event on or near it, it is transmitted to the processor 101 to determine the type of touch event, and then the processor 101 may provide a corresponding visual output on display 104-2 depending on the type of touch event.
- the touchpad 104-1 and the display 104-2 are implemented as two separate components to implement the input and output functions of the handset 100, in some embodiments, the touchpad 104- 1 is integrated with the display screen 104-2 to implement the input and output functions of the mobile phone 100.
- the touch screen 104 is formed by stacking a plurality of layers of materials, which will not be described in detail in the embodiment of the present application.
- the touch panel 104-1 may be disposed on the front surface of the mobile phone 100 in the form of a full-board
- the display screen 104-2 may also be disposed on the front surface of the mobile phone 100 in the form of a full-board, so that the front of the mobile phone can be borderless.
- the structure such as a full screen phone.
- the mobile phone 100 can also have a fingerprint recognition function.
- the fingerprint reader 112 can be configured on the back of the handset 100 (eg, below the rear camera) or on the front side of the handset 100 (eg, below the touch screen 104).
- the fingerprint collection device 112 can be configured in the touch screen 104 to implement the fingerprint recognition function, that is, the fingerprint collection device 112 can be integrated with the touch screen 104 to implement the fingerprint recognition function of the mobile phone 100.
- the fingerprint capture device 112 is disposed in the touch screen 104 and may be part of the touch screen 104 or may be otherwise disposed in the touch screen 104.
- the main component of the fingerprint collection device 112 in the embodiment of the present application is a fingerprint sensor, which can employ any type of sensing technology, including but not limited to optical, capacitive, piezoelectric or ultrasonic sensing technologies.
- the mobile phone 100 may also include a Bluetooth device 105 for enabling data exchange between the handset 100 and other short-range devices (eg, mobile phones, smart watches, etc.).
- the Bluetooth device in the embodiment of the present application may be an integrated circuit or a Bluetooth chip or the like.
- the handset 100 can also include at least one type of sensor 106, such as a light sensor, motion sensor, and other sensors.
- the light sensor can include a proximity sensor, wherein the proximity sensor can turn off the power of the display when the handset 100 is moved to the ear.
- the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.
- the mobile phone 100 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
- the WI-FI device 107 is configured to provide the mobile phone 100 with network access complying with the WI-FI related standard protocol, and the mobile phone 100 can access the WI-FI access point through the WI-FI device 107, thereby helping the user to send and receive emails. Browsing web pages and accessing streaming media, etc., it provides users with wireless broadband Internet access.
- the WI-FI device 107 can also function as a WI-FI wireless access point, and can provide WI-FI network access for other devices.
- the positioning device 108 is configured to provide a geographic location for the mobile phone 100. It can be understood that the positioning device 108 can be specifically a receiver of a positioning system such as a Global Positioning System (GPS) or a Beidou satellite navigation system or a Russian GLONASS. After receiving the geographical location transmitted by the positioning system, the positioning device 108 sends the information to the processor 101 for processing, or sends it to the memory 103 for storage. In some other embodiments, the positioning device 108 can also be a receiver of an Assisted Global Positioning System (AGPS), which assists the positioning device 108 in performing ranging and positioning services by acting as an auxiliary server.
- AGPS Assisted Global Positioning System
- the secondary location server provides location assistance over a wireless communication network in communication with a location device 108 (i.e., a GPS receiver) of the device, such as handset 100.
- the positioning device 108 can also be a WI-FI access point based positioning technology. Since each WI-FI access point has a globally unique (Media Access Control, MAC) address, the device can scan and collect the broadcast signals of the surrounding WI-FI access points when WI-FI is turned on. Therefore, the MAC address broadcasted by the WI-FI access point can be obtained; the device sends the data (such as the MAC address) capable of indicating the WI-FI access point to the location server through the wireless communication network, and each location is retrieved by the location server. The geographic location of the WI-FI access point, combined with the strength of the WI-FI broadcast signal, calculates the geographic location of the device and sends it to the location device 108 of the device.
- MAC Media Access Control
- the audio circuit 109, the speaker 113, and the microphone 114 can provide an audio interface between the user and the handset 100.
- the audio circuit 109 can transmit the converted electrical data of the received audio data to the speaker 113 for conversion to the sound signal output by the speaker 113; on the other hand, the microphone 114 converts the collected sound signal into an electrical signal by the audio circuit 109. After receiving, it is converted into audio data, and then the audio data is output to the RF circuit 102 for transmission to, for example, another mobile phone, or the audio data is output to the memory 103 for further processing.
- the peripheral interface 110 is used to provide various interfaces for external input/output devices (such as a keyboard, a mouse, an external display, an external memory, a subscriber identity module card, etc.). For example, it is connected to the mouse through a Universal Serial Bus (USB) interface, and is connected to a Subscriber Identification Module (SIM) card provided by the service provider through a metal contact on the card slot of the subscriber identity module. . Peripheral interface 110 can be used to couple the external input/output peripherals described above to processor 101 and memory 103.
- USB Universal Serial Bus
- SIM Subscriber Identification Module
- the mobile phone 100 may further include a power supply device 111 (such as a battery and a power management chip) that supplies power to the various components.
- the battery may be logically connected to the processor 101 through the power management chip to manage charging, discharging, and power management through the power supply device 111. And other functions.
- the handset 100 can also include one or more cameras 115.
- the mobile phone may include one or more of the front camera, and may also include one or more rear cameras, and may also include one or more of the front camera and one or more rear cameras.
- the specific structure of the camera 115 can be referred to the following description of FIG. 2, and details are not described herein.
- the mobile phone 100 may further include a flash, a micro projection device, a Near Field Communication (NFC) device, and the like, and details are not described herein.
- NFC Near Field Communication
- FIG. 2 is a schematic structural diagram of a camera 115 in the mobile phone 100.
- the camera mainly includes a lens 201, an infrared cut (IR cut) 202, a sensor integrated circuit (Sensor IC) 203, and a digital signal processing (DSP) chip 204.
- IR cut infrared cut
- Sensor IC sensor integrated circuit
- DSP digital signal processing
- the lens 201 is a lens structure generally composed of one or more lenses, which determines the light extraction rate of the sensor.
- the lens comprises a plastic lens (PLASTIC) or a glass lens (GLASS).
- Common lens structures are: 1P, 2P, 1G1P, 1G3P, 2G2P, 4G, etc.
- the dichroic filter 202 is used to separate the light entering through the lens 201.
- the color separation filter has two color separation methods: RGB (Red Green Blue) primary color separation method (ie, three primary color separation method) and CMYK (Cyan Magenta Yellow Key-Plate) complementary color separation method. Since any color of the natural light can be mixed by the three colors of R, G, and B in different proportions, in the embodiment of the present application, the color separation filter 202 can be used to decompose the natural light into R, G and B three kinds of monochromatic light.
- RGB Red Green Blue
- CMYK Cyan Magenta Yellow Key-Plate
- the sensor integrated circuit 203 includes a sensor, which may also be referred to as an image sensor, a camera sensor, or a photo sensor.
- the sensor is a semiconductor chip whose surface contains a plurality of photosensitive elements such as photodiodes.
- the photosensitive element receives the monochromatic light after being filtered by the filter, and then generates a corresponding electric charge. That is to say, the sensor converts the light conducted from the lens 201 into an electrical signal, and then converts the electrical signal into a digital signal through an internal analog to digital (AD) conversion. At this time, the data of the digital signal is called original. Raw data.
- each photosensitive element can only be photosensitive monochromatic light, such as R light, or B light, or G light.
- each pixel may comprise one of said photosensitive elements.
- each of the pixels includes a photosensitive element and a filter corresponding to the photosensitive element.
- each of the pixels may include other related components in addition to a photosensitive element and a filter corresponding to the photosensitive element.
- the digital signal processing chip 204 is configured to acquire raw data from the sensor, and after a series of processing, send the processed data to a processor in the video output device, and finally display the image through the video output device.
- the digital signal processing chip 204 is further configured to refresh the sensor after transmitting the data to the processor, so that the sensor acquires the next set of raw data.
- the data signal processing chip 204 includes an image signal processor (ISP), specifically for converting the obtained raw data into a format supported by the display, such as a YUV format or an RGB format.
- ISP image signal processor
- the data signal processing chip 204 further includes a camera interface (CAMIF), which is specifically configured to send the ISP processed data to the mobile phone processor.
- CAMIF camera interface
- the working principle of the camera may be: after the light outside the mobile phone passes through the lens 201, it is filtered by the color separation filter 202 and then irradiated onto the sensor surface.
- the sensor converts the received light into an electrical signal, and then converts it into a digital signal through internal AD to obtain raw data. If the sensor does not have an integrated DSP, the raw data is transferred to the processor in the phone, and the data format is RAW DATA. If the sensor integrates the DSP, the resulting raw data is processed through a series of processes to output data in YUV or RGB format. Finally, it will be sent to the video output device (for example: framebuffer) by the processor in the mobile phone for image display.
- the video output device for example: framebuffer
- the brightness of the display screen is adjusted according to the brightness value of the ambient light detected by the front camera as an example.
- a plurality of pixels included in the sensor of the front camera in the terminal are divided into N regions, and N is an integer greater than 1.
- the brightness values of the respective regions are respectively obtained, and the brightness values of the ambient light are determined according to the brightness values of the respective regions. Then adjust the brightness of the display according to the determined brightness value of the ambient light.
- the method for dividing the N regions may be equally divided, that is, the area of each region is the same, or may be divided according to the optical characteristics of the lens.
- the principle of the partitioning and the pixels in each region after the partitioning are applied in this embodiment of the present application.
- the quantity is not limited.
- the pixels in the sensor are equally divided into nine regions, and the nine regions have the same size, wherein each region contains the same number of pixels.
- the pixels in the sensor are divided into 12 regions, and the sizes of the 12 regions are different, and the number of pixels included in each region is not equal.
- the divided area herein does not physically separate pixels of different areas, but is equivalent to grouping a plurality of pixels.
- the principle of grouping is to group adjacent pixels into one group.
- the pixels are equivalent to being in one area.
- the divided area or grouping here is preset.
- the light distribution of the lens is different. In other words, different areas of the camera have different attenuation of light. Therefore, the brightness of the light collected by the camera in the sub-area is detected, and the brightness values of the detected regions are compensated to different degrees, which is beneficial to improving the accuracy of the detected brightness value.
- the brightness values of the respective regions may be compensated according to a preset compensation rule, and the brightness value of the current ambient light is determined from the brightness values of the compensated regions.
- the preset compensation rule may be: the brightness value detected in the area located at the center position of the camera is compensated to be zero or the compensation is small, and the detection of the area farther from the center position is compensated for the higher the brightness value.
- the distance of a certain area from the center position may be the distance from the center of the area to the center position.
- a circular area (area 1) centered on point A and having a radius r1 can be regarded as a center area, and no compensation can be performed, or the compensation value is small (less than the latter). P1).
- the value of the compensation in the region (region 2) whose radius from the point A is larger than r1 and smaller than r2 is P1
- the value of the compensation in the region (region 3) whose radius is greater than r2 and smaller than r3 from the point A is P2.
- the brightness value of the camera center position and the field of view (FOV) edge position may be pre-tested to obtain a difference or ratio of two brightness values, and the difference or ratio of the two brightness values is used as a difference The basis for compensation.
- the brightness value of the center position of the reading camera is Q1 lux (lighting unit)
- the brightness value of the edge position of the reading camera FOV is Q2 lux.
- the difference between Q1 and Q2 ( ⁇ Q) or the ratio of Q1 and Q2 can be determined as a basis for compensation based on factors such as the optical characteristics of the camera or the characteristics of the light source. It is assumed that the difference ( ⁇ Q) between Q1 and Q2 is used as the basis for compensation.
- the actually measured center position has a brightness value of Q3 lux
- the actually measured FOV edge position has a brightness value of Q4 lux.
- the actual measured Q3 may not be compensated, that is, the current brightness value of the camera center position is determined to be Q3 lux.
- the actual measured Q4 is compensated, for example, it can be (Q4 + ⁇ Q) lux, that is, the current brightness value of the edge position of the FOV of the camera is determined to be (Q4 + ⁇ Q) lux.
- the light source may be identified according to the condition that the optical signal is received by each area, and then different compensation rules or adjustment strategies of the display brightness are adopted. For example, if only one of the regions of the sensor or the adjacent regions have a large luminance value, it can be determined that the light source is a point source. Higher compensation can be made for the brightness values measured at the edges of the camera's FOV range. If there are several non-adjacent areas in all areas of the sensor with large brightness values, it can be determined that the light source is a multi-point light source.
- the brightness value measured on the sensor can be compensated according to the distribution of multiple light sources, and the specific compensation method is not limited. If the brightness value of each area in the sensor area changes greatly and has a certain regularity, it can be determined that the terminal is in the motion state, and the brightness of the display screen may not be adjusted.
- the brightness value of the ambient light detected by the camera is a relatively stable value.
- the brightness value of the ambient light detected by the camera will fall off the cliff.
- the FOV range of the camera can be increased to improve the accuracy of the ambient light brightness value determined by the terminal.
- a brightness value of the M pixels in the region may be obtained by using a brightness value of the M pixels in the region, for example, reading a brightness value of the M pixels in the region, and then The brightness value of the area is determined based on the brightness values of the M pixels. For example, the luminance values of the M pixels can be weighted to obtain the luminance value of the region. Wherein, the number of pixels included in the area is greater than or equal to M.
- M may be an integer greater than or equal to 3
- the luminance values of the read M pixels may include a value of at least one R light, a value of at least one G light, and a value of at least one B light.
- different weights can be added to the luminance values of the R light, the G light, and the B light according to the situation, and added, thereby obtaining the luminance value of the region.
- each of the regions in the terminal includes a plurality of first pixels and a plurality of second pixels.
- the first pixel is a pixel for the terminal to detect ambient light
- the M pixels in each region are read as M first pixels.
- the second pixel is a pixel that is not designated for ambient light detection.
- the terminal uses all of the pixels, including the first pixel and the second pixel, for the camera function.
- the terminal When the terminal turns on the function of automatically adjusting the display brightness of the display, the terminal needs to perform ambient light detection. Then, all the first pixels in the terminal are in an active state, that is, a state in which they can work. When the terminal turns on the camera function, all of the first pixels and all of the second pixels are activated. When the terminal turns on the function of automatically adjusting the display brightness of the display screen, and the camera function is not turned on, the first pixel is in an active state, and the second pixel is in an inactive state.
- the terminal turns on the function of automatically adjusting the display brightness of the display screen
- the terminal activates all the second pixels (because the first pixel is activated when the function of automatically adjusting the display brightness of the display screen is turned on) ).
- each area is pre-designated with two R light pixels, one G light pixel, and one B light pixel.
- the embodiment of the present application does not limit the number of first pixels in each area, nor the number of specific R light pixels, G light pixels, and B light pixels.
- the number of first pixels in any two regions may also be different.
- the values of M may be different in different regions.
- the brightness value of M1 pixels may be taken in the first area for calculation, and the brightness value of M2 pixels may be taken in the second area for calculation; wherein M1 and M2 may be equal or unequal.
- the number of luminance values of the read R light pixels, the number of luminance values of the G light pixels, and the number of luminance values of the B light pixels may be the same or different.
- the luminance values of K1 R-light pixels, the luminance values of K2 G-light pixels, and the luminance values of K3 B-light pixels are read, wherein K1, K2, and K3 may be the same, Can be different.
- the first pixel specified in advance has a total of three pixels, namely: one R light pixel, one G light pixel and one B light pixel.
- a total of four pre-designated first pixels are respectively: two R light pixels, one G light pixel and one B light pixel.
- a total of five pre-designated first pixels are respectively: two R light pixels, two G light pixels and one B light pixel.
- the first pixel specified in other areas is shown in the drawing and will not be described again.
- the visual experience presented to the user is different because the same brightness value is at different color temperatures. Therefore, different weights can be added to the luminance values of the R light, the G light, and the B light according to the color temperature, and then the luminance values of the R light, the G light, and the B light are added according to the weights to obtain the luminance value of the region, and finally the current The brightness value of ambient light. For example, in the case of a high color temperature, such as in an outdoor with strong sunlight, the image on the display may appear reddish. Therefore, when calculating the luminance value, the weight of the luminance value of the R light can be increased. In this way, the calculated brightness value will be larger.
- the brightness can be dimmed to improve the user experience.
- the image on the display looks blue in color. Therefore, when calculating the luminance value, the weight of the luminance value of the B light can be increased. In this way, the calculated brightness value will be larger.
- the terminal adjusts the brightness of the display screen according to the calculated brightness value the brightness can be dimmed to improve the user experience.
- the weight corresponding to the R light is H1
- the weight corresponding to the G light is J1
- the weight corresponding to the B light is K1.
- H1+J1+K1 1.
- the current brightness value of a certain area the brightness value of the R light * H1 + the brightness value of the G light * J1 + the brightness value of the B light * K1.
- the brightness value of the R light may be the maximum value (or average value) of the brightness values of all the R lights preset in the area.
- the luminance value of the G light may be the maximum value (or average value) of the luminance values of all the G lights preset in the region.
- the luminance value of the B light may be the maximum value (or average value) of the luminance values of all the B lights preset in the region.
- the weight corresponding to the R light is H2
- the weight corresponding to the G light is J2
- the weight corresponding to the B light is K2.
- H2+J2+K2 1.
- H1 is different from H2
- J1 is different from J2
- K1 and K2 are different.
- the current brightness value of a certain area the brightness value of the R light * the brightness value of the light of the H + G light * the brightness value * J2 of the light of the JJ.
- the brightness value of the R light may be the maximum value (or average value) of the brightness values of all the R lights preset in the area.
- the luminance value of the G light may be the maximum value (or average value) of the luminance values of all the G lights preset in the region.
- the luminance value of the B light may be the maximum value (or average value) of the luminance values of all the B lights preset in the region.
- different light sources include different components of R light, G light, and B light
- the weight of the R light can be increased when calculating the brightness value of the area, so that the calculated brightness value is larger.
- the terminal increases the weight of the R light, and the calculated brightness value is larger.
- the brightness of the display screen is adjusted according to the calculated brightness value, the brightness can be dimmed to improve the user experience.
- the weight of the R light can be increased when calculating the brightness value of the area.
- the weight of the R light can be increased when calculating the brightness value of the area.
- the sensor controls all pixels (including the first pixel and the second pixel) to be in an active state, and the luminance value of the entire image is output to the register by all the pixels, and the terminal reads the A register obtains a brightness value of the image and displays the image according to the brightness value.
- the sensor controls all the pixels to be inactive, and does not need to sensitize and correspondingly illuminate the illumination on the sensor.
- a flowchart of a method for detecting ambient light includes:
- the first pixel in the terminal control camera is in an active state, and acquires a first brightness value detected by each of the first pixels.
- each pixel of the camera is divided into at least two regions, each region includes a plurality of pixels, and at least three pixels of the plurality of pixels of each region are used as the first pixel.
- each first pixel is in an active state.
- the brightness value of the illumination detected by the first pixel in the active state is the first brightness value.
- each first pixel corresponds to a first brightness value.
- the sensor can control the state of each pixel included in the sensor to be in an active state or an inactive state.
- the pixel in the active state can receive illumination and perform photoelectric conversion to convert the optical signal into an electrical signal; then perform analog-to-digital conversion on the electrical signal to convert it into a digital signal; the digital signal includes the brightness value of the pixel.
- the brightness value of the pixel is stored in a register, and the sensor hub can acquire the brightness value by reading the value in the register through, for example, an Inter-Integrated Circuit (IIC).
- IIC Inter-Integrated Circuit
- the terminal can preset some pixels (ie, the first pixel) in each area for detecting ambient light.
- the sensor controls the preset pixels to be in an active state, is used to detect the brightness value of the current illumination, and stores the first brightness value detected by each pixel into the register.
- each first pixel can correspond to one register.
- the terminal obtains a second brightness value of each area according to the first brightness value detected by each of the first pixels.
- each area includes a plurality of first pixels.
- the luminance value of the region that is, the second luminance value, can be obtained according to the first luminance value detected by all the first pixels in one region. It can be seen that one area corresponds to a second brightness value.
- the second luminance value obtained from the plurality of first luminance values of a certain region is used to represent the luminance value of the region.
- the terminal may not perform the action of the specific divided area. Rather, all the first pixels may be grouped in advance in the terminal, each group includes a plurality of first pixels, each group corresponding to one region; and then the group is corresponding according to the first brightness value corresponding to all the first pixels of each group.
- a second brightness value that is, a second brightness value corresponding to the corresponding area of the group.
- the sensor hub of the terminal can read the first brightness value in each register, and report the first brightness value to a processor (eg, an application processor) of the terminal, and perform weighting calculation by the processor of the terminal.
- a processor eg, an application processor
- the brightness value of the area ie the second brightness value).
- the sensor in the terminal camera can read the first brightness value in each register, and then weight the first brightness values to obtain the brightness value of the area (ie, the second brightness value). Then, the second brightness value is read by the sensor hub and reported to the processor of the terminal.
- the communication protocol used may be IIC or other communication protocol such as Serial Peripheral Interface (SPI).
- SPI Serial Peripheral Interface
- the terminal obtains a current ambient light brightness value according to the second brightness value of each of the areas.
- the terminal adjusts the brightness of the display screen according to the determined brightness value of the ambient light.
- the camera function of the camera and the function of detecting the ambient light by the camera are two independent functions, and do not affect each other.
- the ambient light detection can be performed through the camera all the time, and it is not affected by whether the camera takes a picture.
- the camera is capable of performing photographing during ambient light detection.
- the ambient light intensity can also be detected by the camera.
- the corresponding functions of the camera's camera function and the ambient light detection function can be set separately, and the IIC corresponding to these two functions can also be set separately, thereby achieving independent implementation of these two functions.
- the terminal when the terminal performs photographing, all the pixels in the camera are photosensitive, and the generated RGB data may be stored in the first register.
- the image signal processor 404 configures the photographing parameters of the camera sensor 401 through the IIC1.
- the image signal processor 404 reads data of RGB in the first register by, for example, a Mobile Industry Processor Interface (MIPI) 1, and performs image processing based on the data.
- MIPI Mobile Industry Processor Interface
- the first pixel in the camera is photosensitive, and the acquired brightness data is stored in the second register, wherein the second register is different from the first register, the first register There may be multiple and second registers, and the number of registers is not limited in this application.
- the sensor hub 403 reads the luminance values in the respective second registers of the camera sensor 401 by, for example, IIC 2 .
- the process of taking pictures by the terminal and the process of detecting the ambient light by the terminal are two independent processes without mutual interference.
- the decoupling design of the camera during photographing and ambient light detection is realized, so that when the camera is in the sleep state, the terminal can also perform ambient light detection.
- the embodiment of the present application may divide the function module by using the above-mentioned method example.
- each function module may be divided according to each function, or two or more functions may be integrated into one processing module.
- the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present invention is schematic, and is only a logical function division, and the actual implementation may have another division manner.
- FIG. 7 shows a possible structural diagram of the terminal involved in the above embodiment.
- the terminal 700 includes a camera 701 and a processor 702.
- the camera 701 is used to support the terminal to perform step S101 in FIG. 6, and/or other processes for the techniques described herein.
- the processor 702 is configured to support the terminal in performing steps S102-S104 of FIG. 6, and/or other processes for the techniques described herein.
- the terminal 700 may further include a communication unit for the terminal to interact with other devices.
- the terminal 700 may further include a storage unit for storing program codes and data of the terminal.
- the specific functions that can be implemented by the foregoing functional units include, but are not limited to, the functions corresponding to the method steps described in the foregoing examples.
- the camera described above may be a camera module of the terminal, and the processor 702 may be a processing module of the terminal.
- the communication unit described above may be a communication module of the terminal, such as an RF circuit, a WiFi module, or a Bluetooth module.
- the above storage unit may be a storage module of the terminal.
- FIG. 8 is a schematic diagram showing a possible structure of a terminal involved in the above embodiment.
- the terminal 1100 includes a processing module 1101, a storage module 1102, and a communication module 1103.
- the processing module 1101 is configured to control and manage the actions of the terminal.
- the storage module 1102 is configured to save program codes and data of the terminal.
- the communication module 1103 is for communicating with other terminals.
- the processing module 1101 may be a processor or a controller, and may be, for example, a central processing unit (CPU), a general-purpose processor, a digital signal processor (DSP), and an application-specific integrated circuit (Application-Specific).
- CPU central processing unit
- DSP digital signal processor
- Application-Specific Application-Specific
- the processor may also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
- the communication module 1303 may be a transceiver, a transceiver circuit, a communication interface, or the like.
- the storage module 1102 can be a memory.
- the processing module 1101 is a processor (such as the processor 101 shown in FIG. 1)
- the communication module 1103 is an RF transceiver circuit (such as the RF circuit 102 shown in FIG. 1)
- the storage module 1102 is a memory (as shown in FIG. 1).
- the terminal provided by the embodiment of the present application may be the terminal 100 shown in FIG. 1.
- the communication module 1103 may include not only an RF circuit but also a WiFi module and a Bluetooth module. Communication modules such as RF circuits, WiFi modules, and Bluetooth modules can be collectively referred to as communication interfaces. Wherein, the above processor, communication interface and memory can be coupled together by a bus.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
- each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
- the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
- the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
- a computer readable storage medium A number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) or processor to perform all or part of the steps of the methods described in various embodiments of the present application.
- the foregoing storage medium includes: a flash memory, a mobile hard disk, a read only memory, a random access memory, a magnetic disk, or an optical disk, and the like, which can store program codes.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Telephone Function (AREA)
- Controls And Circuits For Display Device (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (30)
- 一种环境光检测的方法,应用于包含摄像头和显示屏的终端中,其特征在于,所述摄像头包含至少两个区域,每个区域包含至少一个第一像素,所述方法包括:所述终端获取每个第一像素的第一亮度值,其中,每个所述第一像素对应一个第一亮度值;所述终端根据每个所述区域中包含的所有第一像素的第一亮度值获取每个所述区域的第二亮度值,每个所述区域对应一个第二亮度值;所述终端根据所有的所述第二亮度值获取当前的环境光的亮度值;所述终端根据所述环境光的亮度值调节所述显示屏的亮度。
- 根据权利要求1所述的方法,其特征在于,所述摄像头还包括至少一个第二像素;所述方法还包括:所述终端接收到用于启动相机的操作;响应所述操作,所述终端激活所述至少一个第二像素。
- 根据权利要求1或2所述的方法,其特征在于,在所述终端获取每个第一像素的第一亮度值之前,所述方法还包括:所述终端控制所述终端中每个第一像素处于激活状态。
- 根据权利要求1或2所述的方法,其特征在于,在所述终端获取每个第一像素的第一亮度值之前,所述方法还包括:所述终端开启自动调节所述显示屏的显示亮度的功能;响应所述开启,所述终端控制所述终端中每个第一像素处于激活状态。
- 根据权利要求2-4任一项所述的方法,其特征在于:若所述自动调节所述显示屏的显示亮度的功能被开启,且所述相机未被启动,则每个所述第一像素处于激活状态,每个所述第二像素处于未激活状态;若所述相机被启动,则每个所述第一像素和每个所述第二像素均处于激活状态。
- 根据权利要求1-5任一项所述的方法,其特征在于,所述终端根据所有的所述第二亮度值获取当前的环境光的亮度值包括:所述终端将所有的所述第二亮度值中的最大值作为所述当前的环境光的亮度值。
- 根据权利要求1-6任一项所述的方法,其特征在于,第一区域内的所有第一像素包括K1个红光像素、K2个绿光像素和K3个蓝光像素;所述第一区域为所述至少两个区域中的任意一个;在所述终端根据每个所述区域中包含的所有第一像素的第一亮度值获取每个所述区域的第二亮度值的过程中,所述终端对每个所述区域执行以下操作:所述终端根据所述第一区域中包含的K1个红光像素的K1个第一亮度值获取红光的第三亮度值,所述K1个红光像素对应1个所述红光的第三亮度值;所述终端根据所述第一区域中包含的K2个绿光像素的第一亮度值获取绿光的第三亮度值,所述K2个绿光像素对应1个所述绿光的第三亮度值;所述终端根据所述第一区域中包含的K3个蓝光像素的第一亮度值获取蓝光的第三亮度值,所述K3个蓝光像素对应1个所述蓝光的第三亮度值;所述终端根据所述红光的第三亮度值、绿光的第三亮度值和蓝光的第三亮度值确 定所述第一区域的第二亮度值。
- 根据权利要求7所述的方法,其特征在于:所述终端根据所述第一区域中包含的K1个红光像素的K1个第一亮度值获取红光的第三亮度值包括:所述终端将所述K1个第一亮度值的平均值或所述K1个第一亮度值中的最大值确定为所述红光的第三亮度值;所述终端根据所述第一区域中包含的K2个绿光像素的K2个第一亮度值获取绿光的第三亮度值包括:所述终端将所述K2个第一亮度值的平均值或所述K2个第一亮度值中的最大值确定为所述绿光的第三亮度值;所述终端根据所述第一区域中包含的K3个蓝光像素的K3个第一亮度值获取蓝光的第三亮度值包括:所述终端将所述K3个第一亮度值的平均值或所述K3个第一亮度值中的最大值确定为所述蓝光的第三亮度值。
- 根据权利要求7或8所述的方法,其特征在于,所述终端根据所述红光的第三亮度值、绿光的第三亮度值和蓝光的第三亮度值确定所述第一区域的第二亮度值包括:所述终端获取所述红光的第三亮度值、所述绿光的第三亮度值和所述蓝光的第三亮度值的加权平均值;所述终端将所述加权平均值作为所述第一区域的第二亮度值。
- 根据权利要求7或8所述的方法,其特征在于,所述终端根据所述红光的第三亮度值、绿光的第三亮度值和蓝的第三亮度值确定所述第一区域的第二亮度值包括:所述终端获取所述红光的第三亮度值、所述绿光的第三亮度值和所述蓝光的第三亮度值的加权平均值;所述终端根据所述第一区域的位置,对所述加权平均值进行补偿得到所述第一区域的第二亮度值。
- 根据权利要求1-6任一项所述的方法,其特征在于,在所述终端根据每个所述区域中包含的所有第一像素的第一亮度值获取每个所述区域的第二亮度值的过程中,所述终端对每个所述区域执行以下操作:所述终端将第一平均值作为所述第一区域的第二亮度值,所述第一平均值为所述第一区域中所有第一像素的第一亮度值的平均值;其中,所述第一区域为所述至少两个区域中的任意一个。
- 根据权利要求1-6任一项所述的方法,其特征在于,在所述终端根据每个所述区域中包含的所有第一像素的第一亮度值获取每个所述区域的第二亮度值的过程中,所述终端对每个所述区域执行以下操作:所述终端根据第一区域的位置,对第一平均值进行补偿得到所述第一区域的第二亮度值,所述第一平均值为所述第一区域中所有第一像素的第一亮度值的平均值;其中,所述第一区域为所述至少两个区域中的任意一个。
- 根据权利要求10或12所述的方法,其特征在于:若所述第一区域距离所述摄像头的中心位置越近,则对所述加权平均值或所述第一平均值的补偿越小;若所述第一区域距离所述摄像头的中心位置越远,则对所述加权平均值或所述第一平均值的补偿越大。
- 一种终端,包含摄像头、处理器和显示屏,其特征在于,所述摄像头包含至少两个区域,每个区域包含至少一个第一像素,所述摄像头,用于获取每个第一像素的第一亮度值,其中,每个所述第一像素对应一个第一亮度值;所述处理器,用于根据每个所述区域中包含的所有第一像素的第一亮度值获取每个所述区域的第二亮度值,每个所述区域对应一个第二亮度值;所述处理器,还用于根据所有的所述第二亮度值获取当前的环境光的亮度值;所述处理器,还用于根据所述环境光的亮度值调节所述显示屏的亮度。
- 根据权利要求14所述的终端,其特征在于,所述摄像头还包括至少一个第二像素;所述处理器,还用于接收到用于启动相机的操作;所述摄像头,还用于响应所述操作激活所述至少一个第二像素。
- 根据权利要求14或15所述的终端,其特征在于,所述摄像头,还用于在所述终端获取每个第一像素的第一亮度值之前,控制所述终端中每个第一像素处于激活状态。
- 根据权利要求14或15所述的终端,其特征在于,所述处理器,还用于在所述终端获取每个第一像素的第一亮度值之前,开启自动调节所述显示屏的显示亮度的功能;所述摄像头,还用于响应所述开启,控制所述终端中每个第一像素处于激活状态。
- 根据权利要求15-17任一项所述的终端,其特征在于:若所述自动调节所述显示屏的显示亮度的功能被开启,且所述相机未被启动,则每个所述第一像素处于激活状态,每个所述第二像素处于未激活状态;若所述相机被启动,则每个所述第一像素和每个所述第二像素均处于激活状态。
- 根据权利要求14-18任一项所述的终端,其特征在于,在根据所有的所述第二亮度值获取当前的环境光的亮度值时,所述处理器具体用于将所有的所述第二亮度值中的最大值作为所述当前的环境光的亮度值。
- 根据权利要求14-19任一项所述的终端,其特征在于,第一区域内的所有第一像素包括K1个红光像素、K2个绿光像素和K3个蓝光像素;所述第一区域为所述至少两个区域中的任意一个;在所述处理器根据每个所述区域中包含的所有第一像素的第一亮度值获取每个所述区域的第二亮度值的过程中,所述处理器对每个所述区域执行以下操作:根据所述第一区域中包含的K1个红光像素的K1个第一亮度值获取红光的第三亮度值,所述K1个红光像素对应1个所述红光的第三亮度值;根据所述第一区域中包含的K2个绿光像素的第一亮度值获取绿光的第三亮度值,所述K2个绿光像素对应1个所述绿光的第三亮度值;根据所述第一区域中包含的K3个蓝光像素的第一亮度值获取蓝光的第三亮度值,所述K3个蓝光像素对应1个所述蓝光的第三亮度值;根据所述红光的第三亮度值、绿光的第三亮度值和蓝光的第三亮度值确定所述第一区域的第二亮度值。
- 根据权利要求20所述的终端,其特征在于:在根据所述第一区域中包含的K1个红光像素的K1个第一亮度值获取红光的第三亮度值时,所述处理器具体用于将所述K1个第一亮度值的平均值或所述K1个第一亮度值中的最大值确定为所述红光的第三亮度值;在根据所述第一区域中包含的K2个绿光像素的第一亮度值获取绿光的第三亮度值时,所述处理器具体用于将所述K2个第一亮度值的平均值或所述K2个第一亮度值中的最大值确定为所述绿光的第三亮度值;在根据所述第一区域中包含的K3个蓝光像素的第一亮度值获取蓝光的第三亮度值时,所述处理器具体用于将所述K3个第一亮度值的平均值或所述K3个第一亮度值中的最大值确定为所述蓝光的第三亮度值。
- 根据权利要求20或21所述的终端,其特征在于,在根据所述红光的第三亮度值、绿光的第三亮度值和蓝光的第三亮度值确定所述第一区域的第二亮度值时,所述处理器具体用于获取所述红光的第三亮度值、所述绿光的第三亮度值和所述蓝光的第三亮度值的加权平均值;将所述加权平均值作为所述第一区域的第二亮度值。
- 根据权利要求20或21所述的终端,其特征在于,在根据所述红光的第三亮度值、绿光的第三亮度值和蓝光的第三亮度值确定所述第一区域的第二亮度值时,所述处理器具体用于获取所述红光的第三亮度值、所述绿光的第三亮度值和所述蓝光的第三亮度值的加权平均值;根据所述第一区域的位置,对所述加权平均值进行补偿得到所述第一区域的第二亮度值。
- 根据权利要求14-19任一项所述的终端,其特征在于,在所述处理器根据每个所述区域中包含的所有第一像素的第一亮度值获取每个所述区域的第二亮度值的过程中,所述处理器对每个所述区域执行以下操作:将第一平均值作为所述第一区域的第二亮度值,所述第一平均值为所述第一区域中所有第一像素的第一亮度值的平均值;其中,所述第一区域为所述至少两个区域中的任意一个。
- 根据权利要求14-19任一项所述的终端,其特征在于,在所述处理器根据每个所述区域中包含的所有第一像素的第一亮度值获取每个所述区域的第二亮度值的过程中,所述处理器对每个所述区域执行以下操作:根据第一区域的位置,对第一平均值进行补偿得到所述第一区域的第二亮度值,所述第一平均值为所述第一区域中所有第一像素的第一亮度值的平均值;其中,所述第一区域为所述至少两个区域中的任意一个。
- 根据权利要求23或25所述的终端,其特征在于:若所述第一区域距离所述摄像头的中心位置越近,则对所述加权平均值或所述第一平均值的补偿越小;若所述第一区域距离所述摄像头的中心位置越远,则对所述加权平均值或所述第一平均值的补偿越大。
- 根据权利要求14-26任一项所述的终端,其特征在于,所述处理器为传感器集线器sensor hub、应用处理器中的任一项。
- 一种终端,其特征在于,包括:处理器、存储器和触摸屏,所述存储器、所 述触摸屏与所述处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器从所述存储器中读取所述计算机指令,以执行如权利要求1-13中任一项所述的方法。
- 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在终端上运行时,使得所述终端执行如权利要求1-13中任一项所述的方法。
- 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-13中任一项所述的方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201880072800.0A CN111345019B (zh) | 2018-04-04 | 2018-04-28 | 一种环境光检测的方法及终端 |
US17/043,201 US11250813B2 (en) | 2018-04-04 | 2018-04-28 | Ambient light detection method and terminal |
JP2020554133A JP7142715B2 (ja) | 2018-04-04 | 2018-04-28 | 周囲光検出方法および端末 |
KR1020207030583A KR102434930B1 (ko) | 2018-04-04 | 2018-04-28 | 주변 광 검출 방법 및 단말 |
CN202110991142.9A CN113923422B (zh) | 2018-04-04 | 2018-04-28 | 一种环境光检测的方法及终端 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810302048 | 2018-04-04 | ||
CN201810302048.6 | 2018-04-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019192044A1 true WO2019192044A1 (zh) | 2019-10-10 |
Family
ID=68100026
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/085107 WO2019192044A1 (zh) | 2018-04-04 | 2018-04-28 | 一种环境光检测的方法及终端 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11250813B2 (zh) |
JP (1) | JP7142715B2 (zh) |
KR (1) | KR102434930B1 (zh) |
CN (2) | CN111345019B (zh) |
WO (1) | WO2019192044A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113240757A (zh) * | 2021-05-12 | 2021-08-10 | 深圳市光科全息技术有限公司 | 蓝光参数检测方法、装置、设备和介质 |
WO2024045829A1 (zh) * | 2022-08-29 | 2024-03-07 | 深圳市Tcl云创科技有限公司 | 屏幕亮度调整方法、装置、存储介质及电子设备 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114485934B (zh) * | 2020-11-13 | 2024-01-30 | 北京小米移动软件有限公司 | 一种光线检测组件、屏幕组件及电子终端 |
CN117938996A (zh) * | 2023-12-28 | 2024-04-26 | 荣耀终端有限公司 | 检测环境光的方法和电子设备 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120098976A1 (en) * | 2008-06-06 | 2012-04-26 | Microsoft Corporation | Radiometric calibration using temporal irradiance mixtures |
CN102693698A (zh) * | 2012-06-25 | 2012-09-26 | 济南大学 | 一种基于环境光变化的户外led显示屏亮度自动调节方法及系统 |
CN107645606A (zh) * | 2017-09-29 | 2018-01-30 | 努比亚技术有限公司 | 屏幕亮度调节方法、移动终端及可读存储介质 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001223792A (ja) * | 2000-02-09 | 2001-08-17 | Mitsubishi Electric Corp | 携帯電話機 |
KR101148791B1 (ko) * | 2004-06-30 | 2012-05-24 | 엘지디스플레이 주식회사 | 타일형 표시장치 |
IL165852A (en) * | 2004-12-19 | 2010-12-30 | Rafael Advanced Defense Sys | System and method for image display enhancement |
KR100792986B1 (ko) * | 2005-09-28 | 2008-01-08 | 엘지전자 주식회사 | 휴대 단말기에서의 렌즈 왜곡 보상 장치 및 방법 |
JP4682181B2 (ja) * | 2007-11-19 | 2011-05-11 | シャープ株式会社 | 撮像装置および電子情報機器 |
JP2010020072A (ja) * | 2008-07-10 | 2010-01-28 | Canon Inc | 表示装置 |
KR101564076B1 (ko) * | 2011-04-29 | 2015-10-27 | 후아웨이 디바이스 컴퍼니 리미티드 | 단말 기기 내의 발광 소자를 제어하는 방법 및 장치, 및 단말 기기 |
CN103152523A (zh) * | 2013-02-27 | 2013-06-12 | 华为终端有限公司 | 电子设备拍摄方法及装置和电子设备 |
CN104113617A (zh) | 2013-04-16 | 2014-10-22 | 深圳富泰宏精密工业有限公司 | 背光亮度调节系统及方法 |
TWI482145B (zh) * | 2013-06-20 | 2015-04-21 | Novatek Microelectronics Corp | 影像顯示裝置及其背光調整方法 |
US9570002B2 (en) | 2014-06-17 | 2017-02-14 | Apple Inc. | Interactive display panel with IR diodes |
CN105592270B (zh) * | 2015-12-18 | 2018-02-06 | 广东欧珀移动通信有限公司 | 图像亮度补偿方法、装置及终端设备 |
CN106488203B (zh) | 2016-11-29 | 2018-03-30 | 广东欧珀移动通信有限公司 | 图像处理方法、图像处理装置、成像装置及电子装置 |
CN107222664B (zh) | 2017-05-03 | 2020-03-06 | Oppo广东移动通信有限公司 | 相机模组及电子装置 |
CN107566695B (zh) * | 2017-08-14 | 2019-07-02 | 厦门美图之家科技有限公司 | 一种补光方法及移动终端 |
-
2018
- 2018-04-28 JP JP2020554133A patent/JP7142715B2/ja active Active
- 2018-04-28 WO PCT/CN2018/085107 patent/WO2019192044A1/zh active Application Filing
- 2018-04-28 US US17/043,201 patent/US11250813B2/en active Active
- 2018-04-28 KR KR1020207030583A patent/KR102434930B1/ko active IP Right Grant
- 2018-04-28 CN CN201880072800.0A patent/CN111345019B/zh active Active
- 2018-04-28 CN CN202110991142.9A patent/CN113923422B/zh active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120098976A1 (en) * | 2008-06-06 | 2012-04-26 | Microsoft Corporation | Radiometric calibration using temporal irradiance mixtures |
CN102693698A (zh) * | 2012-06-25 | 2012-09-26 | 济南大学 | 一种基于环境光变化的户外led显示屏亮度自动调节方法及系统 |
CN107645606A (zh) * | 2017-09-29 | 2018-01-30 | 努比亚技术有限公司 | 屏幕亮度调节方法、移动终端及可读存储介质 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113240757A (zh) * | 2021-05-12 | 2021-08-10 | 深圳市光科全息技术有限公司 | 蓝光参数检测方法、装置、设备和介质 |
CN113240757B (zh) * | 2021-05-12 | 2023-07-14 | 深圳市光科全息技术有限公司 | 蓝光参数检测方法、装置、设备和介质 |
WO2024045829A1 (zh) * | 2022-08-29 | 2024-03-07 | 深圳市Tcl云创科技有限公司 | 屏幕亮度调整方法、装置、存储介质及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
CN111345019A (zh) | 2020-06-26 |
CN111345019B (zh) | 2021-08-31 |
JP7142715B2 (ja) | 2022-09-27 |
US11250813B2 (en) | 2022-02-15 |
KR20200133383A (ko) | 2020-11-27 |
CN113923422B (zh) | 2023-06-30 |
CN113923422A (zh) | 2022-01-11 |
US20210027746A1 (en) | 2021-01-28 |
JP2021518722A (ja) | 2021-08-02 |
KR102434930B1 (ko) | 2022-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109891874B (zh) | 一种全景拍摄方法及装置 | |
US10510136B2 (en) | Image blurring method, electronic device and computer device | |
WO2021052279A1 (zh) | 一种折叠屏显示方法及电子设备 | |
US20220005439A1 (en) | Method for display-brightness adjustment and related products | |
WO2019192044A1 (zh) | 一种环境光检测的方法及终端 | |
US20180013955A1 (en) | Electronic device including dual camera and method for controlling dual camera | |
US11546531B2 (en) | Camera assembly, image acquisition method, and mobile terminal | |
KR20150100394A (ko) | 이미지 표시 방법 및 장치 | |
US20210168279A1 (en) | Document image correction method and apparatus | |
US20220367550A1 (en) | Mobile terminal and image photographing method | |
CN107705247B (zh) | 一种图像饱和度的调整方法、终端及存储介质 | |
JP6862564B2 (ja) | 画像合成のための方法、装置および不揮発性コンピュータ可読媒体 | |
US20170048458A1 (en) | Method for providing images and electronic device supporting the same | |
US10694130B2 (en) | Imaging element and imaging device | |
CN109462732B (zh) | 一种图像处理方法、设备及计算机可读存储介质 | |
CN108848321B (zh) | 曝光优化方法、装置及计算机可读存储介质 | |
EP3249999B1 (en) | Intelligent matching method for filter and terminal | |
WO2023207445A1 (zh) | 图像去噪方法、装置、系统及电子设备 | |
CN109729280A (zh) | 一种图像处理方法及移动终端 | |
CN109729264B (zh) | 一种图像获取方法及移动终端 | |
CN114143588B (zh) | 一种播放控制方法以及电子设备 | |
US11153018B2 (en) | Electronic device and method for controlling electronic device | |
WO2023108442A1 (zh) | 图像处理方法、智能终端及存储介质 | |
CN115225789A (zh) | 感光模组、图像获取设备、图像获取方法及智能终端 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18913443 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020554133 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20207030583 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18913443 Country of ref document: EP Kind code of ref document: A1 |