CN118522243A - Ambient light detection method, device, terminal and storage medium - Google Patents

Ambient light detection method, device, terminal and storage medium Download PDF

Info

Publication number
CN118522243A
CN118522243A CN202310132879.4A CN202310132879A CN118522243A CN 118522243 A CN118522243 A CN 118522243A CN 202310132879 A CN202310132879 A CN 202310132879A CN 118522243 A CN118522243 A CN 118522243A
Authority
CN
China
Prior art keywords
information
photosensitive
compensation
target area
obtaining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310132879.4A
Other languages
Chinese (zh)
Inventor
张逸帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202310132879.4A priority Critical patent/CN118522243A/en
Publication of CN118522243A publication Critical patent/CN118522243A/en
Pending legal-status Critical Current

Links

Landscapes

  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)

Abstract

The disclosure relates to an ambient light detection method, an ambient light detection device, a terminal and a storage medium. The ambient light detection method is applied to the terminal, and first brightness information of a plurality of target areas of the display screen and second brightness information of a preset area are obtained; obtaining a first gray value of each target region based on the first luminance information; obtaining first compensation information of each target area according to the first gray value and the photosensitive distance of each target area; obtaining second compensation information of a preset area based on the second brightness information; obtaining a photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information; and obtaining the ambient light information according to the photosensitive information and the photosensitive compensation value detected by the photosensitive element. By adopting the method provided by the disclosure, the photosensitive compensation value of the photosensitive element can be accurately obtained under the condition that the brightness of the photosensitive area of the display screen is increased due to inconsistent contents displayed in different areas of the display screen, so that the ambient light information can be accurately obtained.

Description

Ambient light detection method, device, terminal and storage medium
Technical Field
The disclosure relates to the technical field of intelligent terminals, and in particular relates to an ambient light detection method, an ambient light detection device, a terminal and a storage medium.
Background
With the continuous development of intelligent terminal technology, in order to ensure the integrity of a terminal display screen, the application of an under-screen light sensing technology is more and more extensive, but the under-screen light sensing technology needs to solve the problem of interference of light leakage of the display screen on an under-screen light sensing element.
An Active-matrix organic light-emitting diode (AMOLED) screen is widely applied to display screens of terminals as a display screen with high response speed and high contrast. However, due to the characteristic of being individually lighted, when the contents displayed in different areas of the display screen are different, for example, when the area provided with the photosensitive element displays white, and the other areas display black or blue and other colors, the brightness of the photosensitive element area is improved, which is higher than the brightness when the whole screen is displayed as white, that is, when the contents displayed in different areas of the display screen are different, the brightness of the display area is higher than the normal brightness as a reference, and the brightness of the local area is improved, the detection parameters of the photosensitive element are influenced, so that errors are generated when the brightness of the display screen is adjusted based on the ambient light information collected by the photosensitive element, and the visual experience of a user is influenced.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides an ambient light detection method, apparatus, terminal, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an ambient light detection method, applied to a terminal, the ambient light detection method including:
acquiring first brightness information of a plurality of target areas of a display screen of the terminal, wherein a part of or all areas of the display screen are divided into the plurality of target areas;
Acquiring second brightness information of a preset area of the display screen, wherein the preset area is determined based on the position of a photosensitive element of the terminal;
Obtaining a first gray value of each target area based on the first brightness information of each target area;
Obtaining first compensation information of each target area according to the first gray value and the photosensitive distance of each target area; wherein the photosensitive distance is a distance between each of the target areas and the photosensitive element; the first compensation information is used for representing the light leakage amount of each target area received by the photosensitive element;
Obtaining second compensation information of the preset area based on the second brightness information, wherein the second compensation information is used for representing the light leakage quantity of the preset area received by the photosensitive element;
Obtaining a photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information;
and obtaining the ambient light information according to the photosensitive information detected by the photosensitive element and the photosensitive compensation value.
In an exemplary embodiment, the obtaining the first gray value of each target area based on the first brightness information of each target area includes:
Obtaining a second gray value of each pixel point in each target area according to the third brightness information of each pixel point in each target area;
and obtaining a first gray value of each target area based on the second gray value of each pixel point in the target area.
In an exemplary embodiment, the third luminance information includes a plurality of pixel parameters, and the obtaining the second gray level value of each pixel in each target area according to the third luminance information of each pixel in each target area includes:
Acquiring a parameter value of each pixel parameter of each pixel point in the target area;
Based on a pre-stored parameter model, obtaining a second gray value of each pixel point in the target area according to parameter values of a plurality of pixel parameters of each pixel point in the target area;
wherein the pre-stored parameter model is determined based on hardware parameters of the display screen.
In an exemplary embodiment, the determining to obtain the first compensation information of each target area according to the first gray value and the photosensitive distance of each target area includes:
based on a prestored neural network model, fusing the first gray value of each target area and the photosensitive distance to obtain first fused information;
And carrying out feature extraction processing on the first fusion information to obtain first compensation information of each target area.
In an exemplary embodiment, the second luminance information includes fourth luminance information and position information of each pixel point in the preset area, and the obtaining second compensation information of the preset area based on the second luminance information includes:
Fusing the fourth brightness information and the position information of each pixel point in the preset area based on a pre-stored neural network model to obtain second fused information;
and carrying out feature extraction processing on the second fusion information to obtain second compensation information of the preset area.
In an exemplary embodiment, the second brightness information includes a heat map of the display screen, and the obtaining second compensation information of the preset area based on the second brightness information includes:
fourth brightness information and position information of each pixel point of the preset area are obtained based on the heat map;
acquiring weight information of each pixel point in the preset area based on the position information of each pixel point in the preset area;
and obtaining a second light leakage amount of the preset area based on the fourth brightness information and the weight information of each pixel point in the preset area, and taking the second light leakage amount as second compensation information.
In an exemplary embodiment, the obtaining the photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information includes:
acquiring a first light leakage amount of the target area based on the first compensation information;
and the sum of the first light leakage amount and the second light leakage amount is used as a photosensitive compensation value of the photosensitive element.
In an exemplary embodiment, the obtaining ambient light information according to the photosensitive information detected by the photosensitive element and the photosensitive compensation value includes:
and taking the difference value between the photosensitive information detected by the photosensitive element and the photosensitive compensation value as the ambient light information.
According to a second aspect of embodiments of the present disclosure, there is provided an ambient light detection device applied to a terminal, the ambient light detection device including:
a first acquisition module configured to acquire first luminance information of a plurality of target areas of a display screen of the terminal, wherein a partial area or a whole area of the display screen is divided into the plurality of target areas;
a second acquisition module configured to acquire second brightness information of a preset area of the display screen, wherein the preset area is determined based on a position of a photosensitive element of the terminal;
A determining module configured to obtain a first gray value of each of the target areas based on the first luminance information of each of the target areas;
The first processing module is configured to obtain first compensation information of each target area according to the first gray value and the photosensitive distance of each target area; wherein the photosensitive distance is a distance between each of the target areas and the photosensitive element; the first compensation information is used for representing the light leakage amount of each target area received by the photosensitive element;
the second processing module is configured to obtain second compensation information of the preset area based on the second brightness information, wherein the second compensation information is used for representing the light leakage amount of the preset area received by the photosensitive element;
a third processing module configured to obtain a photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information;
and the calculating module is configured to obtain the ambient light information according to the photosensitive information detected by the photosensitive element and the photosensitive compensation value.
In an exemplary embodiment, the determination module is further configured to:
Obtaining a second gray value of each pixel point in each target area according to the third brightness information of each pixel point in each target area;
and obtaining a first gray value of each target area based on the second gray value of each pixel point in the target area.
In an exemplary embodiment, the third luminance information includes a plurality of pixel parameters, and the determining module is further configured to:
Acquiring a parameter value of each pixel parameter of each pixel point in the target area;
Based on a pre-stored parameter model, obtaining a second gray value of each pixel point in the target area according to parameter values of a plurality of pixel parameters of each pixel point in the target area;
wherein the pre-stored parameter model is determined based on hardware parameters of the display screen.
In an exemplary embodiment, the first processing module is further configured to:
based on a prestored neural network model, fusing the first gray value of each target area and the photosensitive distance to obtain first fused information;
And carrying out feature extraction processing on the first fusion information to obtain first compensation information of each target area.
In an exemplary embodiment, the second luminance information includes fourth luminance information and position information of each pixel point in the preset area, and the second processing module is further configured to:
Fusing the fourth brightness information and the position information of each pixel point in the preset area based on a pre-stored neural network model to obtain second fused information;
and carrying out feature extraction processing on the second fusion information to obtain second compensation information of the preset area.
In an exemplary embodiment, the second luminance information includes a heat map of the display screen, and the second processing module is further configured to:
fourth brightness information and position information of each pixel point of the preset area are obtained based on the heat map;
acquiring weight information of each pixel point in the preset area based on the position information of each pixel point in the preset area;
and obtaining a second light leakage amount of the preset area based on the fourth brightness information and the weight information of each pixel point in the preset area, and taking the second light leakage amount as second compensation information.
In an exemplary embodiment, the third processing module is further configured to:
acquiring a first light leakage amount of the target area based on the first compensation information;
and the sum of the first light leakage amount and the second light leakage amount is used as a photosensitive compensation value of the photosensitive element.
In an exemplary embodiment, the computing module is further configured to:
and taking the difference value between the photosensitive information detected by the photosensitive element and the photosensitive compensation value as the ambient light information.
According to a third aspect of embodiments of the present disclosure, there is provided a terminal comprising:
A processor;
A memory for storing processor-executable instructions;
Wherein the processor is configured to perform the ambient light detection method as set forth in any one of the first aspects of the embodiments of the present disclosure.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium, which when executed by a processor of a terminal, causes the terminal to perform the ambient light detection method as in any one of the first aspects of embodiments of the present disclosure.
The method has the following beneficial effects: by adopting the method, under the condition that the brightness of the photosensitive area of the display screen is increased due to inconsistent contents displayed in different areas of the display screen, the photosensitive compensation value of the photosensitive element can be accurately obtained, so that the ambient light information can be accurately obtained; in addition, the method for dividing the target area and introducing the gray value for obtaining the light leakage quantity of the target area is adopted, so that the operation quantity in the whole method implementation process is greatly reduced, the environment light information can be still obtained rapidly and accurately on the premise of not obviously increasing the terminal power consumption and on the premise of high screen brushing frequency of the display screen, and the use experience of a user is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart illustrating an ambient light detection method according to an exemplary embodiment;
FIG. 2 is a flowchart illustrating a method of determining a first gray value, according to an exemplary embodiment;
FIG. 3 is a flow chart illustrating an ambient light detection method according to an exemplary embodiment;
FIG. 4 is a flowchart illustrating an ambient light detection method according to an exemplary embodiment;
FIG. 5 is a schematic diagram of a heat map of a display screen shown according to an example embodiment;
FIG. 6 is a block diagram of an ambient light detection device shown according to an exemplary embodiment;
Fig. 7 is a block diagram of a terminal shown according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the invention. Rather, they are merely examples of apparatus and methods consistent with aspects of the invention as detailed in the accompanying claims.
In the related art, the characteristic of the AMOLED display screen is that the individual bright spots, that is, different light emitting spots are activated for different display contents, so that only a part of the pixels of the display screen are lightened, and because the light guiding performance between the pixels of the display screen is poor, the brightness of the part of the pixels is higher when only a part of the pixels are lightened than when all of the pixels are lightened. Therefore, even if the display contents of different areas around the photosensitive element are consistent, the light leakage amount of the display screen is completely different, so that the obtained light leakage amount is inconsistent with the actual light leakage amount, serious compensation failure occurs, the ambient light information detected by the photosensitive element is affected, and errors are generated when the brightness of the display screen is regulated based on the ambient light information collected by the photosensitive element, and the visual experience of a user is affected.
If the pixel points of the whole display screen are used as input to calculate the light leakage of the display screen, for the display screen with the pixels above 3200 multiplied by 1440, the calculation speed is slow due to huge data volume, so that the display screen cannot meet the requirement of high screen brushing of 120Hz, even if the pixel points of part of the screen are used for calculation in the calculation process, the display size of the display screen is reduced to 640 multiplied by 640 pixels, the calculation amount is still large, the time consumption of each calculation process is long, the time consumption required by calculation cannot be covered by the interval time between two screen brushing, and the picture is seriously distorted.
In order to overcome the problems in the related art, the present disclosure provides an ambient light detection method, which is applied to a terminal, and the terminal may be a product such as a mobile phone with a display screen, a tablet computer, a notebook computer, an intelligent wearable device, and the like. According to the method, the mode of dividing the target area is adopted, the gray value is introduced for obtaining the light leakage quantity of the target area, the operation quantity in the whole method implementation process is greatly reduced, the environment light information can still be obtained rapidly and accurately on the premise of not obviously increasing the terminal power consumption and on the premise of high screen brushing frequency of the display screen, and the use experience of a user is improved.
In an exemplary embodiment of the present disclosure, an ambient light detection method is provided and applied to a terminal. Fig. 1 is a flowchart illustrating an ambient light detection method according to an exemplary embodiment, as shown in fig. 1, including the steps of:
step S101, acquiring first brightness information of a plurality of target areas of a display screen of a terminal, wherein a part of or all areas of the display screen are divided into the plurality of target areas;
step S102, obtaining second brightness information of a preset area of the display screen, wherein the preset area is determined based on the position of a photosensitive element of the terminal;
step S103, obtaining a first gray value of each target area based on the first brightness information of each target area;
Step S104, obtaining first compensation information of each target area according to the first gray value and the photosensitive distance of each target area; wherein, the photosensitive distance is the distance between each target area and the photosensitive element; the first compensation information is used for representing the light leakage quantity of each target area received by the photosensitive element;
Step S105, based on the second brightness information, obtaining second compensation information of the preset area, wherein the second compensation information is used for representing the light leakage quantity of the preset area received by the photosensitive element;
step S106, obtaining a photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information;
Step S107, obtaining the ambient light information according to the photosensitive information and the photosensitive compensation value detected by the photosensitive element.
In step S101, the terminal includes electronic devices such as a smart phone, a tablet, and an intelligent wearable device, and the display screen of the terminal is an AMOLED display screen, or other display screens having similar characteristics to the AMOLED display screen, that is, there is a problem that the brightness of the display screen when part of pixels of the display screen are turned on due to the fact that the display content is consistent is higher than the brightness of the display screen when the same color is displayed on the display screen, and then the brightness of the display screen is adjusted according to the light leakage amount of the display screen received by the photosensitive element is inaccurate.
In the step, dividing a part or all of the area of the display screen of the terminal into a plurality of target areas, wherein the part of the area of the display screen of the terminal comprises a display screen area covering the position of the photosensitive element of the terminal, for example, dividing the whole display screen into an upper part, a middle part and a lower part, wherein the areas of the three parts are equal, and when the photosensitive element is positioned at the upper part, the upper part can be divided into a plurality of target areas for use in a subsequent method; for another example, the whole display screen is divided into an upper part and a lower part, the areas of the two parts can be equal or unequal, and when the photosensitive element is positioned at the lower part, the lower part is divided into a plurality of target areas for the subsequent method. To ensure accuracy of calculation of the photosensitive compensation value, the entire area of the terminal display screen, i.e., the entire screen area, may be divided into a plurality of target areas.
The size of the target area is an empirical value, and is determined according to actual requirements, and when the target area is divided into the selected area of the display screen, the selected area is generally divided into an array form of rows and columns. The first brightness information of the target area is a brightness parameter of each pixel point in the target area, and the brightness parameter is determined according to the hardware structure of the light emitting diode of the display screen. In one example, the display content of the display screen is formed by displaying three kinds of RGB light emitting diodes, and the first luminance information is luminance corresponding to an RGB value of each pixel point, where R is red, represents a luminance value of red, G is green, represents a luminance value of green, and B is blue, represents a luminance value of blue. In another example, the display content of the display screen is formed by displaying four kinds of light emitting diodes (RGBC), and the first luminance information is luminance corresponding to an RGBC value of each pixel, where RGB has the same meaning as in the previous example, and C may represent a luminance value of white light to enhance the luminance of the RGBC-formed pixel display. In addition, it should be noted that, the first luminance information is also related to the arrangement manner of the RGB three light emitting diodes in each pixel, for example, the pixels of some display screens include one RGB, and the pixels of some display screens include 2R, one G and one B, so that when calculating the first luminance information, the arrangement manner of the diodes needs to be considered to improve the accuracy of the first luminance information. When the first brightness information is acquired, the terminal can be directly connected with the display screen through the controller, the first brightness information of the display content to be displayed in the target area is requested to be acquired, a heat map of the display screen can also be acquired, and the first brightness information is acquired through analysis and identification of the heat map.
In step S102, the preset area is a screen area corresponding to the position where the terminal photosensitive element is located, the size of the preset area is an empirical value, and it is determined according to practical requirements, it is understood that the larger the preset area is, the larger the calculation amount of the photosensitive compensation value is, the slower the calculation speed is, so the preset area may be determined to be a regular area, such as a rectangular area or an elliptical area, covering the entire photosensitive element, for example, an area of 300×200 pixels on the display screen, which is centered on the photosensitive element, as the preset area.
The second luminance information of the preset area is a luminance parameter of each pixel point in the preset area, that is, a luminance value corresponding to a pixel value determined according to a distribution condition of light emitting diodes of the display screen, for example, a display content of the display screen is formed by displaying three kinds of light emitting diodes of RGB, and the RGB values of the pixel points are (241, 255, 111), so that the second luminance information is a matrix formed by RGB of each pixel point, and an integrated luminance value can be finally calculated through the RGB value of each pixel point, and the second luminance information is an array of integrated luminance values of each pixel point in the preset area.
In step S103, a first gray value of each target area is obtained according to the first brightness information of each target area, the calculation mode of the first gray value is determined by the hardware structure of the terminal display screen, the calculation mode of the gray value is stored in the terminal memory in advance, and the first gray value of each target area is calculated according to the first brightness information of each target area by calling the pre-stored calculation mode. The first gray value of each target area may be a statistical value of gray values of all pixels of each target area, for example, an average value or a median value of gray values of each pixel, or may be a statistical value of gray values of partial pixels of each target area, for example, a statistical value of gray values of pixels at preset positions in each target area.
In step S104, in addition to the first brightness information of the target area affecting the amount of light leakage from the target area to the terminal photosensitive element, the distance between the target area and the terminal photosensitive element also affects the amount of light leakage from the target area to the photosensitive element, and in the case where the first brightness information is the same, the amount of light leakage from the target area closer to the photosensitive element is greater than the amount of light leakage from the target area farther to the photosensitive element, so that the distance between each target area and the photosensitive element, that is, the photosensitive distance between each target area, needs to be obtained.
The photosensitive distance of the target area is the distance from the center point of the target area to the center point of the photosensitive element. Based on two factors of each target area affecting the light leakage amount of the photosensitive element, namely a first gray value and a photosensitive distance obtained by the first brightness information, first compensation information capable of representing the light leakage amount of each target area received by the photosensitive element can be obtained. The relation model between the first gray value and the photosensitive distance and the first compensation information can be constructed through a neural network model, the relation model between the first gray value and the photosensitive distance and the first compensation information can be constructed through a machine learning algorithm, the relation model which is constructed in advance is stored in a memory of the terminal, and after the first gray value and the first brightness information are obtained, the first compensation information is obtained through calling a pre-stored relation model to calculate.
In step S105, since the preset area is a screen area near the photosensitive element, the second brightness information of the preset area is a main factor affecting the light leakage amount of the photosensitive element by the preset area, so that the second compensation information capable of characterizing the light leakage amount of the preset area received by the photosensitive element can be obtained based on the second brightness information of the preset area. The relation model between the second brightness information and the second compensation information can be built through a neural network model, the relation model between the second brightness information and the second compensation information can be built through a machine learning algorithm, the relation model which is built in advance is stored in a memory of the terminal, and after the second brightness information is obtained, the second compensation information is obtained through calling a pre-stored relation model to calculate. The light leakage amount of the preset area can be estimated through the pixel value of each pixel point of the preset area, and the light leakage amount of the preset area is used as second compensation information.
In step S106, the first compensation information characterizes the light leakage amount of each target area received by the photosensitive element, and the second compensation information characterizes the light leakage amount of the preset area received by the photosensitive element, so that the integrated light leakage amount of the target area and the preset area received by the photosensitive element, that is, the actual light leakage amount of the display screen, can be obtained according to the first compensation information and the second compensation information, and the photosensitive compensation value of the photosensitive element can be obtained according to the actual light leakage amount of the display screen.
When a relation model between the first gray value and the photosensitive distance and the first compensation information and a relation model between the second brightness information and the second compensation information are constructed through a neural network model, the first gray value and the photosensitive distance are used as input of the neural network model, the obtained first compensation information is used as characteristic information for representing the light leakage quantity of each target area received by the photosensitive element, the second brightness information is used as input of the neural network model, and the obtained second compensation information is used as characteristic information for representing the light leakage quantity of the preset area received by the photosensitive element. The neural network model superimposes (concat) the first compensation information and the second compensation information, and then outputs the photosensitive compensation value of the photosensitive element after passing through 3 full-connection layers (Fully connected layers, FC). When the second compensation information is the light leakage amount of the preset area, after the first compensation information is obtained, the light leakage amount of the target area is obtained according to the first compensation information, and then the photosensitive compensation value of the photosensitive element can be calculated based on the light leakage amount of the preset area and the light leakage amount of the target area.
In step S107, the photosensitive information detected by the photosensitive element includes the ambient light information and the light leakage amount of the display screen, and the ambient light information is obtained by eliminating the influence of the light leakage amount of the display screen by the photosensitive compensation value, so that the difference between the photosensitive information detected by the photosensitive element and the photosensitive compensation value is used as the ambient light information.
In an exemplary embodiment of the present disclosure, a partial area or an entire area of a display screen is divided into a plurality of target areas, first luminance information of the plurality of target areas of the display screen of a terminal and second luminance information of a preset area of the display screen of the terminal are respectively obtained, wherein the preset area is determined based on a position of a photosensitive element of the terminal, a gray value of each target area is obtained based on the first luminance information of each target area, first compensation information of each target area for characterizing a light leakage amount of each target area received by the photosensitive element is obtained based on the first gray value and a photosensitive distance of each target area, second compensation information of the preset area for characterizing the light leakage amount of the preset area received by the photosensitive element is obtained based on the second luminance information, a photosensitive compensation value of the photosensitive element is obtained based on the first compensation information and the second compensation information, and ambient light information is obtained based on the photosensitive information and the photosensitive compensation value detected by the photosensitive element. By adopting the method provided by the disclosure, the photosensitive compensation value of the photosensitive element can be accurately obtained under the condition that the brightness of the photosensitive area of the display screen is increased due to inconsistent contents displayed in different areas of the display screen, so that the ambient light information can be accurately obtained. In addition, the method adopts a mode of dividing the target area and introduces the gray value for obtaining the light leakage quantity of the target area, so that the operation quantity in the implementation process of the whole method is greatly reduced, the environment light information can be quickly and accurately obtained on the premise of high screen brushing frequency of the display screen while the terminal power consumption is not obviously increased, and the use experience of a user is improved
In an exemplary embodiment, fig. 2 is a flowchart of a method for obtaining a first gray value of each target region based on first luminance information of each target region in step S103 shown in an exemplary embodiment, including the steps of:
step S201, obtaining parameter values of each pixel parameter of each pixel point in a target area;
Step S202, obtaining a second gray value of each pixel point in the target area according to parameter values of a plurality of pixel parameters of each pixel point in the target area based on a pre-stored parameter model;
step S203, obtaining a first gray value of each target area based on the second gray value of each pixel point in the target area.
In step S201, the first luminance information of the target area includes the third luminance information of each pixel in the target area, where the third luminance information of each pixel includes a plurality of pixel parameters, where the pixel parameters are determined by hardware parameters of a display screen of the terminal, may include three RGB pixel parameters, may include four RGBC pixel parameters, and respectively obtain parameter values of each pixel parameter of each pixel in the target area, and when the three RGB pixel parameters are included, respectively obtain R value, G value, and B value of each pixel. In this embodiment, a method for obtaining the first gray value will be described by taking an example in which each pixel includes three pixel parameters of RGB.
In step S202, the pre-stored parameter model is determined based on the hardware parameters of the display screen of the terminal, and when the number and arrangement of the light emitting elements of each pixel point of the display screen are different, the parameter model for calculating the second gray value of each pixel point is also different. Before each terminal leaves the factory, a parameter model which is matched with the hardware parameters of the display screen of the terminal and is used for calculating the second gray value of each pixel point is pre-stored in a memory of the terminal according to the hardware parameters of the display screen of the terminal. Based on a pre-stored parameter model, according to parameter values of a plurality of pixel parameters of each pixel point in the target area, a second gray value of each pixel point in the target area can be obtained. When the third luminance information of each pixel includes RGB three pixel parameters, the parameter model for calculating the second gray value of each pixel includes the following five formulas:
Gray = (Red + Green + Blue) / 3 ①
Gray = (Red * 0.3 + Green * 0.59 + Blue * 0.11) ②
Gray = (Red * 0.2126 + Green * 0.7152 + Blue * 0.0722) ③
Gray = (Red * 0.299 + Green * 0.587 + Blue * 0.114) ④
Gray = (max(Red, Green, Blue) + min(Red,Green,Blue))/2 ⑤
Where Gray represents the second Gray value, red represents the parameter value of the pixel parameter R, green represents the parameter value of the pixel parameter G, and Blue represents the parameter value of the pixel parameter B. According to the hardware parameters of the display screen of the terminal, one of the five formulas is selected to be pre-stored in the terminal, and of course, it can be understood that the five formulas are only the calculation modes for illustrating the second gray value of each pixel, and when the arrangement mode of the pixel parameters of each pixel of the new display screen appears, the new parameter model can be correspondingly re-stored in the terminal so as to calculate the gray value of each pixel.
In step S203, since the average value of the second gradation values of all the pixel points in the target area is a value that most reflects the gradation information of the target area, when the first gradation value of each target area is obtained based on the second gradation value of each pixel point in the target area, the average value of the second gradation values of all the pixel points in the target area is taken as the first gradation value of the target area.
In an exemplary embodiment, the specific implementation of step S104 in the above embodiment is as follows:
based on a pre-stored neural network model, fusing the first gray value and the photosensitive distance of each target area to obtain first fused information;
And carrying out feature extraction processing on the first fusion information to obtain first compensation information of each target area.
The method comprises the steps of constructing a relation model between a first gray value and a photosensitive distance of a target area and first compensation information of each target area through a neural network model, storing the constructed neural network model in a memory of a terminal in advance, and inputting the first gray value and the photosensitive distance of each target area into a pre-stored neural network model after obtaining the first gray value and the photosensitive distance of each target area. The method comprises the steps of marking a target area as a multiplied by b pixels, wherein a represents the number of pixel points in the length direction of the target area, b represents the number of pixel points in the width direction of the target area, the first gray value and the photosensitive distance of each target area are a multiplied by b multiplied by 2 matrix, the first gray value and the photosensitive distance of each target area are a multiplied by b multiplied by 1 matrix, performing convolution operation in a neural network model through a convolution kernel of 1 multiplied by 1 to fuse the first gray value and the photosensitive distance of each target area, namely, performing dimension reduction on the multiplied by a multiplied by 2 matrix to obtain first fusion information, performing feature extraction processing through a convolution kernel of 3 layers of 3 multiplied by 1, namely, performing convolution operation on the multiplied by 3 layers of 3 multiplied by 3 matrix to obtain feature information, namely, first compensation information, which is used for representing the light leakage quantity of each target area received by a photosensitive element.
It should be noted that, the specific structure of the neural network model may be set according to actual requirements, for example, when the feature extraction processing is performed on the first fusion information, a convolution operation may also be performed by using a convolution kernel of 3 layers and 5×5.
In an exemplary embodiment of the present disclosure, an ambient light detection method is provided and applied to a terminal. Fig. 3 is a flowchart illustrating an ambient light detection method according to an exemplary embodiment, as shown in fig. 3, including the steps of:
step S301, a first gray value and a photosensitive distance of each target area are obtained;
step S302, fusing the first gray value and the photosensitive distance of each target area based on a pre-stored neural network model to obtain first fused information;
step S303, carrying out feature extraction processing on the first fusion information to obtain first compensation information of each target area;
step S304, fourth brightness information and position information of each pixel point in a preset area are obtained;
Step S305, fusing fourth brightness information and position information of each pixel point in a preset area based on a pre-stored neural network model to obtain second fused information;
Step S306, performing feature extraction processing on the second fusion information to obtain second compensation information of a preset area;
Step S307, obtaining the photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information;
step S308, obtaining the ambient light information according to the photosensitive information and the photosensitive compensation value detected by the photosensitive element.
The specific embodiments of steps S301 to S303 are the same as those described in the above specific embodiment of step S104, and the contents of steps S307 to S308 and steps S106 to S107 are the same, and are not described here again.
In step S304, the fourth luminance information of each pixel point in the preset area is the luminance value corresponding to the pixel value of each pixel point, for example, the display content of the display screen is formed by displaying three kinds of RGB light emitting diodes, and the RGB values of the pixel points are (211, 255, 100), so that the fourth luminance information is a matrix formed by RGB of each pixel point, and an integrated luminance value can also be finally calculated by RGB values of each pixel point, and the fourth luminance information is an array of integrated luminance values of each pixel point in the preset area. Since the light leakage amount of the pixels with the same fourth brightness information to the photosensitive element is different at different positions, the light leakage amount of the pixels closer to the photosensitive element is larger than the light leakage amount of the pixels farther to the photosensitive element, so that the fourth brightness information and the position information of each pixel in the preset area need to be acquired simultaneously. The position information of each pixel point is the distance from each pixel point to the center point of the photosensitive element.
In steps S305-S306, a relation model between the fourth brightness information and the position information of each pixel point of the preset area and the second compensation information of the preset area is constructed through a neural network model, the constructed neural network model is stored in a memory of the terminal in advance, and after the fourth brightness information and the position information of each pixel point of the preset area are obtained, the fourth brightness information and the position information of each pixel point of the preset area are input into a pre-stored neural network model. And c represents the number of pixels in the length direction of the preset area, d represents the number of pixels in the width direction of the preset area, the fourth brightness information and the position information of each pixel in the preset area are matrices of c×d×2, wherein the fourth brightness information and the position information of each pixel are matrices of c×d×1, convolution operation is carried out in a neural network model through a convolution kernel of 1×1 so as to fuse the fourth brightness information and the position information of each pixel in the preset area, namely, the matrices of c×d×2 are subjected to dimension reduction to obtain second fusion information, the second fusion information is matrices of c×d×1, feature extraction processing is carried out through convolution check of the second fusion information of 3 layers of 3×3, namely, the matrices of c×d×1 are subjected to convolution operation through convolution kernels of 3 layers of 3×3, and feature information used for representing the quantity of the preset area received by the photosensitive element, namely, the second compensation information is obtained.
It should be noted that the specific structure of the neural network model may be set according to actual requirements, for example, when the feature extraction processing is performed on the second fusion information, convolution operation may also be performed by using a convolution kernel of 3 layers and 5×5.
In an exemplary embodiment of the present disclosure, an ambient light detection method is provided and applied to a terminal. Fig. 4 is a flowchart illustrating an ambient light detection method according to an exemplary embodiment, as shown in fig. 4, including the steps of:
step S401, obtaining a first gray value and a photosensitive distance of each target area;
Step S402, fusing the first gray value and the photosensitive distance of each target area based on a pre-stored neural network model to obtain first fused information;
Step S403, performing feature extraction processing on the first fusion information to obtain first compensation information of each target area;
step S404, fourth brightness information and position information of each pixel point of a preset area are obtained based on the heat map;
Step S405, obtaining weight information of each pixel point in a preset area based on the position information of each pixel point in the preset area;
step S406, obtaining a second light leakage amount of the preset area based on the fourth brightness information and the weight information of each pixel point in the preset area, and taking the second light leakage amount as second compensation information;
Step S407, obtaining a first light leakage amount of the target area based on the first compensation information;
in step S408, the sum of the first light leakage amount and the second light leakage amount is used as the photosensitive compensation value of the photosensitive element.
The contents of steps S401 to S403 are the same as those of steps S301 to S303, and will not be described here again.
In steps S404-S406, fig. 5 is a schematic diagram of a heat map of a display screen, where, as shown in fig. 5, a white area a is a heat map area of the display screen, and the heat map of the display screen represents a screen area with a larger influence on a photosensitive element, and the heat map area is identified based on a screen capture picture by capturing the display content of the display screen. And taking the acquired heat map as a preset area, and acquiring fourth brightness information and position information of each pixel point of the preset area. The fourth luminance information is the luminance corresponding to the pixel value of each pixel, for example, the display content of the display screen is formed by displaying three kinds of RGB light emitting diodes, and the RGB values of the pixels are (241, 255, 111), so that the fourth luminance information is a matrix formed by RGB of each pixel, and an integrated luminance value can be finally calculated through the RGB value of each pixel, and the fourth luminance information is an array of integrated luminance values of each pixel in a preset area. The position information of each pixel point in the preset area is the distance from each pixel point to the center point of the photosensitive element.
And obtaining weight information of each pixel point in the preset area based on the position information of each pixel point in the preset area, wherein the weight of the pixel point with the position closer to the photosensitive element is greater than that of the pixel point with the position farther from the photosensitive element. And according to the weight information of each pixel point, carrying out weighted summation on fourth brightness information of all the pixel points in the preset area, and then averaging to obtain the brightness information of the preset area, wherein the brightness information of the preset area is the second light leakage amount of the preset area. When the fourth brightness information of all the pixel points in the preset area is weighted and summed, each pixel parameter of the pixel points is weighted and summed and then averaged, for example, the display content of a display screen is formed by three RGB light emitting diodes, the display screen comprises three pixel parameters of RGB, the parameter values of the pixel parameters R of all the pixel points in the preset area are weighted and summed and then averaged to obtain the parameter value of the pixel parameter R of the preset area, the parameter values of the pixel parameters G of all the pixel points in the preset area are weighted and summed and then averaged to obtain the parameter value of the pixel parameter G of the preset area, and the parameter value of the pixel parameter B of the preset area is obtained by averaging the parameter values of the three pixel parameters of RGB in the preset area, namely the second light leakage quantity. And taking the second light leakage amount as second compensation information of the preset area.
In steps S407-S408, the first compensation information is used to represent the light leakage amount of the display screen of the target area received by the photosensitive element, and is feature vector information obtained by performing feature extraction processing on the first gray value and the photosensitive distance of the target area, and based on the first compensation information, the first compensation information is subjected to dimension reduction learning calculation through a full-connection layer in a preset neural network, so as to obtain the first light leakage amount of the target area. And adding the first light leakage amount and the second light leakage amount to obtain the comprehensive light leakage amount of the target area and the preset area, namely the actual light leakage amount of the display screen, and taking the comprehensive light leakage amount as a photosensitive compensation value of the photosensitive element.
After the photosensitive compensation value of the photosensitive element is obtained in step S408, according to the photosensitive information detected by the photosensitive element and the obtained photosensitive compensation value, the ambient light information is obtained, so that the display screen adjusts the screen brightness according to the ambient light information, in the same embodiment as described in step S308 in the previous example.
In an exemplary embodiment of the present disclosure, an ambient light detection device is provided and applied to a terminal. Fig. 6 is a block diagram of an ambient light detection device, as shown in fig. 6, according to an exemplary embodiment, including:
A first obtaining module 601 configured to obtain first luminance information of a plurality of target areas of a display screen of the terminal, wherein a partial area or a whole area of the display screen is divided into the plurality of target areas;
A second obtaining module 602, configured to obtain second brightness information of a preset area of the display screen, where the preset area is determined based on a position of a photosensitive element of the terminal;
A determining module 603 configured to obtain a first gray value of each of the target areas based on the first luminance information of each of the target areas;
A first processing module 604 configured to obtain first compensation information of each target area according to the first gray value and the photosensitive distance of each target area; wherein the photosensitive distance is a distance between each of the target areas and the photosensitive element; the first compensation information is used for representing the light leakage amount of each target area received by the photosensitive element;
A second processing module 605 configured to obtain second compensation information of the preset area based on the second brightness information, where the second compensation information is used to characterize the light leakage amount of the preset area received by the photosensitive element;
A third processing module 606 configured to obtain a photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information;
A calculating module 607 configured to obtain ambient light information based on the photosensitive information detected by the photosensitive element and the photosensitive compensation value.
In an exemplary embodiment, the determining module 603 is further configured to:
Obtaining a second gray value of each pixel point in each target area according to the third brightness information of each pixel point in each target area;
and obtaining a first gray value of each target area based on the second gray value of each pixel point in the target area.
In an exemplary embodiment, the third luminance information includes a plurality of pixel parameters, and the determining module 603 is further configured to:
Acquiring a parameter value of each pixel parameter of each pixel point in the target area;
Based on a pre-stored parameter model, obtaining a second gray value of each pixel point in the target area according to parameter values of a plurality of pixel parameters of each pixel point in the target area;
wherein the pre-stored parameter model is determined based on hardware parameters of the display screen.
In an exemplary embodiment, the first processing module 604 is further configured to:
based on a prestored neural network model, fusing the first gray value of each target area and the photosensitive distance to obtain first fused information;
And carrying out feature extraction processing on the first fusion information to obtain first compensation information of each target area.
In an exemplary embodiment, the second luminance information includes fourth luminance information and position information of each pixel point in the preset area, and the second processing module 605 is further configured to:
Fusing the fourth brightness information and the position information of each pixel point in the preset area based on a pre-stored neural network model to obtain second fused information;
and carrying out feature extraction processing on the second fusion information to obtain second compensation information of the preset area.
In an exemplary embodiment, the second brightness information includes a heat map of the display screen, and the second processing module 605 is further configured to:
fourth brightness information and position information of each pixel point of the preset area are obtained based on the heat map;
acquiring weight information of each pixel point in the preset area based on the position information of each pixel point in the preset area;
and obtaining a second light leakage amount of the preset area based on the fourth brightness information and the weight information of each pixel point in the preset area, and taking the second light leakage amount as second compensation information.
In an exemplary embodiment, the third processing module 606 is further configured to:
acquiring a first light leakage amount of the target area based on the first compensation information;
and the sum of the first light leakage amount and the second light leakage amount is used as a photosensitive compensation value of the photosensitive element.
In an exemplary embodiment, the computing module 607 is further configured to:
and taking the difference value between the photosensitive information detected by the photosensitive element and the photosensitive compensation value as the ambient light information.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 7 is a block diagram of a terminal 700 shown according to an exemplary embodiment.
Referring to fig. 7, a terminal 700 may include one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the terminal 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 702 can include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operation at the terminal 700. Examples of such data include instructions for any application or method operating on terminal 700, contact data, phonebook data, messages, pictures, videos, and the like. The memory 704 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 706 provides power to the various components of the terminal 700. Power supply components 706 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal 700.
The multimedia component 708 includes a screen between the terminal 700 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 708 includes a front-facing camera and/or a rear-facing camera. The front camera and/or the rear camera may receive external multimedia data when the terminal 700 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 710 is configured to output and/or input audio signals. For example, the audio component 710 includes a Microphone (MIC) configured to receive external audio signals when the terminal 700 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 704 or transmitted via the communication component 716. In some embodiments, the audio component 710 further includes a speaker for outputting audio signals.
The I/O interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the terminal 700. For example, the sensor assembly 714 may detect an on/off state of the terminal 700, a relative positioning of the components, such as a display and keypad of the terminal 700, a change in position of the terminal 700 or a component of the terminal 700, the presence or absence of user contact with the terminal 700, an orientation or acceleration/deceleration of the terminal 700, and a change in temperature of the terminal 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate communication between the terminal 700 and other devices, either wired or wireless. The terminal 700 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the terminal 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 704, including instructions executable by processor 720 of terminal 700 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
A non-transitory computer readable storage medium, which when executed by a processor of a terminal, enables the terminal to perform an ambient light detection method.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the invention is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (11)

1. An ambient light detection method applied to a terminal is characterized by comprising the following steps:
acquiring first brightness information of a plurality of target areas of a display screen of the terminal, wherein a part of or all areas of the display screen are divided into the plurality of target areas;
Acquiring second brightness information of a preset area of the display screen, wherein the preset area is determined based on the position of a photosensitive element of the terminal;
Obtaining a first gray value of each target area based on the first brightness information of each target area;
Obtaining first compensation information of each target area according to the first gray value and the photosensitive distance of each target area; wherein the photosensitive distance is a distance between each of the target areas and the photosensitive element; the first compensation information is used for representing the light leakage amount of each target area received by the photosensitive element;
Obtaining second compensation information of the preset area based on the second brightness information, wherein the second compensation information is used for representing the light leakage quantity of the preset area received by the photosensitive element;
Obtaining a photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information;
and obtaining the ambient light information according to the photosensitive information detected by the photosensitive element and the photosensitive compensation value.
2. The ambient light detection method according to claim 1, wherein the obtaining a first gradation value for each of the target areas based on the first luminance information for each of the target areas includes:
Obtaining a second gray value of each pixel point in each target area according to the third brightness information of each pixel point in each target area;
and obtaining a first gray value of each target area based on the second gray value of each pixel point in the target area.
3. The ambient light detection method according to claim 2, wherein the third luminance information includes a plurality of pixel parameters, and the obtaining the second gray level value of each pixel in each of the target areas based on the third luminance information of each pixel in each of the target areas includes:
Acquiring a parameter value of each pixel parameter of each pixel point in the target area;
Based on a pre-stored parameter model, obtaining a second gray value of each pixel point in the target area according to parameter values of a plurality of pixel parameters of each pixel point in the target area;
wherein the pre-stored parameter model is determined based on hardware parameters of the display screen.
4. The ambient light detection method according to claim 1, wherein the determining to obtain the first compensation information for each of the target areas based on the first gray value and the photosensitive distance for each of the target areas includes:
based on a prestored neural network model, fusing the first gray value of each target area and the photosensitive distance to obtain first fused information;
And carrying out feature extraction processing on the first fusion information to obtain first compensation information of each target area.
5. The ambient light detection method according to claim 1, wherein the second luminance information includes fourth luminance information and position information of each pixel point in the preset area, the obtaining second compensation information of the preset area based on the second luminance information includes:
Fusing the fourth brightness information and the position information of each pixel point in the preset area based on a pre-stored neural network model to obtain second fused information;
and carrying out feature extraction processing on the second fusion information to obtain second compensation information of the preset area.
6. The ambient light detection method according to claim 1, wherein the second luminance information includes a heat map of the display screen, and the obtaining second compensation information of the preset area based on the second luminance information includes:
fourth brightness information and position information of each pixel point of the preset area are obtained based on the heat map;
acquiring weight information of each pixel point in the preset area based on the position information of each pixel point in the preset area;
and obtaining a second light leakage amount of the preset area based on the fourth brightness information and the weight information of each pixel point in the preset area, and taking the second light leakage amount as second compensation information.
7. The method according to claim 6, wherein obtaining a photosensitive compensation value of the photosensitive element based on the first compensation information and the second compensation information, comprises:
acquiring a first light leakage amount of the target area based on the first compensation information;
and the sum of the first light leakage amount and the second light leakage amount is used as a photosensitive compensation value of the photosensitive element.
8. The ambient light detection method according to claim 1, wherein the obtaining ambient light information based on the photosensitive information detected by the photosensitive element and the photosensitive compensation value includes:
and taking the difference value between the photosensitive information detected by the photosensitive element and the photosensitive compensation value as the ambient light information.
9. An ambient light detection device for a terminal, the ambient light detection device comprising:
a first acquisition module configured to acquire first luminance information of a plurality of target areas of a display screen of the terminal, wherein a partial area or a whole area of the display screen is divided into the plurality of target areas;
a second acquisition module configured to acquire second brightness information of a preset area of the display screen, wherein the preset area is determined based on a position of a photosensitive element of the terminal;
A determining module configured to obtain a first gray value of each of the target areas based on the first luminance information of each of the target areas;
The first processing module is configured to obtain first compensation information of each target area according to the first gray value and the photosensitive distance of each target area; wherein the photosensitive distance is a distance between each of the target areas and the photosensitive element; the first compensation information is used for representing the light leakage amount of each target area received by the photosensitive element;
the second processing module is configured to obtain second compensation information of the preset area based on the second brightness information, wherein the second compensation information is used for representing the light leakage amount of the preset area received by the photosensitive element;
a third processing module configured to obtain a photosensitive compensation value of the photosensitive element according to the first compensation information and the second compensation information;
and the calculating module is configured to obtain the ambient light information according to the photosensitive information detected by the photosensitive element and the photosensitive compensation value.
10. A terminal, comprising:
A processor;
A memory for storing processor-executable instructions;
wherein the processor is configured to perform the ambient light detection method of any one of claims 1-8.
11. A non-transitory computer readable storage medium, characterized in that instructions in the storage medium, when executed by a processor of a terminal, enable the terminal to perform the ambient light detection method according to any one of claims 1-8.
CN202310132879.4A 2023-02-17 2023-02-17 Ambient light detection method, device, terminal and storage medium Pending CN118522243A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310132879.4A CN118522243A (en) 2023-02-17 2023-02-17 Ambient light detection method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310132879.4A CN118522243A (en) 2023-02-17 2023-02-17 Ambient light detection method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN118522243A true CN118522243A (en) 2024-08-20

Family

ID=92280161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310132879.4A Pending CN118522243A (en) 2023-02-17 2023-02-17 Ambient light detection method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN118522243A (en)

Similar Documents

Publication Publication Date Title
US11015973B2 (en) Method and apparatus for acquiring ambient light brightness based on the luminance value of the screen of a terminal device
US9858885B2 (en) Method and device for reducing display brightness
CN111128066B (en) Terminal screen, screen structure, control method and device thereof and terminal
US11226718B2 (en) Method and mobile terminal for utilizing area near openings in display screen to display application icon
US10650502B2 (en) Image processing method and apparatus, and storage medium
CN109215578B (en) Screen display method and device
CN111380610B (en) Ambient light detection method and apparatus, electronic device, and storage medium
US11094291B2 (en) Method and device for detecting ambient light and terminal
CN113654459B (en) Method, device and storage medium for determining position of under-screen photosensitive sensor
CN112905135A (en) Screen brightness processing method, electronic device and storage medium
CN114067733B (en) Display screen correction method, display screen correction device and display chip
CN112331158B (en) Terminal display adjusting method, device, equipment and storage medium
CN112033527A (en) Ambient brightness detection method, device, equipment and storage medium
CN115706750B (en) Color temperature calibration method, color temperature calibration device and storage medium
CN118522243A (en) Ambient light detection method, device, terminal and storage medium
CN114120898B (en) Brightness adjusting method, brightness adjusting device and computer readable storage medium
CN113066452B (en) Display control method and device, electronic equipment and computer readable storage medium
CN111243506B (en) Screen brightness adjusting method, device, equipment and storage medium
CN111383583B (en) Display control method and apparatus, electronic device, and computer-readable storage medium
CN113709275A (en) Ambient light determination method and device, terminal equipment and storage medium
CN114416226A (en) Display adjusting method, device, terminal and storage medium
CN110956938A (en) Screen brightness adjusting method and device, electronic equipment and readable storage medium
CN108206012A (en) Gamma correction method and device
CN118016002A (en) Ambient light brightness calibration method, device, electronic equipment and storage medium
CN115482757A (en) Optical signal compensation method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination