CN110458826B - Ambient brightness detection method and device - Google Patents

Ambient brightness detection method and device Download PDF

Info

Publication number
CN110458826B
CN110458826B CN201910736308.5A CN201910736308A CN110458826B CN 110458826 B CN110458826 B CN 110458826B CN 201910736308 A CN201910736308 A CN 201910736308A CN 110458826 B CN110458826 B CN 110458826B
Authority
CN
China
Prior art keywords
information
environment image
brightness
image
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910736308.5A
Other languages
Chinese (zh)
Other versions
CN110458826A (en
Inventor
陈轶博
张峰
陈果果
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Shanghai Xiaodu Technology Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Shanghai Xiaodu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd, Shanghai Xiaodu Technology Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN201910736308.5A priority Critical patent/CN110458826B/en
Publication of CN110458826A publication Critical patent/CN110458826A/en
Application granted granted Critical
Publication of CN110458826B publication Critical patent/CN110458826B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The application discloses an ambient brightness detection method and device, and relates to the technical field of image recognition. The specific implementation scheme is as follows: performing color space conversion on each pixel point of the acquired first environment image; acquiring initial brightness information of the first environment image according to the converted pixel points; acquiring contrast information of a first environment image; and acquiring an environment brightness value according to the initial brightness information and the contrast information of the first environment image. According to the method and the device, the brightness value of the real environment of the acquired environment image can be accurately calculated by utilizing the acquired initial brightness information and contrast information of the first environment image.

Description

Ambient brightness detection method and device
Technical Field
The present application relates to the field of image processing technology, and in particular, to the field of image recognition technology.
Background
When the mobile terminal adjusts the screen brightness of the device, the light sensor on the screen of the mobile terminal is required to detect the brightness of the environment, so that the screen brightness is adjusted according to the detection result. Therefore, the light sensor is an indispensable hardware structure of the mobile terminal, and if the light sensor is not used, the automatic adjustment function of the screen brightness of the mobile terminal cannot be realized.
However, there are certain disadvantages to measuring the brightness of the environment by the light sensor. The light sensor is essentially detecting the reflected light of an object, but the dark object itself reflects less light, in which case it is difficult for the light sensor to detect the brightness of the environment.
Disclosure of Invention
The embodiment of the application provides an ambient brightness detection method and an ambient brightness detection device, which are used for solving one or more technical problems in the prior art.
In a first aspect, an embodiment of the present application provides an ambient brightness detection method, including:
performing color space conversion on each pixel point of the acquired first environment image;
acquiring initial brightness information of the first environment image according to the converted pixel points;
acquiring contrast information of a first environment image;
and acquiring an environment brightness value according to the initial brightness information and the contrast information of the first environment image.
The present embodiment can accurately calculate the real brightness value of the environment shown in the first environment image by using the acquired initial brightness information and contrast information related to the brightness.
In one embodiment, acquiring the ambient brightness value according to the initial brightness information and the contrast information of the first ambient image includes:
obtaining reflection information in a first environment image;
and acquiring an environment brightness value according to the initial brightness information, the contrast information and the reflection information of the first environment image.
According to the embodiment, the detection of the ambient brightness can be assisted through the acquired reflection information, and the calculated ambient brightness value is more accurate by utilizing data of a plurality of different dimensions.
In one embodiment, acquiring reflection information in a first environment image comprises:
carrying out layer separation on the first environment image;
screening out the layers with the inverted images from the separated layers;
and acquiring reflection information from the layer with the reflection.
According to the method and the device, the layers of the first environment image are separated and screened, and the reflection information can be accurately acquired.
In an embodiment, before performing color space conversion on each pixel point of the acquired environment image, the method further includes:
and preprocessing the second environment image acquired by the image acquisition device according to the hardware parameter and/or the automatic exposure compensation parameter of the image acquisition device to obtain a first environment image.
The embodiment utilizes the hardware parameters and/or the automatic exposure compensation parameters of the image acquisition device to carry out image preprocessing, can restore the most real environment image, and reduces the influence of different hardware parameters and/or automatic exposure compensation parameters of the image acquisition device on the acquired environment image.
In one embodiment, performing color space conversion on each pixel point of the acquired first environment image includes:
and converting each pixel point of the first environment image from a primary color space to a color and brightness separation color space.
According to the embodiment, each pixel point is converted from the primary color space to the color-brightness separation color space, so that the data of each pixel point can be effectively utilized to obtain the initial brightness information of the first environment image.
In one embodiment, performing color space conversion on each pixel point of the acquired first environment image includes:
identifying a target region in a first environment image;
and carrying out color space conversion on each pixel point in the target area.
According to the embodiment, the target area in the first environment image is identified, so that the brightness detection area more accurate for determining the environment brightness is obtained.
In one embodiment, acquiring initial brightness information of the first environment image according to the converted pixel points includes:
acquiring brightness component information of each converted pixel point;
and acquiring initial brightness information of the first environment image according to the converted brightness component information of each pixel point.
According to the embodiment, the initial brightness information of the first environment image can be more accurately obtained by utilizing the brightness component information of each pixel point.
In one embodiment, acquiring contrast information of a first environment image comprises:
acquiring a gray value of each pixel point in the first environment image;
calculating the gray difference between adjacent pixel points according to the gray value of each pixel point in the first environment image;
acquiring the distribution probability of adjacent pixel points with the same gray value difference according to the gray value difference between the adjacent pixel points;
and calculating the contrast information of the environment image according to the gray difference between the adjacent pixel points and the distribution probability of the adjacent pixel points with the same gray value difference.
According to the embodiment, the gray value of each pixel point, the gray difference between adjacent pixel points and the distribution probability of the adjacent pixel points with the same gray value difference are utilized, the brightness of each position of the environment image can be accurately obtained, and therefore the contrast information of the environment image is further calculated.
In one embodiment, the method further comprises:
and adjusting the screen brightness of the mobile terminal according to the environment brightness value.
According to the embodiment, the screen brightness of the mobile terminal can be adjusted by utilizing the environment brightness value calculated by the environment image, a light sensor is not needed, the manufacturing cost of the mobile terminal is saved, and more space is saved for the screen of the mobile terminal.
In a second aspect, an embodiment of the present application provides an ambient brightness detection apparatus, including:
the conversion module is used for performing color space conversion on each pixel point of the acquired first environment image;
the first acquisition module is used for acquiring initial brightness information of the first environment image according to the converted pixel points;
the second acquisition module is used for acquiring contrast information of the first environment image;
and the third acquisition module is used for acquiring an environment brightness value according to the initial brightness information and the contrast information of the first environment image.
In one embodiment, the third obtaining module includes:
the first obtaining sub-module is used for obtaining reflection information in the first environment image;
and the second obtaining submodule is used for obtaining the environment brightness value according to the initial brightness information, the contrast information and the reflection information of the first environment image.
In one embodiment, the method further comprises:
and the preprocessing module is used for preprocessing the second environment image acquired by the image acquisition device according to the hardware parameter and/or the automatic exposure compensation parameter of the image acquisition device to obtain a first environment image.
In one embodiment, the conversion module comprises:
and the first conversion submodule is used for converting each pixel point of the first environment image from a primary color space to a color and brightness separation color space.
In one embodiment, the conversion module comprises:
the identification submodule is used for identifying a target area in the first environment image;
and the second conversion submodule is used for performing color space conversion on each pixel point in the target area.
In one embodiment, the first obtaining module includes:
the third obtaining submodule is used for obtaining the brightness component information of each converted pixel point;
and the fourth obtaining submodule is used for obtaining the initial brightness information of the first environment image according to the converted brightness component information of each pixel point.
In one embodiment, the second obtaining module includes:
the fifth obtaining submodule is used for obtaining the gray value of each pixel point in the first environment image;
the first calculation submodule is used for calculating the gray difference between adjacent pixel points according to the gray value of each pixel point in the first environment image;
the sixth obtaining submodule is used for obtaining the distribution probability of adjacent pixel points with the same gray value difference according to the gray value difference between the adjacent pixel points;
and the second calculation submodule is used for calculating the contrast information of the environment image according to the gray difference between adjacent pixel points and the distribution probability of the adjacent pixel points with the same gray value difference.
In one embodiment, the method further comprises:
and the adjusting module is used for adjusting the screen brightness of the mobile terminal according to the environment brightness value.
In a third aspect, an embodiment of the present application provides an electronic device, where functions of the electronic device may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above-described functions.
In one possible design, the electronic device includes a processor and a memory, the memory is used for storing a program for supporting the electronic device to execute the above-mentioned ambient brightness detection method, and the processor is configured to execute the program stored in the memory. The electronic device may also include a communication interface for communicating with other devices or a communication network.
In a fourth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium storing computer instructions for storing an electronic device and computer software instructions for the electronic device, which include a program for executing the ambient brightness detection method.
One embodiment in the above application has the following advantages or benefits: according to the method and the device, the brightness value of the real environment of the collected environment image can be accurately calculated by utilizing the acquired initial brightness information and contrast information of the first environment image. The real environment brightness value is calculated by collecting the initial brightness information and the contrast information of the image, and the environment brightness can be obtained by using the collected image without a light sensor.
Other effects of the above-described alternative will be described below with reference to specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a schematic flow chart of a method according to a first embodiment of the present application;
FIG. 2 is a schematic flow chart of another method according to the first embodiment of the present application;
FIG. 3 is a schematic flow chart of method step S410 according to the first embodiment of the present application;
FIG. 4 is a schematic flow chart of another method according to the first embodiment of the present application;
FIG. 5 is a schematic flow chart of another method according to the first embodiment of the present application;
FIG. 6 is a schematic flow chart of another method according to the first embodiment of the present application;
FIG. 7 is a schematic flow chart of another method according to the first embodiment of the present application;
FIG. 8 is a schematic flow chart of method step S300 according to the first embodiment of the present application;
FIG. 9 is a schematic flow chart of another method according to the first embodiment of the present application;
FIG. 10 is a schematic flow chart of another method according to the first embodiment of the present application;
FIG. 11 is a schematic view of an apparatus according to a second embodiment of the present application;
FIG. 12 is a schematic diagram of a third acquisition module according to a second embodiment of the present application;
FIG. 13 is a schematic view of another apparatus according to a second embodiment of the present application;
FIG. 14 is a schematic diagram of a conversion module according to a second embodiment of the present application;
FIG. 15 is a schematic view of another conversion module according to a second embodiment of the present application;
FIG. 16 is a schematic diagram of a first acquisition module according to a second embodiment of the present application;
FIG. 17 is a schematic diagram of a second acquisition module according to a second embodiment of the present application;
FIG. 18 is a schematic view of another conversion module according to a second embodiment of the present application;
fig. 19 is a block diagram of an electronic device for implementing the ambient brightness detection method according to the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
According to a first embodiment of the present application, there is provided an ambient brightness detection method, as shown in fig. 1, the method including:
s100: and performing color space conversion on each pixel point of the acquired first environment image.
The first environment image may include an image directly acquired by the image acquisition device, or may include an image obtained by preprocessing the acquired image. Performing color space conversion on each pixel point of the first environment image, wherein the color space conversion can be performed on all pixel points in the first environment image; or performing color space conversion on part of the pixel points in the first environment image. The conversion form of the color space can be selected as required, so long as each pixel point can obtain information associated with brightness after color space conversion.
S200: and acquiring initial brightness information of the first environment image according to the converted pixel points. The initial luminance information indicates luminance data on the first environment image directly obtained from data of each pixel point.
S300: contrast information of the first environment image is acquired. The manner of obtaining the contrast information may be selected as desired. For example, contrast information is acquired based on data of each pixel point after color space conversion is performed on the first environment image. The contrast information can also be acquired based on the directly acquired first environment image; the method can also be based on that after the first environment image is preprocessed in other modes, contrast information is obtained from the preprocessed image.
S400: and acquiring an environment brightness value according to the initial brightness information and the contrast information of the first environment image. The ambient brightness value may represent a true brightness value of the environment presented in the first ambient image. The present embodiment can accurately calculate the real brightness value of the environment shown in the first environment image by using the acquired initial brightness information and contrast information related to the brightness.
In an example, the ambient brightness detection method in the embodiment of the present application may be applied to scenes such as brightness detection and automatic screen brightness adjustment of an intelligent terminal. For example, the mobile terminal is a mobile phone, a camera and a light sensor are arranged on a screen of the mobile phone, the mobile phone can take a picture through the camera, and the brightness of the screen is adjusted through the optical fiber sensor. With the development of mobile phones towards high screen occupation ratio, the bang screen proposed in recent years is a main mode for solving the high screen occupation ratio. Therefore, complete screen occupation cannot be achieved, and original parts such as a camera and a light sensor are necessary for the functions of the mobile phone.
By the method, the brightness value of the environment can be directly calculated by utilizing the image acquisition function of the front camera, so that the environment brightness can be acquired without depending on a light sensor. Therefore, the mobile phone does not need to be provided with a light sensor, elements required to be arranged on the screen are reduced, and the high screen occupation ratio of the screen is improved. In addition, the cost of the hardware device is also an important consideration. Each time one element is reduced, a lot of cost can be saved.
In one embodiment, as shown in fig. 2, acquiring an environment brightness value according to initial brightness information and contrast information of a first environment image includes:
s410: and acquiring reflection information in the first environment image. Light reflects when it strikes objects in the environment, presenting particular textures, such as shadows or reflections, on the objects.
S420: and acquiring an environment brightness value according to the initial brightness information, the contrast information and the reflection information of the first environment image.
According to the embodiment, the detection of the ambient brightness can be assisted through the acquired reflection information, and the calculated ambient brightness value is more accurate by utilizing data of a plurality of different dimensions.
The manner of combining the initial brightness information, the contrast information, and the reflection information of the first environment image may be selected as needed.
In one example, the three may be combined in a weighted manner to compute the ambient brightness value.
In another example, the environment brightness value can also be calculated by multiplying the three by a certain functional relationship. For example, the formula for the multiplicative relationship may be:
s=s1*(1-e-s2)*(1-e-s3)
where s denotes an ambient brightness value, and s1, s2, and s3 denote initial brightness information, contrast information, and reflection information of the first ambient image, respectively.
In one embodiment, as shown in fig. 3, acquiring reflection information in a first environment image includes:
s4110: and carrying out layer separation on the first environment image.
S4120: and screening out the layers with the inverted images from the separated layers. The mode of screening the image layer with the reflection image can be according to the intensity of the gray value, the contrast intensity and the mode of utilizing the weak exclusion principle (weikeexclusionprincipal) and the like.
S4130: and acquiring reflection information from the layer with the reflection.
According to the embodiment, the layers of the first environment image are separated and screened, so that the reflection information can be accurately acquired.
In an embodiment, as shown in fig. 4, before performing color space conversion on each pixel point of the acquired environment image, the method further includes:
s500: and preprocessing the second environment image acquired by the image acquisition device according to the hardware parameter and/or the automatic exposure compensation parameter of the image acquisition device to obtain a first environment image.
Because in same environment, the effect that different image acquisition device (for example camera) took out is different, and some can be lighter than normal, and some can be darker than normal. Reasons for this include: on one hand, the hardware configured for each image acquisition device is different, and on the other hand, the shot image acquisition device can perform automatic exposure compensation during shooting, that is, the exposure time is increased when the environment is dark, so that the brightness of the image is increased and the image is not dark. In order to avoid the influence of the image capturing device on the captured environmental image, it is necessary to consider a wide angle parameter, sensitivity, shutter time, frame rate of a camera of the image capturing device, a compensation parameter at the time of exposure compensation, and the like. The camera is modeled by utilizing the parameter information, and the influence of different parameters of the camera under different light brightness conditions can be simulated. And correcting images of different cameras by a hardware parameter method to eliminate the influence of hardware parameters. Or the image can be directly reversely processed through the acquired hardware parameters and the automatic exposure compensation parameters of the image acquisition device, and the image can be restored to the image which is not processed by the image acquisition device.
In the embodiment, based on the influence of different image acquisition devices on the acquired environment image, the most real environment image can be restored by preprocessing the image by using the hardware parameters and/or the automatic exposure compensation parameters of the image acquisition devices, and the influence of the hardware parameters and/or the automatic exposure compensation parameters of the image acquisition devices on the acquired environment image is avoided.
In one embodiment, as shown in fig. 5, performing color space conversion on each pixel point of the acquired first environment image includes:
s110: and converting each pixel point of the first environment image from a primary color space to a color and brightness separation color space. The primary color space may comprise image pixels represented in RBG (red green blue) format. The color-luminance separation color space may include image pixels represented in YUV and HSV formats. Wherein, the "y (brightness)" component in YUV and the "v (value)" component in HSV represent luminance component information. And conversion relation exists between RGB and each component of YUV and HSV, and each component value in YUV or HSV can be obtained through the RGB value of each pixel point.
For example, the formula for mutual conversion between YUV and RGB is as follows (the value ranges of RGB are all 0-255)
Y=0.299R+0.587G+0.114B;
U (chroma) ═ 0.147R-0.289G + 0.436B;
v (chroma) 0.615R-0.515G-0.100B;
R=Y+1.14V;
G=Y-0.39U-0.58V;
B=Y+2.03U。
according to the embodiment, each pixel point is converted from the primary color space to the color-brightness separation color space, so that the data of each pixel point can be effectively utilized to obtain the initial brightness information of the first environment image.
In one embodiment, as shown in fig. 6, performing color space conversion on each pixel point of the acquired first environment image includes:
s120: a target region in a first environmental image is identified. The target area is a valuable area in the first ambient image that can be used for obtaining luminance information. For example, the ambient image has dark home and white walls. The white wall surface can be used as the target area. Dark-color homes are not highly sensitive to light, so even after color space conversion is carried out, the dark homes are also darker in luminance space components, and the reference value of luminance component information is not high.
S130: and carrying out color space conversion on each pixel point in the target area.
The present embodiment acquires a luminance detection region that is more valuable for determining ambient luminance by identifying a target region in the first ambient image.
In an embodiment, as shown in fig. 7, acquiring initial luminance information of the first environment image according to the converted pixel points includes:
s210: and acquiring the brightness component information of each pixel point after conversion. For example, when the first environment image in the RGB format is converted into the first environment image in the YUV format, the Y component represents luminance component information. When the first environment image in the RGB format is converted into the first environment image in the HSV format, the V component represents luminance component information.
S220: and acquiring initial brightness information of the first environment image according to the converted brightness component information of each pixel point.
In one example, the initial brightness information of the first environment image is obtained by calculating an average value and/or a variance value of the brightness component information of each pixel point.
According to the embodiment, the initial brightness information of the first environment image can be more accurately obtained by utilizing the brightness component information of each pixel point.
In order to obtain a more accurate ambient brightness value, the initial ambient brightness information needs to be corrected by the contrast information. The contrast information is very important information in the brightness judgment. For example, even for dark furniture, the contrast in brightness between it and the surrounding white wall is also important information, the brighter the environment, the more prominent the dark furniture and white wall will be in comparison with the color; the darker the environment, the opposite.
In one embodiment, as shown in fig. 8, acquiring contrast information of a first environment image includes:
s310: and acquiring the gray value of each pixel point in the first environment image. The gray value obtaining mode of each pixel point can be selected according to the requirement. For example, the gray value of the pixel point is obtained from the pixel point data of the image after color space conversion. The gray value of the pixel point can also be obtained through the data of each pixel point expressed in the RGB format in the first environment image.
The gray image is different from the black and white image, the black and white image only has two colors of black and white in the computer image field, and the gray image has a plurality of levels of color depth between black and white. Any color is composed of three primary colors of red, green and blue, and if the original color of a certain point is RGB, it can be converted into gray scale by the following methods:
1. floating point arithmetic: gray ═ R0.3 + G0.59 + B0.11
2. Integer method: gray ═ 30+ G59 + B11)/100
3. The shifting method comprises the following steps: gray ═ (R77 + G151 + B28) > > 8;
4. average value method: (R + G + B)/3;
5. taking green only: g ═ G;
after Gray is obtained by any of the above methods, R, G, and B in the original RGB are collectively replaced with Gray to form a new color RGB (Gray ), and the Gray map is obtained by replacing the original RGB (R, G, B) with the new color RGB (Gray ).
S320: and calculating the gray difference between adjacent pixel points according to the gray value of each pixel point in the first environment image.
S330: according to the gray difference (color change information) between adjacent pixel points, the distribution probability (color statistical information) of the adjacent pixel points with the same gray value difference is obtained.
S340: and calculating the contrast information of the environment image according to the gray difference between the adjacent pixel points and the distribution probability of the adjacent pixel points with the same gray value difference.
In one example, calculating contrast information for an image of an environment may employ the following formula:
Figure BDA0002162277970000111
δ(i,j)=|i-j|
where C denotes contrast information of an ambient image, δ (i, j) | i-j |, denotes a gray level difference between adjacent pixels, and Pδ(i, j) represents a pixel distribution probability that the gray difference between adjacent pixels is δ, and i and j represent pixel points.
According to the embodiment, the gray value of each pixel point, the gray difference between adjacent pixel points and the distribution probability of the adjacent pixel points with the same gray value difference are utilized, the brightness of each position of the environment image can be accurately known, and therefore the contrast information of the environment image is further calculated.
In one embodiment, as shown in fig. 9, the method further includes:
600: and adjusting the screen brightness of the mobile terminal according to the environment brightness value. The mobile terminal is provided with an image acquisition device.
In one example, adjusting the screen brightness of the mobile terminal according to the ambient brightness value includes:
and acquiring the current screen brightness information of the mobile terminal.
And adjusting the screen brightness of the mobile terminal according to the adaptation degree of the environment brightness value and the current screen brightness information.
According to the embodiment, the screen brightness of the mobile terminal can be adjusted by utilizing the environment brightness value calculated by the environment image, a light sensor is not needed, the manufacturing cost of the mobile terminal is saved, and more space is saved for the screen of the mobile terminal.
In one embodiment, as shown in fig. 10, the ambient brightness detection method includes:
and preprocessing the second environment image acquired by the image acquisition device according to the hardware parameter and/or the automatic exposure compensation parameter of the image acquisition device to obtain a first environment image.
And performing color space conversion on each pixel point of the acquired first environment image.
And acquiring initial brightness information of the first environment image according to the converted brightness components of the pixel points.
Contrast information of the first environment image is acquired.
And acquiring reflection information in the first environment image.
And acquiring an environment brightness value according to the initial brightness information, the contrast information and the reflection information of the first environment image.
According to an embodiment of the present application, there is provided an ambient brightness detection apparatus,
according to a second embodiment of the present application, an ambient brightness detection apparatus 100 is provided in the embodiment of the present application, as shown in fig. 11, the apparatus includes:
the conversion module 10 is configured to perform color space conversion on each pixel point of the acquired first environment image.
The first obtaining module 20 is configured to obtain initial brightness information of the first environment image according to the converted pixel points.
The second obtaining module 30 is configured to obtain contrast information of the first environment image.
And a third obtaining module 40, configured to obtain an environment brightness value according to the initial brightness information and the contrast information of the first environment image.
In one embodiment, as shown in fig. 12, the third obtaining module 40 includes:
the first obtaining sub-module 41 is configured to obtain reflection information in the first environment image.
And the second obtaining submodule 42 is configured to obtain an environment brightness value according to the initial brightness information, the contrast information, and the reflection information of the first environment image.
In one example, the first obtaining sub-module 41 includes:
and the separation unit is used for carrying out layer separation on the first environment image.
And the screening unit is used for screening out the layers with the inverted images from the separated layers.
In one embodiment, as shown in fig. 13, the method further includes:
and the preprocessing module 50 is configured to preprocess the second environment image acquired by the image acquisition device according to the hardware parameter and/or the automatic exposure compensation parameter of the image acquisition device, so as to obtain the first environment image.
In one embodiment, as shown in fig. 14, the conversion module 10 includes:
the first conversion submodule 11 is configured to convert each pixel point of the first environment image from a primary color space to a color-to-luminance separation color space.
In one embodiment, as shown in fig. 15, the conversion module 10 includes:
the identification submodule 12 is configured to identify a target region in the first environment image.
And the second conversion submodule 13 is configured to perform color space conversion on each pixel point in the target area.
In one embodiment, as shown in fig. 16, the first obtaining module 20 includes:
the third obtaining submodule 21 is configured to obtain luminance component information of each converted pixel point.
The fourth obtaining submodule 22 is configured to obtain initial luminance information of the first environment image according to the converted luminance component information of each pixel point.
In one embodiment, as shown in fig. 17, the second obtaining module 30 includes:
the fifth obtaining submodule 31 is configured to obtain a gray value of each pixel in the first environment image.
The first calculating submodule 32 is configured to calculate a gray difference between adjacent pixels according to a gray value of each pixel in the first environment image.
And a sixth obtaining submodule 33, configured to obtain, according to the gray difference between adjacent pixel points, the distribution probability of adjacent pixel points with the same gray value difference.
And the second calculating submodule 34 is configured to calculate contrast information of the environment image according to the gray level difference between adjacent pixel points and the distribution probability of the adjacent pixel points with the same gray level difference.
In one embodiment, as shown in fig. 18, the method further includes:
and the adjusting module 60 is configured to adjust the screen brightness of the mobile terminal according to the environment brightness value, where the mobile terminal has an image acquisition device.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 19, it is a block diagram of an electronic device according to the ambient brightness detection method of the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 19, the electronic apparatus includes: one or more processors 901, memory 902, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display Graphical information for a Graphical User Interface (GUI) on an external input/output device, such as a display device coupled to the Interface. In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). Fig. 19 illustrates an example of a processor 901.
Memory 902 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by at least one processor to cause the at least one processor to perform the ambient brightness detection method provided by the present application. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to perform the method of ambient brightness detection provided herein.
The memory 902, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the conversion module 10, the first acquisition module 20, the second acquisition module 30, and the third acquisition module 40 shown in fig. 11) corresponding to the method for detecting ambient brightness in the embodiments of the present application. The processor 901 executes various functional applications of the server and data processing, i.e., implements the method of ambient brightness detection in the above method embodiments, by running non-transitory software programs, instructions, and modules stored in the memory 902.
The memory 902 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device for ambient brightness detection, and the like. Further, the memory 902 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 902 may optionally include memory located remotely from the processor 901, which may be connected to the ambient brightness detection electronics over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the method of ambient brightness detection may further include: an input device 903 and an output device 904. The processor 901, the memory 902, the input device 903, and the output device 904 may be connected by a bus or other means, and fig. 19 illustrates an example of connection by a bus.
The input device 903 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus for ambient brightness detection, such as an input device like a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, etc. The output devices 904 may include a display device, auxiliary lighting devices (e.g., LEDs), tactile feedback devices (e.g., vibrating motors), and the like. The Display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) Display, and a plasma Display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be implemented in digital electronic circuitry, Integrated circuitry, Application Specific Integrated Circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (Cathode Ray Tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, the brightness value of the real environment of the acquired environment image can be accurately calculated by utilizing the acquired initial brightness information and contrast information of the first environment image. Because the technical means of calculating the brightness value of the real environment by acquiring the initial brightness information and the contrast information of the image is adopted, the technical problem that the environment brightness can be determined only by the light sensor is solved, and the technical effect that the environment brightness can be obtained by utilizing the acquired image without the light sensor is further achieved.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (17)

1. An ambient brightness detection method, comprising:
performing color space conversion on each pixel point of the acquired first environment image to obtain brightness component information;
acquiring initial brightness information of the first environment image according to the converted brightness component information of each pixel point;
acquiring contrast information of the first environment image;
acquiring reflection information in the first environment image;
and combining the initial brightness information, the contrast information and the reflection information of the first environment image in a weighting mode or multiplying the three according to a preset functional relation to obtain an environment brightness value.
2. The method of claim 1, wherein obtaining reflection information in the first environmental image comprises:
carrying out layer separation on the first environment image;
screening out the layers with the inverted images from the separated layers;
and acquiring reflection information from the layer with the reflection.
3. The method of claim 1, wherein before performing the color space conversion on each pixel point of the acquired environment image, the method further comprises:
and preprocessing the second environment image acquired by the image acquisition device according to hardware parameters and/or automatic exposure compensation parameters of the image acquisition device to obtain the first environment image.
4. The method according to claim 1, wherein performing color space conversion on each pixel point of the acquired first environment image to obtain luminance component information comprises:
and converting each pixel point of the first environment image from a primary color space to a color and brightness separation color space to obtain brightness component information of the color and brightness separation color space.
5. The method according to claim 1, wherein performing color space conversion on each pixel point of the acquired first environment image comprises:
identifying a target region in the first environmental image;
and carrying out color space conversion on each pixel point in the target area.
6. The method according to claim 1, wherein obtaining initial luminance information of the first environmental image according to the converted luminance component information of each pixel point comprises:
acquiring brightness component information of each converted pixel point;
and acquiring initial brightness information of the first environment image according to the converted brightness component information of each pixel point.
7. The method of claim 1, wherein obtaining contrast information for the first environmental image comprises:
acquiring the gray value of each pixel point in the first environment image;
calculating the gray difference between adjacent pixel points according to the gray value of each pixel point in the first environment image;
acquiring the distribution probability of adjacent pixel points with the same gray difference according to the gray difference between the adjacent pixel points;
and calculating the contrast information of the environment image according to the gray difference between the adjacent pixel points and the distribution probability of the adjacent pixel points with the same gray difference.
8. The method of claim 1, further comprising:
and adjusting the screen brightness of the mobile terminal according to the environment brightness value.
9. An ambient brightness detection apparatus, comprising:
the conversion module is used for performing color space conversion on each pixel point of the acquired first environment image to obtain brightness component information;
the first obtaining module is used for obtaining initial brightness information of the first environment image according to the converted brightness component information of each pixel point;
the second acquisition module is used for acquiring contrast information of the first environment image;
a third acquisition module comprising a first acquisition submodule and a second acquisition submodule, wherein,
the first obtaining submodule is used for obtaining reflection information in the first environment image;
and the second obtaining submodule is used for combining the initial brightness information, the contrast information and the reflection information of the first environment image in a weighting mode or multiplying the initial brightness information, the contrast information and the reflection information according to a preset functional relation to obtain an environment brightness value.
10. The apparatus of claim 9, further comprising:
and the preprocessing module is used for preprocessing the second environment image acquired by the image acquisition device according to hardware parameters and/or automatic exposure compensation parameters of the image acquisition device to obtain the first environment image.
11. The apparatus of claim 9, wherein the conversion module comprises:
and the first conversion submodule is used for converting each pixel point of the first environment image from a primary color space to a brightness separation color space to obtain brightness component information of the brightness separation color space.
12. The apparatus of claim 9, wherein the conversion module comprises:
the identification sub-module is used for identifying a target area in the first environment image;
and the second conversion submodule is used for performing color space conversion on each pixel point in the target area.
13. The apparatus of claim 9, wherein the first obtaining module comprises:
a third obtaining submodule, configured to obtain luminance component information of each converted pixel point;
and the fourth obtaining submodule is used for obtaining the initial brightness information of the first environment image according to the converted brightness component information of each pixel point.
14. The apparatus of claim 9, wherein the second obtaining module comprises:
a fifth obtaining submodule, configured to obtain a gray value of each pixel point in the first environment image;
the first calculation submodule is used for calculating the gray difference between adjacent pixel points according to the gray value of each pixel point in the first environment image;
the sixth obtaining submodule is used for obtaining the distribution probability of the adjacent pixel points with the same gray difference according to the gray difference between the adjacent pixel points;
and the second calculation submodule is used for calculating the contrast information of the environment image according to the gray difference between the adjacent pixel points and the distribution probability of the adjacent pixel points with the same gray difference.
15. The apparatus of claim 9, further comprising:
and the adjusting module is used for adjusting the screen brightness of the mobile terminal according to the environment brightness value.
16. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
17. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-8.
CN201910736308.5A 2019-08-09 2019-08-09 Ambient brightness detection method and device Active CN110458826B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910736308.5A CN110458826B (en) 2019-08-09 2019-08-09 Ambient brightness detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910736308.5A CN110458826B (en) 2019-08-09 2019-08-09 Ambient brightness detection method and device

Publications (2)

Publication Number Publication Date
CN110458826A CN110458826A (en) 2019-11-15
CN110458826B true CN110458826B (en) 2022-06-03

Family

ID=68485748

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910736308.5A Active CN110458826B (en) 2019-08-09 2019-08-09 Ambient brightness detection method and device

Country Status (1)

Country Link
CN (1) CN110458826B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113055580B (en) * 2019-12-26 2023-10-03 中兴通讯股份有限公司 Environment recognition method, shooting mode switching method, terminal and storage medium
CN111626310B (en) * 2020-05-27 2023-08-29 百度在线网络技术(北京)有限公司 Image comparison method, device, equipment and storage medium
CN113379650B (en) * 2021-07-22 2023-03-17 浙江大华技术股份有限公司 Face image exposure method and device, electronic equipment and storage medium
CN113936221B (en) * 2021-12-17 2022-05-13 北京威摄智能科技有限公司 Method and system applied to highway environment monitoring in plateau area
CN116564239A (en) * 2023-05-18 2023-08-08 深圳市瀚达美电子有限公司 Vehicle-mounted miniLED backlight module and control method thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101126661A (en) * 2007-09-20 2008-02-20 北京中星微电子有限公司 Method and apparatus for determining ambient light
CN101750848A (en) * 2008-12-11 2010-06-23 鸿富锦精密工业(深圳)有限公司 Pick-up device and light filling method
CN103856823A (en) * 2012-12-06 2014-06-11 腾讯科技(深圳)有限公司 Interface adjustment method, device and terminal
CN104715736A (en) * 2013-12-16 2015-06-17 炬芯(珠海)科技有限公司 Backlight automatic adjusting method and device for electronic device
CN105005390A (en) * 2015-08-11 2015-10-28 宇龙计算机通信科技(深圳)有限公司 Terminal adjusting method, terminal adjusting device, and terminal
CN108600649A (en) * 2018-04-26 2018-09-28 莆田市烛火信息技术有限公司 A kind of Intelligent House Light brightness acquisition and control method
CN108986199A (en) * 2018-06-14 2018-12-11 北京小米移动软件有限公司 Dummy model processing method, device, electronic equipment and storage medium
CN110021257A (en) * 2019-04-02 2019-07-16 Oppo广东移动通信有限公司 Display brightness method of adjustment and relevant device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6415022B2 (en) * 2013-05-08 2018-10-31 キヤノン株式会社 Image processing apparatus, image processing method, and program
US9154697B2 (en) * 2013-12-06 2015-10-06 Google Inc. Camera selection based on occlusion of field of view

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101126661A (en) * 2007-09-20 2008-02-20 北京中星微电子有限公司 Method and apparatus for determining ambient light
CN101750848A (en) * 2008-12-11 2010-06-23 鸿富锦精密工业(深圳)有限公司 Pick-up device and light filling method
CN103856823A (en) * 2012-12-06 2014-06-11 腾讯科技(深圳)有限公司 Interface adjustment method, device and terminal
CN104715736A (en) * 2013-12-16 2015-06-17 炬芯(珠海)科技有限公司 Backlight automatic adjusting method and device for electronic device
CN105005390A (en) * 2015-08-11 2015-10-28 宇龙计算机通信科技(深圳)有限公司 Terminal adjusting method, terminal adjusting device, and terminal
CN108600649A (en) * 2018-04-26 2018-09-28 莆田市烛火信息技术有限公司 A kind of Intelligent House Light brightness acquisition and control method
CN108986199A (en) * 2018-06-14 2018-12-11 北京小米移动软件有限公司 Dummy model processing method, device, electronic equipment and storage medium
CN110021257A (en) * 2019-04-02 2019-07-16 Oppo广东移动通信有限公司 Display brightness method of adjustment and relevant device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于彩色CCD相机的船舶夜航光环境亮度测量;黄成 等;《光电工程》;20180715;第1-6页 *

Also Published As

Publication number Publication date
CN110458826A (en) 2019-11-15

Similar Documents

Publication Publication Date Title
CN110458826B (en) Ambient brightness detection method and device
US11546567B2 (en) Multimodal foreground background segmentation
CN107810505B (en) Machine learning of real-time image capture parameters
US10165194B1 (en) Multi-sensor camera system
CN111986178A (en) Product defect detection method and device, electronic equipment and storage medium
US20150264278A1 (en) System and Method for Estimating an Ambient Light Condition Using an Image Sensor and Field-of-View Compensation
CN110728705B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110519485B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108881875B (en) Image white balance processing method and device, storage medium and terminal
US10812744B2 (en) Defective pixel compensation method and device
CN107909569B (en) Screen-patterned detection method, screen-patterned detection device and electronic equipment
CN112184837A (en) Image detection method and device, electronic equipment and storage medium
CN111028276A (en) Image alignment method and device, storage medium and electronic equipment
CN113132695A (en) Lens shadow correction method and device and electronic equipment
CN106954022B (en) Image processing method, device and terminal
CN109040729B (en) Image white balance correction method and device, storage medium and terminal
CN111401248B (en) Sky area identification method and device, electronic equipment and storage medium
CN111031256B (en) Image processing method, image processing device, storage medium and electronic equipment
KR20230041648A (en) Multi-frame depth-based multi-camera relighting of images
CN116668843A (en) Shooting state switching method and device, electronic equipment and storage medium
JP2011029710A (en) Image processor, image processing program, and imaging apparatus
JP4947105B2 (en) Image processing apparatus, image processing program, and imaging apparatus
JP2015233202A (en) Image processing apparatus, image processing method, and program
CN113688900A (en) Radar and visual data fusion processing method, road side equipment and intelligent traffic system
CN115714925A (en) Sensor, image generation method and device and camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210507

Address after: 100085 Baidu Building, 10 Shangdi Tenth Street, Haidian District, Beijing

Applicant after: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

Applicant after: Shanghai Xiaodu Technology Co.,Ltd.

Address before: 100085 Baidu Building, 10 Shangdi Tenth Street, Haidian District, Beijing

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant