CN113824894A - Exposure control method, device, equipment and storage medium - Google Patents

Exposure control method, device, equipment and storage medium Download PDF

Info

Publication number
CN113824894A
CN113824894A CN202111017862.1A CN202111017862A CN113824894A CN 113824894 A CN113824894 A CN 113824894A CN 202111017862 A CN202111017862 A CN 202111017862A CN 113824894 A CN113824894 A CN 113824894A
Authority
CN
China
Prior art keywords
image
brightness
calculated
face
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111017862.1A
Other languages
Chinese (zh)
Inventor
王洋冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Intellifusion Technologies Co Ltd
Priority to CN202111017862.1A priority Critical patent/CN113824894A/en
Publication of CN113824894A publication Critical patent/CN113824894A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Abstract

The embodiment of the application provides an exposure control method, an exposure control device, exposure control equipment and a storage medium, wherein the method comprises the following steps: acquiring an image to be calculated, determining a face region in the image to be calculated and the total brightness of pixels in a square region which takes the same point as a common vertex and is included in the image to be calculated; calculating the total brightness of the face area according to the total brightness of the pixels in the square area; determining the average brightness of the face area of the image to be calculated according to the total brightness of the face area; and executing exposure control operation according to the average brightness. According to the method and the device, the intercepting calculation of the image pixels is not required to be carried out on the brightness of the face area, the exposure of the camera device can be efficiently compensated and controlled according to the average brightness determined by the square area, a clearer face image is obtained, and the improvement of the face recognition precision is facilitated.

Description

Exposure control method, device, equipment and storage medium
Technical Field
The present application relates to the field of imaging, and in particular, to an exposure control method, apparatus, device, and storage medium.
Background
The face recognition technology judges and detects an input face image or video stream based on the face features of a person. In the identification process, whether a human face exists is judged firstly. If a human face exists, the position, size, and position information of each major facial organ of each face are further given. According to the information, the identity characteristics implied in each face are further extracted and compared with the known faces, so that the identity of each face is recognized.
Before face recognition is performed, a face image needs to be acquired. And the brightness of the face image is affected by the ambient light. The brightness of the face is too high or too low, which may affect the accuracy of face recognition, and in order to improve the accuracy of face recognition, the image is usually subjected to exposure compensation before the face image is acquired. When exposure compensation is carried out, the average brightness of pixels in the area where the human face is located needs to be determined, the calculation process is complex, exposure compensation efficiency is not high, and the rapid obtaining of clear human face images is not facilitated.
Disclosure of Invention
In view of this, embodiments of the present application provide an exposure control method, an apparatus, a device, and a storage medium, so as to solve the problems that, when performing exposure compensation in the prior art, it is necessary to determine the average brightness of pixels in an area where a human face is located, and a calculation process is complex, so that the efficiency of exposure compensation is not high, and it is not beneficial to quickly obtain a clear human face image.
A first aspect of an embodiment of the present application provides an exposure control method, including:
acquiring an image to be calculated, determining a face region in the image to be calculated and the total brightness of pixels in a square region which takes the same point as a common vertex and is included in the image to be calculated;
calculating the total brightness of the face area according to the total brightness of the pixels in the square area;
determining the average brightness of the face area of the image to be calculated according to the total brightness of the face area;
and executing exposure control operation according to the average brightness.
A second aspect of an embodiment of the present application provides an exposure control apparatus, including:
the image determining unit is used for acquiring an image to be calculated, determining a face area in the image to be calculated and the total brightness of pixels in a square area which takes the same point as a common vertex and is included in the image to be calculated;
the total brightness calculation unit is used for calculating the total brightness of the face area according to the total brightness of the pixels in the square area;
the average brightness determining unit is used for determining the average brightness of the face area of the image to be calculated according to the total brightness of the face area;
and the exposure control unit is used for executing exposure control operation according to the average brightness.
A third aspect of embodiments of the present application provides an exposure control apparatus, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any one of the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, performs the steps of the method according to any one of the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: the total brightness of pixels of the square area in the image to be calculated is determined firstly, the total brightness of different face areas can be calculated quickly through the determined total brightness of the square area, the average brightness of the face areas is calculated based on the total brightness of the face areas, the intercepting calculation of the image pixels is not needed for the brightness of the face areas, efficient compensation control can be carried out on exposure of the camera device according to the average brightness determined by the square area, a clearer face image is obtained, and the improvement of the accuracy of face recognition is facilitated.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of an exposure control method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a square region of a gray scale image provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of the total brightness of an image to be calculated according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a calculation of total luminance of a face region according to an embodiment of the present application;
fig. 5 is a schematic diagram of an exposure control apparatus according to an embodiment of the present application;
fig. 6 is a schematic diagram of an exposure control apparatus provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
When a camera device is used for taking a picture or shooting a picture, if a face is included in a shot image, the area where the face is located is usually the main body of the image, and the image quality of the main body area needs to be preferentially ensured. For example, it is necessary to ensure that the exposure of the main area of the image is reasonable, that the main area of the image is within the focus range, and the like. Based on this, the embodiment of the present application provides an exposure control method, which improves the efficiency of exposure control while ensuring effective exposure of an image main body.
Fig. 1 is a schematic flow chart illustrating an implementation of an exposure control method according to an embodiment of the present disclosure. As shown in fig. 1, the method includes:
in S101, an image to be calculated is obtained, a face region in the image to be calculated and a total luminance of pixels in a square region including the same point as a common vertex in the image to be calculated are determined.
The image to be calculated may be a preview image obtained when the camera device takes a picture. Through the portrait detection of the preview image, the brightness detection of the face area of the detected portrait is carried out, and the exposure control is carried out on the camera device according to the detected brightness, so that the portrait of the shot picture is clearer.
The imaging device includes a video camera, a digital camera, a surveillance camera, an in-vehicle camera, and the like.
The image to be calculated may be an image acquired before image acquisition for face recognition. The average brightness of the face area is determined by carrying out face detection on the collected image, and the exposure control is carried out on the camera device according to the average brightness, so that the collected image for face recognition is clearer, and the accuracy of face recognition is improved.
Alternatively, the image to be calculated may also be an image in a video captured by an imaging device. The brightness of the human face area of the image shot by the camera device is detected, and the exposure time of the camera device is adjusted, so that a clearer video image is obtained.
The image to be calculated may include one or more than one portrait, and correspondingly, may include one or more than one face area.
When the face region and the brightness information in the image to be calculated are determined, the image to be calculated can be subjected to image conversion. The image to be calculated can be converted into a color image with a specific format for face recognition, the image to be calculated is converted into a gray image, and the brightness information in the image to be calculated is determined.
For example, the format of the acquired image to be calculated can be YUV format, which is a common encoding method for acquiring when encoding photos or videos. The YUV format allows for a reduction in the bandwidth of the chrominance, taking into account the human perception capabilities.
When the image to be calculated is converted into a color image, the image to be calculated may be converted into an image in a BGR format, but is not limited to this, and other formats of color images may also be included, including formats such as CMYK.
When the image to be calculated includes the face region through the color image recognition, a face recognition algorithm including, for example, a face feature comparison method may be adopted to determine one or more face regions included in the image to be calculated. The face region may be identified by a square region (or may also be referred to as a rectangular region).
After the image to be computed is converted into a grayscale image, the brightness of each pixel in the grayscale image can be determined. Suppose the gray value of the pixel points in the gray image is between [0,255], the closer the gray value is to 0, the lower the brightness of the image. The higher the grayscale value, the higher the brightness of the image. The brightness of the pixel can be directly represented by the gray value, and the brightness corresponding to the gray value can also be determined by other mapping modes.
In order to facilitate the subsequent brightness calculation of the face region, the embodiment of the application uses the same pixel point as a common vertex of a plurality of square regions to be generated for the converted gray-scale image. The common vertex of the generated square areas can be any point in the image to be calculated or the gray-scale image, and can also be the vertex in the image to be calculated or the gray-scale image. For example, the top left corner vertex, the bottom left corner vertex, the top right corner vertex, or the bottom right corner vertex of the image to be calculated may be used as the common vertex of the generated plurality of square areas. When the square regions are determined at positions other than the vertices in the image to be calculated or the grayscale image, a larger number of square regions can be generated.
Fig. 2 is a schematic diagram illustrating a square area of a gray scale image according to an embodiment of the present application. In the schematic diagram, the vertex at the upper left corner is used as the common vertex of the generated square areas, and the square areas with different sizes can be generated according to different positions selected by the lower right vertex. For example, the square area generated in fig. 2 is the intersection area of the first 5 rows of pixels and the first 7 columns of pixels. The total brightness of the square region is the sum of the brightness of all the pixel points in the square region, which is 5 × 7 pixel points in total.
After determining the total brightness of the square area shown in fig. 2, if the total brightness of 5 × 8 pixels, that is, the total brightness of the pixels in the intersection area of the first 5 rows of pixels and the first 8 columns of pixels needs to be determined, the brightness of the first 5 pixels in the 8 th column may be added to the total brightness determined in fig. 2. Therefore, by means of stepwise calculation, the total luminance of the pixels of the generated square area can be determined quickly.
Similarly, before determining the total brightness of the square region shown in image 2, the total brightness of 5 × 7 pixels, i.e., the total brightness of the intersection region of the first 5 rows and the first 7 columns of pixels, may be predetermined. And adding the brightness of the first 5 pixel points of the 7 th row on the basis of the total brightness of the 5 × 7 square areas.
Alternatively, the total brightness of 4 × 8 pixels, i.e., the total brightness of the intersection region of the first 4 rows and the first 8 columns of pixels, may be predetermined. And adding the brightness of the first 8 pixel points of the 5 th row on the basis of the total brightness of the 4 x 8 square areas.
Taking the image shown in fig. 2 as an example of the image to be calculated, the generated square regions with the top left corner as the common vertex may include square regions such as 1 × 1, 2 × 2, 2 × 3, 2 × 4, 2 × 5, 2 × 6, 2 × 7, 2 × 8, 3 × 2, 3 × 3, 3 × 4, 3 × 5, 3 × 6, 3 × 7, 3 × 8 … … 11 × 7, 11 × 8, and the like. The total brightness of the square area can be calculated step by step on the basis of the total brightness of the smaller square area according to the calculation process of the square area from small to large, and the brightness calculation efficiency is improved.
After the total luminance of each square region is obtained through calculation, a luminance map may be formed by the total luminance of the square regions, for example, in the schematic diagram of the total luminance of the image to be calculated shown in fig. 3, when the common vertex of the generated square regions is the vertex of the upper left corner, the position of the total luminance recorded in the schematic diagram of the total luminance (or referred to as the total luminance map) may be determined according to the position of the lower left corner of the square region (for example, dark pixels in fig. 2), and the total luminance map may be in one-to-one correspondence with pixel points in the grayscale map, and the total luminance of the square region is recorded at the position corresponding to the lower left corner of the square region in the total luminance map. For example, the lower left corner of the square area shown in fig. 2 is the position of the 5 th row and the 7 th column, the total brightness of the square area can be recorded at the position of the 5 th row and the 7 th column in the total brightness map.
Of course, without being limited thereto, the position of the total luminance in the total luminance map may be determined according to the vertices at different positions of the square region, according to the position of the common vertex. For example, when the common vertex is a vertex at a lower right corner, the position of the total brightness of the square region in the total brightness map may be determined according to the position of the vertex at the upper left corner of the square region.
In S102, the total brightness of the face region is calculated according to the total brightness of the pixels in the square region.
Through face recognition calculation of the color image, a face region included in the image to be calculated can be determined. The face region may be represented by a square region. In the schematic diagram of calculating the total brightness of the face area shown in fig. 3, the square area shown in gray can be represented as the face area.
After the total brightness of each square area with a common vertex included in the image to be calculated is predetermined, the square area for calculating the total brightness of the face area can be selected according to the vertex of the face area, and the total brightness of the face area can be rapidly determined according to the selected square area.
For example, in the schematic diagram of calculating the total brightness of the face region shown in fig. 3, the face region includes four vertices, respectively A, B, C, D. And according to the vertexes of the face area, combining the common vertexes of the preset square areas to respectively determine the four square areas. For example, a square area a 'is determined from the vertex a and the common vertex, a square area B' is determined from the vertex B and the common vertex, a square area C 'is determined from the vertex C and the common vertex, and a square area D' is determined from the vertex D and the common vertex.
And calculating the total brightness of the face area through addition and subtraction operation of the area according to the square area searched by the vertex of the face area. For example, the face region P shown in fig. 4 can be represented as: p ═ a '-B' -C '+ D'.
According to the corresponding relation between the total brightness in the total brightness image generated by the square area and the square area, the total brightness corresponding to the square areas A ', B', C 'and D' can be quickly searched, and therefore the total brightness of the face area can be quickly calculated.
When the face regions included in the image to be calculated include two or more than two, the square regions used for calculating the total brightness of the face regions can be correspondingly determined according to the positions of the vertexes of the face regions in the manner shown in fig. 4, and the total brightness of the square regions is searched in the total brightness graph for calculation, so that the total brightness of each face region can be efficiently and respectively obtained.
In S103, the average brightness of the face region of the image to be calculated is determined according to the total brightness of the face region.
After the total brightness of each face region in the image to be calculated is determined, the average brightness of the face region of the image to be calculated can be determined by combining the calculated total brightness of each face region with the weight coefficient corresponding to each face. In the method for determining the weight coefficient, the weight coefficient of each face region can be determined by combining the size of the area of each face region and the proportion of the total area of the face regions.
For example, the image to be calculated includes 3 face regions, which are P1, P2, and P3, the areas of the three face regions are S1, S2, and S3, and the total brightness of the three face regions is L1, L2, and L3, so that when the average brightness of all face regions in the image to be calculated is determined, the calculation may be performed in combination with the weight coefficients determined by the areas. The average luminance L can be expressed as:
L=L1(S1/(S1+S2+S3)+L2(S2/(S1+S2+S3)+L3(S3/(S1+S2+S3)。
alternatively, in a possible implementation, the weighting coefficients may also be determined according to the actual scene. For example, during face recognition, when the image to be calculated includes two or more faces, the average brightness of each face region may be determined respectively (at this time, the weight coefficient of the face region currently calculating the average brightness is 1, and the weight coefficients of other face regions are 0), so that two or more exposure compensation schemes are determined according to the two or more calculated average brightness, and exposure compensation is performed sequentially according to the two or more determined exposure compensation schemes, so that images after exposure compensation of different face regions can be obtained sequentially, so that face recognition is performed sequentially on a plurality of faces according to the sequentially obtained images.
In S104, an exposure control operation is performed according to the average brightness.
After the average brightness of all face regions in the image to be calculated or the average brightness of each face region is determined, an exposure compensation scheme can be determined by combining a preset brightness range according to the determined average brightness.
For example, when the determined average brightness is smaller than the minimum value of the brightness range, the exposure duration may be increased, so that the adjusted average brightness of the face region in the exposure image belongs to the preset brightness range, and the definition of the face region is improved. If the determined average brightness is larger than the maximum value of the range, the exposure time can be reduced, so that the average brightness of the face area in the adjusted exposure image belongs to the preset brightness range, and the definition of the face area is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 5 is a schematic diagram of an exposure control apparatus according to an embodiment of the present disclosure. As shown in fig. 5, the apparatus includes:
an image determining unit 501, configured to acquire an image to be calculated, determine a face region in the image to be calculated, and determine total brightness of pixels in a square region that uses the same point as a common vertex and is included in the image to be calculated;
a total brightness calculation unit 502, configured to calculate a total brightness of the face region according to a total brightness of pixels in the square region;
an average brightness determination unit 503, configured to determine an average brightness of a face region of the image to be calculated according to the total brightness of the face region;
an exposure control unit 504 for performing an exposure control operation according to the average brightness.
In one embodiment, the image determining unit 501 may include:
the image conversion subunit is used for reading an image to be calculated from the camera device and respectively converting the image to be calculated into a color image and a gray image;
and the image calculation subunit is used for determining a face area included in the image to be calculated according to the color image and determining the total brightness of pixels in a square area in the image to be calculated according to the gray image.
In one embodiment, the image calculation subunit may include:
a brightness determination module for determining brightness of pixels in the grayscale image;
the square area determining module is used for selecting any point of the image to be calculated as a vertex of the square area and determining the square area included in the image to be calculated;
and the total brightness determining module is used for determining the total brightness of the square area according to the determined brightness of the pixels in the square area.
In one embodiment, the total luminance calculating unit 502 may include:
the square area determining subunit is used for determining a plurality of square areas used for the face pixel calculation according to the vertexes of the face area;
and the total brightness operator unit is used for calculating the total brightness of the face region according to the determined total brightness of the plurality of square regions.
In one embodiment, the average brightness determination unit 503 may include:
each obtaining subunit is used for obtaining the area of a face region included in the image to be calculated;
and the first average brightness determining subunit is used for determining the average brightness of the face region included in the image to be calculated according to the area of the face region and by combining the total brightness of the face region.
In one embodiment, when a plurality of faces are included in the image to be calculated, the average brightness determination unit 503 may include:
the second acquisition subunit is used for acquiring the total brightness of each face region in the image to be calculated and the weight coefficient of each face region;
and the second average brightness determining subunit is used for determining the average brightness of the face area of the image to be calculated according to the total brightness of each face area and the weight coefficient of each face area.
In one embodiment, the exposure control unit 504 may include:
the brightness comparison subunit is used for comparing the average brightness with a preset brightness range;
and the exposure control subunit is used for executing exposure control operation corresponding to the comparison result according to the comparison result of the average brightness and a preset brightness range.
The exposure control apparatus shown in fig. 5 corresponds to the exposure control method shown in fig. 1.
Fig. 6 is a schematic diagram of an exposure control apparatus according to an embodiment of the present application. As shown in fig. 6, the exposure control apparatus 6 of this embodiment includes: a processor 60, a memory 61 and a computer program 62, such as an exposure control program, stored in said memory 61 and executable on said processor 60. The processor 60 implements the steps in each of the above-described exposure control method embodiments when executing the computer program 62. Alternatively, the processor 60 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 62.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 62 in the exposure control apparatus 6.
The exposure control device may include, but is not limited to, a processor 60, a memory 61. It will be understood by those skilled in the art that fig. 6 is merely an example of the exposure control apparatus 6, and does not constitute a limitation of the exposure control apparatus 6, and may include more or less components than those shown, or combine some components, or different components, for example, the exposure control apparatus may further include an input-output device, a network access device, a bus, and the like.
The Processor 60 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the exposure control apparatus 6, such as a hard disk or a memory of the exposure control apparatus 6. The memory 61 may also be an external storage device of the exposure control apparatus 6, such as a plug-in hard disk provided on the exposure control apparatus 6, a Smart Media Card (SMC), a Secure Digital (SD) Card, a flash memory Card (FlashCard), and the like. Further, the memory 61 may also include both an internal storage unit and an external storage device of the exposure control device 6. The memory 61 is used to store the computer program and other programs and data required by the exposure control apparatus. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of the methods described above can be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An exposure control method, characterized in that the method comprises:
acquiring an image to be calculated, determining a face region in the image to be calculated and the total brightness of pixels in a square region which takes the same point as a common vertex and is included in the image to be calculated;
calculating the total brightness of the face area according to the total brightness of the pixels in the square area;
determining the average brightness of the face area of the image to be calculated according to the total brightness of the face area;
and executing exposure control operation according to the average brightness.
2. The method according to claim 1, wherein the steps of obtaining an image to be calculated, determining a face region in the image to be calculated, and determining the total brightness of pixels in a square region which is included in the image to be calculated and has a same point as a common vertex comprise:
reading an image to be calculated from a camera device, and respectively converting the image to be calculated into a color image and a gray image;
and determining a face region included in the image to be calculated according to the color image, and determining the total brightness of pixels in a square region in the image to be calculated according to the gray image.
3. The method of claim 2, wherein determining the total brightness of pixels in a square region in the image to be computed from the grayscale image comprises:
determining a brightness of a pixel in the grayscale image;
selecting any point of the image to be calculated as a vertex of the square area, and determining the square area included in the image to be calculated;
determining the total brightness of the square area according to the determined brightness of the pixels in the square area.
4. The method of claim 1, wherein calculating the total luminance of the face region from the total luminance of the pixels in the square region comprises:
determining a plurality of square areas for the face pixel calculation according to the vertexes of the face areas;
and calculating the total brightness of the face area according to the determined total brightness of the plurality of square areas.
5. The method of claim 1, wherein determining the average luminance of the face region of the image to be computed from the total luminance of the face region comprises:
acquiring the area of a face region included in the image to be calculated;
and determining the average brightness of the face region included in the image to be calculated according to the area of the face region and the total brightness of the face region.
6. The method according to claim 1, wherein when the image to be calculated includes a plurality of faces, determining an average brightness of the face region of the image to be calculated according to a total brightness of the face region comprises:
acquiring the total brightness of each face region in the image to be calculated and the weight coefficient of each face region;
and determining the average brightness of the face area of the image to be calculated according to the total brightness of each face area and the weight coefficient of each face area.
7. The method of claim 1, wherein performing an exposure control operation based on the average brightness comprises:
comparing the average brightness with a preset brightness range;
and according to the comparison result of the average brightness and a preset brightness range, executing exposure control operation corresponding to the comparison result.
8. An exposure control apparatus, characterized in that the apparatus comprises:
the image determining unit is used for acquiring an image to be calculated, determining a face area in the image to be calculated and the total brightness of pixels in a square area which takes the same point as a common vertex and is included in the image to be calculated;
the total brightness calculation unit is used for calculating the total brightness of the face area according to the total brightness of the pixels in the square area;
the average brightness determining unit is used for determining the average brightness of the face area of the image to be calculated according to the total brightness of the face area;
and the exposure control unit is used for executing exposure control operation according to the average brightness.
9. An exposure control apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202111017862.1A 2021-08-31 2021-08-31 Exposure control method, device, equipment and storage medium Pending CN113824894A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111017862.1A CN113824894A (en) 2021-08-31 2021-08-31 Exposure control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111017862.1A CN113824894A (en) 2021-08-31 2021-08-31 Exposure control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113824894A true CN113824894A (en) 2021-12-21

Family

ID=78923490

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111017862.1A Pending CN113824894A (en) 2021-08-31 2021-08-31 Exposure control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113824894A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361507A (en) * 2022-10-21 2022-11-18 安翰科技(武汉)股份有限公司 Imaging method, and automatic exposure control method and device for dyeing imaging

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361507A (en) * 2022-10-21 2022-11-18 安翰科技(武汉)股份有限公司 Imaging method, and automatic exposure control method and device for dyeing imaging
CN115361507B (en) * 2022-10-21 2023-03-24 安翰科技(武汉)股份有限公司 Imaging method, and automatic exposure control method and device for dyeing imaging

Similar Documents

Publication Publication Date Title
US20220014684A1 (en) Image display method and device
US20190130169A1 (en) Image processing method and device, readable storage medium and electronic device
US8401328B2 (en) Image processing apparatus and image processing method
US9071761B2 (en) Information processing apparatus, information processing method, program, and imaging apparatus including optical microscope
CN107395991B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
US20110158551A1 (en) Image composition device, image composition method, and storage medium storing program
CN108616700B (en) Image processing method and device, electronic equipment and computer readable storage medium
US20220270345A1 (en) Subject detection method and apparatus, electronic device, and computer-readable storage medium
CN113313661A (en) Image fusion method and device, electronic equipment and computer readable storage medium
CN113902657A (en) Image splicing method and device and electronic equipment
CN114730456A (en) Training method of neural network model, image processing method and device thereof
US20220398698A1 (en) Image processing model generation method, processing method, storage medium, and terminal
CN112767291A (en) Visible light image and infrared image fusion method and device and readable storage medium
CN113379609B (en) Image processing method, storage medium and terminal equipment
CN114627034A (en) Image enhancement method, training method of image enhancement model and related equipment
WO2010089836A1 (en) Image processing device
US10452955B2 (en) System and method for encoding data in an image/video recognition integrated circuit solution
CN114998122A (en) Low-illumination image enhancement method
CN113824894A (en) Exposure control method, device, equipment and storage medium
CN110717864A (en) Image enhancement method and device, terminal equipment and computer readable medium
CN111539975B (en) Method, device, equipment and storage medium for detecting moving object
CN111080683B (en) Image processing method, device, storage medium and electronic equipment
US20190220699A1 (en) System and method for encoding data in an image/video recognition integrated circuit solution
CN111222446B (en) Face recognition method, face recognition device and mobile terminal
CN112489144A (en) Image processing method, image processing apparatus, terminal device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211221