CN116506737A - Method, device, equipment and storage medium for determining exposure parameters - Google Patents

Method, device, equipment and storage medium for determining exposure parameters Download PDF

Info

Publication number
CN116506737A
CN116506737A CN202310423039.3A CN202310423039A CN116506737A CN 116506737 A CN116506737 A CN 116506737A CN 202310423039 A CN202310423039 A CN 202310423039A CN 116506737 A CN116506737 A CN 116506737A
Authority
CN
China
Prior art keywords
gray
value
image
target
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310423039.3A
Other languages
Chinese (zh)
Inventor
刘嘉宇
孙伟
王任翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Cloud Machine Beijing Technology Co ltd
Original Assignee
Aerospace Cloud Machine Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Cloud Machine Beijing Technology Co ltd filed Critical Aerospace Cloud Machine Beijing Technology Co ltd
Priority to CN202310423039.3A priority Critical patent/CN116506737A/en
Publication of CN116506737A publication Critical patent/CN116506737A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals

Abstract

The method comprises the steps of obtaining a plurality of images aiming at a target object, wherein the images are images obtained by shooting under a plurality of groups of preset exposure parameters by a shooting device, the images are provided with first images, and the first images are images obtained by shooting under the condition that a light supplementing lamp is not started; fusing the plurality of images to obtain a second image; determining a first gray value and a second gray value, wherein the first gray value is the average gray value of a target area of a first image, and the second gray value is the average gray value of the target area of a second image; the target area of the first image and the target area of the second image correspond to the same area of the target object; and determining a target exposure parameter of the image pickup device based on the first gray value and the second gray value. The method has the advantages of high stability, accurate measurement and strong adaptability to light environment.

Description

Method, device, equipment and storage medium for determining exposure parameters
Technical Field
The present disclosure relates to the field of image processing, and in particular, to a method, an apparatus, a device, and a storage medium for determining exposure parameters.
Background
In order to obtain an image with a high definition at the time of photographing, the photographing device usually performs photometry before the main photographing. In the existing photometry method, an image is generally shot through an imaging device, then a target exposure value is obtained based on a mathematical statistical algorithm according to the exposure value of each area of the image, and further a target exposure parameter is obtained.
However, the existing photometry method has high requirements on the light environment and cannot be used outdoors stably.
Disclosure of Invention
In one aspect, an embodiment of the present disclosure provides a method for determining an exposure parameter, which is applied to an image capturing device, where the image capturing device includes a light supplementing lamp; the method for determining the exposure parameters comprises the following steps:
acquiring a plurality of images aiming at a target object, wherein the images are images obtained by shooting by the camera under a plurality of groups of preset exposure parameters, the images are provided with first images, and the first images are images obtained by shooting under the condition that the light supplementing lamp is not turned on;
fusing the plurality of images to obtain a second image;
determining a first gray value and a second gray value, wherein the first gray value is an average gray value of a target area of the first image, and the second gray value is an average gray value of a target area of the second image; the target area of the first image and the target area of the second image correspond to the same area of the target object;
a target exposure parameter of the image capturing apparatus is determined based on the first gray value and the second gray value.
In one embodiment, determining the target exposure parameter of the image capturing apparatus based on the first gray value and the second gray value includes:
obtaining mapping information, wherein the mapping information is used for representing the corresponding relation between different first gray values and second gray values and exposure parameters;
and determining the target exposure parameters corresponding to the first gray scale value and the second gray scale value based on the mapping information.
In one embodiment, the mapping information includes a plurality of sets of mapping relationships, each set of mapping relationships including a first gray scale interval, a second gray scale interval, and exposure parameters corresponding to the first gray scale interval and the second gray scale interval; based on the mapping information, determining the target exposure parameters corresponding to the first gray value and the second gray value includes:
determining a target mapping relation from the plurality of groups of mapping relations; wherein the first gray scale interval of the target mapping relationship comprises the first gray scale value, and the second gray scale interval of the target mapping relationship comprises the second gray scale value;
and determining the exposure parameters in the target mapping relation as the target exposure parameters.
In one embodiment, the first gray scale interval includes a first gray scale lower limit value and a first gray scale upper limit value, and the second gray scale interval includes a second gray scale lower limit value and a second gray scale upper limit value; determining a target mapping relationship from the plurality of sets of mapping relationships, including:
comparing the first gray value with a first gray lower limit value and a first gray upper limit value in the same group of mapping relation according to a priority order preset by the plurality of groups of mapping relation, and comparing the second gray value with a second gray lower limit value and a second gray upper limit value in the same group of mapping relation;
and determining a mapping relation to which the first gray lower limit value, the first gray upper limit value, the second gray lower limit value and the second gray upper limit value belong as the target mapping relation in response to the situation that the first gray value is larger than or equal to the first gray lower limit value, smaller than the first gray upper limit value and the second gray value is larger than or equal to the second gray lower limit value and smaller than the second gray upper limit value.
In one embodiment, determining the first gray value and the second gray value comprises: determining position information of the target area of the second image;
the first gray value and the second gray value are determined based on position information of the target region of the second image.
In one embodiment, determining the location information of the target area of the second image comprises: and inputting the second image into a pre-trained model, and outputting the position information of the target area of the second image.
In one embodiment, the target exposure parameters include at least one of exposure time, gain, gamma value, and light supplement lamp brightness.
The embodiment of the disclosure also provides a device for determining the exposure parameters, which is applied to an image pickup device, wherein the image pickup device comprises a light supplementing lamp; the device for determining the exposure parameters comprises:
the image acquisition module is used for acquiring a plurality of images aiming at a target object, wherein the images are images obtained by shooting under a plurality of groups of preset exposure parameters by the camera, the images are provided with first images, and the first images are images obtained by shooting under the condition that the light supplementing lamp is not started;
the fusion module is used for fusing the plurality of images to obtain a second image;
the gray value determining module is used for determining a first gray value and a second gray value, wherein the first gray value is the average gray value of the target area of the first image, and the second gray value is the average gray value of the target area of the second image; the target area of the first image and the target area of the second image correspond to the same area of the target object;
and the exposure parameter determining module is used for determining a target exposure parameter of the image pickup device based on the first gray value and the second gray value.
The embodiment of the disclosure also provides an electronic device, which is characterized by comprising:
at least one processor;
a memory for storing the at least one processor-executable instruction;
wherein the at least one processor is configured to execute the instructions to implement the method as in any of the above embodiments.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program, characterized in that the method according to any of the above embodiments is implemented when the computer program is executed by a processor.
The method for determining the exposure parameters comprises the steps of obtaining a plurality of images aiming at a target object, wherein the images are images obtained by shooting under a plurality of groups of preset exposure parameters by the camera, the images are provided with first images, and the first images are images obtained by shooting under the condition that the light supplementing lamp is not started; fusing the plurality of images to obtain a second image; determining a first gray value and a second gray value, wherein the first gray value is the average gray value of a target area of a first image, and the second gray value is the average gray value of the target area of a second image; the target area of the first image and the target area of the second image correspond to the same area of the target object; and determining a target exposure parameter of the image pickup device based on the first gray value and the second gray value. The first image is obtained by shooting under the condition that the light supplementing lamp is not started, and in the condition, the target area of the first image can be in an underexposure state, an exposure normal state or an overexposure state, namely, the first gray value can directly reflect the light intensity of the ambient light where the target object is located; the second image is formed by fusing the plurality of images, the second gray level can reflect the sensitivity degree of the target object to the light supplementing lamp, and the target exposure parameters meeting the requirements of the environment light and the target object can be determined according to the first gray level and the second gray level, so that the light environment-friendly LED display device has the advantages of high stability, accurate measurement and strong light environment adaptability.
Drawings
Further details, features and advantages of the present disclosure are disclosed in the following description of exemplary embodiments, with reference to the following drawings, wherein:
FIG. 1 is a flow chart of a method for determining exposure parameters provided by an exemplary embodiment of the present disclosure;
FIG. 2 is a flow chart of a method for determining a first gray value and a second gray value provided by an exemplary embodiment of the present disclosure;
FIG. 3 is a flowchart of determining target exposure parameters corresponding to a first gray value and a second gray value based on mapping information according to an exemplary embodiment of the present disclosure;
FIG. 4 is a functional block diagram of an exposure parameter determination apparatus according to an exemplary embodiment of the present disclosure;
FIG. 5 is a schematic block diagram of an electronic device provided by an exemplary embodiment of the present disclosure;
fig. 6 is a block diagram of a computer system according to an exemplary embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below. It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The embodiment of the disclosure provides a method for determining exposure parameters, which is applied to an image pickup device and is used for determining target exposure parameters before normal shooting by the image pickup device, so that the shooting quality of images is improved. In one embodiment, the image capturing device may be an industrial camera, the image capturing device may obtain, according to a captured image of a target object, the target object may be, for example, a vehicle waiting for filling, and the image capturing device may be, for example, a camera mounted on a head or a hand of a filling robot, and is used for obtaining a position of a filling opening of the vehicle, but not limited to, the method for determining the exposure parameter may be applied to any image capturing device, and the disclosure is not limited to this.
The following description of the embodiments of the present disclosure will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present disclosure. Based on the embodiments in this disclosure, all other embodiments that a person of ordinary skill in the art would obtain without making any inventive effort are within the scope of protection of this disclosure.
Fig. 1 is a flowchart of a method for determining exposure parameters according to an exemplary embodiment of the present disclosure, which is applied to an image capturing apparatus including a light supplementing lamp; as shown in fig. 1, the method for determining the exposure parameters includes the following steps:
s101, acquiring a plurality of images aiming at a target object, wherein the images are images obtained by shooting under a plurality of groups of preset exposure parameters by the image pickup device, the images are provided with first images, and the first images are images obtained by shooting under the condition that the light supplementing lamp is not turned on.
In this step, a plurality of images for the target object captured by the image capturing device are acquired, and exposure parameters corresponding to the plurality of images are preset. The imaging device may be an industrial camera suitable for different scenes, for example, a camera mounted on the head or the hand of the filling robot, in which the target object is a vehicle to be filled, and the imaging device captures a plurality of images of the vehicle to be filled, the plurality of images being captured under a plurality of sets of preset exposure parameters. The embodiment of the disclosure is not limited to specific values of preset exposure parameters, which may be set based on an actual scene, but it is required to ensure that an image is captured under the condition of no light supplement, that is, the first image, so that the light intensity of the ambient light where the target object is located, for example, whether the ambient light is day or night, and whether the ambient light is strong light or weak light, may be determined based on the first image.
It should be noted that, in the embodiment of the present disclosure, the exposure parameters include, but are not limited to, exposure time, exposure gain, gamma value, on-time of the light-compensating lamp, brightness of the light-compensating lamp, and so on.
The embodiment of the present disclosure is not limited to the number of the above-described plurality of images, and for example, it may be 3, 4, 5, 6 or more. In a specific embodiment, the plurality of images is specifically 5 images, and preset exposure parameters between the 5 images are different. It should be noted that, the difference of the preset exposure parameters between the 5 images means that the 5 images are captured under different exposure conditions, and the difference does not necessarily represent that each sub-parameter in the exposure parameters is different, but at least one different sub-parameter exists between every two of the 5 images, and the sub-parameter may be, for example, an exposure time, or brightness of a light filling lamp, or an exposure gain. For example, the exposure time of the first image of the plurality of images, i.e., the first image, may be 8.5 ms, the light-compensating lamp is not turned on, and the brightness thereof may be regarded as 255 (darkest); the exposure time of the second image of the plurality of images may be 8.5 milliseconds, but the light supplement lamp is turned on, and the brightness thereof is 200; the exposure time of the third image may be 16 milliseconds and the brightness of the light filling lamp 160; the exposure time of the fourth image may be 4 milliseconds and the brightness of the light filling lamp 120; the exposure time of the 5 th image may be 8 milliseconds and the brightness of the light filling lamp 90. It can be seen that the exposure time of the first image and the second image are the same, but the brightness of the light filling lamps is different, i.e. only 1 sub-parameter, i.e. the brightness of the light filling lamps is different between the first image and the second image; the second image, the third image, the fourth image and the fifth image have different exposure time and brightness of the light filling lamp.
In practical application, a plurality of groups of proper preset exposure parameters are required to be set according to the illumination characteristics of an actual scene, so that at any moment of the actual scene, at least one image of a plurality of images obtained by shooting any target object is in a normal exposure range. Taking an image pickup device applied to a filling robot as an example, when preset exposure parameters are set, a group of smaller exposure parameters can be set, the group of exposure parameters is suitable for the situation that the ambient light brightness is larger, for example, in daytime, the color of a vehicle to be filled is sensitive to a light supplementing lamp, for example, in the case of a white vehicle, and the smaller exposure parameters refer to shorter exposure time and smaller light supplementing lamp brightness; a set of exposure parameters can also be set, which are located in a larger set of exposure parameters, the set of exposure parameters are suitable for use in situations where the ambient light level is small, such as in the night, the color of the vehicle to be filled is insensitive to the light filling lamp, such as in the case of a black vehicle, the larger set of exposure parameters refers to a longer exposure time and a larger light filling lamp brightness.
S102, fusing the plurality of images to obtain a second image.
In this step, the plurality of images captured in step S101 are fused to obtain a second image having a higher image quality than any one of the plurality of images, facilitating the processing of a subsequent computer, such as the identification of a target area, and the like.
S103, determining a first gray level value and a second gray level value, wherein the first gray level value is the average gray level value of the target area of the first image, and the second gray level value is the average gray level value of the target area of the second image; the target region of the first image and the target region of the second image correspond to the same region of the target object.
The purpose of this step is to determine an average gray value of the target area of the first image and an average gray value of the target area of the second image, the first image being taken under natural light, whereby the above-mentioned first gray value may characterize the brightness of the ambient light where the target object is located; the second image is formed by fusing a plurality of images, and the second gray value can represent the sensitivity degree of the target object to the light filling lamp.
The embodiments of the present disclosure are not limited to a specific manner of determining the first gray value and the second gray value, and any manner in which the gray value may be determined may be applied to the embodiments of the present disclosure to determine the first gray value and the second gray value. In one embodiment, as shown in fig. 2, determining the first gray value and the second gray value includes the steps of:
s201, determining position information of the target area of the second image.
In the step, the position of the target area of the second image is determined first, and the accuracy of determining the target area through the second image is higher because the image quality of the second image is better than that of each image before fusion.
The embodiment of the present disclosure is not limited to a specific manner of determining the target area by the second image, and in one example, determining the position information of the target area of the second image includes: the second image is input into a pre-trained model that outputs positional information of a target region of the second image. As can be seen, the input of the model is a gray scale image and the output is the positional information of the target area.
S202, determining the first gray level value and the second gray level value based on the position information of the target area of the second image.
It will be appreciated that when the position information of the target region in the second image is known, the average gray value of the target region in the second image and the average gray value of the target region in the first image, that is, the second gray value and the first gray value, may be obtained based on the position information when the target region in the second image and the target region in the first image correspond to the same region of the target object. The obtaining of the average gray value of the target area based on the position information of the target area may refer to the prior art in the field, and will not be described herein.
S104, determining a target exposure parameter of the image pickup device based on the first gray level value and the second gray level value.
The first gray value can represent the intensity of the ambient light where the target object is located, the second gray value can represent the sensitivity of the target object to the light supplementing lamp, and the exposure parameters adapted to the ambient light brightness and the color of the target object, namely the target exposure parameters, can be determined according to the first gray value and the second gray value. The target exposure parameter may include at least one of exposure time, gain, gamma value, and lamp brightness.
The embodiment of the disclosure is not limited to a specific manner of determining the target exposure parameter of the image capturing apparatus based on the first gray level value and the second gray level value, and in one possible implementation, the determining the target exposure parameter of the image capturing apparatus based on the first gray level value and the second gray level value includes the following steps:
obtaining mapping information, wherein the mapping information is used for representing the corresponding relation between different first gray values and second gray values and exposure parameters; and determining the target exposure parameters corresponding to the first gray scale value and the second gray scale value based on the mapping information.
The mapping information used for representing the correspondence between the first gray value and the second gray value and the exposure parameter may be a mapping function or a mapping table, and both the mapping function and the mapping table may be obtained by performing a shooting experiment in an actual scene, so that determining the exposure parameter through the mapping information has the advantage of high accuracy. Taking the mapping information as a mapping table for illustration, in this example, the mapping information should include multiple sets of mapping relationships, where the multiple sets of mapping relationships are located in different rows in the mapping table, and each set of mapping relationships may include a first gray scale interval, a second gray scale interval, and exposure parameters corresponding to the first gray scale interval and the second gray scale interval, that is, in this example, a first gray scale value located in a certain interval and a second gray scale value located in a certain interval are corresponding to a certain set of exposure parameters. In one possible implementation manner, based on the mapping information, a target exposure parameter corresponding to the first gray value and the second gray value is determined, as shown in fig. 3, including the following steps:
s301, determining a target mapping relation from the plurality of groups of mapping relations; the first gray scale interval of the target mapping relationship includes the first gray scale value, and the second gray scale interval of the target mapping relationship includes the second gray scale value.
In one possible embodiment, the first gray scale interval includes a first gray scale lower limit value and a first gray scale upper limit value, and the second gray scale interval includes a second gray scale lower limit value and a second gray scale upper limit value; determining a target mapping relationship from the plurality of sets of mapping relationships, including: comparing the first gray value with a first gray lower limit value and a first gray upper limit value in the same group of mapping relation according to a priority order preset by a plurality of groups of mapping relation, and comparing the second gray value with a second gray lower limit value and a second gray upper limit value in the same group of mapping relation; and determining a mapping relation to which the first gray lower limit value, the first gray upper limit value, the second gray lower limit value and the second gray upper limit value belong as the target mapping relation in response to the situation that the first gray value is larger than or equal to the first gray lower limit value, smaller than the first gray upper limit value and the second gray value is larger than or equal to the second gray lower limit value and smaller than the second gray upper limit value. In this embodiment, the first gray value and the second gray value determined in step S103 are compared with the first gray lower limit value, the first gray upper limit value, the second gray lower limit value, and the second gray upper limit value in the plurality of sets of mapping relationships, respectively, and when the first gray value and the second gray fall within the first gray interval and the second gray interval in a certain set of mapping relationships, respectively, the set of mapping relationships is determined as the target mapping relationship.
S302, determining the exposure parameters in the target mapping relation as the target exposure parameters.
In this step, the exposure parameter included in the determined target mapping relationship is determined as the target exposure parameter.
According to the method for determining the exposure parameters, disclosed by the embodiment of the disclosure, the first image is shot under the condition that the light supplementing lamp is not turned on, and in the case, the target area of the first image may be in an underexposure state, an exposure normal state or an overexposure state, that is, the first gray value can directly reflect the light intensity of the ambient light where the target object is located; the second image is formed by fusing a plurality of images, the image quality of the second image is better than that of each image before fusion, more details about the target object can be reflected, the second gray value can reflect the sensitivity degree of the target object to the light supplementing lamp, and according to the first gray value and the second gray value, the target exposure parameters meeting the requirements of the environment light and the target object can be determined, so that the method has the advantages of high stability, accurate measurement and strong light environment adaptability.
The embodiment of the disclosure provides a device for determining the pose of an object under the condition that each functional module is divided by corresponding each function. Fig. 4 is a schematic block diagram of functional modules of an exposure parameter determination apparatus provided in an exemplary embodiment of the present disclosure. As shown in fig. 4, the apparatus 400 for determining the pose of the object includes:
an obtaining module 401, configured to obtain a plurality of images for a target object, where the plurality of images are images obtained by the image capturing device under a plurality of groups of preset exposure parameters, and the plurality of images have a first image, and the first image is an image obtained by capturing under the condition that the light filling lamp is not turned on;
a fusion module 402, configured to fuse the plurality of images to obtain a second image;
a gray value determining module 403 configured to determine a first gray value and a second gray value, the first gray value being an average gray value of a target region of the first image, the second gray value being an average gray value of a target region of the second image; the target area of the first image and the target area of the second image correspond to the same area of the target object; an exposure parameter determination module 404 configured to determine a target exposure parameter of the image capturing apparatus based on the first gray scale value and the second gray scale value.
In one possible implementation, the exposure parameter determination module 404 is configured to: obtaining mapping information, wherein the mapping information is used for representing the corresponding relation between different first gray values and second gray values and exposure parameters; and determining the target exposure parameters corresponding to the first gray scale value and the second gray scale value based on the mapping information.
In one possible implementation manner, the mapping information includes multiple sets of mapping relationships, and each set of mapping relationship includes a first gray scale interval, a second gray scale interval, and exposure parameters corresponding to the first gray scale interval and the second gray scale interval; the exposure parameter determination module 404 is further configured to: determining a target mapping relation from the plurality of groups of mapping relations; wherein the first gray scale interval of the target mapping relationship comprises the first gray scale value, and the second gray scale interval of the target mapping relationship comprises the second gray scale value; and determining the exposure parameters in the target mapping relation as the target exposure parameters.
In one possible embodiment, the first gray scale interval includes a first gray scale lower limit value and a first gray scale upper limit value, and the second gray scale interval includes a second gray scale lower limit value and a second gray scale upper limit value; the exposure parameter determination module 404 is further configured to: comparing the first gray value with a first gray lower limit value and a first gray upper limit value in the same group of mapping relation according to a priority order preset by the plurality of groups of mapping relation, and comparing the second gray value with a second gray lower limit value and a second gray upper limit value in the same group of mapping relation; and determining a mapping relation to which the first gray lower limit value, the first gray upper limit value, the second gray lower limit value and the second gray upper limit value belong as the target mapping relation in response to the situation that the first gray value is larger than or equal to the first gray lower limit value, smaller than the first gray upper limit value and the second gray value is larger than or equal to the second gray lower limit value and smaller than the second gray upper limit value.
In one possible implementation, the gray value determination module 403 is further configured to: determining position information of the target area of the second image; the first gray value and the second gray value are determined based on position information of the target region of the second image.
In a possible implementation, the gray value determining module 403 is further configured to input the second image into a pre-trained model, and output the position information of the target area of the second image.
In one possible embodiment, the target exposure parameter includes at least one of exposure time, gain, gamma value, and light-compensating lamp brightness.
The embodiment of the disclosure also provides an electronic device, including: at least one processor; a memory for storing the at least one processor-executable instruction; wherein the at least one processor is configured to execute the instructions to implement the method disclosed in the embodiments of the present disclosure.
Fig. 5 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present disclosure. As shown in fig. 5, the electronic device 1800 includes at least one processor 1801 and a memory 1802 coupled to the processor 1801, the processor 1801 may perform corresponding steps in the above-described methods disclosed by embodiments of the present disclosure.
The processor 1801 may also be referred to as a Central Processing Unit (CPU), which may be an integrated circuit chip with signal processing capabilities. The steps of the above-described methods disclosed in the embodiments of the present disclosure may be accomplished by instructions in the form of integrated logic circuits or software in hardware in the processor 1801. The processor 1801 may be a general purpose processor, a Digital Signal Processor (DSP), an ASIC, an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present disclosure may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may reside in a memory 1802 such as random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as is well known in the art. The processor 1801 reads the information in the memory 1802 and, in combination with its hardware, performs the steps of the method described above.
In addition, various operations/processes according to the present disclosure, in the case of being implemented by software and/or firmware, may be installed from a storage medium or network to a computer system having a dedicated hardware structure, such as the computer system 1900 shown in fig. 6, which is capable of performing various functions including functions such as those described above, and the like, when various programs are installed. Fig. 6 is a block diagram of a computer system according to an exemplary embodiment of the present disclosure.
Computer system 1900 is intended to represent various forms of digital electronic computing devices, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other suitable computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the computer system 1900 includes a computing unit 1901, and the computing unit 1901 may perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 1902 or a computer program loaded from a storage unit 1908 into a Random Access Memory (RAM) 1903. In the RAM1903, various programs and data required for the operation of the computer system 1900 may also be stored. The computing unit 1901, ROM1902, and RAM1903 are connected to each other via a bus 1904. An input/output (I/O) interface 1905 is also connected to bus 1904.
Various components in computer system 1900 are connected to I/O interface 1905, including: an input unit 1906, an output unit 1907, a storage unit 1908, and a communication unit 1909. The input unit 1906 may be any type of device capable of inputting information to the computer system 1900, and the input unit 1906 may receive input numeric or character information and generate key signal inputs related to user settings and/or function controls of the electronic device. The output unit 1907 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, video/audio output terminals, vibrators, and/or printers. Storage unit 1908 may include, but is not limited to, magnetic disks, optical disks. The communication unit 1909 allows the computer system 1900 to exchange information/data with other devices over a network, such as the internet, and may include, but is not limited to, modems, network cards, infrared communication devices, wireless communication transceivers and/or chipsets, such as bluetooth (TM) devices, wiFi devices, wiMax devices, cellular communication devices, and/or the like.
The computing unit 1901 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 1901 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 1901 performs the various methods and processes described above. For example, in some embodiments, the above-described methods disclosed by embodiments of the present disclosure may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 1908. In some embodiments, some or all of the computer programs may be loaded and/or installed onto computer system 1900 via ROM1902 and/or communication unit 1909. In some embodiments, the computing unit 1901 may be configured to perform the above-described methods of the disclosed embodiments by any other suitable means (e.g., by means of firmware).
The disclosed embodiments also provide a computer-readable storage medium, wherein instructions in the computer-readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the above-described method disclosed in the disclosed embodiments.
A computer readable storage medium in embodiments of the present disclosure may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium described above can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specifically, the computer-readable storage medium described above may include one or more wire-based electrical connections, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The disclosed embodiments also provide a computer program product comprising a computer program, wherein the computer program implements the method disclosed in the disclosed embodiments when the computer program is executed by a processor.
In an embodiment of the present disclosure, computer program code for performing the operations of the present disclosure may be written in one or more programming languages, including but not limited to an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of remote computers, the remote computers may be connected to the user computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to external computers.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules, components or units referred to in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a module, component or unit does not in some cases constitute a limitation of the module, component or unit itself.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
The above description is merely illustrative of some embodiments of the present disclosure and of the principles of the technology applied. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Although some specific embodiments of the present disclosure have been described in detail by way of example, it should be understood by those skilled in the art that the above examples are for illustration only and are not intended to limit the scope of the present disclosure. It will be appreciated by those skilled in the art that modifications may be made to the above embodiments without departing from the scope and spirit of the disclosure. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. The method for determining the exposure parameters is characterized by being applied to an image pickup device, wherein the image pickup device comprises a light supplementing lamp; the method for determining the exposure parameters comprises the following steps:
acquiring a plurality of images aiming at a target object, wherein the images are images obtained by shooting by the camera under a plurality of groups of preset exposure parameters, the images are provided with first images, and the first images are images obtained by shooting under the condition that the light supplementing lamp is not turned on;
fusing the plurality of images to obtain a second image;
determining a first gray value and a second gray value, wherein the first gray value is an average gray value of a target area of the first image, and the second gray value is an average gray value of a target area of the second image; the target area of the first image and the target area of the second image correspond to the same area of the target object;
a target exposure parameter of the image capturing apparatus is determined based on the first gray value and the second gray value.
2. The method of claim 1, wherein determining a target exposure parameter of the image capture device based on the first gray value and the second gray value comprises:
obtaining mapping information, wherein the mapping information is used for representing the corresponding relation between different first gray values and second gray values and exposure parameters;
and determining the target exposure parameters corresponding to the first gray scale value and the second gray scale value based on the mapping information.
3. The method of claim 2, wherein the mapping information comprises a plurality of sets of mapping relationships, each set of mapping relationships comprising a first gray scale interval, a second gray scale interval, and exposure parameters corresponding to the first gray scale interval and the second gray scale interval; based on the mapping information, determining the target exposure parameters corresponding to the first gray value and the second gray value includes:
determining a target mapping relation from the plurality of groups of mapping relations; wherein the first gray scale interval of the target mapping relationship comprises the first gray scale value, and the second gray scale interval of the target mapping relationship comprises the second gray scale value;
and determining the exposure parameters in the target mapping relation as the target exposure parameters.
4. A method according to claim 3, wherein the first gray scale interval comprises a first gray scale lower limit value and a first gray scale upper limit value, and the second gray scale interval comprises a second gray scale lower limit value and a second gray scale upper limit value; determining a target mapping relationship from the plurality of sets of mapping relationships, including:
comparing the first gray value with a first gray lower limit value and a first gray upper limit value in the same group of mapping relation according to a priority order preset by the plurality of groups of mapping relation, and comparing the second gray value with a second gray lower limit value and a second gray upper limit value in the same group of mapping relation;
and determining a mapping relation to which the first gray lower limit value, the first gray upper limit value, the second gray lower limit value and the second gray upper limit value belong as the target mapping relation in response to the situation that the first gray value is larger than or equal to the first gray lower limit value, smaller than the first gray upper limit value and the second gray value is larger than or equal to the second gray lower limit value and smaller than the second gray upper limit value.
5. The method of claim 1, wherein determining the first gray value and the second gray value comprises: determining position information of the target area of the second image;
the first gray value and the second gray value are determined based on position information of the target region of the second image.
6. The method of claim 4, wherein determining location information of the target area of the second image comprises: and inputting the second image into a pre-trained model, and outputting the position information of the target area of the second image.
7. The method of any one of claims 1 to 6, wherein the target exposure parameters include at least one of exposure time, gain, gamma value, and lamp brightness.
8. The device for determining exposure parameters is applied to an image pickup device, and the image pickup device comprises a light supplementing lamp; the device for determining the exposure parameters is characterized by comprising the following steps:
the image acquisition module is used for acquiring a plurality of images aiming at a target object, wherein the images are images obtained by shooting under a plurality of groups of preset exposure parameters by the camera, the images are provided with first images, and the first images are images obtained by shooting under the condition that the light supplementing lamp is not started;
the fusion module is used for fusing the plurality of images to obtain a second image;
the gray value determining module is used for determining a first gray value and a second gray value, wherein the first gray value is the average gray value of the target area of the first image, and the second gray value is the average gray value of the target area of the second image; the target area of the first image and the target area of the second image correspond to the same area of the target object;
and the exposure parameter determining module is used for determining a target exposure parameter of the image pickup device based on the first gray value and the second gray value.
9. An electronic device, comprising:
at least one processor;
a memory for storing the at least one processor-executable instruction;
wherein the at least one processor is configured to execute the instructions to implement the method of any of claims 1-7.
10. A computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the method according to any of claims 1-7.
CN202310423039.3A 2023-04-19 2023-04-19 Method, device, equipment and storage medium for determining exposure parameters Pending CN116506737A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310423039.3A CN116506737A (en) 2023-04-19 2023-04-19 Method, device, equipment and storage medium for determining exposure parameters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310423039.3A CN116506737A (en) 2023-04-19 2023-04-19 Method, device, equipment and storage medium for determining exposure parameters

Publications (1)

Publication Number Publication Date
CN116506737A true CN116506737A (en) 2023-07-28

Family

ID=87316019

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310423039.3A Pending CN116506737A (en) 2023-04-19 2023-04-19 Method, device, equipment and storage medium for determining exposure parameters

Country Status (1)

Country Link
CN (1) CN116506737A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117793541A (en) * 2024-02-27 2024-03-29 广东朝歌智慧互联科技有限公司 Ambient light adjusting method, device, electronic equipment and readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117793541A (en) * 2024-02-27 2024-03-29 广东朝歌智慧互联科技有限公司 Ambient light adjusting method, device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN108337433B (en) Photographing method, mobile terminal and computer readable storage medium
US9451173B2 (en) Electronic device and control method of the same
US11532076B2 (en) Image processing method, electronic device and storage medium
CN111614892B (en) Face image acquisition method, shooting device and computer-readable storage medium
CN107040726B (en) Double-camera synchronous exposure method and system
CN103227928B (en) White balance adjusting method and device
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
KR20160038460A (en) Electronic device and control method of the same
US11159739B2 (en) Apparatus and method for generating moving image data including multiple section images in electronic device
CN112840636A (en) Image processing method and device
CN107909569B (en) Screen-patterned detection method, screen-patterned detection device and electronic equipment
US20180262667A1 (en) Image capturing device and brightness adjusting method
KR20190041586A (en) Electronic device composing a plurality of images and method
CN110648296A (en) Pupil color correction method, correction device, terminal device and storage medium
CN116506737A (en) Method, device, equipment and storage medium for determining exposure parameters
WO2018161568A1 (en) Photographing method and device based on two cameras
CN116055712A (en) Method, device, chip, electronic equipment and medium for determining film forming rate
WO2020119454A1 (en) Method and apparatus for color reproduction of image
US10769416B2 (en) Image processing method, electronic device and storage medium
CN113727085B (en) White balance processing method, electronic equipment, chip system and storage medium
WO2022006739A1 (en) Control method, control apparatus, and infrared camera
CN115334250B (en) Image processing method and device and electronic equipment
KR102499399B1 (en) Electronic device for notifying updata of image signal processing and method for operating thefeof
US11509797B2 (en) Image processing apparatus, image processing method, and storage medium
CN111656759A (en) Image color correction method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination