CN116744135B - Dynamic range measuring method and related device - Google Patents

Dynamic range measuring method and related device Download PDF

Info

Publication number
CN116744135B
CN116744135B CN202211217941.1A CN202211217941A CN116744135B CN 116744135 B CN116744135 B CN 116744135B CN 202211217941 A CN202211217941 A CN 202211217941A CN 116744135 B CN116744135 B CN 116744135B
Authority
CN
China
Prior art keywords
image
dynamic range
image sensor
iso
low
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211217941.1A
Other languages
Chinese (zh)
Other versions
CN116744135A (en
Inventor
蔡捷帆
陈祥
周天一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211217941.1A priority Critical patent/CN116744135B/en
Publication of CN116744135A publication Critical patent/CN116744135A/en
Application granted granted Critical
Publication of CN116744135B publication Critical patent/CN116744135B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application discloses a dynamic range measuring method and a related device. Acquiring a plurality of RAW images, wherein each RAW image is generated by shooting a test chart card under illumination brightness by an image sensor under the same light sensitivity ISO, and each RAW image corresponds to one illumination brightness respectively; generating an RGB image corresponding to each RAW image; determining a low-light image in a plurality of RGB images, wherein the low-light image is an RGB image with a signal-to-noise ratio within a preset range; and determining the dynamic range of the image sensor under ISO according to the gray value of the low-light image. In the application, the low-light image is found out from a plurality of RAW images shot by the image sensor under the same light sensitivity ISO and different illumination brightness, and the dynamic range of the image sensor under the ISO is determined according to the gray value of the low-light image, thereby improving the accuracy of measuring the dynamic range.

Description

Dynamic range measuring method and related device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a dynamic range measurement method and a related device.
Background
The dynamic range is one of the most important parameters of the image sensor, and determines the range of light intensity distribution from the brightest highlight to the darkest shadow that the image sensor can receive, and thus the details, level and characteristics of the captured image.
The dynamic range of a natural scene is the ratio between the highest light level and the lowest light level of the scene. The dynamic range of an image sensor is the ratio between the highest light level that just saturates the image sensor and the light level that is detectable by the image sensor and represents the lowest tolerance of the human eye. When the dynamic range of the scene exceeds that of the image sensor, details in the scene exceeding the dynamic range of the image sensor are lost in the imaging result of the image sensor.
The image sensor with strong dynamic range capability can obtain better dynamic range images under the same image processing algorithm. Therefore, the dynamic range capability of the image sensor is objectively evaluated, and each camera manufacturer can be helped to select a better image sensor in the device model selection stage.
In view of the above, a measurement scheme for the dynamic range of the image sensor is needed.
Disclosure of Invention
The application provides a dynamic range measuring method and a related device, which are used for improving the accuracy of measuring the dynamic range of an image sensor.
In a first aspect, the present application provides a method of measuring dynamic range. A plurality of RAW images are acquired. Each RAW image is generated by shooting a test chart card under one illumination brightness by an image sensor under the same sensitivity ISO, and each RAW image corresponds to one illumination brightness respectively. The more the number of RAW images is, the more uniform and dense the distribution of gray values of the RAW images is, and the more accurate the dynamic range measurement result is.
And converting each acquired RAW image into an RGB image to obtain a plurality of RGB images under the same ISO. Each RAW image is generated by shooting a test chart card under illumination brightness by an image sensor under the same sensitivity ISO, and each RAW image corresponds to one illumination brightness respectively, so that the gray value of each RAW image is different, and the gray value of each converted RGB image is also different. When the gray level of the RGB image is too low, the signal-to-noise ratio (SNR) is too small, the image details are submerged in the noise, and the image quality generated by the sensor is lower than the minimum acceptable image quality.
And calculating the signal-to-noise ratio of each RGB image, and determining the RGB image with the signal-to-noise ratio within a preset range as a low-light image. After the low-light image is determined, the dynamic range of the image sensor under the ISO can be determined according to the gray value of the low-light image.
In the application, the low-light image is found out from a plurality of RAW images shot by the image sensor under the same light sensitivity ISO and different illumination brightness, and the dynamic range of the image sensor under the ISO is determined according to the gray value of the low-light image, thereby improving the accuracy of measuring the dynamic range.
Based on the first aspect, in an alternative implementation manner, in addition to determining the low-light image in the multiple RGB images, a high-light image in the multiple RGB images needs to be determined, where the high-light image in the present application is the image with the lowest brightness in all the overexposed RGB images in the multiple RGB images. Then, the dynamic range is calculated using the gray value of the highlight image as the highest light level in the dynamic range (i.e., the highest light level that just saturates the image sensor).
Based on the first aspect, in an alternative implementation manner, the dynamic range may be calculated directly by using the gray upper limit value of the test chart as the highest light level in the dynamic range (i.e. the highest light level that just saturates the image sensor).
Based on the first aspect, in an alternative implementation manner, the dynamic range represented by the image sensor is different under different ISO. Therefore, the value of the calculated dynamic range is also only suitable for representing the dynamic range of the image sensor under the same ISO, and for the dynamic range of the image sensor under other ISO, the calculation is needed by the RAW image photographed under other ISO and combined with the dynamic range measuring method of the present application. After the dynamic range (multiple dynamic ranges are obtained) of the image sensor under different ISO is calculated, the dynamic range information of the image sensor is generated according to the multiple dynamic ranges, and the dynamic range information indicates the dynamic range of the image sensor under different ISO, so that the measurement result of the dynamic range is richer and more comprehensive.
In a second aspect, the present application provides a dynamic range measuring apparatus comprising:
An acquisition unit, configured to acquire a plurality of RAW images, where each RAW image is generated by capturing a test chart under one illumination brightness by an image sensor at the same sensitivity ISO, and each RAW image corresponds to one illumination brightness respectively;
A processing unit, configured to generate an RGB image corresponding to each RAW image;
the processing unit is also used for determining a low-light image in the plurality of RGB images, wherein the low-light image is an RGB image with a signal-to-noise ratio within a preset range;
And the processing unit is also used for determining the dynamic range of the image sensor under ISO according to the gray value of the low-light image.
Based on the second aspect, in an alternative embodiment, the plurality of RAW images includes at least one overexposed RAW image.
Based on the second aspect, in an optional implementation manner, the processing unit is further configured to:
determining a highlight image in the plurality of RGB images, wherein the highlight image is the image with the lowest brightness in all the overexposed RGB images of the plurality of RGB images;
The processing unit is specifically used for determining the dynamic range of the image sensor under the ISO according to the gray value of the low-light image:
the ratio between the gray value of the high light image and the gray value of the low light image is determined as the dynamic range of the image sensor under ISO.
Based on the second aspect, in an optional implementation manner, the processing unit is specifically configured to, when determining the dynamic range of the image sensor under ISO according to the gray value of the low-light image:
The dynamic range of the image sensor under ISO is determined according to the ratio between the upper gray value of the test chart card and the gray value of the low-light image.
Based on the second aspect, in an optional implementation manner, the processing unit is specifically configured to, when determining a low-light image of the plurality of RGB images:
an RGB image with a signal-to-noise ratio of 10DB was determined to be a low-light image.
Based on the second aspect, in an optional implementation manner, the processing unit is further configured to:
Determining dynamic ranges of the image sensor under different ISO to obtain a plurality of dynamic ranges;
Dynamic range information of the image sensor is generated according to the plurality of dynamic ranges, and the dynamic range information indicates dynamic ranges of the image sensor under different ISO.
Based on the second aspect, in an optional implementation manner, the processing unit is specifically configured to, when generating the dynamic range information of the image sensor according to the plurality of dynamic ranges:
And combining the dynamic ranges to obtain a dynamic range capacity table of the image sensor, wherein the dynamic range capacity table comprises a plurality of table entries, and each table entry corresponds to the dynamic range of the image sensor under an ISO.
Based on the second aspect, in an optional implementation manner, the processing unit is specifically configured to, when generating the dynamic range information of the image sensor according to the plurality of dynamic ranges:
and converting the dynamic ranges into dynamic range capability curves of the image sensor, wherein each coordinate on the dynamic range capability curves corresponds to the dynamic range of the image sensor under one ISO.
The content of the information interaction and the execution process of the embodiment shown in the present aspect is based on the same concept as the embodiment shown in the first aspect, so the description of the beneficial effects shown in the present aspect is shown in the above first aspect, and details are not repeated here.
In a third aspect, a computing device is provided that includes a memory and a processor coupled to the memory; the memory is configured to store instructions, and the processor is configured to execute the instructions, to implement the method according to any one of the above aspects.
In a fourth aspect, a computer readable storage medium is provided, in which a computer program is stored which, when run on a processor, implements the method of any of the above aspects.
In a fifth aspect, there is provided a computer program product or computer program comprising computer instructions which, when run on a processor, implement the method of any of the above aspects.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a system frame of a dynamic range detection method according to the present application;
FIG. 2 is a flowchart illustrating a process for acquiring RAW images according to the present application;
fig. 3 is a schematic view of a scene in which a plurality of RAW images are captured;
FIG. 4 is a flow chart of a dynamic range measurement method according to an embodiment of the application;
FIG. 5 is a schematic structural diagram of a dynamic range measuring device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a computing device according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a dynamic range measuring method and a related device, which are used for improving the accuracy of measuring the dynamic range of an image sensor.
Embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. The terminology used in the description of the embodiments of the application herein is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application. As one of ordinary skill in the art can know, with the development of technology and the appearance of new scenes, the technical scheme provided by the embodiment of the application is also applicable to similar technical problems.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The following description is given of some terms or terminology used in connection with the present application, which also forms part of the summary of the application.
The dynamic range of a scene is determined by the ratio of the highest light level to the lowest light level of the scene. The dynamic range of an image sensor is determined by the ratio of the highest light level in the scene that just saturates the image sensor to the lowest light level that can be detected by the image sensor. When the dynamic range of the scene exceeds that of the image sensor, details in the scene exceeding the dynamic range of the image sensor are lost in the imaging result of the image sensor.
The image sensor with strong dynamic range capability can obtain better dynamic range images under the same image processing algorithm. Therefore, the dynamic range capability of the image sensor is objectively evaluated, and each camera manufacturer can be helped to select a better image sensor in the device model selection stage.
In view of the above, the present application provides a dynamic range measuring method and related device for improving accuracy of measuring dynamic range of an image sensor. Referring to fig. 1, fig. 1 is a schematic diagram of a system frame of a dynamic range detection method according to the present application. As shown in fig. 1, the dynamic range detection scheme of the present application mainly includes a data acquisition system and a data processing system. The data acquisition system is used for acquiring the RAW image, and the data processing system executes the dynamic range measuring method according to the acquired RAW image.
Next, an example in which the data acquisition system acquires a RAW image will be described in the present application. Referring to fig. 2, fig. 2 is a flowchart illustrating a process of capturing a RAW image according to the present application. As shown in fig. 2, the process of acquiring a RAW image includes:
101. the photographing device and the test chart card are deployed.
Firstly, a shooting device (provided with an image sensor) and a test chart card are required to be deployed, and the position relationship between the shooting device and the test chart card is adjusted. Specifically, the photographing device may be a device with a photographing and imaging function, such as a mobile phone or a camera, and is not limited herein. In practical application, the image generated after shooting by the shooting device is often optimized through an image processing algorithm, and the image can have great interference on the measurement of the dynamic range, so that the authenticity of the measurement result is seriously affected. Therefore, it is necessary to turn off the function of image optimization of the photographing device, and for example, the photographing device may be adjusted to a professional mode.
102. After the parameters of the light source and the parameters of the shooting device are configured, the test chart card is shot.
Configuring parameters of a light source, including color temperature and brightness; parameters of the photographing device including exposure time and sensitivity are configured. The RAW images of the test chart card at different brightnesses are photographed. For example, referring to fig. 3, fig. 3 is a schematic view of a scene in which a plurality of RAW images are captured. As shown in fig. 3, a plurality of glass sheets 203 of different transmittances may be disposed between an image sensor (disposed in the photographing device 201 in fig. 3) and the test chart card 202, so that a plurality of image areas of different brightness may be formed on one image obtained by photographing the test chart card at a time. Therefore, the plurality of RAW images in the present application may be a plurality of image areas of different brightness on one image. In other words, each image area with different brightness in one image is one RAW image of the present application; or the same plane with uniform brightness can be shot by adjusting different light source brightness, so as to form a plurality of RAW images under different illumination intensities. Therefore, the plurality of RAW images in the application can also be a plurality of images formed under different illumination intensities, and each image is one RAW image in the application; or may take another way to obtain a RAW image, which is not limited in the present application.
Thus, a plurality of RAW images of the test chart captured by the image sensor at the same sensitivity ISO are obtained.
103. The image sensor captures a plurality of RAW images under different ISO by switching different ISO.
After obtaining a plurality of RAW images under the current ISO of step 102. And (3) keeping the exposure time unchanged, switching different ISO, and continuously shooting the test chart card to obtain a group of RAW images (a plurality of RAW images) corresponding to each ISO. In practical applications, for the sake of convenience of observation, the brightness of the light source at the time of photographing in step 102 and step 103 may be adjusted so that at least one overexposed RAW image is included in a group of RAW images corresponding to each ISO.
It should be understood that, in practical applications, the RAW image may be acquired by other ways, which is not limited herein. In the present application, only the flow of acquiring a RAW image shown in fig. 2 is described as an example.
After the RAW image is obtained, the data processing system performs the dynamic range measuring method according to the present application based on the acquired RAW image. Referring to fig. 4, fig. 4 is a flow chart illustrating a dynamic range measurement method according to an embodiment of the application. As shown in fig. 4, the method for measuring dynamic range in the embodiment of the present application includes:
301. A plurality of RAW images are acquired.
A plurality of RAW images are acquired. Each RAW image is generated by shooting a test chart card under one illumination brightness by an image sensor under the same sensitivity ISO, and each RAW image corresponds to one illumination brightness respectively. The RAW image in the present application refers to an image that has not undergone post-processing (e.g., adjustment by an image processing algorithm) after photographing imaging, so as to facilitate accuracy and authenticity of measurement results of a dynamic range. The RAW image may be an image in ARW, DNG, or CRW format, for example. The more the number of RAW images is, the more uniform and dense the distribution of gray values of the RAW images is, and the more accurate the dynamic range measurement result is.
In practical application, as shown in the example of fig. 3, a plurality of glass sheets with different transmittances may be disposed between the image sensor and the test chart card, so that a plurality of image areas with different brightness may be formed on one image obtained by photographing the test chart card at a time. Therefore, the plurality of RAW images in the present application may be a plurality of image areas of different brightness on one image. In other words, each image area with different brightness in one image is one RAW image of the present application; or the same plane with uniform brightness can be shot by adjusting different light source brightness, so as to form a plurality of RAW images under different illumination intensities. Therefore, the plurality of RAW images in the application can also be a plurality of images formed under different illumination intensities, and each image is one RAW image in the application; or may take another way to obtain a RAW image, which is not limited in the present application.
302. And generating an RGB image corresponding to each RAW image.
And converting each acquired RAW image into an RGB image to obtain a plurality of RGB images under the same ISO.
303. A low light image of the plurality of RGB images is determined.
As can be seen from the above, each RAW image is generated by capturing a test chart card under one illumination brightness by the image sensor under the same sensitivity ISO, and each RAW image corresponds to one illumination brightness, so that the gray value of each RAW image is different, and the gray value of each converted RGB image is also different. When the gray level of the RGB image is too low, the signal-to-noise ratio (SNR) is too small, the image details are submerged in the noise, and the image quality generated by the sensor is lower than the minimum acceptable image quality.
In the application, the signal-to-noise ratio of each RGB image is calculated, and the RGB image with the signal-to-noise ratio within a preset range is determined as a low-light image. For example, the preset range of the signal-to-noise ratio may be set to srn=10 according to the standard of ISO 12232, and the RGB image satisfying srn=10 is a low-light image; or in practical applications, it may be difficult to have RGB images that can just meet the preset range of srn=10, and then the preset range may be adjusted to a range that fluctuates above and below srn=10, for example, RGB images with SRNs of 9 to 11 are all determined as low-light images.
304. And determining the dynamic range of the image sensor under ISO according to the gray value of the low-light image.
After the low-light image is determined, the dynamic range of the image sensor under the ISO can be determined according to the gray value of the low-light image.
For example, if a plurality of low-light images are determined, it is possible to calculate what the gray value is when the snr=10 of the RAW image according to the signal-to-noise ratio and the gray value of each low-light image in combination with a specific algorithm (for example, an interpolation algorithm or a mean algorithm). The dynamic range is then calculated with the gray value as the lowest light level in the dynamic range (i.e. the lowest light level that can be detected by the image sensor).
If there is exactly one low-light image with snr=10, the dynamic range is calculated with the gray value of the low-light image as the lowest light level in the dynamic range (i.e., the lowest light level that can be detected by the image sensor).
In the application, the low-light image is found out from a plurality of RAW images shot by the image sensor under the same light sensitivity ISO and different illumination brightness, and the dynamic range of the image sensor under the ISO is determined according to the gray value of the low-light image, thereby improving the accuracy of measuring the dynamic range.
In one possible implementation, in addition to determining the low-light image in the multiple RGB images, it is also necessary to determine the high-light image in the multiple RGB images, where the high-light image in the present application is the image with the highest gray level value, except for the over-exposed RGB image in the multiple RGB images. Then, the dynamic range is calculated using the gray value of the highlight image as the highest light level in the dynamic range (i.e., the highest light level that just saturates the image sensor).
In one possible implementation, the dynamic range calculation may be performed directly with the upper gray level value of the test chart as the highest light level in the dynamic range (i.e., the highest light level that just saturates the image sensor).
Further, since the image sensor is under different ISO, the dynamic range exhibited is not the same. Therefore, the value of the calculated dynamic range is also only suitable for representing the dynamic range of the image sensor under the same ISO, and for the dynamic range of the image sensor under other ISO, the calculation is needed by the RAW image photographed under other ISO and combined with the dynamic range measuring method of the present application. After the dynamic range (multiple dynamic ranges are obtained) of the image sensor under different ISO is calculated, the dynamic range information of the image sensor is generated according to the multiple dynamic ranges, and the dynamic range information indicates the dynamic range of the image sensor under different ISO, so that the measurement result of the dynamic range is richer and more comprehensive. In practical application, the dynamic range information can be displayed in the form of the content of a table, and each item in the table records the dynamic range corresponding to one ISO respectively; alternatively, a specific algorithm (such as an interpolation algorithm or a mean algorithm) may be used to calculate a change curve of the dynamic range of the image sensor under different ISO according to a plurality of dynamic ranges.
Next, in order to better implement the above-mentioned scheme of the embodiment of the present application, the embodiment of the present application further provides a related device for implementing the above-mentioned scheme. Specifically, referring to fig. 5, fig. 5 is a schematic structural diagram of a dynamic range measuring device according to an embodiment of the application. As shown in fig. 5, the dynamic range measuring apparatus includes:
An acquiring unit 401, configured to acquire a plurality of RAW images, where each RAW image is generated by capturing a test chart under one illumination brightness by an image sensor under the same sensitivity ISO, and each RAW image corresponds to one illumination brightness respectively;
a processing unit 402, configured to generate an RGB image corresponding to each RAW image;
the processing unit 402 is further configured to determine a low-light image of the plurality of RGB images, where the low-light image is an RGB image with a signal-to-noise ratio within a preset range;
The processing unit 402 is further configured to determine a dynamic range of the image sensor under ISO according to the gray value of the low-light image.
In one possible design, the plurality of RAW images includes at least one over-exposed RAW image.
Based on the second aspect, in an alternative embodiment, the processing unit 402 is further configured to:
determining a highlight image in the plurality of RGB images, wherein the highlight image is the image with the lowest brightness in all the overexposed RGB images of the plurality of RGB images;
The processing unit 402 is specifically configured to, when determining the dynamic range of the image sensor under ISO according to the gray value of the low-light image:
the ratio between the gray value of the high light image and the gray value of the low light image is determined as the dynamic range of the image sensor under ISO.
Based on the second aspect, in an alternative implementation manner, the processing unit 402 is specifically configured to, when determining the dynamic range of the image sensor under ISO according to the gray value of the low-light image:
The dynamic range of the image sensor under ISO is determined according to the ratio between the upper gray value of the test chart card and the gray value of the low-light image.
Based on the second aspect, in an alternative implementation manner, the processing unit 402 is specifically configured to, when determining a low-light image of the plurality of RGB images:
an RGB image with a signal-to-noise ratio of 10DB was determined to be a low-light image.
Based on the second aspect, in an alternative embodiment, the processing unit 402 is further configured to:
Determining dynamic ranges of the image sensor under different ISO to obtain a plurality of dynamic ranges;
Dynamic range information of the image sensor is generated according to the plurality of dynamic ranges, and the dynamic range information indicates dynamic ranges of the image sensor under different ISO.
Based on the second aspect, in an alternative implementation manner, the processing unit 402 is specifically configured to, when generating dynamic range information of the image sensor according to the multiple dynamic ranges:
And combining the dynamic ranges to obtain a dynamic range capacity table of the image sensor, wherein the dynamic range capacity table comprises a plurality of table entries, and each table entry corresponds to the dynamic range of the image sensor under an ISO.
Based on the second aspect, in an alternative implementation manner, the processing unit 402 is specifically configured to, when generating dynamic range information of the image sensor according to the multiple dynamic ranges:
and converting the dynamic ranges into dynamic range capability curves of the image sensor, wherein each coordinate on the dynamic range capability curves corresponds to the dynamic range of the image sensor under one ISO.
It should be noted that, in the dynamic range measuring apparatus, contents such as information interaction and execution process between each module/unit are based on the same concept as the method embodiment corresponding to fig. 4 in the present application, and specific contents may be referred to the description in the foregoing method embodiment of the present application, which is not repeated herein.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a computing device according to an embodiment of the present application, on which a dynamic range measuring apparatus described in the corresponding embodiment of fig. 5 may be disposed on a computing device 500, for implementing functions in the corresponding embodiment of fig. 4, specifically, the computing device 500 is implemented by one or more servers, where the computing device 500 may have relatively large differences due to different configurations or performances, and may include one or more central processing units (central processing units, CPU) 522 (e.g., one or more processors) and a memory 532, and one or more storage media 530 (e.g., one or more mass storage devices) storing application programs 542 or data 544. Wherein memory 532 and storage medium 530 may be transitory or persistent. The program stored on the storage medium 530 may include one or more modules (not shown), each of which may include a series of instruction operations in the computing device. Still further, the central processor 522 may be arranged to communicate with a storage medium 530 to execute a series of instruction operations in the storage medium 530 on the computing device 500.
Computing device 500 may also include one or more power supplies 526, one or more wired or wireless network interfaces 550, one or more input/output interfaces 558, and/or one or more operating systems 541, such as Windows Server TM,Mac OS XTM,UnixTM,LinuxTM,FreeBSDTM, etc.
It should be noted that, content such as information interaction and execution process between each module/unit in the computing device, the method embodiment corresponding to fig. 4 in the present application is based on the same concept, and specific content may be referred to the description in the foregoing method embodiment of the present application, which is not repeated herein.
Embodiments of the present application also provide a computer program product comprising instructions. The computer program product may be software or a program product containing instructions capable of running on a computing device or stored in any useful medium. The computer program product, when run on at least one computer device, causes the at least one computer device to perform the method as described in the embodiment shown in fig. 4.
The embodiment of the application also provides a computer readable storage medium. The computer readable storage medium may be any available medium that can be stored by a computing device or a data storage device such as a data center containing one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc. The computer readable storage medium includes instructions that instruct a computing device to perform the above-described method applied to performing the embodiment described above with respect to fig. 4.
The training device of the neural network provided by the embodiment of the application can be a chip, and the chip comprises: a processing unit, which may be, for example, a processor, and a communication unit, which may be, for example, an input/output interface, pins or circuitry, etc. The processing unit may execute the computer-executable instructions stored in the storage unit to cause the chip to perform the method described in the embodiment shown in fig. 4. Optionally, the storage unit is a storage unit in the chip, such as a register, a cache, or the like, and the storage unit may also be a storage unit in the wireless access device side located outside the chip, such as a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a random access memory (random access memory, RAM), or the like.
It should be further noted that the above described embodiments of the apparatus are only schematic, where the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the application, the connection relation between the modules represents that the modules have communication connection, and can be specifically implemented as one or more communication buses or signal lines.
From the above description of the embodiments, it will be apparent to those skilled in the art that the present application may be implemented by means of software plus necessary general purpose hardware, or of course by means of special purpose hardware including application specific integrated circuits, special purpose CPUs, special purpose memories, special purpose components, etc. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions can be varied, such as analog circuits, digital circuits, or dedicated circuits. But a software program implementation is a preferred embodiment for many more of the cases of the present application. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk or an optical disk of a computer, etc., comprising several instructions for causing a computer device (which may be a personal computer, a training device, a network device, etc.) to perform the method according to the embodiments of the present application.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, training device, or data center to another website, computer, training device, or data center via a wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a training device, a data center, or the like that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk (Solid STATE DISK, SSD)), etc.

Claims (10)

1. A method of measuring dynamic range, comprising:
acquiring a plurality of RAW images, wherein each RAW image is generated by shooting a test chart card under illumination brightness by an image sensor under the same light sensitivity ISO, and each RAW image corresponds to one illumination brightness respectively;
Generating an RGB image corresponding to each RAW image;
determining a low-light image in the plurality of RGB images, wherein the low-light image is an RGB image with a signal-to-noise ratio within a preset range;
and determining the dynamic range of the image sensor under the ISO according to the gray value of the low-light image.
2. The method of claim 1, wherein the plurality of RAW images comprises at least one overexposed RAW image.
3. The method according to claim 2, wherein the method further comprises:
Determining a highlight image in the plurality of RGB images, wherein the highlight image is the image with the lowest brightness in all the overexposed RGB images of the plurality of RGB images;
the determining the dynamic range of the image sensor under the ISO according to the gray value of the low-light image comprises the following steps:
And determining the ratio between the gray value of the high-light image and the gray value of the low-light image as the dynamic range of the image sensor under the ISO.
4. The method of claim 2, wherein said determining the dynamic range of the image sensor under the ISO from the gray scale value of the low light image comprises:
and determining the ratio between the gray upper limit value of the test chart card and the gray value of the low-light image as the dynamic range of the image sensor under the ISO.
5. The method of any one of claims 1 to 4, wherein the determining a low light image of the plurality of RGB images comprises:
an RGB image with a signal-to-noise ratio of 10DB was determined to be a low-light image.
6. The method according to any one of claims 1 to 4, further comprising:
determining dynamic ranges of the image sensor under different ISO to obtain a plurality of dynamic ranges;
Dynamic range information of the image sensor is generated according to the plurality of dynamic ranges, and the dynamic range information indicates dynamic ranges of the image sensor under the different ISO.
7. The method of claim 6, wherein the generating dynamic range information for the image sensor from the plurality of dynamic ranges comprises:
And combining the dynamic ranges to obtain a dynamic range capacity table of the image sensor, wherein the dynamic range capacity table comprises a plurality of table entries, and each table entry corresponds to the dynamic range of the image sensor under an ISO.
8. The method of claim 6, wherein the generating dynamic range information for the image sensor from the plurality of dynamic ranges comprises:
And converting the dynamic ranges into dynamic range capability curves of the image sensor, wherein each coordinate on the dynamic range capability curves corresponds to the dynamic range of the image sensor under one ISO.
9. A computing device comprising a processor and a memory, the processor coupled to the memory,
The memory is used for storing programs;
The processor configured to execute the program in the memory, to cause the computing device to perform the method of any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 8.
CN202211217941.1A 2022-09-30 Dynamic range measuring method and related device Active CN116744135B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211217941.1A CN116744135B (en) 2022-09-30 Dynamic range measuring method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211217941.1A CN116744135B (en) 2022-09-30 Dynamic range measuring method and related device

Publications (2)

Publication Number Publication Date
CN116744135A CN116744135A (en) 2023-09-12
CN116744135B true CN116744135B (en) 2024-05-14

Family

ID=

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5760829A (en) * 1995-06-06 1998-06-02 United Parcel Service Of America, Inc. Method and apparatus for evaluating an imaging device
CN108965867A (en) * 2018-07-25 2018-12-07 首都师范大学 A kind of camera image calculation method of parameters and device
CN109660789A (en) * 2018-11-26 2019-04-19 维沃移动通信(杭州)有限公司 A kind of measuring device and camera dynamic range measurement method
JP2020048003A (en) * 2018-09-14 2020-03-26 日本放送協会 Dynamic range measuring device and program
CN111213372A (en) * 2017-07-26 2020-05-29 惠普发展公司,有限责任合伙企业 Evaluation of dynamic range of imaging device
CN112954310A (en) * 2021-02-05 2021-06-11 上海研鼎信息技术有限公司 Image quality detection method, device, computer equipment and readable storage medium
CN115049530A (en) * 2022-06-17 2022-09-13 瑞芯微电子股份有限公司 Method, apparatus and system for debugging image signal processor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5760829A (en) * 1995-06-06 1998-06-02 United Parcel Service Of America, Inc. Method and apparatus for evaluating an imaging device
CN111213372A (en) * 2017-07-26 2020-05-29 惠普发展公司,有限责任合伙企业 Evaluation of dynamic range of imaging device
CN108965867A (en) * 2018-07-25 2018-12-07 首都师范大学 A kind of camera image calculation method of parameters and device
JP2020048003A (en) * 2018-09-14 2020-03-26 日本放送協会 Dynamic range measuring device and program
CN109660789A (en) * 2018-11-26 2019-04-19 维沃移动通信(杭州)有限公司 A kind of measuring device and camera dynamic range measurement method
CN112954310A (en) * 2021-02-05 2021-06-11 上海研鼎信息技术有限公司 Image quality detection method, device, computer equipment and readable storage medium
CN115049530A (en) * 2022-06-17 2022-09-13 瑞芯微电子股份有限公司 Method, apparatus and system for debugging image signal processor

Similar Documents

Publication Publication Date Title
CN109862282B (en) Method and device for processing person image
KR102636439B1 (en) Method of flicker reduction
US20090309998A1 (en) Electronic image capture with reduced noise
EP3204812B1 (en) Microscope and method for obtaining a high dynamic range synthesized image of an object
CN111062870A (en) Processing method and device
JP2020042760A (en) Information processing method, information processing device, and program
CN109361853A (en) Image processing method, device, electronic equipment and storage medium
CN113299216A (en) Gamma debugging method, device, equipment and storage medium
CN116744135B (en) Dynamic range measuring method and related device
CN114429476A (en) Image processing method, image processing apparatus, computer device, and storage medium
CN113639881A (en) Color temperature testing method and device, computer readable medium and electronic equipment
JP2008503156A (en) Multi-gain data processing
KR101992403B1 (en) Skin moisture estimating method using image
CN108810509A (en) A kind of image color correction method and device
CN110677558A (en) Image processing method and electronic device
CN116744135A (en) Dynamic range measuring method and related device
CN112651945A (en) Multi-feature-based multi-exposure image perception quality evaluation method
KR20170048454A (en) Image processing method and device
CN111917986A (en) Image processing method, medium thereof, and electronic device
CN113038026B (en) Image processing method and electronic device
WO2019126916A1 (en) Testing method and apparatus, and terminal
US11812167B2 (en) Determining pixel intensity values in imaging
KR101653649B1 (en) 3D shape measuring method using pattern-light with uniformity compensation
CN112422841B (en) Image compensation method, image compensation device, computer equipment and storage medium
JP3941546B2 (en) Contrast enhancement method

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant