CN111861959A - Ultra-long depth of field and ultra-wide dynamic image synthesis algorithm - Google Patents

Ultra-long depth of field and ultra-wide dynamic image synthesis algorithm Download PDF

Info

Publication number
CN111861959A
CN111861959A CN202010681348.7A CN202010681348A CN111861959A CN 111861959 A CN111861959 A CN 111861959A CN 202010681348 A CN202010681348 A CN 202010681348A CN 111861959 A CN111861959 A CN 111861959A
Authority
CN
China
Prior art keywords
ultra
image
wide dynamic
pixel point
long depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010681348.7A
Other languages
Chinese (zh)
Inventor
伍思樾
顾兆泰
李娜娜
李明
安昕
张浠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Optomedic Technology Co Ltd
Original Assignee
Guangdong Optomedic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Optomedic Technology Co Ltd filed Critical Guangdong Optomedic Technology Co Ltd
Priority to CN202010681348.7A priority Critical patent/CN111861959A/en
Publication of CN111861959A publication Critical patent/CN111861959A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Abstract

The invention discloses an ultra-wide dynamic image synthesis algorithm with ultra-long depth of field, which specifically comprises the following steps: s1: acquiring at least two images; s2: respectively calculating the definition C, the saturation S and the exposure E of each image; s3: respectively calculating the weight W of each image according to the definition C, the saturation S and the exposure E of each image; s4: carrying out weighted average processing on the weights W of all the images to synthesize an ultra-long depth-of-field ultra-wide dynamic image; by adopting the image synthesis algorithm, the RGB image can be directly synthesized, and the processing on an RAW format is not needed, so that the processing is convenient; the image synthesis algorithm has low operation complexity, does not need tone mapping and can process in real time; the image synthesis algorithm considers definition, saturation and exposure degree simultaneously, can synthesize an ultra-long depth of field and an ultra-wide dynamic image simultaneously, and can well keep color information.

Description

Ultra-long depth of field and ultra-wide dynamic image synthesis algorithm
Technical Field
The invention relates to an image synthesis algorithm, in particular to an ultra-wide dynamic image synthesis algorithm with ultra-long depth of field.
Background
Due to the physical and digital limitations of cameras, cameras have limited depth of field and limited dynamic range, i.e., limited distance range between which a user can see and limited light and dark regions that can be distinguished. In some fields with higher requirements on depth of field and dynamic range, limited depth of field and dynamic range cause not little trouble, for example, when an endoscope camera system is clinically used, tissues with larger working distance difference in an operation visual field are difficult to see clearly at the same time due to depth of field limitation, and the working distance or focal distance needs to be adjusted repeatedly, so that operation is inconvenient; meanwhile, due to the influence of the limited dynamic range, the doctor is difficult to see the objects with large difference in brightness and darkness at the same time, and discomfort is brought to the doctor.
The existing High Dynamic Range (HDR) synthesis algorithm weights a plurality of RAW format images with different exposures according to the exposure time, synthesizes the HDR images to be represented by a matrix with a wide range of values, and finally maps the HDR images to 0-255 through tone mapping (tonemapping) to carry out correct color display. The algorithm has high calculation complexity, and the requirements of real-time display of 50 and 60 frames of the camera system are difficult to realize. In addition, the algorithm needs to be performed on the RAW format image, and for the commonly used RGB image, the RAW format image needs to be converted for operation, which further increases the computational complexity. Moreover, the quality of the synthesized image is mainly related to the tone mapping method, and the tone mapping is easy to have the problems of detail loss, halo, color cast and the like. Most importantly, this algorithm can only solve the problem of limited dynamic range of the camera, but not the problem of limited depth of field.
Therefore, the prior art still needs to be improved and developed.
Disclosure of Invention
The invention aims to provide an ultra-wide dynamic image synthesis algorithm with ultra-long depth of field, so as to provide an ultra-high resolution output image with ultra-long depth of field.
The technical scheme of the invention is as follows: an ultra-wide dynamic image synthesis algorithm with ultra-long depth of field specifically comprises the following steps:
S1: acquiring at least two images;
s2: respectively calculating the definition C, the saturation S and the exposure E of each image;
s3: respectively calculating the weight W of each image according to the definition C, the saturation S and the exposure E of each image;
s4: and carrying out weighted average processing on the weights W of all the images to synthesize the ultra-long depth-of-field ultra-wide dynamic image.
The ultra-long depth of field and ultra-wide dynamic image synthesis algorithm, wherein in S1, the image includes a close-range clear and non-overexposed image and a far-range clear and non-underexposed image.
In the ultra-wide dynamic image synthesis algorithm with ultra-long depth of field, in S2, each pixel point of each image is slid by 3 × 3 laplacian to obtain the definition C (i, j) of each pixel point.
The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field is characterized in that the calculation formula of the definition C (i, j) of each pixel point is as follows:
Figure DEST_PATH_IMAGE002AA
f (i, j) is the gray value of the pixel point with the coordinate (i, j).
In the ultra-wide dynamic image synthesis algorithm with ultra-long depth of field, in S2, the saturation S (i, j) of each pixel is calculated by using the RGB value of each pixel of each image.
The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field is characterized in that the calculation formula of the saturation S (i, j) of each pixel point is as follows:
Figure DEST_PATH_IMAGE004AA
Figure DEST_PATH_IMAGE006AA
Wherein, R (i, j) is a red component of the pixel point with the coordinate (i, j), G (i, j) is a green component of the pixel point with the coordinate (i, j), B (i, j) is a blue component of the pixel point with the coordinate (i, j), and M (i, j) is a gray value of the pixel point with the coordinate (i, j).
The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field is characterized in that the exposure degree E (i, j) of each pixel point is calculated through the RGB value of each pixel point of each image.
The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field is characterized in that the calculation formula of the exposure degree E (i, j) of each pixel point is as follows:
Figure 654871DEST_PATH_IMAGE007
wherein, G is a preset value, R (i, j) is a red component of a pixel point with coordinates (i, j), G (i, j) is a green component of the pixel point with coordinates (i, j), and B (i, j) is a blue component of the pixel point with coordinates (i, j).
In the ultra-wide dynamic image synthesis algorithm with ultra-long depth of field, in S3, after obtaining the definition C, the saturation S and the exposure E of each pixel in each image, respectively, the weight W of each pixel in each image is calculated; the weighted value W (i, j) of each pixel point is obtained by calculation according to C (i, j), S (i, j) and B (i, j), and the calculation formula is as follows:
Figure 841188DEST_PATH_IMAGE010
in the ultra-wide dynamic image synthesis algorithm with ultra-long depth of field, in S4, the weights of the pixels at the same coordinate in all the images are weighted and averaged one by one to obtain the pixel values of the pixels at the same coordinate in the ultra-wide dynamic image with ultra-long depth of field, and all the pixels in the ultra-wide dynamic image with ultra-long depth of field are obtained Point synthesis is carried out to obtain an ultra-long depth-of-field ultra-wide dynamic image, and the weighting formula of each pixel point is as follows:
Figure DEST_PATH_IMAGE012
wherein, Wn is the weight of the pixel point with the coordinate (i, j) In the nth image, and In (i, j) is the pixel value of the pixel point with the coordinate (i, j) In the nth image.
The invention has the beneficial effects that: the invention provides an ultra-wide dynamic image synthesis algorithm with ultra-long depth of field, which can directly synthesize RGB images without processing on RAW format and is convenient to process; the image synthesis algorithm has low operation complexity, does not need tone mapping and can process in real time; the image synthesis algorithm considers definition, saturation and exposure degree simultaneously, can synthesize an ultra-long depth of field and an ultra-wide dynamic image simultaneously, and can well keep color information.
Drawings
Fig. 1 is a flow chart of steps of an ultra-long depth of field and ultra-wide dynamic image synthesis algorithm in the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations and positional relationships based on those shown in the drawings, and are used only for convenience of description and simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be considered as limiting the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, "above" or "below" a first feature means that the first and second features are in direct contact, or that the first and second features are not in direct contact but are in contact with each other via another feature therebetween. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The following disclosure provides many different embodiments or examples for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or uses of other materials.
As shown in fig. 1, an ultra-wide dynamic image synthesis algorithm with ultra-long depth of field specifically includes the following steps:
s1: at least two images are acquired.
At least 1 image1 with clear near view and no overexposure and at least 1 image2 with clear far view and no underexposure are obtained and synthesized.
S2: the sharpness C, saturation S and exposure E of each image are calculated separately.
The definition C slides through each pixel point of the image through a 3 × 3 Laplacian operator to obtain the definition C (i, j) of each pixel point, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE002AAA
f (i, j) is the gray value of the pixel point with the coordinate (i, j). The larger the value of C, the higher the resolution.
The saturation S is calculated through the RGB value of each pixel point to obtain the saturation S (i, j) of each pixel point, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE004AAA
Figure DEST_PATH_IMAGE006AAA
wherein, R (i, j) is a red component of the pixel point with the coordinate (i, j), G (i, j) is a green component of the pixel point with the coordinate (i, j), B (i, j) is a blue component of the pixel point with the coordinate (i, j), and M (i, j) is a gray value of the pixel point with the coordinate (i, j). The larger the S value, the higher the saturation.
The exposure degree E (i, j) of each pixel point is obtained by calculating the RGB value of each pixel point, and the calculation formula is as follows:
Figure 417443DEST_PATH_IMAGE007
Wherein, G is a preset value and can be set as 0.2. The larger the value of E, the better the degree of exposure.
Wherein, the value ranges of R (i, j), G (i, j), B (i, j) and f (i, j) are 0-1, if not, normalization is needed.
S3: and respectively calculating the weight W of each image according to the definition C, the saturation S and the exposure E of each image.
After the definition C, the saturation S and the exposure E of each pixel point in each image are respectively obtained, the weight W of each pixel point in each image is respectively calculated, the weight value W (i, j) of each pixel point is obtained by calculation according to C (i, j), S (i, j) and B (i, j), and the calculation formula is as follows:
Figure 927239DEST_PATH_IMAGE014
s4: and carrying out weighted average processing on the weights W of all the images to synthesize the ultra-long depth-of-field ultra-wide dynamic image.
Assuming that the weights of the pixel points at the same coordinate in the n images are respectively W1 … Wn, performing weighted average on the weights of the pixel points at the same coordinate in the n images to obtain the pixel value of the pixel point at the same coordinate in the ultra-long depth-of-field ultra-wide dynamic image, and synthesizing all the pixel points in the obtained ultra-long depth-of-field ultra-wide dynamic image to obtain the ultra-long depth-of-field ultra-wide dynamic image, wherein the weighting formula of each pixel point is as follows:
Figure DEST_PATH_IMAGE012A
wherein: i1(I, j) … In (I, j) is the pixel value of the (I, j) th to nth images with the coordinate of the pixel point.
In the description herein, references to the description of the terms "one embodiment," "certain embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
It is to be understood that the invention is not limited to the examples described above, but that modifications and variations may be effected thereto by those of ordinary skill in the art in light of the foregoing description, and that all such modifications and variations are intended to be within the scope of the invention as defined by the appended claims.

Claims (10)

1. An ultra-wide dynamic image synthesis algorithm with ultra-long depth of field is characterized by comprising the following steps:
s1: acquiring at least two images;
s2: respectively calculating the definition C, the saturation S and the exposure E of each image;
S3: respectively calculating the weight W of each image according to the definition C, the saturation S and the exposure E of each image;
s4: and carrying out weighted average processing on the weights W of all the images to synthesize the ultra-long depth-of-field ultra-wide dynamic image.
2. The ultra-long depth-of-field ultra-wide dynamic image synthesis algorithm as claimed in claim 1, wherein in the step S1, the image comprises a close-up clear and non-overexposed image and a far-up clear and non-underexposed image.
3. The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field according to claim 1, wherein in S2, the definition C (i, j) of each pixel is obtained by sliding 3 × 3 laplacian across each pixel in each image.
4. The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field according to claim 3, wherein the calculation formula of the definition C (i, j) of each pixel point is as follows:
Figure 890765DEST_PATH_IMAGE001
f (i, j) is the gray value of the pixel point with the coordinate (i, j).
5. The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field according to claim 1, wherein in S2, the saturation S (i, j) of each pixel is calculated from the RGB value of each pixel of each image.
6. The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field according to claim 5, wherein the calculation formula of the saturation S (i, j) of each pixel point is as follows:
Figure 172841DEST_PATH_IMAGE002
Figure 123349DEST_PATH_IMAGE003
Wherein, R (i, j) is a red component of the pixel point with the coordinate (i, j), G (i, j) is a green component of the pixel point with the coordinate (i, j), B (i, j) is a blue component of the pixel point with the coordinate (i, j), and M (i, j) is a gray value of the pixel point with the coordinate (i, j).
7. The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field according to claim 1, wherein the exposure degree E (i, j) of each pixel point is calculated by the RGB value of each pixel point of each image.
8. The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field according to claim 7, wherein the calculation formula of the exposure degree E (i, j) of each pixel point is as follows:
Figure 98258DEST_PATH_IMAGE004
wherein, G is a preset value, R (i, j) is a red component of a pixel point with coordinates (i, j), G (i, j) is a green component of the pixel point with coordinates (i, j), and B (i, j) is a blue component of the pixel point with coordinates (i, j).
9. The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field according to claim 1, wherein in S3, after obtaining the definition C, the saturation S and the exposure E of each pixel in each image, the weight W of each pixel in each image is calculated; the weighted value W (i, j) of each pixel point is obtained by calculation according to C (i, j), S (i, j) and B (i, j), and the calculation formula is as follows:
Figure 446249DEST_PATH_IMAGE006
10. The ultra-wide dynamic image synthesis algorithm with ultra-long depth of field according to claim 1, wherein in S4, the weights of the pixels at the same coordinate in all the images are weighted and averaged one by one to obtain the pixel value of the pixel at the same coordinate in the ultra-wide dynamic image with ultra-long depth of field, and all the pixels in the obtained ultra-wide dynamic image with ultra-long depth of field are synthesized to obtain the ultra-wide dynamic image with ultra-long depth of field, and the weighting formula of each pixel is as follows:
Figure 736416DEST_PATH_IMAGE007
wherein, Wn is the weight of the pixel point with the coordinate (i, j) In the nth image, and In (i, j) is the pixel value of the pixel point with the coordinate (i, j) In the nth image.
CN202010681348.7A 2020-07-15 2020-07-15 Ultra-long depth of field and ultra-wide dynamic image synthesis algorithm Pending CN111861959A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010681348.7A CN111861959A (en) 2020-07-15 2020-07-15 Ultra-long depth of field and ultra-wide dynamic image synthesis algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010681348.7A CN111861959A (en) 2020-07-15 2020-07-15 Ultra-long depth of field and ultra-wide dynamic image synthesis algorithm

Publications (1)

Publication Number Publication Date
CN111861959A true CN111861959A (en) 2020-10-30

Family

ID=72984318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010681348.7A Pending CN111861959A (en) 2020-07-15 2020-07-15 Ultra-long depth of field and ultra-wide dynamic image synthesis algorithm

Country Status (1)

Country Link
CN (1) CN111861959A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101021945A (en) * 2007-03-23 2007-08-22 北京中星微电子有限公司 Image composing method and device
US20130070965A1 (en) * 2011-09-21 2013-03-21 Industry-University Cooperation Foundation Sogang University Image processing method and apparatus
CN104616273A (en) * 2015-01-26 2015-05-13 电子科技大学 Multi-exposure image fusion method based on Laplacian pyramid decomposition
CN106030614A (en) * 2014-04-22 2016-10-12 史內普艾德有限公司 System and method for controlling a camera based on processing an image captured by other camera
CN106408518A (en) * 2015-07-30 2017-02-15 展讯通信(上海)有限公司 Image fusion method and apparatus, and terminal device
CN106550194A (en) * 2016-12-26 2017-03-29 珠海格力电器股份有限公司 Photographic method, device and mobile terminal
CN107220956A (en) * 2017-04-18 2017-09-29 天津大学 A kind of HDR image fusion method of the LDR image based on several with different exposures
CN108921806A (en) * 2018-08-07 2018-11-30 Oppo广东移动通信有限公司 A kind of image processing method, image processing apparatus and terminal device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101021945A (en) * 2007-03-23 2007-08-22 北京中星微电子有限公司 Image composing method and device
US20130070965A1 (en) * 2011-09-21 2013-03-21 Industry-University Cooperation Foundation Sogang University Image processing method and apparatus
CN106030614A (en) * 2014-04-22 2016-10-12 史內普艾德有限公司 System and method for controlling a camera based on processing an image captured by other camera
CN104616273A (en) * 2015-01-26 2015-05-13 电子科技大学 Multi-exposure image fusion method based on Laplacian pyramid decomposition
CN106408518A (en) * 2015-07-30 2017-02-15 展讯通信(上海)有限公司 Image fusion method and apparatus, and terminal device
CN106550194A (en) * 2016-12-26 2017-03-29 珠海格力电器股份有限公司 Photographic method, device and mobile terminal
CN107220956A (en) * 2017-04-18 2017-09-29 天津大学 A kind of HDR image fusion method of the LDR image based on several with different exposures
CN108921806A (en) * 2018-08-07 2018-11-30 Oppo广东移动通信有限公司 A kind of image processing method, image processing apparatus and terminal device

Similar Documents

Publication Publication Date Title
EP1924966B1 (en) Adaptive exposure control
JP6111336B2 (en) Image processing method and apparatus
US9558543B2 (en) Image fusion method and image processing apparatus
JP5284537B2 (en) Image processing apparatus, image processing method, image processing program, and imaging apparatus using the same
US8659672B2 (en) Image processing apparatus and image pickup apparatus using same
US8860816B2 (en) Scene enhancements in off-center peripheral regions for nonlinear lens geometries
JP5054248B2 (en) Image processing apparatus, image processing method, image processing program, and imaging apparatus
JP5672796B2 (en) Image processing apparatus and image processing method
JP5677113B2 (en) Image processing device
JP2015122110A (en) High dynamic range image generation and rendering
US10943333B2 (en) Method and apparatus of sharpening of gastrointestinal images based on depth information
JP5411786B2 (en) Image capturing apparatus and image integration program
JP5804857B2 (en) Image processing apparatus, image processing method, and program
CN113409247A (en) Multi-exposure fusion image quality evaluation method
CN111861959A (en) Ultra-long depth of field and ultra-wide dynamic image synthesis algorithm
JP2020145553A (en) Image processing apparatus, image processing method and program
JP5952574B2 (en) Image processing apparatus and control method thereof
CN114979500B (en) Image processing method, image processing apparatus, electronic device, and readable storage medium
JP4125191B2 (en) Image processing apparatus and method
CN114283100A (en) High dynamic range image synthesis and tone mapping method and electronic equipment
JP5952573B2 (en) Image processing apparatus and control method thereof
JP2021082069A (en) Image processing system, imaging apparatus, image processing method, and program
Go et al. Image fusion for single-shot high dynamic range imaging with spatially varying exposures
TWI243598B (en) Device and method to correct an image received at a wide angle
TW201121336A (en) Method for correcting high dynamic range synthetic images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201030

RJ01 Rejection of invention patent application after publication