WO2017080179A1 - 克服图像融合中颜色突变的方法、系统及设备 - Google Patents

克服图像融合中颜色突变的方法、系统及设备 Download PDF

Info

Publication number
WO2017080179A1
WO2017080179A1 PCT/CN2016/083367 CN2016083367W WO2017080179A1 WO 2017080179 A1 WO2017080179 A1 WO 2017080179A1 CN 2016083367 W CN2016083367 W CN 2016083367W WO 2017080179 A1 WO2017080179 A1 WO 2017080179A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
fused
adjacent
panoramic
fusion
Prior art date
Application number
PCT/CN2016/083367
Other languages
English (en)
French (fr)
Inventor
周珣
Original Assignee
乐视控股(北京)有限公司
乐卡汽车智能科技(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐卡汽车智能科技(北京)有限公司 filed Critical 乐视控股(北京)有限公司
Publication of WO2017080179A1 publication Critical patent/WO2017080179A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the embodiments of the present invention relate to the field of image splicing technology, and in particular, to a method, system, and device for overcoming color catastrophe in image fusion.
  • the image stitching technique is a technique of stitching a plurality of images having overlapping portions into a panoramic image, wherein the images of the overlapping portions are images at different viewing angles.
  • Imaging at different viewing angles is obtained simultaneously by multiple (two or more) imaging devices (eg, cameras).
  • Typical applications for this type of image stitching technique are multi-camera mosaic panoramic cameras, panoramic parking, etc. .
  • image fusion is a seamless panoramic image in which the field of view overlap region of the image to be fused, that is, the fusion region is superimposed and processed.
  • the simultaneous imaging of the multi-imaging device is performed separately, and the illumination conditions of the shooting scene at different viewing angles are not the same, and the imaging device necessarily causes the acquisition when the exposure is adjusted according to the scene.
  • the image to be fused has a difference in brightness and chromaticity.
  • the image to be fused with the difference of brightness and chromaticity is directly used for image superposition and fusion, and the difference is enlarged by contrast, which eventually leads to a color mutation of the fused panoramic image, which affects the visual sensory effect.
  • the greater the difference in brightness and chromaticity between the images to be fused the more obvious the color change of the spliced panoramic image.
  • the solutions provided by the prior art mostly use image processing algorithms for brightness and chromaticity adjustment.
  • image processing algorithms may be different, there may be differences in processing time points, such as processing the fused image before fusion or smoothing the panoramic image after merging, but all have the following technical defects:
  • the present invention discloses a method for overcoming color abrupt changes in image fusion, which is applicable to a panoramic device including N imaging devices, and includes the following steps:
  • the invention also discloses a system for overcoming color mutation in image fusion, which is suitable for a panoramic device comprising N imaging devices, comprising:
  • a quantization unit configured to perform quantization processing on the image electrical signals collected by each imaging device to form respective images to be fused of the panoramic image
  • a data processing unit configured to calculate a deviation amount of a fusion region of the adjacent image to be fused
  • a gain adjustment unit configured to perform gain adjustment on each of the images to be fused, such that a deviation amount of the fused region of the adjacent image to be fused is within a preset range
  • the invention also discloses a panoramic device, comprising:
  • N imaging devices wherein said N ⁇ 2;
  • N analog-to-digital converters respectively connected to the N imaging devices for performing quantization processing on image electrical signals collected by the imaging devices to form respective images to be fused of the panoramic images;
  • a processor connected to the N analog-to-digital converters, for calculating a deviation amount of a fusion region of an adjacent image to be fused, and performing gain adjustment on the image to be fused, so that the adjacent image to be fused
  • the amount of deviation of the fusion area is within a preset range.
  • Embodiments of the present application provide a system for overcoming color catastrophe in image fusion, suitable for a panoramic device including N imaging devices, including a memory, one or more processors, and one or more programs, wherein the one or more The programs perform the following operations when executed by the one or more processors:
  • Embodiments of the present application provide a computer readable storage medium having stored thereon computer executable instructions that, in response to execution, cause a system to overcome color abrupt changes in image fusion to perform operations
  • the operations include:
  • the gain adjustment can be performed in a hardware manner, so that the output luminance and the chrominance of all the fused regions of the adjacent to-be-fused images are substantially consistent, and the existing image fusion technology is overcome by the image processing algorithm for brightness and The effectiveness of chroma adjustment, the problem of excessive resource consumption, and the efficiency of processing.
  • FIG. 1 is a schematic diagram of a method for overcoming color mutation in image fusion provided by an embodiment of the present application; A flow chart of an embodiment.
  • FIG. 2 is a flowchart of another preferred embodiment of a method provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a system for overcoming color mutation in image fusion according to an embodiment of the present application.
  • FIG. 4 is a block diagram of a panoramic device according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a system for overcoming color mutation in image fusion according to an embodiment of the present application.
  • FIG. 6 is a computer program product for overcoming color mutation in image fusion provided by an embodiment of the present application.
  • an embodiment of the present application discloses a method for overcoming color mutation in image fusion, which is suitable for performing technical processing in an image fusion process in an image mosaic process, so as to overcome the existing image processing algorithm in image fusion technology.
  • Various defects are possible.
  • the applicable occasion of the embodiment of the present application is a panoramic imaging system including N (2 or more) imaging devices, such as a panoramic parking system, and multiple imaging devices, such as a camera, simultaneously acquire image electrical signals to finally splicing into complete Panoramic image.
  • Figure 1 shows a flow chart of an embodiment thereof, including:
  • this step is to process the image electrical signals collected by the imaging devices, and the image electrical signals can be specifically obtained by the photosensitive plane of the imaging device (for example, an image sensor).
  • the imaging device for example, an image sensor
  • the image electrical signal is quantized to obtain an image to be fused.
  • the image to be fused as described herein refers to a digital image that needs to be subjected to image fusion processing to be spliced into a panoramic image.
  • the quantization process can be realized by an analog/digital converter, and thus each image to be fused is obtained.
  • Those skilled in the art can select an 8-bit, 16-bit, 24-bit or higher quantized word length to quantize the image electrical signal according to actual accuracy requirements.
  • each image to be fused may also undergo a series of processing before the image fusion processing in the image splicing process, for example, processing such as noise reduction and registration. That is to say, the technical solution provided by the embodiment of the present application does not affect the application of the prior art in other parts of the image splicing. In turn, the other aspects of the image splicing can be combined with the technical solutions provided by the embodiments of the present application. .
  • step S1 is performed frame by frame for each frame of the panoramic image, otherwise the reconstruction of the panoramic image cannot be achieved.
  • the deviation amount refers to the deviation statistical value of the corresponding position quantization value of the fusion region (overlap region), for example, it may be the sum or the sum of the squares of the difference between the pixel values of all the points in the region to be fused, for example, for example, It may be the difference between the mean values of the pixels of all the points to be fused in the region, for example, it may be that the adjacent image to be fused samples the pixels according to a certain rule in the region and thereby calculates the pixel values of the sample points.
  • the amount of deviation, or any other statistical value taken by those skilled in the art based on actual needs.
  • S3 Perform gain adjustment on each image to be fused, so that the deviation amount of the fused region of the adjacent image to be fused is within a preset range.
  • the gain adjustment method is adopted to make the deviation amount of the fusion region of the adjacent image to be fused in a preset range, and the processing of the luminance and chrominance difference is realized in a hardware manner (a gain amplifier or a hardware processor capable of performing gain adjustment).
  • a hardware manner a gain amplifier or a hardware processor capable of performing gain adjustment.
  • the best technical effect will be achieved when the amount of deviation of the fused region of the adjacent image to be fused is zero.
  • the pixel gray level is 255
  • the single pixel allowable error may be 5 or less or 10 or less.
  • the fusion effect is not visually affected. It can be specifically set by a person skilled in the art according to the actual precision requirement, and does not affect the implementation of the technical solution of the embodiment of the present application.
  • step S3 is also performed frame by frame for each frame of the panoramic image.
  • steps S1 and S3 are performed frame by frame as described above, but S2 can be executed at different frequencies according to requirements.
  • the deviation amount calculation may be selected for each frame, that is, steps S1-S3 are performed for each frame of the panoramic image, and such an adjustment manner undoubtedly obtains the best fusion effect for dynamic panoramic image playback, but at the same time, Continuously displayed camera systems such as stereoscopic travel recording, monitoring, etc. are also relatively expensive.
  • the amount of deviation can be calculated without frame by frame, and can be divided into two cases.
  • step S2 is re-executed, for example, when the change of the imaging parameter reaches the preset threshold.
  • the preset threshold for example, when occlusion occurs, or a new light source causes a large change in brightness, or switches a shooting scene, etc.
  • the deviation amount calculation is performed every time interval T, that is, every time Step S2 is performed once every T, and the remaining gain adjustment value is maintained in the remaining time period, step S3 is performed frame by frame.
  • the deviation amount calculation is periodically performed when the change of the imaging parameter does not reach the preset threshold, so that the phased continuous correction can be achieved, and the balance between the effect and the system overhead can be achieved.
  • processing of the above two cases can be used mutually exclusive, that is, selecting to perform S2 periodically or step S2 based on the threshold.
  • the processing of the above two cases can of course also be combined to achieve better technical effects.
  • the above different processing options may be used in combination, for example, selecting step S2 on a frame-by-frame basis in the case of frequent parameter changes, and performing step S2 based on the time interval T and/or the threshold value in the case where the parameter variation is infrequent.
  • step S3 performed frame by frame in the above embodiment may have different implementation manners.
  • the intermediate value of the deviation amount of the fusion region of the adjacent image to be fused may be taken, and then the two adjacent images to be fused are respectively adjusted by the intermediate value, so that the deviation amount between the two approaches 0.
  • the embodiment of the present application further provides a method for overcoming color mutation in image fusion as shown in FIG. 2.
  • the steps S1 and S2 are the same as the embodiment shown in FIG. 1, but the step S3 is specifically implemented as follows.
  • the first reference image may be determined according to a preset rule.
  • the first reference image may be determined according to a specification of hardware in advance. For example, a to-be-fused image captured by a certain imaging device and quantized is designated as a first reference image in which each frame of the panoramic image is fused. Specifically, the designation of the imaging device may be closer to the imaging core position, or which imaging device has a larger lighting field, depending on the actual situation, for example, according to which imaging device the image to be fused.
  • the first reference image may be determined based on a setting for an image parameter in advance. For example, the quantized value of each image to be fused may be analyzed, and the image to be fused closest to the target parameter of the currently selected scene mode of the device is determined as the first reference image; for example, other images to be fused may be selected. The image to be fused with the smallest amount of deviation is used as the first reference image;
  • the first deviation amount obtained in this step may be obtained for the current panoramic image frame calculation, or may be the result of the deviation amount calculation of the fusion region of the adjacent to-be-fused image that is executed last time.
  • the amount of deviation between each adjacent fusion image of the image to be fused P1...Pm and the first reference image P0 is calculated separately.
  • D1 ... Dm, and for each of the deviation amounts D1 ... Dm, the subsequent steps are respectively performed.
  • S33 performing gain adjustment on the entire image to be fused adjacent to the first reference image by using the first deviation amount, and using the gain-adjusted image as the second reference image;
  • gain adjustment is performed on P1 by D1
  • gain adjustment is performed on Pm by Dm
  • the adjusted P1/.../Pm are all second reference images.
  • the object of the gain adjustment is the quantized value of all the pixels of each image to be fused (P1, . . . , Pm), for example, the quantized value of all pixels of P1 +x or -y.
  • S34 Acquire a second deviation amount of the fusion region of the image to be fused and the second reference image adjacent to the second reference image;
  • the second deviation amount obtained in this step may be obtained for the current panoramic image frame calculation, or may be the result of the deviation amount calculation of the fusion region of the adjacent to-be-fused image that is executed last time.
  • the image to be fused adjacent to the second reference image P1 may have more than one image, for example, P11...P1n
  • the image to be fused adjacent to the second reference image Pm may also have more than one image, for example, Pm1...Pmk, which are also calculated separately.
  • the quantized value deviation amounts D11...D1n, Dm1 . . . Dmk of the fused regions of each adjacent image to be fused and the second reference image are respectively performed and subsequent steps are performed.
  • S35 performing gain adjustment on the entire image to be fused adjacent to the second reference image by using the second deviation amount, and using the gain-adjusted image as the third reference image;
  • the gain adjustment is performed on P11 by D11, the gain adjustment of P1n by D1n, and so on; the gain adjustment of Pm1 by Dm1, the gain adjustment of Pmk by Dmk, and so on; the adjusted P11/.../ P1n, Pm1/.../Pmk are the third reference images
  • a step of performing gain compensation on the first reference image may be performed, and the determination of the gain may be determined according to a target parameter of the currently selected scene mode, so that all the traversal is performed after the traversal
  • the fused images all have better quantized value parameters.
  • the embodiment of the present application further provides a system for overcoming color mutation in image fusion, which is suitable for a panoramic device including N imaging devices, N ⁇ 2. As shown in Figure 3, it includes:
  • a quantization unit for performing quantization processing on the image electrical signals collected by each imaging device to form each image to be fused of the panoramic image; the unit may be implemented by using an analog/digital converter;
  • a data processing unit configured to calculate a deviation amount of a fusion region of the adjacent image to be fused
  • the gain adjustment unit performs gain adjustment on each of the images to be fused, so that the deviation amount of the fused region of the adjacent image to be fused is within a preset range.
  • the embodiment of the present application further provides a panoramic device, as shown in FIG. 4 .
  • the panoramic device comprises N imaging devices 10(1), 10(2) ... 10(N).
  • N is greater than or equal to 2, that is, a panoramic device including a plurality of imaging devices.
  • the imaging device may specifically be an image sensor for performing image acquisition.
  • N analog/digital converters 20(1), 20(2), ... 20(N) are respectively connected to the imaging devices 10(1), 10(2), ... 10(N) for imaging each
  • the image electrical signals collected by the devices 10(1), 10(2), ... 10(N) are quantized to form respective images to be fused of the panoramic image;
  • the processor 30 is connected to the analog/digital converters 20(1), 20(2), ..., 20(N) for calculating the amount of deviation of the fusion region of the adjacent image to be fused, and performing the image to be fused.
  • the gain is adjusted such that the amount of deviation of the fused region of the adjacent image to be fused is within a preset range (close to zero).
  • the panoramic device may also include or be externally connected to the processor and the display for performing the panoramic merging, and may be implemented differently according to the type or requirement of the panoramic device, and details are not described herein again.
  • the embodiment of the present application further provides a system for overcoming color mutation in image fusion, which is suitable for a panoramic device including N imaging devices, N ⁇ 2. As shown in Figure 5, it includes:
  • Memory 91 Memory 91, one or more processors 92, and one or more programs 93.
  • one or more programs 93 when executed by one or more processors 92, perform the following operations:
  • S3 Perform gain adjustment on each image to be fused, so that the deviation amount of the fused region of the adjacent image to be fused is within a preset range.
  • program product 100 can include signal bearing medium 101.
  • the signal bearing medium 101 can include one or more instructions 102 that, when executed by, for example, a processor, can provide the functionality described above with respect to Figures 1-5.
  • the instruction 102 may include: one or more instructions for performing quantization processing on the image electrical signals collected by the imaging devices to form respective images to be fused of the panoramic image; and calculating a fusion region of the adjacent image to be fused One or more instructions of the amount of deviation; and one or more instructions for performing gain adjustment on the respective images to be fused such that the amount of deviation of the fused regions of the adjacent images to be fused is within a preset range.
  • a system that overcomes color catastrophe in image fusion can perform one or more of the steps shown in FIG. 1 in response to instruction 102.
  • the signal bearing medium 101 can include a computer readable medium 103 such as, but not limited to, a hard disk drive, a compact disk (CD), a digital versatile disk (DVD), a digital tape, a memory, and the like.
  • the signal bearing medium 101 can include a recordable medium 104 such as, but not limited to, a memory, a read/write (R/W) CD, an R/W DVD, and the like.
  • the signal bearing medium 101 can include a communication medium 105 such as, but not limited to, a digital And/or analog communication media (eg, fiber optic cables, waveguides, wired communication links, wireless communication links, etc.).
  • the program product 100 can be transmitted by the RF signal bearing medium 101 to one or more modules of the identification device of the multi-finger swipe gesture, wherein the signal bearing medium 101 is comprised of a wireless communication medium (eg, a wireless communication compliant with the IEEE 802.11 standard) Media) transfer.
  • a wireless communication medium eg, a wireless communication compliant with the IEEE 802.11 standard

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Image Processing (AREA)

Abstract

本发明公开了一种克服图像融合中颜色突变的方法、系统以及一种全景设备。该方法适用于包括N个成像装置的全景设备,包括:对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;计算相邻待融合图像的融合区域的偏差量;对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内。通过本发明所提供的技术方案,能够实现所有相邻待融合图像的融合区域的输出亮度和色度基本保持一致,克服了通过图像处理算法进行亮度和色度调节所存在的有效性问题、资源过量消耗问题以及处理效率问题。

Description

克服图像融合中颜色突变的方法、系统及设备
本专利申请要求申请日为2015年11月11日、申请号为2015107662572的中国专利申请的优先权,并将上述专利申请以引用的方式全文引入本文中。
技术领域
本申请实施例涉及图像拼接技术领域,尤其涉及克服图像融合中颜色突变的方法、系统及设备。
背景技术
图像拼接技术是将数张有重叠部分的图像拼接成全景图的技术,其中这数张有重叠部分的图像是不同视角下的成像。
该不同视角下的成像是由多个(两个或更多个)成像装置(例如摄像头)同时获得,这种类型的图像拼接技术的典型应用是多摄像头拼接式全景摄像机、全景泊车等等。作为图像拼接的关键技术之一,图像融合是将待融合图像的视场重合区域即融合区域进行叠加处理得到拼接重构的无缝全景图像。但从前述对于图像拼接技术的介绍可知,多成像装置的同时成像为分别进行独立曝光,而不同视角下拍摄场景的光照条件不尽相同,则当成像装置根据场景调整曝光时必然导致所采集到的待融合图像存在亮度和色度上的差异。采用这种存在亮度和色度差异的待融合图像直接进行图像叠加融合,该差异会通过对比而放大,最终会导致融合后的全景图像存在颜色突变,影响视觉感官效果。显然,待融合图像之间的亮度和色度差异越大,拼接后的全景图像的颜色突变会越明显。
针对图像融合技术所存在的这一问题,现有技术所提供的解决方案大多是采用图像处理算法进行亮度和色度调节。虽然图像处理算法可以不尽相同,处理时间点也可能存在区别例如在融合前对待融合图像进行处理或者融合后对全景图像进行平滑处理,但均存在以下同样的技术缺陷:
1、难以从根源上解决待融合图像之间亮度和色度差异的问题,并且会受到图像处理算法有效性的固有影响;
2、图像处理算法通常较为复杂,会造成系统应用资源的大量消耗; 而且随着图像分辨率增加,消耗也会急剧增加;
3、系统应用资源的消耗进而导致处理效率的下降。
因此,如何简单有效地使融合区域的图像过渡更自然,是目前本领域的技术难点,需要提供新的克服图像融合中颜色突变的技术以解决上述问题。
发明内容
为了解决该问题,本发明公开了一种克服图像融合中颜色突变的方法,适用于包括N个成像装置的全景设备,包括以下步骤:
对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
计算相邻待融合图像的融合区域的偏差量;
对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内;
其中,所述N≥2。
本发明还公开了一种克服图像融合中颜色突变的系统,适用于包括N个成像装置的全景设备,包括:
量化单元,用于对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
数据处理单元,用于计算相邻待融合图像的融合区域的偏差量;
增益调整单元,用于对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内;
其中,所述N≥2。
本发明还公开了一种全景设备,包括:
N个成像装置,其中所述N≥2;
N个模/数转换器,分别与所述N个成像装置对应连接,用于对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
处理器,与所述N个模/数转换器连接,用于计算相邻待融合图像的融合区域的偏差量,以及对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内。
本申请实施例提供一种克服图像融合中颜色突变的系统,适用于包括N个成像装置的全景设备,包括存储器、一个或多个处理器以及一个或多个程序,其中,所述一个或多个程序在由所述一个或多个处理器执行时执行下述操作:
对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
计算相邻待融合图像的融合区域的偏差量;
对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内;
其中,所述N≥2。
本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机可执行指令,所述计算机可执行指令响应于执行使得克服图像融合中颜色突变的系统执行操作,所述操作包括:
对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
计算相邻待融合图像的融合区域的偏差量;
对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内;
其中,所述N≥2。
通过本申请的技术方案,能够以硬件方式进行增益调整,从而实现所有相邻待融合图像的融合区域的输出亮度和色度基本保持一致,克服了现有图像融合技术通过图像处理算法进行亮度和色度调节所存在的有效性问题、资源过量消耗问题以及处理效率问题。
在发明内容部分中引入了一系列简化形式的概念,这将在具体实施方式部分中进一步详细说明。本申请内容部分并不意味着要试图限定出所要求保护的技术方案的关键特征和必要技术特征,更不意味着试图确定所要求保护的技术方案的保护范围。
以下结合附图,详细说明本申请的优点和特征。
附图说明
图1为本申请实施例所提供的、克服图像融合中颜色突变的方法的一 实施例的流程图。
图2为本申请实施例所提供的方法另一较佳实施例的流程图。
图3为本申请实施例所提供的克服图像融合中颜色突变的系统的示意图。
图4为本申请实施例所提供的全景设备的框图。
图5为本申请实施例所提供的克服图像融合中颜色突变的系统的示意图。
图6为本申请实施例所提供的用于克服图像融合中颜色突变的计算机程序产品。
具体实施方式
在下文的描述中,给出了大量具体的细节以便提供对本申请实施例更为彻底的理解。然而,对于本领域技术人员来说显而易见的是,本申请实施例可以无需一个或多个这些细节而得以实施。在其他的例子中,为了避免与本申请发生混淆,对于本领域公知的一些技术特征未进行描述。
为了彻底了解本申请,将在下列的描述中提出详细的结构。显然,本申请实施例的施行并不限定于本领域的技术人员所熟习的特殊细节。本申请的较佳实施例详细描述如下,然而除了这些详细描述外,本申请还可以具有其他实施方式。
作为主旨,本申请实施例公开了一种克服图像融合中颜色突变的方法,适用于在图像拼接过程中进行图像融合阶段的技术处理,以克服现有图像融合技术中采用图像处理算法所存在的各种缺陷。
在结合具体实施例对本申请实施例所提供的方法加以具体描述之前,需要针对本申请实施例的技术方案声明以下内容:
本申请实施例的适用场合为包括N个(2个或更多个)成像装置的全景成像系统例如全景泊车系统,由多个成像装置,例如摄像头,同时采集图像电信号以最终拼接为完整的全景图像。
以上所声明的内容对于本申请的全部实施例均一体适用。以下结合附图对本申请的实施例做详细描述。
图1示出了其一实施例的流程图,包括:
S1、对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
首先,本步骤是对各成像装置所采集到的图像电信号进行处理,该图像电信号具体可以通过成像装置(例如图像传感器)的感光平面获取。
其次,本步骤对上述图像电信号进行量化处理来获得待融合图像。本领域技术人员应当理解,这里所述的待融合图像是指需要进行图像融合处理以拼接为全景图像的数字图像。
其中,可以通过模/数转换器实现量化处理,并由此获得各待融合图像。本领域技术人员可以根据实际精度要求,选择8位、16位、24位或更高的量化字长来对图像电信号进行量化。
以及,各待融合图像还可以经过图像拼接过程中图像融合处理之前的一系列处理,例如经过降噪、配准等措施的处理。也就是说,本申请实施例所提供的技术方案不影响图像拼接的其他环节现有技术的应用,反过来,图像拼接的其他环节现有技术可以与本申请实施例所提供的技术方案结合使用。
有必要指出的是,步骤S1为针对每一帧全景图像逐帧执行,否则无法实现全景图像的重构。
S2、计算相邻待融合图像的融合区域的偏差量;
在本步骤中,偏差量是指融合区域(重叠区域)相应位置量化值的偏差统计值,例如其可以是相邻待融合图像在该区域内所有点像素值之差的和或者平方和,例如其可以是相邻待融合图像在该区域内所有点像素值均值之差,例如其可以是相邻待融合图像在该区域内按照一定规则对像素进行采样并由此计算采样点的像素值之偏差量,或者本领域技术人员根据实际需要而采取任何其他统计值。
S3、对各待融合图像进行增益调整,使得相邻待融合图像的融合区域的偏差量在预设范围内。
通过增益调整的方式使相邻待融合图像的融合区域的偏差量在预设范围内,是以硬件的方式(增益放大器或者能够执行增益调整的硬件处理器)实现对亮度和色度差异的处理,克服现有技术中图像处理算法所带来的弊端;而且相邻待融合图像的融合区域偏差量极小时,相邻待融合图像 的重合区域亮度和色度差异也就极其微小,有效达致本申请实施例的克服拼接后全景图像颜色突变的目的。
显然,当相邻待融合图像的融合区域的偏差量为零时,将达到最佳技术效果。但本领域技术人员显然可以理解,在图像处理领域,想要达到融合区域内所有像素值的完美一致是不现实的,因此只能是趋近于零,必然存在允许误差范围。例如像素灰度级为255的情况下,单个像素允许误差可以是在5以下或者10以下。只要在这一预设的允许误差范围内,因为差别极其微小,因此不至于在视觉上影响到融合效果。可以由本领域技术人员根据实际的精度需求具体设定,不影响本申请实施例技术方案的实现。
同样有必要指出的是,步骤S3也针对每一帧全景图像逐帧执行。
这样,经过上述步骤S1-S3的处理,可以实现本申请实施例的发明目的。
进一步的,在图像拼接过程中,步骤S1和S3如上所述是逐帧执行,但S2则可以根据需求以不同频率加以执行。
例如,可以选择每一帧都进行偏差量计算,即针对每一帧全景图像均执行步骤S1-S3,这样的调整方式无疑对于动态的全景图像播放可以得到最好的融合效果,但同时,对于连续显示的摄像机系统例如立体行车记录、监控等等其相对系统开销也较大。
作为另一个选择,考虑到有些环境中光照变化并没有特别剧烈,因此可以不逐帧计算偏差量,具体又可以分为两种情况。
一种情况例如当成像参数的变化达到预设阈值时重新执行上述步骤S2。具体而言,在有些环境场景中,如白天室外、不会频繁开关灯的室内等,其各成像设备的亮度并不会发生频繁的剧烈改变。这样,在前一次调整后,如果已根据偏差量分析出相邻的A图像比B图像暗4个亮度值,并通过处理器控制A的所有像素亮度增加4,那么可以一直维持这个增益来执行步骤S3,而不需要反复执行步骤S2来重新计算偏差量,直到成像参数的变化达到预设阈值,例如当出现遮挡,或新的光源导致亮度的较大变化,或者切换了拍摄场景等等情况,从而在效果和系统开销间达到平衡。
另一种情况例如每隔时间间隔T进行一次偏差量计算,即每隔时间间 隔T执行一次步骤S2,其余时段则保持前一次的增益调整值逐帧执行步骤S3。这种方式在成像参数的变化未达到预设阈值的情况下周期性执行偏差量计算,从而实现阶段性的不断修正,同样能够在效果和系统开销间达到平衡。
本领域技术人员可以理解,上述两种情况的处理可以互斥使用,即选择周期性执行S2或者基于阈值执行步骤S2。上述两种情况的处理当然也可以结合使用,以达至更好的技术效果。甚至于,上述不同的处理选择也可以结合使用,例如在参数频繁变动的情况下选择逐帧执行步骤S2,而参数变动不频繁的情况下则基于时间间隔T和/或阈值来执行步骤S2。
再进一步的,上述实施例中逐帧执行的步骤S3可以具有不同的实现方式。作为一个选择,例如可以取相邻待融合图像的融合区域的偏差量中间值,然后以该中间值分别调整这两个相邻待融合图像,使二者之间的偏差量趋近于0。
作为一个较佳实施例,本申请实施例还提供了如图2所示的克服图像融合中颜色突变的方法。其步骤S1、S2与图1所示实施例相同,但其步骤S3具体实现如下。
S31:确定各待融合图像之一作为第一基准图像;
其中,可以根据预先设定的规则来确定第一基准图像。
具体而言,可以根据预先对硬件的指定来确定所述第一基准图像。例如指定某一个成像装置所拍摄并经量化处理的待融合图像为每一帧全景图像进行融合的第一基准图像。具体的,对于成像装置的指定可以根据实际情况,例如根据哪个成像装置拍摄的图像电信号所对应的待融合图像更接近成像核心位置,或者哪个成像装置的采光域更大。
作为选择,也可以根据预先对于图像参数的设定来确定所述第一基准图像。例如,可以对各待融合图像的量化值进行分析,将分析结果与设备当前所选择场景模式的目标参数最接近的待融合图像确定为第一基准图像;再例如,可以选择与其他待融合图像的偏差量最小的待融合图像作为第一基准图像;等等。
当然,本方案也不排斥随机选择第一基准图像。
S32:获取与第一基准图像相邻的待融合图像与第一基准图像的融合 区域的第一偏差量;
可以理解,本步骤所获取的第一偏差量可以是针对当前全景图像帧计算获得,也可以是最近一次所执行的相邻待融合图像的融合区域的偏差量计算的结果。
进一步的,与第一基准图像例如P0相邻的待融合图像可能不止一幅,那么就分别计算每一幅相邻的待融合图像P1……Pm与第一基准图像P0的融合区域的偏差量D1……Dm,并针对每一个偏差量D1……Dm,分别执行后续步骤。
S33:以第一偏差量对与第一基准图像相邻的待融合图像整体进行增益调整,并将经增益调整后的图像作为第二基准图像;
即,以D1对P1进行增益调整,以Dm对Pm进行增益调整,以此类推;调整后的P1/……/Pm均为第二基准图像。
具体而言,增益调整的对象是每一待融合图像(P1、……、Pm)所有像素点的量化值,例如将P1所有像素点的量化值+x或者-y。
S34:获取与第二基准图像相邻的待融合图像与第二基准图像的融合区域的第二偏差量;
同样的,本步骤所获取的第二偏差量可以是针对当前全景图像帧计算获得,也可以是最近一次所执行的相邻待融合图像的融合区域的偏差量计算的结果。
以及,与第二基准图像P1相邻的待融合图像可能不止一幅例如P11……P1n,与第二基准图像Pm相邻的待融合图像也可能不止一幅例如Pm1……Pmk,同样分别计算每一幅相邻的待融合图像与第二基准图像的融合区域的量化值偏差量D11……D1n,Dm1……Dmk并分别执行后续步骤。
S35:以第二偏差量对与第二基准图像相邻的待融合图像整体进行增益调整,并将经增益调整后的图像作为第三基准图像;
即,以D11对P11进行增益调整,以D1n对P1n进行增益调整,以此类推;以Dm1对Pm1进行增益调整,以Dmk对Pmk进行增益调整,以此类推;调整后的P11/……/P1n、Pm1/……/Pmk均为第三基准图像
依次类推,直至各待融合图像全部遍历。
作为一个优选实施例,可以在步骤S31之后,步骤S32之前,执行对第一基准图像进行增益补偿的步骤,该增益的确定可以根据当前所选择场景模式的目标参数来确定,使得遍历后全部待融合图像均具有更好的量化值参数。
以上对本申请实施例所提供的克服图像融合中颜色突变的方法结合具体实施例进行了阐述。相应的,本申请实施例还提供了一种克服图像融合中颜色突变的系统,适用于包括N个成像装置的全景设备,N≥2。如图3所示,包括:
量化单元,用于对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;本单元可以采用模/数转换器实现;
数据处理单元,用于计算相邻待融合图像的融合区域的偏差量;
增益调整单元,对所述各待融合图像进行增益调整,使得相邻待融合图像的融合区域的偏差量在预设范围内。
本领域技术人员可以理解,本申请实施例所提供的克服图像融合中颜色突变的系统中,各单元的具体实现方案和可能实施例同样适用于相应方法部分所阐述的内容,并达至相同的技术效果,在此不再赘述。
进一步的,本申请实施例还提供了一种全景设备,如图4所示。
该全景设备包括N个成像装置10(1)、10(2)……10(N)。
其中N大于等于2,即为包括多个成像设备的全景设备。
以及,成像装置具体可以为图像传感器,用于进行图像采集。
N个模/数转换器20(1)、20(2)……20(N),分别与成像装置10(1)、10(2)……10(N)对应连接,用于对各成像装置10(1)、10(2)……10(N)所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
处理器30,与模/数转换器20(1)、20(2)……20(N)连接,用于计算相邻待融合图像的融合区域的偏差量,以及对各待融合图像的进行增益调整,使得相邻待融合图像的融合区域的偏差量在预设范围内(趋近于零)。
本领域技术人员同样可以理解,本申请实施例所提供的全景设备中,第一处理器30的具体实现方案和可能实施例同样适用于相应方法部分所 阐述的内容,并达至相同的技术效果,在此不再赘述。
但有必要指出的是,全景设备还可以包括或者外接用于进行全景融合的处理器和显示器,可以根据全景设备的类型或者需求而有区别地加以实现,在此不再赘述。
进一步的,本申请实施例还提供了一种克服图像融合中颜色突变的系统,适用于包括N个成像装置的全景设备,N≥2。如图5所示,包括:
存储器91、一个或多个处理器92以及一个或多个程序93。
其中,一个或多个程序93在由一个或多个处理器92执行时执行下述操作:
S1、对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
S2、计算相邻待融合图像的融合区域的偏差量;
S3、对各待融合图像进行增益调整,使得相邻待融合图像的融合区域的偏差量在预设范围内。
进一步的,本申请实施例提供了一种能够用于克服图像融合中颜色突变的计算机程序产品100,适用于包括N个成像装置的全景设备,N≥2。如图6所示,程序产品100可以包括信号承载介质101。信号承载介质101可以包括一个或更多个指令102,该指令102在由例如处理器执行时,处理器可以提供以上针对图1-5描述的功能。例如,指令102可以包括:用于对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像的一个或多个指令;用于计算相邻待融合图像的融合区域的偏差量的一个或多个指令;以及用于对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内的一个或多个指令。因此,例如,参照图3,克服图像融合中颜色突变的系统可以响应于指令102来进行图1中所示的步骤中的一个或更多个。
在一些实施中,信号承载介质101可以包括计算机可读介质103,诸如但不限于硬盘驱动器、压缩盘(CD)、数字通用盘(DVD)、数字带、存储器等。在一些实现中,信号承载介质101可以包括可记录介质104,诸如但不限于存储器、读/写(R/W)CD、R/W DVD等。在一些实现中,信号承载介质101可以包括通信介质105,诸如但不限于数字 和/或模拟通信介质(例如,光纤线缆、波导、有线通信链路、无线通信链路等)。因此,例如,程序产品100可以通过RF信号承载介质101传送给多指滑动手势的识别装置的一个或多个模块,其中,信号承载介质101由无线通信介质(例如,符合IEEE 802.11标准的无线通信介质)传送。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。
本申请已经通过上述实施例进行了说明,但应当理解的是,上述实施例只是用于举例和说明的目的,而非意在将本申请限制于所描述的实施例范围内。此外本领域技术人员可以理解的是,本申请并不局限于上述实施例,根据本申请实施例的教导还可以做出更多种的变型和修改,这些变型和修改均落在本申请实施例所要求保护的范围以内。本申请实施例的保护范围由附属的权利要求书及其等效范围所界定。

Claims (10)

  1. 一种克服图像融合中颜色突变的方法,适用于包括N个成像装置的全景设备,其特征在于,包括以下步骤:
    对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
    计算相邻待融合图像的融合区域的偏差量;
    对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内;
    其中,所述N≥2。
  2. 根据权利要求1所述的方法,其特征在于,还包括针对每一帧全景图像执行所述计算相邻待融合图像的融合区域的偏差量的步骤。
  3. 根据权利要求1所述的方法,其特征在于,还包括以时间间隔T执行所述计算相邻待融合图像的融合区域的偏差量的步骤。
  4. 根据权利要求1所述的方法,其特征在于,还包括当成像参数的变化达到预设阈值时执行所述计算相邻待融合图像的融合区域的偏差量的步骤。
  5. 根据权利要求1-4任一所述的方法,其特征在于,所述对各待融合图像进行增益调整的步骤包括:
    确定所述各待融合图像之一作为第一基准图像;
    获取与所述第一基准图像相邻的待融合图像与所述第一基准图像的融合区域的第一偏差量;
    以所述第一偏差量对与所述第一基准图像相邻的待融合图像整体进行增益调整,并将经增益调整后的图像作为第二基准图像;
    获取与所述第二基准图像相邻的待融合图像与所述第二基准图像的融合区域的第二偏差量;
    以所述第二偏差量对与所述第二基准图像相邻的待融合图像整体进行增益调整,并将经增益调整后的图像作为第三基准图像;
    依次类推,直至各待融合图像全部遍历。
  6. 根据权利要求5所述的方法,其特征在于,根据预先对硬件的指定来确定所述第一基准图像。
  7. 根据权利要求5所述的方法,其特征在于,根据预先对于图像参数的设定来确定所述第一基准图像。
  8. 根据权利要求5所述的方法,其特征在于,在确定所述第一基准图像后,还包括对所述第一基准图像进行增益补偿的步骤。
  9. 一种克服图像融合中颜色突变的系统,适用于包括N个成像装置的全景设备,其特征在于,包括:
    量化单元,用于对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
    数据处理单元,用于计算相邻待融合图像的融合区域的偏差量;
    增益调整单元,用于对所述各待融合图像进行增益调整,使得所述相邻待融合图像的融合区域的偏差量在预设范围内;
    其中,所述N≥2。
  10. 一种全景设备,包括:
    N个成像装置,其中所述N≥2;
    N个模/数转换器,分别与所述N个成像装置对应连接,用于对各成像装置所采集到的图像电信号进行量化处理,形成全景图像的各待融合图像;
    处理器,与所述N个模/数转换器连接,用于计算相邻待融合图像的融合区域的偏差量,以及对所述各待融合图像进行增益调整使得所述相邻待融合图像的融合区域的偏差量在预设范围内。
PCT/CN2016/083367 2015-11-11 2016-05-25 克服图像融合中颜色突变的方法、系统及设备 WO2017080179A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510766257.2 2015-11-11
CN201510766257.2A CN105894449B (zh) 2015-11-11 2015-11-11 克服图像融合中颜色突变的方法及系统

Publications (1)

Publication Number Publication Date
WO2017080179A1 true WO2017080179A1 (zh) 2017-05-18

Family

ID=57002292

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/083367 WO2017080179A1 (zh) 2015-11-11 2016-05-25 克服图像融合中颜色突变的方法、系统及设备

Country Status (3)

Country Link
US (1) US20170132820A1 (zh)
CN (1) CN105894449B (zh)
WO (1) WO2017080179A1 (zh)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107872658A (zh) * 2016-09-28 2018-04-03 株式会社理光 投影接缝区域亮度调节方法、以及投影接缝区域亮度调节装置
CN106937102B (zh) * 2016-12-25 2019-01-22 惠州市德赛西威汽车电子股份有限公司 一种全景倒车系统色彩平衡调节方法
CN107085842B (zh) * 2017-04-01 2020-04-10 上海讯陌通讯技术有限公司 自学习多路图像融合的实时矫正方法及系统
TW201839717A (zh) * 2017-04-19 2018-11-01 睿緻科技股份有限公司 影像拼接方法及其影像拼接裝置
US10614609B2 (en) 2017-07-19 2020-04-07 Mediatek Inc. Method and apparatus for reduction of artifacts at discontinuous boundaries in coded virtual-reality images
EP3499870B1 (en) * 2017-12-14 2023-08-23 Axis AB Efficient blending using encoder
US10764496B2 (en) * 2018-03-16 2020-09-01 Arcsoft Corporation Limited Fast scan-type panoramic image synthesis method and device
CN111225180B (zh) * 2018-11-26 2021-07-20 浙江宇视科技有限公司 画面处理方法及装置
KR20200111446A (ko) * 2019-03-19 2020-09-29 삼성전자주식회사 합성 이미지를 생성하는 전자 장치 및 방법
CN112312108B (zh) * 2019-08-02 2022-09-13 浙江宇视科技有限公司 一种白平衡的异常确定方法、装置、存储介质及电子设备
CN110751608B (zh) * 2019-10-23 2022-08-16 北京迈格威科技有限公司 一种夜景高动态范围图像融合方法、装置和电子设备
CN111179165B (zh) * 2019-11-29 2023-07-28 南京泓众电子科技有限公司 一种全景图像生成方法及装置
TWI760733B (zh) 2020-04-27 2022-04-11 晶睿通訊股份有限公司 影像校正方法及其影像校正裝置
CN111669559A (zh) * 2020-05-11 2020-09-15 安徽百诚慧通科技有限公司 多通道ccd图像亮度和色度差异校正方法、装置及存储介质
CN111669519B (zh) * 2020-05-11 2022-10-28 安徽百诚慧通科技股份有限公司 一种多通道ccd图像非均匀性校正方法、装置及存储介质
CN111970481A (zh) * 2020-07-07 2020-11-20 深圳英飞拓智能技术有限公司 一种基于5g传输超高清解码拼接视频方法及系统
CN112837254A (zh) * 2021-02-25 2021-05-25 普联技术有限公司 一种图像融合方法、装置、终端设备及存储介质
CN113516155A (zh) * 2021-04-12 2021-10-19 佛山市顺德区美的洗涤电器制造有限公司 用于处理图像的方法、处理器、控制装置及家用电器
CN115908210A (zh) * 2021-08-05 2023-04-04 中兴通讯股份有限公司 图像处理方法、电子设备、计算机可读存储介质
CN117333372B (zh) * 2023-11-28 2024-03-01 广东海洋大学 一种海洋生物图像的融合拼接方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101262567A (zh) * 2008-04-07 2008-09-10 北京中星微电子有限公司 自动曝光方法与装置
CN102063712A (zh) * 2010-11-04 2011-05-18 北京理工大学 基于子带结构的多曝光图像融合方法
US8411938B2 (en) * 2007-11-29 2013-04-02 Sri International Multi-scale multi-camera adaptive fusion with contrast normalization
CN103186894A (zh) * 2013-03-22 2013-07-03 南京信息工程大学 一种自适应分块的多聚焦图像融合方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11205648A (ja) * 1998-01-09 1999-07-30 Olympus Optical Co Ltd 画像合成装置
JP5299488B2 (ja) * 2011-09-13 2013-09-25 カシオ計算機株式会社 画像処理装置、画像処理方法及びプログラム
JP5729237B2 (ja) * 2011-09-26 2015-06-03 カシオ計算機株式会社 画像処理装置、画像処理方法、及びプログラム
JP6084434B2 (ja) * 2012-10-31 2017-02-22 クラリオン株式会社 画像処理システム及び画像処理方法
CN103338343A (zh) * 2013-05-29 2013-10-02 山西绿色光电产业科学技术研究院(有限公司) 以全景图像为基准的多路图像无缝拼接方法及装置
CN104618648B (zh) * 2015-01-29 2018-11-09 桂林长海发展有限责任公司 一种全景视频拼接系统及拼接方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8411938B2 (en) * 2007-11-29 2013-04-02 Sri International Multi-scale multi-camera adaptive fusion with contrast normalization
CN101262567A (zh) * 2008-04-07 2008-09-10 北京中星微电子有限公司 自动曝光方法与装置
CN102063712A (zh) * 2010-11-04 2011-05-18 北京理工大学 基于子带结构的多曝光图像融合方法
CN103186894A (zh) * 2013-03-22 2013-07-03 南京信息工程大学 一种自适应分块的多聚焦图像融合方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WANG, WEI ET AL.: "Overview of Digital Image Mosaic", JOURNAL OF CHINESE COMPUTER SYSTEMS, vol. 27, no. 7, 31 July 2006 (2006-07-31) *

Also Published As

Publication number Publication date
CN105894449A (zh) 2016-08-24
CN105894449B (zh) 2019-11-08
US20170132820A1 (en) 2017-05-11

Similar Documents

Publication Publication Date Title
WO2017080179A1 (zh) 克服图像融合中颜色突变的方法、系统及设备
US10205896B2 (en) Automatic lens flare detection and correction for light-field images
US8737736B2 (en) Tone mapping of very large aerial image mosaic
US11070741B2 (en) High dynamic range video shooting method and device
US9036047B2 (en) Apparatus and techniques for image processing
WO2020051898A1 (zh) 双光图像自动曝光方法、装置、双光图像相机及机器存储介质
US20150373247A1 (en) Method and apparatus for dynamic range expansion of ldr video sequence
TW201841139A (zh) 用於處理影像性質圖的方法及裝置
US20130169825A1 (en) Method and apparatus for measuring response curve of an image sensor
KR101215666B1 (ko) 오브젝트 색상 보정 방법, 시스템 및 컴퓨터 프로그램 제품
WO2013114803A1 (ja) 画像処理装置及びその画像処理方法、並びにコンピュータ・プログラム、および画像処理システム
TWI698129B (zh) 影像處理方法及其電子裝置
KR20100067268A (ko) 유효 영역을 이용한 자동 화이트 밸런스 조정 장치 및 방법
KR20150040559A (ko) 관심영역 기반의 화질개선 장치와 그를 위한 컴퓨터로 읽을 수 있는 기록 매체
US20220261970A1 (en) Methods, systems and computer program products for generating high dynamic range image frames
KR102207940B1 (ko) 영상 노이즈 제거 방법 및 시스템
KR101904108B1 (ko) 평면 모델링을 통한 깊이 영상의 가변적 블록 부호화 방법 및 장치
US20220408013A1 (en) DNN Assisted Object Detection and Image Optimization
JP2016127505A (ja) 画像処理装置及び画像処理方法
CN106920217B (zh) 图像矫正的方法及装置
CN111147760B (zh) 一种光场相机及其光度调整方法、装置及电子设备
US11716521B2 (en) Using IR sensor with beam splitter to obtain depth
US11765309B2 (en) Video capturing subject using IR light
CN107613221B (zh) 图像处理方法及装置、计算机可读存储介质、终端
US20210304374A1 (en) Tone mapping method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16863360

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16863360

Country of ref document: EP

Kind code of ref document: A1