WO2012034467A1 - 一种天基大视场成像点源/斑状目标的图像恢复增强方法 - Google Patents

一种天基大视场成像点源/斑状目标的图像恢复增强方法 Download PDF

Info

Publication number
WO2012034467A1
WO2012034467A1 PCT/CN2011/078380 CN2011078380W WO2012034467A1 WO 2012034467 A1 WO2012034467 A1 WO 2012034467A1 CN 2011078380 W CN2011078380 W CN 2011078380W WO 2012034467 A1 WO2012034467 A1 WO 2012034467A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
block
sub
target
point spread
Prior art date
Application number
PCT/CN2011/078380
Other languages
English (en)
French (fr)
Inventor
张天序
武道龙
陈建冲
关静
余铮
陈浩
左芝勇
张蒲涛
Original Assignee
华中科技大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华中科技大学 filed Critical 华中科技大学
Publication of WO2012034467A1 publication Critical patent/WO2012034467A1/zh
Priority to US13/731,100 priority Critical patent/US8737761B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Definitions

  • the invention belongs to the field of cross-scientific technology combining aerospace technology and image processing, and particularly relates to a method for recovering and enhancing a point source/spotted target of a large-scale field of view image.
  • image quality degradation such as image distortion or blurring
  • plaque/point targets it will produce a weakening of the point source or plaque target signal, which will result in missed detection of the weak source point target or plaque target.
  • the blur produced by the imaging sensor itself can be approximated by a point spread function model.
  • the traditional small field of view imaging sensor uses the spatial invariant point spread function degradation model to not accurately reflect the imaging distortion produced by the large field of view imaging sensor. More importantly, the distortion of the system increases with time after the imaging system is built. How to develop digital processing methods, adaptively correct its distortion and blur, and improve its performance become an important research topic. This is a very weak problem in the current technological development that remains to be solved.
  • the distortion caused by the large field of view imaging sensor should be characterized by a spatially varying point spread function, and its imaging distortion has characteristics that vary with time and space.
  • the point spread function of the imager distortion is set within a certain exposure time.
  • the equivalent point spread function is /3 ⁇ 4 (/"), where r is the distance from the center of the lens or the center of the focal plane, as shown in Figure ( 2) shown.
  • r is the distance from the center of the lens or the center of the focal plane, as shown in Figure ( 2) shown.
  • the image blur caused by the limit is relatively small, that is, /WKr) is relatively small; and the farther away from the optical lens axis, the distortion of the imaging sensor optical lens is greatly changed, and the image is greatly blurred when imaged.
  • / ⁇ ⁇ r is also relatively large (Figure (2b)).
  • the point source target may fall completely on the photoreceptor, that is, a pixel image of one pixel. But it may fall into 2 ⁇ 4 photographic elements (as shown in Figure (3b) ⁇ Figure (3d)). At this time, these sensitivities all respond, making a point image into a 2 ⁇ 4 pixel image. It causes a blurring effect. Similarly, an example in which a spotted object larger than one photosensitive element area falls into a plurality of photosensitive elements to form an additional blur is shown in Fig. (3e:).
  • This method proposes a model of spatially variable point spread function, and considers the random degradation blur effect produced by the imaging focal plane discrete photoreceptor array, and proposes and implements an effective point source/spot target recovery enhancement method.
  • the invention provides a space-based large field of view imaging point source/spot target recovery enhancement method, The steps include:
  • the spatially variable degradation image of the imaging sensing device due to the limitation of the design and manufacturing technology level is divided into a plurality of spatially invariant image sub-blocks, and the overall point spread function of each image sub-block is constructed.
  • step 2 The spatial variable image degradation model established in step 1 is used as a constraint, and each image sub-block is separately corrected;
  • x is the support domain of the target image
  • Y is the support domain of the observed image
  • W is the corrected image obtained by the second iteration, h
  • W is the point spread function obtained from the nth iteration
  • & is the actually observed image data, ie /;
  • Splicing step The splicing algorithm is used to splicing the corrected image sub-blocks to obtain a complete corrected image /.
  • the invention analyzes the origin of the spatial variable point spread function of the large field of view imaging sensor, and proposes and implements a space-based large field of view imaging point source/spot target recovery enhancement method, which can degenerate the point source/speckle target Get an effective recovery enhancement (Figure 8).
  • FIG. 1 is a flow chart of a method for recovering a point source/spotted target of a space-based large field of view image of the present invention
  • Figure 2 is a schematic diagram of a spatially variable point spread function model caused by a simulated imaging sensor optical lens.
  • Figure (2a) is the center of the lens or the center of the focal plane.
  • the distance (2b) is a standard deviation diagram of the point spread function W ⁇ r) corresponding to the distance from the axis of the lens, which is equivalent to the image blur.
  • the area identified as A in Fig. 2c is minimally affected by the blur. Therefore, the point spread function support domain is also the smallest, and as the distance from the center increases, the blur effect is larger, so the region point spread function support domain identified as B, C, D, E, F also becomes larger in turn, and The same point spread function is also different, and the diagram (2d) shows the center area PSF. Small, ⁇ small; surrounded by PSF, ⁇ large.
  • Figure (2e) is a partial simulation of the point spread function PSF (r) far from the lens axis.
  • Figure (3a) is a schematic diagram of a discrete array of photosensitive elements of a digital imaging sensing device
  • Figure (3b 3d) is a schematic diagram of a point source target falling on discrete photosensitive elements of the imaging focal plane.
  • the square represents the photographic element map, and the bold circle is the point source target or the plaque target.
  • Figure (3b) shows the point source target falling on a photographic element
  • (3c) is the point source target falling on two photographic elements
  • Fig. (3d) is the point source target falling on four photographic elements.
  • Figure (3e) is an example where the patchy target falls on nine photoreceptors.
  • Figure (4a) is a simulated image of a point source target falling on a discrete photosensitive element of the imaging focal plane, where the target 1 is on a photoreceptor and the target 2 is on two photoreceptors, the target 3 4 is the case of falling on four photoreceptors, causing the image to become large and blur.
  • Figure (4b) is a simulated image of four spotted targets falling on nine photoreceptors at different image plane positions.
  • Figure (5a) is a spatially variable degradation simulation image caused by the distortion of the imaging sensor optical lens by the point source target falling on the discrete focal plane of the imaging focal plane (Fig. 4a), and Fig. 5b is the plaque target falling within the nine sensitization On the element (Fig. 4b), the spatially variable degradation simulation image caused by the distortion of the imaging sensor optical lens.
  • the overall effect is that the point source or plaque target image becomes blurred.
  • Figure (6a) is a three-dimensional display of point source target 3 in Figure (5a).
  • Figure (6b) is a three-dimensional display of the point source target 4 in Figure (5a).
  • Figure (7a) is a three-dimensional display of the squamous target 1 (space-variant degradation caused by the imaging sensor optical lens on 9 photographic elements in Figure 5b).
  • Figure (7b) is the spotted target 2 in Figure (5b) (falling on the imaging sensor light on 9 photosensitive elements)
  • the spatially degraded space caused by the learning lens:) is a three-dimensional display.
  • Figure (7c) is a three-dimensional display of the squamous target 3 (space-variant degradation caused by the imaging sensor optical lens on 9 photographic elements in Figure 5b).
  • Figure (7d) is a three-dimensional display of the squamous target 4 (falling spatially variable degradation caused by the imaging sensor optical lens on 9 photographic elements in Figure 5b).
  • Figure (8a) is a corrected image obtained by the conventional spatial invariant correction algorithm for Figure (5a)
  • Figure 8 (8b) is a corrected image obtained by the conventional spatial invariant correction algorithm for Figure (5b). .
  • Figure (9a) is the corrected image obtained by the spatial variable correction algorithm of Figure 5a
  • Figure 9b is the corrected image obtained by the spatial variable correction algorithm of Figure 5b
  • the correction effect is improved
  • the target energy concentration is better.
  • Figure (10a) is a three-dimensional display of the point source target 3 in Fig. (9a) corrected by the restoration enhancement method of the space-based large field of view imaging point source/spotted target of the present invention.
  • Figure (10b) is a three-dimensional display of the point source target 4 in Fig. (9a) corrected by the restoration enhancement method of the space-based large field of view imaging point source/spotted target of the present invention.
  • Fig. (11a) is a three-dimensional display corrected by the recovery enhancement method of the point source/spotted target of the space-based large field of view of the present invention in Fig. (9b).
  • the graph (l ib) is the three-dimensional display corrected by the recovery enhancement method of the point source/spotted target of the space-based large field of view image of the present invention in Fig. 9b.
  • the figure (lie) is the three-dimensional display corrected by the recovery enhancement method of the point source/spotted target of the space-based large field of view of the present invention in Fig. 9b.
  • Figure (l id) is the spotted target 4 in Figure (9b). After the space-based large field of view imaging point source of the present invention / The recovery enhancement method of the plaque target is corrected after the three-dimensional display.
  • Fig. 12 is a schematic view showing an overlapping area of data obtained by appropriately extending the boundary of the isona region.
  • Fig. 13 is a schematic view showing the overlapping area of data in the one-dimensional direction obtained by appropriately extending the boundary of the iso-halo region outward.
  • the spatially variable degradation image of the imaging sensing device due to the limitation of the design and manufacturing technology level is divided into a plurality of spatially invariant image sub-blocks, and the overall point spread function of each image sub-block is constructed.
  • X is the support region of the target image
  • Y is an observation image of the support domain
  • ze X ⁇ ei ⁇ eF as the corrected image to the i-th block of the image sub-blocks of W iteration obtained
  • h is the observation image of the support domain
  • W is the point spread function obtained for the ⁇ iteration of the first image sub-block
  • & is the image data actually observed for the first image sub-block, ie.
  • the splicing step uses the splicing algorithm to splicing the corrected image sub-blocks to obtain a complete corrected image.
  • the principle is that when the image is segmented in step (1), the contour of the iso-halo region is appropriately extended outward, and the weighting coefficient is constructed according to the distance from the pixel to the boundary of the overlapping region, and the grading is completed by using the overlapping region data to remove the stitched image visually.
  • Splitting Use the method shown in Figure 10:
  • the area [1] [2] [3] is the part of the scale model that does not overlap with other areas, so no other processing is needed, and the original value can be used directly
  • the area [ 4] is two districts In the process, if it is not processed, it will produce a more obvious block effect. And simply averaging the values of the area, the effect is not good. So we use a weighted average method.
  • the adjacent two image blocks are respectively represented by x and y, and the stitched images are represented by Z, as shown in FIG.
  • the sizes of X and y are M and N respectively, and the boundary of the block boundary region of the image extends outward L, that is, the boundary of the block region is in the ML column of X, and the column L in the y column.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Description

一种天基大视场成像点源 /斑状目标的图像恢复增强方法 【技术领域】
本发明属于航天技术与图像处理相结合的交叉科学技术领域,具 体涉及一种天基大视场成像点源 /斑状目标的恢复增强方法。
【背景技术】
在天基对地观测时,由于成像探测设备设计和制造工艺技术水平 的限制, 将造成图像品质的下降, 如图像畸变或模糊。 特别对斑 /点 状目标, 将产生点源或斑状目标信号的减弱, 对信号弱的点源目标或 斑状目标造成漏检的后果。成像传感器自身产生的模糊可以用点扩展 函数模型近似表征。但是, 传统小视场成像传感器使用空间不变点扩 展函数退化模型不能准确的反映大视场成像传感器产生的成像畸变。 更重要的是, 在成像系统建成后随着时间推移系统的畸变会增加, 如 何发展数字处理的方法, 自适应校正其畸变和模糊、 提高其性能, 就 成为重要的研宄课题。这是当前技术发展中十分薄弱的尚待解决的难 题。大视场成像传感器所引起的畸变应该用空间变化的点扩展函数来 表征, 其成像畸变具有随时间、 空间变化的特点。
设在某一定曝光时间内, 成像器畸变的点扩展函数为
它可通过两个点扩展函数的卷积得到:
1.随着与离镜头轴心的距离不同造成不同的图像模糊,其等效的 点扩展函数为 /¾ (/")。其中 r为离镜头中心或焦平面中心。的距离,如 图 (2)所示。根据成像传感器的制作技术水平, 在焦平面中心, 由衍射 极限造成的图像模糊相对比较小, 即 /WKr)比较小; 而离光学镜头轴 心的距离越远, 其成像传感器光学镜头的制作畸变更大, 成像时会对 图像造成较大的模糊, 从而/ ^ ^r)也就比较大 (如图 (2b))。
2.如图(3a)所示, 在某感光元 处其点扩展函数为
P^(r) ( r = ^2 + ) 2 ),点源目标可能完全落入该感光元上, 即一个像素 的点像。 但也可能落入 2~4个感光元上 (如图 (3b)〜图 (3d)), 此时这些 感光元均产生响应, 使一个点像变成 2~4个像素的点像, 这就造成模 糊效应。 同样, 大于一个感光元面积的斑状目标落入多个感光元形成 附加模糊的例子见图 (3e:)。 这就是说, 由于成像位置不同, 同样大小 的点源或斑状目标在焦平面离散感光元阵列上将形成随机的附加模 糊的像。 我们可用一个随机的点扩展函数 raF2 c,) 来表示, ^ , 是源于数字型成像传感装置离散的感光元阵列。则总的点扩展函数表 示为
PSF(x, y) = PSF, (x, y) * PSF2 (x, y)
综上所述, 需要发明新的数字图像处理技术, 来改善和校正由于 上述成像系统本身造成的成像模糊问题。
【发明内容】
本发明的目的在于提供一种天基大视场成像点源 /斑状目标的恢 复增强方法。该方法提出了空间可变的点扩展函数的模型, 并考虑到 成像焦平面离散的感光元阵列产生的随机退化模糊效应,提出并实现 了有效的点源 /斑状目标的恢复增强方法。
本发明提供的天基大视场成像点源 /斑状目标的恢复增强方法, 其步骤包括:
地面准备阶段:
(1) 将成像传感装置因设计和制造工艺技术水平的限制而造成 的空间可变退化图像分成多个空间不变的图像子块,并构造各图像子 块的总体点扩展函数。
(1.1)根据感光元离镜头轴心的距离, 将焦平面上灰度图像分成 M个子块, 各图像子块用 为图像子块的序号, = 1,2,· · ·,Μ )表示, 各图像子块内可近似看成等晕区;
(1.2)在地面预先用光学设备测出各图像子块内的点扩展函数 PSFn (r) , 并存储到数据库中;
(1.3)针对各图像子块,构造成像焦平面离散感光元阵列随机退化 模糊效应点扩展函数模型/ ^ ^, ;
(1.4) 构 建 各 子 块 的 总 体 点 扩 展 函 数 模 型 PSF{ (X, y) = PSFn (x, y) * PSFi2 (x, y)。 在线处理阶段:
(2) 以步骤 1建立的空间可变图像退化模型为约束条件, 对各图 像子块分别进行校正;
(2.1)用最大似然估计算法对各图像子块进行校正。将 作为 点扩展的迭代初值, 即 ^ P^i^) , 将/;作为目标图像的迭代初值, 即 ° = /;, 并设置迭代次数为 N进行迭代。 迭代公式如下:
Figure imgf000005_0001
<≡Χ h.n ( ) = h 1 (x)V g( (l≤n≤N)
Figure imgf000006_0001
式 (1)、 (2)中, x为目标图像的支持域, Y 为观察图像的支持域, z e X, y,y e /;" W为第《次迭代得到的校正图像, h," W为第 n次迭 代得到的点扩散函数, &是实际观测到的图像数据, 即 /;。
(2.2) 达到规定的迭代次数《 = N,得到各图像子块的校正图像 , 即 = fi N(X), 以及总体点扩展函数 ¾ = hN{X)。
(3) 拼接步骤: 用拼接算法将各校正后的图像子块 ^拼接起来, 得到完整的校正图像 /。
本发明分析了大视场成像传感器空间可变点扩展函数产生的起 源, 提出并实现了一种天基大视场成像点源 /斑状目标的恢复增强方 法, 可使退化的点源 /斑状目标得到有效的恢复增强 (如图 8)。
【附图说明】
图 1 是本发明的天基大视场成像点源 /斑状目标的恢复增强方法 的流程图;
图 2 是模拟的成像传感器光学镜头造成的空间可变的点扩展函 数模型示意图。 图 (2a)为离镜头中心或焦平面中心。的距离, 图 (2b)是 与离镜头轴心的距离相对应造成图像模糊等效的点扩展函数 W^r) 的标准差示意图, 图 (2c)中标识为 A的区域受到的模糊影响最小, 因 此点扩展函数支撑域也最小, 随着离中心的距离的增大, 模糊影响越 大, 因此标识为 B、 C、 D、 E、 F的区域点扩展函数支撑域也依次变 大,并且标识相同的点扩展函数也不同,图 (2d)示意图表示中心区 PSF 小, σ小; 四周 PSF大, σ大。 图 (2e)是离镜头轴心较远的部分点扩 展函数 PSF (r)仿真模拟图。
图 (3a)是数字型成像传感装置离散的感光元阵列示意图, 图 (3b 3d)是点源目标落在成像焦平面离散的感光元上的示意图。方格 表示感光元图, 加粗的圆形为点源目标或斑状目标。 图 (3b)是点源目 标落在一个感光元上的情况, 图 (3c)是点源目标落在两个感光元上的 情况, 图 (3d)是点源目标落在四个感光元上的例子。 图 (3e)是斑状目 标落在九个感光元上的例子。
图 (4a)是点源目标落在成像焦平面离散的感光元上的仿真图像, 其中目标 1是落在一个感光元上的情况, 目标 2是落在两个感光元上 的情况, 目标 3、 4是落在四个感光元上的情况, 造成点像变大即模 糊。 图 (4b)是 4个斑状目标分别在不同像面位置落在九个感光元上的 仿真图像。
图 (5a)是点源目标落在成像焦平面离散的感光元上 (图 4a)经成像 传感器光学镜头畸变造成的空间可变的退化仿真图像, 图 (5b)是斑状 目标落在九个感光元上 (图 4b),又经成像传感器光学镜头畸变造成的 空间可变的退化仿真图像。 总的效应是点源或斑状目标像变模糊。
图 (6a)是图 (5a)中点源目标 3的三维显示。
图 (6b)是图 (5a)中点源目标 4的三维显示。
图 (7a)是图 (5b)中斑状目标 1(落在 9个感光元上经成像传感器光 学镜头造成的空间可变退化:)的三维显示。
图 (7b)是图 (5b)中斑状目标 2(落在 9个感光元上经成像传感器光 学镜头造成的空间可变退化:)的三维显示。
图 (7c)是图 (5b)中斑状目标 3(落在 9个感光元上经成像传感器光 学镜头造成的空间可变退化:)的三维显示。
图 (7d)是图 (5b)中斑状目标 4(落在 9个感光元上经成像传感器光 学镜头造成的空间可变退化:)的三维显示。
图 (8a)是对图 (5a)用传统的空间不变校正算法得到的校正图像, 图 (8b)是对图 (5b)用传统的空间不变校正算法得到的校正图像, 校正 效果不好。
图 (9a)是对图 5a用空间可变校正算法得到的校正图像, 图 (9b)是 对图 5b)用空间可变校正算法得到的校正图像, 校正效果改善了, 目 标能量集中度较好。
图 (10a)是图 (9a)中点源目标 3 经过本发明天基大视场成像点源 / 斑状目标的恢复增强方法校正后的三维显示。
图 (10b)是图 (9a)中点源目标 4 经过本发明天基大视场成像点源 / 斑状目标的恢复增强方法校正后的三维显示。
图 (11a)是图 (9b)中斑状目标 1 经过本发明天基大视场成像点源 / 斑状目标的恢复增强方法校正后的三维显示。
图 (l ib)是图 (9b)中斑状目标 2 经过本发明天基大视场成像点源 / 斑状目标的恢复增强方法校正后的三维显示。
图 (l ie)是图 (9b)中斑状目标 3 经过本发明天基大视场成像点源 / 斑状目标的恢复增强方法校正后的三维显示。
图 (l id)是图 (9b)中斑状目标 4 经过本发明天基大视场成像点源 / 斑状目标的恢复增强方法校正后的三维显示。
图 12是等晕区边界适当往外延伸得到的数据重叠区域示意图。 图 13是将等晕区边界适当往外延伸得到的在一维方向上的数据 重叠区域示意图。
【具体实施方式】
下面结合附图和实例对本发明作进一步详细的说明。
地面准备阶段:
(1) 将成像传感装置因设计和制造工艺技术水平的限制而造成 的空间可变退化图像分成多个空间不变的图像子块,并构造各图像子 块的总体点扩展函数。
(1.1)根据感光元离镜头轴心的距离,将焦平面上的灰度图像即空 间可变退化图像分成 M个子块 (M取值根据感光元离镜头轴心的距离 大小确定, 一般 1x1到 32 x 32大小的退化图像可认为是空间不变的, 因 此 M可取值为输入图像的大小除以 32 X 32 ),各图像子块用 (为图像 子块的序号, = 1,2,· · ·,Λ 表示, 各图像子块内可近似看成等晕区;
(1.2)在地面预先用光学设备测出各图像子块内的点扩展函数 PSFa (r) , 并存储到数据库中;
(1.3)针对各图像子块,构造成像焦平面离散感光元阵列随机退化 模糊效应点扩展函数模型 ¾ 2(^ ) ;
( 1.4)构建各子块的总体点扩展函数。 其模型可用函数 PSFt (x, y) = PSFa (x, y) * PSFi2 (x, y)来表示。 在线处理阶段:
(2) 以步骤 1建立的用空间可变图像退化模型为约束条件, 对各 图像子块分别进行校正;
(2.1)用最大似然估计算法对各图像子块进行校正。将 /W^r)作为 点扩展的迭代初值, 即
Figure imgf000010_0001
将 作为目标图像的迭代初值, 即 ° = /;, 并设置迭代次数为 N进行迭代。 迭代公式如下:
Figure imgf000010_0002
式 (1)、 (2)中 X为目标图像的支持域, Y为观察图像的支持域, ze X^ei^ eF, 为对第 i 块图像子块第 W次迭代得到的校正图 像, h,«W为对第 块图像子块第^次迭代得到的点扩散函数, &是对 第 块图像子块实际观测到的图像数据, 即 。
(2.2) 达到规定的迭代次数 W = N,得到各图像子块的校正图像 , 即 = fi N(x), 以及总体点扩展函数 4 = hN(x)。
(3) 拼接步骤用拼接算法将各校正后的图像子块^拼接起来, 得 到完整的校正图像 。
原理是在步骤(1)进行图像分块时,将等晕区边界适当往外延伸, 并依据重叠区像素到边界的距离构造加权系数,使用重叠区数据完成 渐变的拼接, 以去除拼接图像视觉上的割裂感。 使用如图 10所示的 方法: 区域 [1] [2] [3]为某一级尺度模型下没有和其他区域重叠的部 分, 故不需要做其他处理, 直接使用原值即可, 区域 [4]则是两块区 过程中, 如果不做处理, 会产生较为明显的块状效应。 而简单的将该 区域的值做平均处理, 效果也不好。 因此我们采用一种加权平均的方 法。 以一维方向上的重叠为例来简化说明该加权系数, 设相邻两图像 块分别以 x、 y表示, 两者拼接后图像以 Z表示, 如图 11所示。 X、 y的大小分别是 M、 N , 图像边界块状区边界往外延伸 L, 即块状区 边界在 X的第 M-L列, 在 y的第 L列。 X、 y在经过图像校正处理之 后, 去除了存在块状效应的 /列, 那么剩下的两图像块的重叠区宽度 2d=2(L-l), 即此时的块状区边界在 X的第 M- 列, 在 F的第 列。 两侧图像重叠区各过渡元素的权重系数按 =1/(2 递推。

Claims

权 利 要 求
1、 一种天基大视场成像点源或斑状目标的图像恢复增强方法, 其步骤包括:
( 1 ) 将空间可变退化图像分成多个空间不变的图像子块, 并构 建各图像字块的点扩展函数。
(1.1)将焦平面上的灰度图像分成 M个子块, 各图像子块用 表 示, 其中 为图像子块的序号, = 1,2,· · ·,Μ, 各图像子块内可近似看成 等晕区; 其中, Μ取值根据感光元离镜头轴心的距离大小确定;
(1.2)在地面预先用光学设备测出各图像子块内的点扩展函数 PSFa (r) , 并存储到数据库中;
(1.3)构造成像焦平面离散感光元阵列对应的各图像子块内的随 机退化模糊效应点扩展函数 ¾ 2^, ) ;
(1.4)构建各子块的总体点扩展函数
PSFt (x, y) = PSFa (x, y) * PSF2 (x, y)
(2) 利用空间不变图像校正方法对各图像子块分别进行校正, 得到各图像子块的校正图像 .。
(3 )拼接步骤: 用拼接算法将各校正后的图像子块 ^拼接起来, 得到完整的校正图像 /。
2、 根据权利要求 1所述的天基大视场成像点源或斑状目标的恢 复增强方法, 其特征在于, 所述的步骤 (2) 中的校正过程具体为: (2.1 ) 用最大似然估计算法对各图像子块进行校正
将/ 作为点扩展的迭代初值, 即 将 .作为目 标图像的迭代初值, 即 °= , 并设置迭代次数为 N进行迭代, 迭代 公式如下:
Figure imgf000013_0001
上式中, X为目标图像的支持域, Y为观察图像的支持域, Z X,x Y,y Y, 为第; ^次迭代得到的校正图像, 为第; ^次迭 代得到的点扩散函数, &是实际观测到的图像数据, 即^
(2.2) 达到迭代次数, 得到各图像子块的校正图像 , 即 =ff X、, 以及总体点扩展函数 4 。
3、 根据权利要求 1或 2所述的一种天基大视场成像点源或斑状 目标的恢复增强方法, 其特征在于, 所述的步骤(3) 的拼接过程中, 为去除拼接图像视觉上的割裂感, 在进行图像分块时, 将等晕区边界 适当往外延伸, 并依据重叠区像素到边界的距离构造加权系数, 通过 使用重叠区数据完成渐变的拼接 (
PCT/CN2011/078380 2010-09-19 2011-08-12 一种天基大视场成像点源/斑状目标的图像恢复增强方法 WO2012034467A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/731,100 US8737761B2 (en) 2010-09-19 2012-12-30 Method for restoring and enhancing space based image of point or spot objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2010102855260A CN101937561A (zh) 2010-09-19 2010-09-19 一种天基大视场成像点源/斑状目标的图像恢复增强方法
CN201010285526.0 2010-09-19

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/731,100 Continuation-In-Part US8737761B2 (en) 2010-09-19 2012-12-30 Method for restoring and enhancing space based image of point or spot objects

Publications (1)

Publication Number Publication Date
WO2012034467A1 true WO2012034467A1 (zh) 2012-03-22

Family

ID=43390875

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/078380 WO2012034467A1 (zh) 2010-09-19 2011-08-12 一种天基大视场成像点源/斑状目标的图像恢复增强方法

Country Status (3)

Country Link
US (1) US8737761B2 (zh)
CN (1) CN101937561A (zh)
WO (1) WO2012034467A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9530079B2 (en) 2012-11-09 2016-12-27 Nikon Corporation Point spread function classification using structural properties
US9589328B2 (en) 2012-11-09 2017-03-07 Nikon Corporation Globally dominant point spread function estimation

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101667798B1 (ko) * 2010-03-29 2016-10-20 삼성전자 주식회사 영상 처리 장치 및 이의 영상 처리 방법
CN101937561A (zh) * 2010-09-19 2011-01-05 华中科技大学 一种天基大视场成像点源/斑状目标的图像恢复增强方法
CN102116876B (zh) * 2011-01-14 2013-04-17 中国科学院上海技术物理研究所 基于轨迹编目模型的空间点目标天基探测方法
WO2012105222A1 (ja) * 2011-01-31 2012-08-09 パナソニック株式会社 画像復元装置、撮像装置及び画像復元方法
CN103150705B (zh) * 2012-12-06 2016-05-25 华中科技大学 一种弱小目标图像的自适应恢复增强方法
CN104299190B (zh) * 2014-02-25 2017-02-15 河南科技大学 线阵扫描图像的矫正方法与装置
WO2018098607A1 (en) * 2016-11-29 2018-06-07 SZ DJI Technology Co., Ltd. Method and system of adjusting image focus
CN109697705B (zh) * 2018-12-24 2019-09-03 北京天睿空间科技股份有限公司 适于视频拼接的色差矫正方法
DE102020211896A1 (de) 2020-09-23 2022-03-24 Conti Temic Microelectronic Gmbh Verfahren zur Erzeugung eines Bildes einer Fahrzeugumgebung und Surroundview-System zur Erzeugung eines Bildes einer Fahrzeugumgebung
CN114066764B (zh) * 2021-11-23 2023-05-09 电子科技大学 基于距离加权色偏估计的沙尘退化图像增强方法及装置
CN116866589B (zh) * 2023-09-05 2023-12-26 成都大熊猫繁育研究基地 一种野外红外相机无线网络的视频图像压缩方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7065256B2 (en) * 2001-02-08 2006-06-20 Dblur Technologies Ltd. Method for processing a digital image
CN101261176A (zh) * 2008-04-03 2008-09-10 华中科技大学 基于序列图像校正的气动光学传输效应测评方法与装置
CN101281643A (zh) * 2008-04-23 2008-10-08 浙江大学 一种退化函数随空间变化图像的分块复原和拼接方法
WO2010045632A2 (en) * 2008-10-17 2010-04-22 Massachusetts Institute Of Technology Optical superresolution using multiple images
CN101937561A (zh) * 2010-09-19 2011-01-05 华中科技大学 一种天基大视场成像点源/斑状目标的图像恢复增强方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5412200A (en) * 1993-03-01 1995-05-02 Rhoads; Geoffrey B. Wide field distortion-compensating imaging system and methods
US5448053A (en) * 1993-03-01 1995-09-05 Rhoads; Geoffrey B. Method and apparatus for wide field distortion-compensated imaging
KR100247938B1 (ko) * 1997-11-19 2000-03-15 윤종용 영상처리 시스템의 디지탈 초점 조절방법 및 장치
JP2002300461A (ja) * 2001-03-30 2002-10-11 Minolta Co Ltd 画像復元装置、画像復元方法、プログラム及び記録媒体
KR101341096B1 (ko) * 2007-09-12 2013-12-13 삼성전기주식회사 영상 복원 장치 및 방법
KR101399012B1 (ko) * 2007-09-12 2014-05-26 삼성전기주식회사 영상 복원 장치 및 방법
US20090086174A1 (en) * 2007-09-28 2009-04-02 Sanyo Electric Co., Ltd. Image recording apparatus, image correcting apparatus, and image sensing apparatus
JP2010079875A (ja) * 2008-08-27 2010-04-08 Sony Corp 情報処理装置、情報処理方法、及びプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7065256B2 (en) * 2001-02-08 2006-06-20 Dblur Technologies Ltd. Method for processing a digital image
CN101261176A (zh) * 2008-04-03 2008-09-10 华中科技大学 基于序列图像校正的气动光学传输效应测评方法与装置
CN101281643A (zh) * 2008-04-23 2008-10-08 浙江大学 一种退化函数随空间变化图像的分块复原和拼接方法
WO2010045632A2 (en) * 2008-10-17 2010-04-22 Massachusetts Institute Of Technology Optical superresolution using multiple images
CN101937561A (zh) * 2010-09-19 2011-01-05 华中科技大学 一种天基大视场成像点源/斑状目标的图像恢复增强方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HE, XIAOHAI ET AL.: "Study on Generalized Point Spread Function for 3D Image Restoration", JOURNAL OF DATA ACQUISITION & PROCESSING, vol. 15, no. 3, September 2000 (2000-09-01), pages 335 - 339 *
HONG, HANYU ET AL.: "Study on circulation iterative restoration algorithm of degraded images with aero-optical effects", JOURNAL OF HUAZHONG UNIVERSITY OF SCIENCE AND TECHNOLOGY, vol. 33, no. 9, September 2005 (2005-09-01), pages 15 - 18 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9530079B2 (en) 2012-11-09 2016-12-27 Nikon Corporation Point spread function classification using structural properties
US9589328B2 (en) 2012-11-09 2017-03-07 Nikon Corporation Globally dominant point spread function estimation

Also Published As

Publication number Publication date
US8737761B2 (en) 2014-05-27
CN101937561A (zh) 2011-01-05
US20130121609A1 (en) 2013-05-16

Similar Documents

Publication Publication Date Title
WO2012034467A1 (zh) 一种天基大视场成像点源/斑状目标的图像恢复增强方法
EP2535864B1 (en) Image processing device and method
KR101364421B1 (ko) 멀티-레졸류션 절차를 이용하여 강건한 깊이 맵을 생성하는 시스템 및 방법
US20190180418A1 (en) Scene-based nonuniformity correction using a convolutional recurrent neural network
CN103369233B (zh) 用于通过利用自适应核来执行深度估计的系统和方法
WO2007113799A2 (en) Digital filtering with noise gain limit
KR100996897B1 (ko) 직선 핏팅에 의한 광각렌즈의 원주방향의 왜곡영상 보정 방법
JP2018055516A (ja) 画像処理方法、画像処理装置、撮像装置、画像処理プログラム、および、記憶媒体
CN111583119B (zh) 一种正射影像拼接方法、设备以及计算机可读介质
JP7174298B2 (ja) 差異検出装置、差異検出方法及びプログラム
CN106651790A (zh) 图像去模糊的方法、装置和设备
JP7398938B2 (ja) 情報処理装置およびその学習方法
CN110852958B (zh) 基于物体倾斜角度的自适应校正方法和装置
JP7403995B2 (ja) 情報処理装置、制御方法およびプログラム
JP2017103756A (ja) 画像データ処理装置及び方法
CN107590790B (zh) 一种基于对称边缘填充的简单透镜边缘区域去模糊方法
TWI805282B (zh) 使用焦點資訊深度估計的方法和裝置
JP2020030569A (ja) 画像処理方法、画像処理装置、撮像装置、レンズ装置、プログラム、および、記憶媒体
CN113191959B (zh) 一种基于退化标定的数字成像系统极限像质提升方法
Dhillon et al. Exhibition of noise-aided stochastic resonance by discontinuity detectors in smartphone images
US20240185405A1 (en) Information processing apparatus, information processing method, and program
유재형 Zero-shot defocus deblurring using dual-pixel image
JP2009146231A (ja) 超解像法および超解像プログラム
JP2023047600A (ja) 情報処理装置、情報処理方法及びプログラム
JP2024038523A (ja) 画像処理方法、学習済みモデルの製造方法、プログラム、および、画像処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11824537

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11824537

Country of ref document: EP

Kind code of ref document: A1