CN106940877B - Image processing apparatus and method - Google Patents

Image processing apparatus and method Download PDF

Info

Publication number
CN106940877B
CN106940877B CN201610007111.4A CN201610007111A CN106940877B CN 106940877 B CN106940877 B CN 106940877B CN 201610007111 A CN201610007111 A CN 201610007111A CN 106940877 B CN106940877 B CN 106940877B
Authority
CN
China
Prior art keywords
pixel
image
pixels
texture
foreground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610007111.4A
Other languages
Chinese (zh)
Other versions
CN106940877A (en
Inventor
范伟
刘威
刘伟
孙俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to CN201610007111.4A priority Critical patent/CN106940877B/en
Publication of CN106940877A publication Critical patent/CN106940877A/en
Application granted granted Critical
Publication of CN106940877B publication Critical patent/CN106940877B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an image processing apparatus and an image processing method. The image processing apparatus includes: and a texture adjusting unit configured to perform texture adjustment processing on the foreground pixels and the background pixels in the overlapping area of the first image and the second image which are stitched together, so that the texture in the overlapping area between the first image and the second image due to stitching is gradually changed. The texture adjustment unit comprises a determination module. The determination module is to determine a texture adjusted pixel value for a pixel in the overlap region, and is configured to: determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of a first image; for a background pixel, a texture adjusted pixel value for the background pixel is determined based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image.

Description

Image processing apparatus and method
Technical Field
The present invention generally relates to the field of image processing, and more particularly, to an apparatus and method for image stitching.
Background
The technique of seamlessly stitching images taken at different angles or positions to form a high-resolution panorama is called image stitching. When the scanning range of the camera is limited, the image stitching technology can provide a whole panoramic image, so that the limitation of limited view field is overcome.
However, in the stitching process, due to differences between the camera (such as white balance) and the illumination intensity, the brightness and the color of the stitched image at the stitched position are not uniform, the stitched image has alternating brightness and darkness, and the stitched position has a difference in texture, which causes great inconvenience to the observer.
Disclosure of Invention
In view of the above-mentioned situation of the prior art, an object of the present invention is to provide an image processing apparatus and method to solve the problems of the prior art.
According to an aspect of the present invention, there is provided an image processing apparatus including a texture adjustment unit configured to perform texture adjustment processing on foreground pixels and background pixels in an overlapping region of a first image and a second image stitched together so that a texture in the overlapping region due to stitching between the first image and the second image is gradation. The texture adjustment unit comprises a determination module for determining texture adjusted pixel values for pixels in the overlap region, and the determination module is configured to: determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of the first image, wherein the foreground pixel includes a foreground pixel of the first image and a foreground pixel of the second image; determining, for a background pixel, a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image, wherein the background pixel includes other pixels than the foreground pixel.
According to another aspect of the present invention, there is also provided an image processing apparatus including a processor configured to perform texture adjustment processing on two images stitched together so that a texture of an overlapping area between the two images due to the stitching is gradually changed. The texture adjustment process includes determining texture adjusted pixel values for pixels in the overlap region by: determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of the first image, wherein the foreground pixel includes a foreground pixel of the first image and a foreground pixel of the second image; determining, for a background pixel, a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image, wherein the background pixel includes other pixels than the foreground pixel.
According to still another aspect of the present invention, there is also provided an image processing method including performing texture adjustment processing on foreground pixels and background pixels in an overlapping region of two images stitched together so that a texture in the overlapping region due to stitching between the two images is gradually changed. The texture adjustment process includes determining texture adjusted pixel values for pixels in the overlap region by: determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of the first image, wherein the foreground pixel includes a foreground pixel of the first image and a foreground pixel of the second image; determining, for a background pixel, a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image, wherein the background pixel includes other pixels than the foreground pixel.
According to other aspects of the invention, embodiments of the invention also provide a computer program product in the form of a computer readable medium having computer program code recorded thereon for implementing the above-described method.
According to the method and the device provided by the embodiment of the invention, the texture and/or tone adjustment is carried out on the overlapped area of the two images spliced together, so that the obvious distortion caused by image splicing in the overlapped area is eliminated.
These and other advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments of the invention, taken in conjunction with the accompanying drawings.
Drawings
The invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like reference numerals are used to designate like or similar parts throughout the figures thereof, and in which
Fig. 1 is a block diagram of an image processing apparatus 10 according to a first embodiment of the present invention;
FIG. 2A is a schematic diagram of an image of an overlay region after spatial transformation;
FIG. 2B is a schematic illustration of an image of the lower image overlap region;
fig. 3 is an image processing apparatus 20 according to a second embodiment of the present invention;
FIG. 4 shows a schematic diagram of the classification of pixels in the overlapping regions of stitched together images into three classes;
FIG. 5A is a schematic diagram of a conventional image processing apparatus after image stitching;
fig. 5B is a schematic diagram of the same image to be stitched by using the image processing apparatus 20 according to the second embodiment of the present invention;
FIG. 6 is a flow diagram of an image processing method 60 according to one embodiment of the invention;
FIG. 7 is a flow diagram of an image processing method 70 according to another embodiment of the invention; and
fig. 8 is a block diagram of an exemplary architecture of a general-purpose personal computer in which methods and/or apparatus according to embodiments of the invention may be implemented.
Detailed Description
Embodiments of the present invention are described below with reference to the drawings. Elements and features described in one drawing or one embodiment of the invention may be combined with elements and features shown in one or more other drawings or embodiments. It should be noted that the figures and description omit representation and description of components and processes that are not relevant to the present invention and that are known to those of ordinary skill in the art for the sake of clarity.
It will be understood by those skilled in the art that the terms "first", "second", etc. in the present invention are only used for distinguishing different units, modules or steps, etc., and do not represent any specific technical meaning or necessary logical sequence between them, nor represent the importance of the different units, modules or steps defined by them.
Fig. 1 is a block diagram of an image processing apparatus 10 according to a first embodiment of the present invention. As shown in fig. 1, the image processing apparatus 10 includes a texture adjustment unit 11 and/or a hue adjustment unit 12, wherein the texture adjustment unit 11 is configured to perform texture adjustment processing on foreground pixels and background pixels in an overlapping region of two images stitched together so that a texture in the overlapping region between the two images due to the stitching is gradually changed; the hue adjustment unit 12 is configured to match the histogram of background pixels of a first image and the histogram of background pixels of a second image of the two images stitched together such that the hue difference between the two images is within a predetermined threshold.
Hereinafter, the image processing apparatus 10 according to the first embodiment of the present invention will be described in detail by taking the stitching of two figures as an example. It should be understood, however, that the image processing apparatus 10 of the present invention is not limited to stitching of two images, and that stitching of a plurality of images may be achieved by applying pairwise stitching a plurality of times, for example.
In order to stitch the images, after the image processing apparatus 10 acquires the images to be stitched, it is necessary to register the input images.
In image registration, a stitch line needs to be first determined. Since an approximate calculation is used in image registration, a parallax occurs in an overlapping portion of images. The direct result is that inconsistencies occur in the overlapping portions of the stitched images, resulting in an apparent stitching line in the stitched image. In the present embodiment, for the sake of simplifying the description, it is assumed hereinafter that two images to be stitched are stitched in an up-down arrangement, hereinafter referred to as an upper image and a lower image. In the case of a splice above and below, the patchwork line is generally horizontal. It should be understood that the patchwork lines may be straight or curved. Methods of determining a stitch line are well known in the art and will not be described in detail herein.
After the patchwork lines are determined, the coordinate points of the pixels in the overlapping area of the two images to be spliced can be accurately aligned by means of spatial transformation. In the following, it is assumed that the upper map is mapped into the coordinates of the lower map by a spatial transformation, so that a registration of the images is obtained. Fig. 2A and 2B show images of an overlapping area of two images to be stitched, where fig. 2A is a schematic diagram of an image of an overlapping area of an upper image after spatial transformation, and fig. 2B is a schematic diagram of an image of an overlapping area of a lower image. Note that the pixel value of each pixel of the overlap area image shown in fig. 2A is a pixel value obtained by interpolating (i.e., spatially varying) the pixel of the upper image. Image registration methods for stitching images are well known in the art and will not be described in detail herein.
After the images to be spliced are registered, the images can be spliced by re-determining the pixel values of the pixels of the registered images. In the present embodiment, in order to eliminate distortion caused by a difference in texture and/or hue when the upper and lower images are stitched, it is necessary to re-determine the pixel value of the pixel and/or the hue (for example, the gray value) of the pixel in the overlap region so that the texture and/or the hue in the overlap region is gradually changed.
In the texture and/or hue adjustment, one of the images may be used as a reference, and the other image may be adjusted so that the hue of the pixels of the two images and/or the pixel values of the pixels in the overlapping region tend to coincide. However, texture adjustment may also be performed simultaneously on the hue of the pixels of both images and/or the pixel values of the pixels in the overlapping area with reference to another reference image.
The operations of the texture adjusting unit 11 and the tone adjusting unit 12 according to the embodiment of the present invention will be described below by taking only one image adjustment as an example. For example, the upper graph is the target image and the lower graph is the reference image.
In order to perform texture adjustment, image detection is first required, that is, pixels in the overlapping region of two images are segmented into foreground pixels and background pixels. The foreground and background pixel segmentation may employ, for example, conventional binarization methods such as global thresholding (e.g., Otsu algorithm) or local thresholding (e.g., Niblack algorithm or sauvola algorithm).
By the segmentation of the foreground pixels and the background pixels, the foreground pixels in the overlap region include the foreground pixels of the upper image and the foreground pixels of the lower image, and the background pixels in the overlap region include other pixels except the foreground pixels.
After the segmentation of the foreground pixels and the background pixels, the texture adjusting unit 11 may perform texture adjustment on the foreground pixels and the background pixels, respectively. Specifically, in order to make the texture difference gradual, for the background pixels in the overlapping region, the pixel values of the background pixels closer to the stitch line should be closer to the pixel values of the background pixels of the corresponding region of the lower image, and the pixel values of the background pixels farther from the stitch line should be closer to the pixel values of the pixels of the corresponding region outside the overlapping region of the upper image, and therefore, for the background pixels in the overlapping region, particularly the background pixels close to the stitch line, it is necessary to re-determine the pixel values thereof by pixel value fusion with reference to the pixel values of the corresponding pixels of the lower image. However, for the foreground pixels in the overlapping region, it is considered that the pixel value of the stitched pixel should be close to the pixel value of the pixel of the target image, and therefore, it may be considered that only the pixel value of the target image itself is taken as the pixel value of the stitched foreground pixel. That is, the texture adjusting unit 11 may select different strategies of pixel value fusion for the foreground pixel and the background pixel, respectively.
For the tone adjustment, the tone adjustment unit 12 may transform the histogram of the upper image (i.e., the target image) with reference to the histogram of the lower image so that the histograms of the upper and lower images are the same or similar, thereby making the two images have similar tones.
In the present embodiment, only the tone of the background pixel in the upper image may be adjusted. Specifically, the tone adjustment unit 12 calculates a cumulative distribution function of histograms of background pixels of the upper and lower images. Assume that F1 denotes the histogram cumulative distribution function of the acquired reference image, and F2 denotes the histogram cumulative distribution function of the target image. Subsequently, for G1 ∈ [0, 255] for each gray level, the hue adjustment unit 12 may find its corresponding G2 value such that F1(G1) ═ F2(G2), i.e., the result of the histogram matching function: m (G1) ═ G2. Finally, the tone adjustment unit 12 applies the mapping function M to adjust the background pixels of the target image.
In one possible embodiment, the darker tone image of the upper and lower images can be used as the reference image.
In this way, the tone adjustment unit 12 can make the tone difference between the two images within the predetermined threshold by adjusting the tone of the background pixel. However, the present invention is not limited to this, and the color tone adjusting unit 12 may also adjust the color tones of the foreground pixels and the background pixels at the same time.
As shown in fig. 1, the image processing apparatus 10 according to the present embodiment may configure both the texture adjusting unit 11 and the color tone adjusting unit 12 at the same time, alternatively, the image processing apparatus 10 may also configure only one of the texture adjusting unit 11 and the color tone adjusting unit 12. That is, either one of the color tone adjustment and the texture adjustment may be performed only on the images that are stitched together, or both the color tone adjustment and the texture adjustment may be performed on the images that are stitched together at the same time.
As described above, the image processing apparatus 10 according to the first embodiment of the present invention can eliminate significant distortion in an overlapping region caused by image stitching by performing texture adjustment on the overlapping region of two images stitched together; and/or, the image processing apparatus 10 according to the first embodiment of the present invention may make the overall tone of the stitched image harmonized by performing tone adjustment on the two images stitched together.
In one possible embodiment, the texture of only a portion of the overlap area may be adjusted. For example, the texture of only a portion of the overlapping region from the patchwork line may be adjusted.
Fig. 3 is an image processing apparatus 20 according to a second embodiment of the present invention. The image processing apparatus 20 in the present embodiment includes a texture adjustment unit 21 and a tone adjustment unit 22.
The tone adjusting unit 22 may be configured according to the tone adjusting unit 12 in the first embodiment of the present invention described in conjunction with fig. 1, and will not be described herein again.
In this embodiment, the texture adjustment unit 21 comprises a determining module 211 for determining texture adjusted pixel values for pixels in the overlap region.
Specifically, for a foreground pixel, the determining module 211 determines a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating a pixel of an upper map, wherein the foreground pixel includes a foreground pixel of the upper map and a foreground pixel of a lower map.
That is, for foreground pixels, in the overlapping region, the determining module 211 determines the pixel value of the corresponding pixel obtained by interpolating the pixels of the upper map as the pixel value of the texture-adjusted foreground pixel, regardless of whether the foreground pixel to be adjusted is a foreground pixel of the upper map or a foreground pixel of the lower map.
For the background pixel, the determining module 211 determines a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating a pixel of the upper map and a pixel value of a corresponding pixel of the lower map, wherein the background pixel includes other pixels except the foreground pixel in the overlapping region.
That is, for the background pixel, the determining module 211 may determine the pixel value of the background pixel in the overlapping area by the following formula (1).
I=beta*I(d)+(1-beta)*I(u) (1)
beta=max(1-d2/belt2,0)
Where i (d) is the pixel value of the lower image pixel, and i (u) is the pixel value of the corresponding pixel obtained by interpolating the pixel of the upper image pixel. beta is the weight factor, d2 is the distance between the background pixel and the seam line, and the parameter belt2 is the width of the overlap region above the seam line.
It can be seen that, in the process of merging the pixel values of the background pixels, the closer the background pixels are to the stitching line, the more the weight of the pixel value i (d) of the lower image pixel is.
In one possible embodiment, texture adjustments may be made to only a portion of the overlap region. In this case, for example, belt2 in formula (1) may be set to 100.
In one possible embodiment, the pixels of the overlap area may be classified into three types, i.e., foreground pixels, transition pixels, and other pixels, and texture adjustment may be performed on the three types of pixels, respectively.
Fig. 4 shows a schematic diagram of the classification of pixels in the overlapping regions of images stitched together into three classes. For the foreground pixel class, the pixels in the class include the foreground pixels of the upper image and the foreground pixels of the lower image, such as the black text part in fig. 4. For the transition pixel class, the pixels in the class correspond to pixels in the background pixels that are very close to the foreground pixels, and the transition pixels may be determined by a preset distance threshold, for example. The transition pixels are in fig. 4 the gray outline portions where the text edge is very close to the text. The other pixel classes are other pixels except the transition pixel in the background pixel, i.e. the white background part in fig. 4.
For texture adjustment, for the foreground pixel, as described above, the determining module 211 determines the pixel value of the corresponding pixel obtained by interpolating the pixels of the upper map as the pixel value of the texture-adjusted foreground pixel, regardless of whether the foreground pixel of the upper map or the foreground pixel of the lower map.
For the transition pixel, the determining module 211 performs pixel value fusion on the pixel values of the corresponding pixels of the upper and lower images, thereby determining the texture-adjusted pixel value of the transition pixel.
Specifically, the determining module 211 may determine the pixel value of the transition pixel in the overlap region by the following formula (2).
I=alpha*I(u)+(1-alpha)*I(d) (2)
alpha=max(1-d1/belt,0)
Where i (d) is the pixel value of the lower image pixel, and i (u) is the pixel value of the corresponding pixel obtained by interpolating the pixel of the upper image pixel. alpha is the weighting factor, d1 is the shortest distance between the transition pixel and the foreground pixel, and belt is the width of the transition region.
In one possible example, the width of the transition region may be set to 5, i.e. pixels having a distance of less than or equal to 5 from the nearest foreground pixel are transition pixels.
As can be seen from the above equation (2), the weight of the pixel value (interpolation) of the transition pixel of the upper map at the time of pixel value fusion is larger as the transition pixel is closer to the foreground pixel, and conversely, the weight of the pixel value of the transition pixel of the upper map at the time of pixel value fusion is smaller as the transition pixel is farther from the foreground pixel.
In this way, when determining the texture-adjusted pixel value of the transition pixel, the determining module 211 selects the weight of the pixel value of the corresponding pixel obtained by interpolating the pixels of the upper map and the pixel value of the corresponding pixel of the lower map when the pixel values are fused, based on the shortest distance between the transition pixel and the foreground pixel, so that the texture-adjusted pixel value of the transition pixel is gradually changed from the nearest foreground pixel to the pixel value i (u) of the corresponding pixel obtained by interpolating the pixels of the upper map toward the pixel value i (d) of the corresponding pixel of the lower map.
For other pixels than the transition pixel in the background pixel, as described above, the determining module 211 may determine the pixel values of the other pixels in the overlap region by the formula (1) as described above. That is, the closer the other pixels are to the stitch line, the smaller the weight of the pixel value of the corresponding pixel obtained by interpolating the pixels of the upper map at the time of pixel value fusion is, and conversely, the farther the other pixels are from the stitch line, the larger the weight of the pixel value of the corresponding pixel obtained by interpolating the pixels of the upper map at the time of pixel value fusion is.
In this way, when determining the texture-adjusted pixel values of the other pixels, the determining module 211 selects the weight of the pixel value of the corresponding pixel obtained by interpolating the pixel of the upper map and the pixel value of the corresponding pixel of the lower map when the pixel values are fused, based on the distance of the other pixels from the stitch line, so that the texture-adjusted pixel values of the other pixels are gradually changed from the stitch line to the pixel value i (u) of the corresponding pixel obtained by interpolating the pixel of the upper map by the pixel value i (d) of the corresponding pixel of the lower map.
Fig. 5A is a schematic diagram of image stitching performed by using a conventional image processing apparatus, and fig. 5B is a schematic diagram of image stitching performed by using the image processing apparatus 20 according to the second embodiment of the present invention on the same image to be stitched. By comparison, it can be seen that the image processing apparatus 20 according to the second embodiment of the present invention better eliminates the difference in texture and hue caused by image stitching by performing pixel value fusion on the foreground pixel, the transition pixel, and the other pixels of the overlap region, respectively, and causing the overall hue of the stitched image to become harmonized by the hue adjustment.
Embodiments of an image processing apparatus according to the present invention are described above in conjunction with fig. 1 to 5B, in the course of which an image processing method is also in fact described. The method is briefly described below with reference to fig. 6 and 7, and the details thereof can be found in the description of the image processing apparatus.
FIG. 6 is a flow diagram of an image processing method 60 according to one embodiment of the invention.
The image processing method 60 comprises the steps of: step S62 of performing texture adjustment processing on foreground pixels and background pixels in an overlapping area of two images stitched together so that a texture in the overlapping area due to stitching between the two images is gradually changed; and/or step S64 of matching the histogram of background pixels of a first image and the histogram of background pixels of a second image of the two images stitched together such that the hue difference between the two images is within a predetermined threshold.
As shown in fig. 6, according to the image processing method 60 of the present embodiment, any one of the tone adjustment and the texture adjustment may be performed only on the images stitched together, that is, only the steps S62 or S64 may be performed, or the steps S62 and S64 may be performed simultaneously. When the steps S62 and S64 are performed simultaneously, the present invention does not limit the order of performing the steps S62 and S64. For example, the implementation can be achieved through the processes performed by the image processing apparatus 10 according to the first embodiment described in conjunction with fig. 1 to 2, and specific details are not described herein again.
In one possible embodiment, texture adjustments may be made to only a portion of the overlap area.
In one possible embodiment, the darker-tone image of the upper and lower images can be used as the reference image when performing the tone adjustment.
In one possible embodiment, when performing the tone adjustment, the tones of the foreground pixel and the background pixel in the stitched together image may be adjusted simultaneously.
Fig. 7 is a flow diagram of an image processing method 70 according to another embodiment of the invention. In the present embodiment, the image processing method 70 according to the present embodiment is described by taking upper and lower columns of two images to be stitched as an example.
The method starts at step S71 and at step S72 the segmentation of foreground and background pixels is performed on the pixels in the top and bottom images, respectively. The foreground pixels comprise foreground pixels of an upper image and foreground pixels of a lower image, and the background pixels comprise other pixels except the foreground pixels.
In step S73, the histogram of the upper image background pixels is matched with the histogram of the lower image background pixels. For example, the processing can be realized by the process executed by the tone processing unit 12 according to the first embodiment described in conjunction with fig. 1, and the details are not described herein again.
In step S74, image registration is performed to determine a stitch line.
In step S75, for the foreground pixel in the overlap region, a texture-adjusted pixel value of the foreground pixel is determined based on a pixel value of a corresponding pixel obtained by interpolating the pixels of the upper map.
In step S76, for the transition pixel in the overlap region, weights of a pixel value of a corresponding pixel obtained by interpolating a pixel of the upper map and a pixel value of a corresponding pixel of the lower map at the time of pixel value fusion are determined based on the shortest distance of the transition pixel from the foreground pixel, so that the texture-adjusted pixel value of the transition pixel is gradually changed from the foreground pixel closest to the pixel value of the corresponding pixel obtained by interpolating the pixel of the upper map toward the pixel value of the corresponding pixel of the lower map. And when the shortest distance between the background pixel and the foreground pixel is less than a preset value, judging that the background pixel is a transition pixel.
In step S77, for other pixels in the overlap region, weights of the pixel value of the corresponding pixel obtained by interpolating the pixel of the upper diagram and the pixel value of the corresponding pixel of the lower diagram at the time of pixel value fusion are determined based on the distances of the other pixels from the stitch line so that the texture-adjusted pixel values of the other pixels gradually change from the stitch line to the pixel value of the corresponding pixel obtained by interpolating the pixel of the upper diagram. Wherein the other pixels include other pixels of the background pixel except the transition pixel.
It should be understood that the present implementation does not limit the execution order of steps S75-S77. For example, the steps S75-S77 can be realized by the process executed by the texture processing unit 21 according to the second embodiment described in conjunction with FIG. 3, and further details are not repeated herein.
It can be seen that, in this embodiment, the color tones of the pixels of the images that are spliced together may be adjusted first, and then, for the overlapping area, texture adjustment may be performed on the pixels that have undergone color tone adjustment according to different pixel classifications; and vice versa, in other words, it is also possible to perform the texture adjustment process first and then perform the tone adjustment texture.
Furthermore, an embodiment of the present invention actually relates to an image processing apparatus (e.g., a computer) including: a processor; wherein the processor is configured to implement the functions possessed by the respective functional components of the image processing apparatus as described above with reference to fig. 1 to 5B, or to execute the operation steps of the image processing method as described above with reference to fig. 6 to 7.
In the embodiments of the apparatus, method, etc. of the present application, it is apparent that each component (unit, sub-unit, module, sub-module, etc.) or each step may be decomposed, combined, and/or recombined after being decomposed. These decompositions and/or recombinations are to be considered as equivalents of the present application. Also, in the above description of specific embodiments of the application, features described and/or illustrated with respect to one embodiment may be used in the same or similar manner in one or more other embodiments, in combination with or instead of the features in the other embodiments.
While the principles of the invention have been described in connection with specific embodiments thereof, it should be noted that it will be understood by those skilled in the art that all or any of the steps or elements of the method and apparatus of the invention may be implemented in any computing device (including processors, storage media, etc.) or network of computing devices, in hardware, firmware, software, or any combination thereof, which will be within the skill of those in the art after reading the description of the invention and applying their basic programming skills.
Thus, the objects of the invention may also be achieved by running a program or a set of programs on any computing device. The computing device may be a general purpose device as is well known. The object of the invention is thus also achieved solely by providing a program product comprising program code for implementing the method or device. That is, such a program product also constitutes the present invention, and a storage medium storing such a program product also constitutes the present invention. Obviously, the storage medium may be any known storage medium or any storage medium developed in the future.
In the case where the embodiment of the present invention is implemented by software and/or firmware, a program constituting the software is installed from a storage medium or a network to a computer having a dedicated hardware structure, such as a general-purpose computer 800 shown in fig. 8, which is capable of executing various functions and the like when various programs are installed.
In fig. 8, a Central Processing Unit (CPU)801 executes various processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage section 808 to a Random Access Memory (RAM) 803. In the RAM 803, data necessary when the CPU 801 executes various processes and the like is also stored as necessary. The CPU 801, the ROM 802, and the RAM 803 are linked to each other via a bus 804. An input/output interface 805 is also linked to the bus 804.
The following components are linked to the input/output interface 805: an input section 806 (including a keyboard, a mouse, and the like), an output section 807 (including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker and the like), a storage section 808 (including a hard disk and the like), a communication section 809 (including a network interface card such as a LAN card, a modem, and the like). The communication section 809 performs communication processing via a network such as the internet. The drive 810 may also be linked to the input/output interface 805 as desired. A removable medium 811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 810 as necessary, so that a computer program read out therefrom is installed in the storage portion 808 as necessary.
In the case where the above-described series of processes is realized by software, a program constituting the software is installed from a network such as the internet or a storage medium such as the removable medium 811.
It will be understood by those skilled in the art that such a storage medium is not limited to the removable medium 811 shown in fig. 8 in which the program is stored, distributed separately from the apparatus to provide the program to the user. Examples of the removable medium 811 include a magnetic disk (including a floppy disk (registered trademark)), an optical disk (including a compact disk read only memory (CD-ROM) and a Digital Versatile Disk (DVD)), a magneto-optical disk (including a Mini Disk (MD) (registered trademark)), and a semiconductor memory. Alternatively, the storage medium may be the ROM 802, a hard disk included in the storage section 808, or the like, in which programs are stored and which are distributed to users together with the apparatus including them.
The invention also provides a program product with machine readable instruction codes stored. The instruction codes, when read and executed by a machine, may perform the methods according to embodiments of the invention described above.
Accordingly, a storage medium carrying the above-described program product having machine-readable instruction code stored thereon is also included in the present disclosure. Storage media includes, but is not limited to, floppy disks, optical disks, magneto-optical disks, memory cards, memory sticks, and the like.
Finally, it should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Furthermore, without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description, the embodiments of the present invention provide the following technical solutions, but are not limited thereto.
Supplementary note 1. an image processing apparatus including a texture adjusting unit and/or a tone adjusting unit, wherein,
the texture adjustment unit is configured to perform texture adjustment processing on foreground pixels and background pixels in an overlapping region of two images stitched together so that a texture in the overlapping region due to stitching between the two images is gradually changed;
the hue adjustment unit is configured to match a histogram of background pixels of a first image and a histogram of background pixels of a second image of the two images stitched together such that a hue difference between the two images is within a predetermined threshold.
Supplementary note 2 the image processing apparatus according to supplementary note 1, wherein the two images include a first image and a second image, the texture adjusting unit includes:
a determination module to determine texture adjusted pixel values for pixels in the overlap region, the determination module configured to:
determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of the first image, wherein the foreground pixel includes a foreground pixel of the first image and a foreground pixel of the second image;
determining, for a background pixel, a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image, wherein the background pixel includes other pixels than the foreground pixel.
Note 3. the image processing apparatus according to note 2, wherein the background pixels include transition pixels, and for the transition pixels, the determination module is further configured to:
determining a weight of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image in determining a texture-adjusted pixel value of the transition pixel based on a shortest distance of the transition pixel from the foreground pixel;
and under the condition that the shortest distance between the background pixel and the foreground pixel is less than a preset value, judging that the background pixel is the transition pixel.
Note 4 the image processing apparatus according to note 3, wherein the smaller the shortest distance of the transition pixel from the foreground pixel is, the greater the weight of the pixel value of the corresponding pixel obtained by interpolating the pixels of the first image is when determining the texture-adjusted pixel value of the transition pixel.
Supplementary note 5. the image processing apparatus according to supplementary note 3 or 4, wherein the determining module is further configured to:
in determining the texture-adjusted pixel value of the transition pixel, the weight of the pixel value of the corresponding pixel obtained by interpolating the pixels of the first image and the weight of the pixel value of the corresponding pixel of the second image are selected so that the texture-adjusted pixel value of the transition pixel is gradually changed from the nearest foreground pixel to the pixel value of the corresponding pixel of the second image.
Supplementary note 6. the image processing apparatus according to supplementary note 3, wherein the background pixels further include other pixels, the determining module is further configured to:
determining a pixel value of a corresponding pixel obtained by interpolating pixels of the first image based on distances of the other pixels from the stitch line, a weight in determining texture-adjusted pixel values of the other pixels;
wherein the other pixels include other pixels of the background pixel except the transition pixel.
Note 7 the image processing apparatus according to note 6, wherein the smaller the distance of the other pixel from the stitch line, the smaller the weight of the pixel value of the corresponding pixel obtained by interpolating the pixel of the first image when determining the texture-adjusted pixel value of the other pixel.
Note 8. the image processing apparatus according to note 6 or 7, wherein the determination module is further configured to:
in determining the texture-adjusted pixel values of the other pixels, the pixel values of the corresponding pixels of the second image are gradually changed from the pixel values of the corresponding pixels of the second image to the pixel values of the corresponding pixels interpolated for the first image by selecting weights of the pixel values of the corresponding pixels interpolated for the first image and the pixel values of the corresponding pixels of the second image such that the texture-adjusted pixel values of the other pixels start from the stitch line.
Supplementary note 9. an image processing apparatus, comprising a processor configured to perform at least one of:
performing texture adjustment processing on two images spliced together so that the texture of an overlapped area generated by splicing between the two images is gradually changed;
a background portion of at least one of two images stitched together is subjected to a tone adjustment process such that a difference in tone between the two images is within a predetermined threshold.
Supplementary note 10. an image processing method, the method comprising:
performing texture adjustment processing on foreground pixels and background pixels in an overlapping area of two images spliced together so that textures in the overlapping area generated due to splicing between the two images are gradually changed; and/or
The histogram of background pixels of a first image and the histogram of background pixels of a second image of two images stitched together are matched such that the tonal difference between the two images is within a predetermined threshold.
Note 11 the image processing method according to note 10, wherein the two images include a first image and a second image, and the texture adjustment process includes:
determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of the first image, wherein the foreground pixel includes a foreground pixel of the first image and a foreground pixel of the second image;
determining, for a background pixel, a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image, wherein the background pixel includes other pixels than the foreground pixel.
Note 12 the image processing method according to note 11, wherein the background pixels include transition pixels, and the texture adjustment process further includes, for the transition pixels:
determining a weight of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image in determining a texture-adjusted pixel value of the transition pixel based on a shortest distance of the transition pixel from the foreground pixel;
and under the condition that the shortest distance between the background pixel and the foreground pixel is less than a preset value, judging that the background pixel is the transition pixel.
Note 13 the image processing method according to note 12, wherein the smaller the shortest distance of the transition pixel from the foreground pixel is, the greater the weight of the pixel value of the corresponding pixel obtained by interpolating the pixels of the first image is when determining the texture-adjusted pixel value of the transition pixel.
Note 14 the image processing method according to note 12 or 13, wherein in determining the texture-adjusted pixel value of the transition pixel, the texture-adjusted pixel value of the transition pixel is gradually changed from the pixel value of the corresponding pixel obtained by interpolating the pixels of the first image to the pixel value of the corresponding pixel of the second image by selecting weights of the pixel value of the corresponding pixel obtained by interpolating the pixels of the first image and the pixel value of the corresponding pixel of the second image such that the texture-adjusted pixel value of the transition pixel starts from the nearest foreground pixel.
Additionally 15 the image processing method according to additional note 12, wherein the texture adjustment processing further includes:
determining a pixel value of a corresponding pixel obtained by interpolating pixels of the first image based on distances of the other pixels from the stitch line, a weight in determining texture-adjusted pixel values of the other pixels;
wherein the other pixels include other pixels of the background pixel except the transition pixel.
Additional 16 the image processing method according to additional 15, wherein the smaller the distance of the other pixel from the stitch line, the smaller the weight of the pixel value of the corresponding pixel obtained by interpolating the pixel of the first image when determining the texture-adjusted pixel value of the other pixel.
Supplementary note 17 the image processing method according to supplementary note 15 or 16, wherein in determining the texture-adjusted pixel values of the other pixels, the texture-adjusted pixel values of the other pixels are made to gradually change from the pixel values of the corresponding pixels of the second image to the pixel values of the corresponding pixels interpolated from the pixels of the first image by selecting weights of the pixel values of the corresponding pixels interpolated from the pixels of the first image and the pixel values of the corresponding pixels of the second image, starting from the stitch line.
The above embodiments are only for illustrating the invention and are not to be construed as limiting the invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the invention, therefore, all equivalent technical solutions also belong to the scope of the invention, and the scope of the invention is defined by the claims.

Claims (12)

1. An image processing apparatus, the image processing apparatus comprising:
a texture adjustment unit configured to perform texture adjustment processing on foreground pixels and background pixels in an overlapping region of a first image and a second image stitched together so that a texture in the overlapping region due to stitching between the first image and the second image is gradation,
wherein the texture adjustment unit comprises a determination module for determining texture adjusted pixel values for pixels in the overlap region, and the determination module is configured to:
determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of the first image, wherein the foreground pixel includes a foreground pixel of the first image and a foreground pixel of the second image;
determining, for a background pixel, a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image, wherein the background pixel includes other pixels than the foreground pixel.
2. The image processing apparatus according to claim 1, further comprising:
a tone adjustment unit configured to match a histogram of background pixels of the first image with a histogram of background pixels of the second image such that a tone difference between the first image and the second image is within a predetermined threshold.
3. The image processing apparatus of claim 1, wherein the background pixels comprise transition pixels, for which the determination module is further configured to:
determining a weight of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image in determining a texture-adjusted pixel value of the transition pixel based on a shortest distance of the transition pixel from the foreground pixel;
and under the condition that the shortest distance between the background pixel and the foreground pixel is less than a preset value, judging that the background pixel is the transition pixel.
4. The image processing apparatus according to claim 3, wherein the smaller the shortest distance of the transition pixel from the foreground pixel is, the greater the weight of the pixel value of the corresponding pixel obtained by interpolating the pixels of the first image is when determining the texture-adjusted pixel value of the transition pixel.
5. The image processing apparatus of claim 3 or 4, wherein the determination module is further configured to:
in determining the texture-adjusted pixel value of the transition pixel, the weight of the pixel value of the corresponding pixel obtained by interpolating the pixels of the first image and the weight of the pixel value of the corresponding pixel of the second image are selected so that the texture-adjusted pixel value of the transition pixel is gradually changed from the nearest foreground pixel to the pixel value of the corresponding pixel of the second image.
6. The image processing apparatus of claim 3, wherein the background pixels further comprise other pixels, the determination module further configured to:
determining a weight of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image based on distances of the other pixels from a stitching line of the first image and the second image stitched together when determining texture-adjusted pixel values of the other pixels;
wherein the other pixels include other pixels of the background pixel except the transition pixel.
7. The image processing apparatus according to claim 6, wherein the smaller the distance of the other pixel from the stitch line, the smaller the weight of the pixel value of the corresponding pixel obtained by interpolating the pixel of the first image when determining the texture-adjusted pixel value of the other pixel.
8. The image processing apparatus of claim 6 or 7, the determination module further configured to:
in determining the texture-adjusted pixel values of the other pixels, the pixel values of the corresponding pixels of the second image are gradually changed from the pixel values of the corresponding pixels of the second image to the pixel values of the corresponding pixels interpolated for the first image by selecting weights of the pixel values of the corresponding pixels interpolated for the first image and the pixel values of the corresponding pixels of the second image such that the texture-adjusted pixel values of the other pixels start from the stitch line.
9. An image processing apparatus comprising a processor configured to:
performing texture adjustment processing on a first image and a second image which are spliced together so that the texture of an overlapping area between the first image and the second image due to the splicing is gradually changed,
wherein the texture adjustment process comprises determining texture adjusted pixel values for pixels in the overlap region by:
determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of the first image, wherein the foreground pixel includes a foreground pixel of the first image and a foreground pixel of the second image;
determining, for a background pixel, a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image, wherein the background pixel includes other pixels than the foreground pixel.
10. The image processing device of claim 9, wherein the processor is further configured to:
matching the histogram of background pixels of the first image with the histogram of background pixels of the second image such that a tonal difference between the first image and the second image is within a predetermined threshold.
11. A method of image processing, the method comprising:
performing texture adjustment processing on foreground pixels and background pixels in an overlapping region of a first image and a second image which are stitched together so that texture in the overlapping region due to stitching between the first image and the second image is gradually changed,
wherein the texture adjustment process comprises determining texture adjusted pixel values for pixels in the overlap region by:
determining, for a foreground pixel, a texture-adjusted pixel value of the foreground pixel based on a pixel value of a corresponding pixel obtained by interpolating pixels of the first image, wherein the foreground pixel includes a foreground pixel of the first image and a foreground pixel of the second image;
determining, for a background pixel, a texture-adjusted pixel value of the background pixel based on a weighted combination of a pixel value of a corresponding pixel obtained by interpolating pixels of the first image and a pixel value of a corresponding pixel of the second image, wherein the background pixel includes other pixels than the foreground pixel.
12. The image processing method according to claim 11, further comprising:
matching the histogram of background pixels of the first image with the histogram of background pixels of the second image such that a tonal difference between the first image and the second image is within a predetermined threshold.
CN201610007111.4A 2016-01-05 2016-01-05 Image processing apparatus and method Active CN106940877B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610007111.4A CN106940877B (en) 2016-01-05 2016-01-05 Image processing apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610007111.4A CN106940877B (en) 2016-01-05 2016-01-05 Image processing apparatus and method

Publications (2)

Publication Number Publication Date
CN106940877A CN106940877A (en) 2017-07-11
CN106940877B true CN106940877B (en) 2021-04-20

Family

ID=59468521

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610007111.4A Active CN106940877B (en) 2016-01-05 2016-01-05 Image processing apparatus and method

Country Status (1)

Country Link
CN (1) CN106940877B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110517188B (en) * 2018-05-22 2024-02-23 杭州海康威视数字技术股份有限公司 Method and device for determining aerial view image
CN109523495B (en) * 2018-10-15 2022-04-01 北京东软医疗设备有限公司 Image processing method and device, equipment and storage medium
CN112419349B (en) * 2020-11-19 2022-11-22 安阳师范学院 Artificial intelligent object fragment image splicing method
CN113077387B (en) * 2021-04-14 2023-06-27 杭州海康威视数字技术股份有限公司 Image processing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060072041A1 (en) * 2004-10-01 2006-04-06 Sharp Kabushiki Kaisha Image synthesis apparatus, electrical apparatus, image synthesis method, control program and computer-readable recording medium
CN102819824A (en) * 2011-06-10 2012-12-12 三星电子株式会社 Apparatus and method for image processing
CN103258321A (en) * 2013-05-14 2013-08-21 杭州海康希牧智能科技有限公司 Image stitching method
CN104240211A (en) * 2014-08-06 2014-12-24 中国船舶重工集团公司第七0九研究所 Image brightness and color balancing method and system for video stitching

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060072041A1 (en) * 2004-10-01 2006-04-06 Sharp Kabushiki Kaisha Image synthesis apparatus, electrical apparatus, image synthesis method, control program and computer-readable recording medium
CN102819824A (en) * 2011-06-10 2012-12-12 三星电子株式会社 Apparatus and method for image processing
CN103258321A (en) * 2013-05-14 2013-08-21 杭州海康希牧智能科技有限公司 Image stitching method
CN104240211A (en) * 2014-08-06 2014-12-24 中国船舶重工集团公司第七0九研究所 Image brightness and color balancing method and system for video stitching

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
叶永凯等.基于特征匹配的渐变纹理图像合成算法.《计算机技术与发展》.2009,第19卷(第11期),第42-44,48页. *
基于特征匹配的渐变纹理图像合成算法;叶永凯等;《计算机技术与发展》;20091130;第19卷(第11期);第2.3节第2段 *

Also Published As

Publication number Publication date
CN106940877A (en) 2017-07-11

Similar Documents

Publication Publication Date Title
US20210004962A1 (en) Generating effects on images using disparity guided salient object detection
CN106940877B (en) Image processing apparatus and method
US20170294000A1 (en) Sky editing based on image composition
US9292911B2 (en) Automatic image adjustment parameter correction
CN112862685B (en) Image stitching processing method, device and electronic system
US8666148B2 (en) Image adjustment
JP5818552B2 (en) Image processing apparatus, image processing method, and program
US20160307306A1 (en) Method and apparatus for image colorization
JP2013025650A (en) Image processing apparatus, image processing method, and program
KR20190030028A (en) Image processing method and device for auto white balance
JP2009169925A (en) Image retrieval device and image retrieval method
Liu et al. Texture filtering based physically plausible image dehazing
JP5864936B2 (en) Image processing apparatus, image processing method, and program
CN112396610A (en) Image processing method, computer equipment and storage medium
US11615507B2 (en) Automatic content-aware collage
WO2015189369A1 (en) Methods and systems for color processing of digital images
EP2966613A1 (en) Method and apparatus for generating a super-resolved image from an input image
JP4756436B2 (en) Pattern recognition apparatus, pattern recognition method, and pattern recognition program
JP2018206260A (en) Image processing system, evaluation model construction method, image processing method, and program
GB2548088A (en) Augmenting object features in images
TW202303518A (en) Method of processing circuit, system comprising processing circuit and system comprising means for processing
US9225876B2 (en) Method and apparatus for using an enlargement operation to reduce visually detected defects in an image
CN114764839A (en) Dynamic video generation method and device, readable storage medium and terminal equipment
US10614560B2 (en) Apparatus and method for image processing
Popowicz et al. Isoline based image colorization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant