WO2009055913A1 - Système et procédé pour assemblage d'images - Google Patents

Système et procédé pour assemblage d'images Download PDF

Info

Publication number
WO2009055913A1
WO2009055913A1 PCT/CA2008/001906 CA2008001906W WO2009055913A1 WO 2009055913 A1 WO2009055913 A1 WO 2009055913A1 CA 2008001906 W CA2008001906 W CA 2008001906W WO 2009055913 A1 WO2009055913 A1 WO 2009055913A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
images
avg
registering
pixels
Prior art date
Application number
PCT/CA2008/001906
Other languages
English (en)
Inventor
Vittorio Accomazzi
Songyang Yu
Elizabeth Klimczak
Original Assignee
Cedara Software Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cedara Software Corp. filed Critical Cedara Software Corp.
Publication of WO2009055913A1 publication Critical patent/WO2009055913A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/32Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone

Definitions

  • the invention relates to systems and methods for image stitching.
  • the relative location of the images is generally not known.
  • the relative position is known with reasonable accuracy, but this is often not accurate enough for the purpose of stitching the images together. In both cases, an accurate relative position of the images is needed for stitching.
  • Image stitching generally comprises a registration procedure followed by a fusion procedure.
  • Image stitching is widely used in medical fields, for instance in X-ray imaging, and microscopy. There is a wider necessity to stitch multiple images together that extends beyond the medical field. For example, software has been developed that can use a single digital camera to take several image of a landscape and then stitch them together as one image.
  • a method for combining a first image and a second image to produce a stitched output image comprising registering the first image and the second image and performing a fusion of the images to produce the output image by applying an heuristic blending operation according to the distance of pixels in the images to the image borders to compensate for noise.
  • a method for combining a first image and a second image to produce a stitched output image comprising: registering the first image and the second image by down sampling each image to generate a plurality of image levels, wherein an exhaustive search is performed at an image level having the lowest resolution to determine a transformation matrix that maximizes a normalized correlation between the images and promoting a resultant transformation matrix to each other image level until a final matrix is determined at an input level; and performing a fusion of the images to produce the output image.
  • a method for combining a first image and a second image to produce a stitched output image comprising: registering the first image and the second image by down sampling each image to generate a plurality of image levels; performing an initial registration at a lowest resolution of the image levels; fine tuning the registration by promoting resultant registrations through the image levels; and performing a fusion of the images to produce the output image by applying a linear intensity normalization, merging the images to generate an intermediate image; and applying a non-linear intensity normalization to the intermediate image to produce the output image.
  • Computer readable media, image processing devices and other apparatus are also provided that are configured to perform the above methods.
  • Figure 1 is a schematic diagram of an x-ray imaging system.
  • Figure 2 is a flow diagram illustrating primary steps in an image stitching procedure.
  • Figure 3 is a flow diagram showing the combination of two input x-ray images into a stitched image.
  • Figure 4 is a flow diagram showing the combination of three input microscopic images into another stitched image.
  • Figure 5 is a flow diagram showing the combination of two digital camera input images into yet another stitched image.
  • Figure 6 is a flow diagram showing sub- steps performed during the registration and fusion phases shown in Figure 2.
  • Figure 7 shows a stitched image having a poor registration between the two input images.
  • Figure 8 shows another stitched image having a poor registration between the two input images.
  • Figure 9 shows an image pyramid resulting from the down sampling of one input image.
  • Figure 10 shows an image pyramid resulting from the down sampling of another input image.
  • Figure 11 is a flow diagram illustrating the combination of two input images into a stitched image according to the process shown in Figure 6.
  • Figure 12 is a flow diagram illustrating the combination of two input images having collimators visible in each input image and a resultant stitched image having removed the collimators.
  • Figure 13 graphically illustrates the fusion phase shown in Figure 6.
  • Figure 14(a) shows a stitched output image using only the merge step shown in Figure 6.
  • Figure 14(b) shows a stitched output image using both the linear intensity normalization and merge steps shown in Figure 6.
  • Figure 14(c) shows a stitched output image using the linear intensity normalization, merge and non- linear intensity normalization steps shown in Figure 6.
  • Figure 15 is an intensity remapping graph for an input image and transformation of the input image.
  • Figure 16 graphically illustrates a merge process.
  • Figure 17(a) shows a stitched image using a merge process where the maximum value of the intensities of all images is output.
  • Figure 17(b) shows a stitched image using a merge process where the average value of the intensities of all images is output.
  • Figure 17(c) shows a stitched image using a merge process where the minimum value of the intensities of all images is output.
  • Figure 17(d) shows a stitched image using a merge process where the output value of the intensities is a blend based on heuristics.
  • Figure 18(a) shows a stitched image including a linear intensity normalization and blend.
  • Figure 18(b) shows a stitched image showing the effects of a non- linear intensity normalization.
  • Figure 18(c) shows a stitched image showing the effects of a different non-linear intensity normalization.
  • Figure 18(d) shows a stitched image showing the effects of another different nonlinear intensity normalization.
  • an x-ray system is generally denoted by numeral 10.
  • the x- ray system 10 comprises an x-ray apparatus 12 having an x-ray source 14, which, when excited by a power supply (not shown), emits an x-ray beam 15.
  • the x-ray beam 15 is directed towards a patient 22 and, in this example, passes through a shutter 16 disposed between the patient 22 and the source 14.
  • the shutter 16 includes an aperture 18 for limiting the amount of beam 15 that passes through the patient 22.
  • the x-rays of the beam 15 that pass through the patient 22 impinge on a photographic detection plate 20 in area 24, which captures an x-ray image of a portion of the patient 22.
  • the shutter 16 shown in Figure 1 is for illustrative purposes only and that the following principles apply to any other arrangement used for obtaining a shuttered image, e.g., a lead apron.
  • the x-ray system 10 also comprises an x-ray machine 26, which powers the x-ray source 14, acquires the x-rays impinging the detection plate 20, displays a two dimensional image 38 on a display 28 and is operated by a technician using an operator control 30.
  • the machine 26 comprises an image processing program 32 that is responsible for digitizing and processing the acquired x-ray area 24 according to standard image processing techniques to generate the image 28, a stitching program 34 for automatically registering and fusing multiple images 38 together, and a system controller 36, which is responsible for operating the machine according to technician input at the operator controls 30.
  • a large portion of the patient 22 is desired, e.g. the spine or leg.
  • a plurality of images 38 are acquired by the x-ray system 10 and the stitching program 34 used to perform an image stitching procedure.
  • the imaging stitching procedure comprises obtaining the images to be stitched together at 40, performing a registration phase at 42 to register the images 38 with each other, performing a fusion phase at 44 to combine the images 38 into a stitched image, and outputting the combined image at 46 so that it may be viewed on the display 28, transferred, uploaded, stored, etc.
  • stage 40 may include initiating the x-ray system 10 to acquire the images at that time, i.e. substantially in real time, or may simply obtain previously acquired images from a data storage device (not shown).
  • FIGs 3 to 5 illustrate the results of an image stitching procedures, in different applications.
  • two x-ray images 48a and 48b of a spine are stitched together to produce a full-spine image 48ab, e.g. using the stitching program 34.
  • each of the input x-ray images 48a, 48b include a region of overlap 49, which is used to register the images and perform the combination. The overlap is therefore fused together in the output image 48ab.
  • three microscopic images 50a, 50b and 50c are stitched together to produce a wider view of the cellular body 50abc.
  • the registration phase 42 is based on the following assumptions:
  • the first assumption an overlapping value of 5% is usually enough in order for the image stitching procedure to work.
  • the second assumption even if there are other transformations across the images (e.g. it is very common that there will be a perspective distortion, or objects have moved when the images have been taken) the method can still operate.
  • the image stitching procedure only recovers the translation, rotation and scaling difference across the images. It will be appreciated that if appropriate, other transformations can be accounted for such as perspective distortion.
  • the "fitness" or acceptability of a registration performed in the registration phase 42 is measured according to the similarity of the pixels in the overlapping region 49 of the images.
  • the metric used to make this determination is known as "Normalized Correlation".
  • the proposed registration phase 42 is formalized as a maximization optimization, in which the value of normalized correlation is progressively optimized at various image resolutions.
  • the optimization procedure used in the registration phase 42 can be separated conceptually into two primary stages, namely initial registration 56 and fine tuning 58.
  • Figure 6 shows a flow diagram of the sub-steps performed in both the registration phase 42 and the fusion phase 44.
  • the initial registration 56 is preceded by a down sampling stage 54 that, as will be explained below, enhances the performance of the registration phase 42 by operating in multiple image-resolution levels during the initial registration 56 and fining tuning 58 stages.
  • T x and T y are translations along the x axis and y axis respectively
  • is the angle of rotation between the images 38
  • s is a zooming or magnification factor.
  • the normalized correlation is used.
  • the normalized correlation for two images F and G when mapping image G to image F using the transformation matrix R, is defined as:
  • G R is the transformed image G
  • F avg and G R vg are the average pixel value of the overlapped region between the two images, and the summation is over the overlapped region 49.
  • Equation 1 above has some important characteristics.
  • the value of correlation between two images does not depend on the contrast in brightness.
  • the normalized correlation value of F and G' is the same as that of F and G.
  • the normalized correlation formula is tolerant of contrast and brightness variation across the images, and the 'Linear Intensity Remapping' section below shows how that the fusion phase 44 can be independent as well.
  • This information can be used by the application deploying the stitching program 34 to detect possible user or system mistakes, for instance: 1) the application is trying to stitch images which are not meant to be stitched together; or 2) the application is constraining the registration function too much.
  • the program 34 may, for example be forcing the registration phase 42 to have a large overlapping area 49 while the actual overlapping region 49 is much smaller.
  • the normalized correlation formula can readily be extended for handling color images.
  • each image can be give a vector at each pixel location, where the vector represents the color.
  • One common way to represent color is to use red, green and blue components in the vector, and therefore represent a color as a three dimensional vector. In such cases, the normalized correlation equation becomes:
  • Equation 2 Any representation of color as a vector can be used with Equation 2 above.
  • the most common representations are RGB (Red, Green and Blue). However, it will be appreciated that other representations may also be useful, such as Lightness- A-B (LAB) and hue, saturation, value (HSV).
  • LAB Lightness- A-B
  • HSV hue, saturation, value
  • the original input image is first down-sampled in stage 54, by a predefined down sample factor. It has been found that using a default value of 4 is suitable. This down-sampled image will be called input image thereafter.
  • the input image is preferably further down sampled to reduce the size of the image and thus enable an increase in processing speed. It will be appreciated that any amount of down sampling can performed according to the speed requirements and the processing capabilities of the processing program 32. As such, depending on the processing capabilities of the system 10, a particular image size can be chosen and then a suitable number of stages between the input image and that particular size are generated. For example, as shown in Figures 9 and 10, the input image (level 0) is first down sampled to the size of the nearest power of 2 that is smaller then the input image to produce level 1 of the pyramid, namely image. Preferably, the image at level 1 is further down sampled until it reaches a desired size.
  • a 128x128 pixel image provides adequate resolution with a decreased processing time when compared to the input image.
  • level 2 in addition to level 0 and level 1 , there is also level 2, where image at level 2 is in this example the chosen fully down sampled size of 128 x 128 pixels.
  • the images together create the multi-resolution pyramid representation of the input image as shown in Figure 9.
  • Figure 10 shows an image pyramid for a second input image that is to be stitched to the input image shown in Figure 9.
  • the initial registration 56 is performed as follows. Starting from the lowest resolution image in the pyramid (e.g. at level 2 in Figures 9 and 10), the method shown in Figure 6 performs an exhaustive search of all the possible transforms within predefined translation, rotation and zoom searching ranges to find out the transform that maximizes the normalized correlation. The resultant transform is called the initial registration transform. It has been found that suitable default searching ranges can be 5% of the image size for translation, 10% change for zoom, and 5 degrees for rotation.
  • the initial registration transform is then promoted up to the next level in the image pyramid until it reaches the input image level.
  • the stitching program 34 is trying to maximize the normalized correlation by compensating for the resolution loss because of the down sampling process.
  • the transform is scaled up to the next resolution level. Then an exhaustive search near the current transform is performed to determine if there is a better transform using the normalized correlation measure, if so, that transform will replace the current transform and will be used to promote to the next resolution level until the top of the multi- resolution pyramid.
  • a suitable translation searching range has been found to be the down sample factor between the two resolution levels. For example, if the two resolutions are 128x128 and 256x256, when promoted, the translation searching range would be 2 pixels. It has also been found that a suitable rotation searching range is from about -1 degree to about 1 degree, with a 1 degree step. It is typically not required to perform any scaling or zooming when the transform is promoted.
  • the transforming matrix obtained after the fine tuning step 58 is the transform that maximizes the normalized correlation between the overlapped areas in the two images and will be used for stitching the two images together.
  • the stitching result is shown in Figure 11.
  • R m,m+p R m/n+l * R m+l,m+2 * • R ⁇ m+ p- ⁇ jn+ p [0070] It may be noted that the process described above does not require any user intervention. As long as the images have a reasonable amount of overlapping content, the method can locate the relative position. As noted above, about 5 percent overlap has been found to be suitable to enable the algorithms described herein to produce adequate results.
  • two or more corresponding points might be known in the images with some approximation. This could be the case in which the images are taken from a fixed camera location, or an algorithm, or where the user has identified an object in the two images.
  • This information can be used by the stitching program 34 to restrict or limit the search space in which the normalized correlation is to be maximized.
  • the algorithm used by the stitching program 34 can be limited to search only in an area that maps the corresponding point to within a predetermined tolerance.
  • the transforming matrix can be calculated between the two corresponding points. The image stitching procedure would then use this calculated transforming matrix as the starting point for searching, and only search near this region.
  • the corresponding point could be set by the user or by the acquisition device if the relative location of the images is known.
  • the stitching program 34 can limit the search of the registration to where the overlapping region 49 of the image covers a certain area or it is within a specified range. The benefit of such modifications is to speed up the computation of the registration as well as to ensure a registration that is consistent with the expectations for the specific application.
  • FIG. 1 it is very common in medical X-ray imaging to protect the patient with a metal shield so that only the interested anatomy is irradiated.
  • the X-ray beam is effectively collimated on the anatomy using the shutter 16.
  • the metal shield will show up on the image as a large bright structure. Prior to the registration, this structure should be determined (manually or automatically) and the registration method should ignore the pixels in this area. Pixels in this area of the image will then be treated as if they were outside the image's boundaries.
  • Figure 12 shows a pair of input images with collimators which have been stitched together.
  • any manual or automatic shutter detection and removal routine can be performed.
  • a particularly suitable automatic shutter detection routine is described in U.S. Patent Application No. 11/857,546 filed on September 19, 2007, the contents of which are incorporated herein by reference.
  • the fusion phase 44 will now be described. As can be seen in Figure 6, the fusion phase 44 occurs after the registration steps when the relative location of the image is known. Turning now to Figure 13, the inputs of the fusion steps are the input images, depicted as F and G in Figure 13, and a transformation matrix which defines their relative location. The transformation matrix is, as discussed above, determined during the registration phase 42. The output of the fusion phase 44 is an output image O, which contains all the information in the input images.
  • the collimator e.g. shutter 16
  • the pixels in the collimator area should not be counted as image's pixels.
  • Linear Intensity Normalization step 60 In this step, the pixel of one image is linearly remapped to match the pixel of the other image.
  • Merge step 62 In this step, the pixel of the two images are merged together to generate the output image.
  • Non linear Intensity Normalization step 64 In this step, the intensities in output image are changed non-linearly to normalize the brightness thought the entire image.
  • steps 60 and 64 are entirely optional and thus can be skipped. Typically, if step 64 is applied, step 60 is also applied, however, it is not strictly necessary.
  • Figure 14 shows the fusion result when applying the steps outlined above.
  • Figure 14(a) illustrates the fusion of 4 input images using only the merge step 62.
  • Figure 14(b) illustrates the fusion of the same 4 input images using both the linear intensity normalization step 60 and the merge step 62.
  • Figure 14(c) illustrates the fusion of the 4 input images using all three steps 60-64.
  • the intensity of image G is changed to match more closely the intensity of image F.
  • determines the contrast in image G' and the ⁇ brightness, ⁇ and ⁇ are chosen such that the intensities of the pixels in G' are as close as possible to the intensities of the pixels in F.
  • ⁇ and ⁇ we consider all the pixels in the overlapping region (region 2 in Figure 13). For each pixel (x,y) in this region, there is a value from image F and a value from image G, wherein the value corresponds to a brightness for the pixel. Considering these values as a couple or pair and plotting them on a graph, an output such as that shown in Figure 14 would be obtained. Now, considering this graph, ⁇ and ⁇ can be determined as the coefficient of the linear regression line, which is optimal in the least square minimum sense in remapping the intensities of G into intensities of F.
  • the linear regression line can be computed as follows: )
  • the merge step 62 the input images are merged together to generate a single output image. If the linear intensity normalization step 60 is first used, the merge step 62 uses the intensity of the image G after the remapping. In this step, the pixels of the output image are set based on the area in which they belong to. For illustration, reference may be made to Figure 13:
  • Output pixels in area 1 are set as input pixels of image F.
  • Output pixels in area 2 are set as input pixels of image G R .
  • Output pixels of area 4 are set to a fixed value, typically defined by the application since it will depend on the type of images under consideration.
  • Output pixels in area 2 can be set according to one of the following rules:
  • Blend (see Figure 17(d)).
  • the blend method is an heuristic method based on the observation that pixels close to the image border are usually effected by noise. Therefore the blend method weights each pixel in each image proportional to the distance to the closest border.
  • the non-linear intensity normalization step 64 operates to change the intensities across the image such that similar tissues have a constant intensity over the entire image 38. It should be noted that this step can be a very valuable feature from an image quality standpoint. In the image shown in Figure 14(c), all the relevant anatomy is visible with no discontinuity. The doctors or radiologists can easily operate on such an image, whereas in images shown in Figures 14(a) and Figure 14(b) lack some of the contrast and brightness in at least some part of the anatomy that has been imaged. This may cause the doctor or radiologist to change the brightness and contrast depending on the part of the image being analyzed, which can be very inconvenient to do.
  • stitching is often used in orthopaedics, where images like the ones displayed in Figure 14 are used to measure leg length, which involves locating points on the hip (top of the image ), knee (mid of the image) and feet (bottom of the image ).
  • the non linear intensity normalization step 64 is accomplished by subtracting from the output image generated in the merge step 62, a weighted value of the low pass filter of the same image.
  • O' O- ⁇ *L(O); where L is a generic low pass filter.
  • L is a generic low pass filter.
  • One suitable filter is the Gaussian filter, whose kernel is defined in terms of standard deviation ⁇ , so the formula above can be rewritten as:
  • the value ⁇ determines the strength of the operation.
  • the value ⁇ determines the level of derails (frequency) which will be effected by the reduction. Typically this value is extremely large, around 10% of the image size.
  • Figure 18 shows various effects in changing ⁇ and ⁇ .
  • the original image with linear intensity normalization step 62 and the blend merge step 62 have been applied.
  • the image in Figure 18(b) is preferable to radiologists because this effect normalizes the intensities without "flattening" them to a narrow range (as can be seen in Figure 18(c), for example). It can be seen that the image in Figure 18(d) does not provide much of an improvement over the image in Figure 18(a).
  • Equation 3 an efficient way to implement Equation 3 is to apply a Gaussian filter (or in general a low pass filter) to a down sampled copy of the image O, and then multiply by ⁇ . The image obtained then will be up sampled to the original size and subtracted by O.
  • the fusion methods described above can be readily extended to multiple images. Given n images, Io,.. I n , the steps 60 and 62 are first applied to images Io and Ii . Then, the result Oo is fused with I 3 . The result of this fusion is then fused with I 4 and so forth. Once the complete output image has been obtained, step 64 can then be applied.
  • the registration of two images to be stitched together can be optimized by down sampling the input images and maximizing the normalized correlation until the best transformation matrix is found.
  • the result can then be promoted to the next level in the down sampled image pyramid until the best transformation is found in the original input image level.
  • the image pyramid enables an exhaustive search to be performed at lower resolutions and more concentrated searches at higher resolutions as the results are promoted.
  • a linear intensity normalization step 60 can be used to linearly remap pixels from one image to the other.
  • the merge step 62 can then be applied, in one of four ways, to produce the output image that includes the two input images fused together.
  • an heuristic blend method is used that is based on the recognition that pixels close to the image border are usually affected by noise.
  • the output image may then be subjected to a non-linear intensity normalization step 64 where the intensities in the output image are modified non- linearly to normalize the brightness throughout the entire image, in particular within different anatomy.

Abstract

La présente invention concerne un procédé d'assemblage d'images dans lequel l'enregistrement de deux images destinées à être assemblées est optimisé en sous-échantillonnant les images d'entrée et en maximisant la corrélation normalisée jusqu'à ce que l'on trouve la meilleure matrice de transformation. Le résultat peut alors être promu au niveau suivant dans la pyramide d'images sous-échantillonnées jusqu'à ce que l'on trouve la meilleure transformation dans le niveau d'images d'entrée d'origine. Une fois l'enregistrement terminé, une étape de normalisation d'intensité linéaire peut être utilisée pour remapper linéairement les pixels d'une image à l'autre. Une étape de fusion peut ensuite être appliquée, de l'une des quatre manières différentes, afin de produire l'image de sortie qui comprend les deux images d'entrée fusionnées l'une avec l'autre. L'image de sortie peut alors être soumise à une étape de normalisation d'intensité non linéaire lors de laquelle les intensités dans l'image de sortie sont modifiées de manière non linéaire pour normaliser la luminosité sur la totalité de l'image.
PCT/CA2008/001906 2007-10-30 2008-10-30 Système et procédé pour assemblage d'images WO2009055913A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US98372407P 2007-10-30 2007-10-30
US60/983,724 2007-10-30

Publications (1)

Publication Number Publication Date
WO2009055913A1 true WO2009055913A1 (fr) 2009-05-07

Family

ID=40590485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2008/001906 WO2009055913A1 (fr) 2007-10-30 2008-10-30 Système et procédé pour assemblage d'images

Country Status (1)

Country Link
WO (1) WO2009055913A1 (fr)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8340415B2 (en) 2010-04-05 2012-12-25 Microsoft Corporation Generation of multi-resolution image pyramids
CN103295209A (zh) * 2012-02-24 2013-09-11 深圳市蓝韵实业有限公司 Dr图像的拼接方法及系统
US8547389B2 (en) 2010-04-05 2013-10-01 Microsoft Corporation Capturing image structure detail from a first image and color from a second image
EP2648157A1 (fr) * 2012-04-04 2013-10-09 Telefonaktiebolaget LM Ericsson (PUBL) Procédé et dispositif pour transformer une image
CN103729833A (zh) * 2013-11-27 2014-04-16 乐视致新电子科技(天津)有限公司 一种图像拼接方法和装置
US9030490B2 (en) 2009-09-29 2015-05-12 Koninklijke Philips N.V. Generating composite medical images
JP2015229033A (ja) * 2014-06-05 2015-12-21 株式会社東芝 医用画像処理装置
WO2016106034A1 (fr) * 2014-12-22 2016-06-30 Medical Metrics, Inc. Procédés de détermination de l'instabilité de la colonne vertébrale et d'élimination de l'impact de l'effort du patient sur les déterminations de stabilité
CN106447612A (zh) * 2016-09-21 2017-02-22 湖南子午天地科技文化发展有限公司 一种图像拼接方法和生成装置
US9870601B2 (en) 2015-04-03 2018-01-16 Electronics And Telecommunications Research Institute System and method for displaying panoramic image using single look-up table
CN109377447A (zh) * 2018-09-18 2019-02-22 湖北工业大学 一种基于杜鹃搜索算法的Contourlet变换图像融合方法
US10521883B1 (en) 2018-07-26 2019-12-31 Raytheon Company Image turbulence correction using tile approach
CN111626936A (zh) * 2020-05-22 2020-09-04 湖南国科智瞳科技有限公司 一种显微图像的快速全景拼接方法及系统
CN114827482A (zh) * 2021-01-28 2022-07-29 北京字节跳动网络技术有限公司 图像亮度的调整方法、装置、电子设备及介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6303921B1 (en) * 1999-11-23 2001-10-16 Hewlett-Packard Company Method and system for capturing large format documents using a portable hand-held scanner
US6483893B1 (en) * 1998-11-27 2002-11-19 Wuestec Medical, Inc. Digital high resolution X-ray imaging
CA2541798A1 (fr) * 2002-11-05 2004-05-21 Diagnostic Ultrasound Corporation Instrument a ultrasons 3d pour la mesure non invasive du volume de liquide amniotique
US20070133736A1 (en) * 2005-10-17 2007-06-14 Siemens Corporate Research Inc Devices, systems, and methods for imaging

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6483893B1 (en) * 1998-11-27 2002-11-19 Wuestec Medical, Inc. Digital high resolution X-ray imaging
US6303921B1 (en) * 1999-11-23 2001-10-16 Hewlett-Packard Company Method and system for capturing large format documents using a portable hand-held scanner
CA2541798A1 (fr) * 2002-11-05 2004-05-21 Diagnostic Ultrasound Corporation Instrument a ultrasons 3d pour la mesure non invasive du volume de liquide amniotique
US20070133736A1 (en) * 2005-10-17 2007-06-14 Siemens Corporate Research Inc Devices, systems, and methods for imaging

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9030490B2 (en) 2009-09-29 2015-05-12 Koninklijke Philips N.V. Generating composite medical images
US8547389B2 (en) 2010-04-05 2013-10-01 Microsoft Corporation Capturing image structure detail from a first image and color from a second image
US8340415B2 (en) 2010-04-05 2012-12-25 Microsoft Corporation Generation of multi-resolution image pyramids
CN103295209A (zh) * 2012-02-24 2013-09-11 深圳市蓝韵实业有限公司 Dr图像的拼接方法及系统
EP2648157A1 (fr) * 2012-04-04 2013-10-09 Telefonaktiebolaget LM Ericsson (PUBL) Procédé et dispositif pour transformer une image
CN103729833A (zh) * 2013-11-27 2014-04-16 乐视致新电子科技(天津)有限公司 一种图像拼接方法和装置
JP2015229033A (ja) * 2014-06-05 2015-12-21 株式会社東芝 医用画像処理装置
WO2016106034A1 (fr) * 2014-12-22 2016-06-30 Medical Metrics, Inc. Procédés de détermination de l'instabilité de la colonne vertébrale et d'élimination de l'impact de l'effort du patient sur les déterminations de stabilité
US9870601B2 (en) 2015-04-03 2018-01-16 Electronics And Telecommunications Research Institute System and method for displaying panoramic image using single look-up table
CN106447612A (zh) * 2016-09-21 2017-02-22 湖南子午天地科技文化发展有限公司 一种图像拼接方法和生成装置
US10521883B1 (en) 2018-07-26 2019-12-31 Raytheon Company Image turbulence correction using tile approach
CN109377447A (zh) * 2018-09-18 2019-02-22 湖北工业大学 一种基于杜鹃搜索算法的Contourlet变换图像融合方法
CN109377447B (zh) * 2018-09-18 2022-11-15 湖北工业大学 一种基于杜鹃搜索算法的Contourlet变换图像融合方法
CN111626936A (zh) * 2020-05-22 2020-09-04 湖南国科智瞳科技有限公司 一种显微图像的快速全景拼接方法及系统
CN111626936B (zh) * 2020-05-22 2023-05-12 湖南国科智瞳科技有限公司 一种显微图像的快速全景拼接方法及系统
CN114827482A (zh) * 2021-01-28 2022-07-29 北京字节跳动网络技术有限公司 图像亮度的调整方法、装置、电子设备及介质
CN114827482B (zh) * 2021-01-28 2023-11-03 抖音视界有限公司 图像亮度的调整方法、装置、电子设备及介质

Similar Documents

Publication Publication Date Title
WO2009055913A1 (fr) Système et procédé pour assemblage d'images
US7092581B2 (en) Balancing areas of varying density in a digital image
US8401258B2 (en) Method to provide automated quality feedback to imaging devices to achieve standardized imaging data
JP5026939B2 (ja) 画像処理装置およびそのプログラム
JP4821611B2 (ja) 画像処理装置、画像処理方法及び画像処理プログラム
US8131051B2 (en) Advanced automatic digital radiographic hot light method and apparatus
US6292596B1 (en) Method for automatic image dependent digitization and processing of small format films
US20100128063A1 (en) Rendering for improved diagnostic image consistency
JPWO2018003503A1 (ja) 画像処理装置および画像処理方法、並びに医療用撮像システム
US6195474B1 (en) Pathology dependent viewing of processed dental radiographic film having authentication data
US20230306657A1 (en) Noise suppression using deep convolutional networks
JP2002162705A (ja) 位相コントラスト放射線画像処理装置
WO2021012520A1 (fr) Procédé et appareil d'assemblage d'images médicales arm tridimensionnelles, et dispositif électronique et support de stockage lisible par ordinateur
CN107622475A (zh) 图像拼接中的灰度校正方法
JP2002085392A (ja) 放射線画像処理方法および放射線画像処理装置
Piccinini et al. Colour vignetting correction for microscopy image mosaics used for quantitative analyses
US20080187194A1 (en) Cad image normalization
JP6642048B2 (ja) 医療画像表示システム、医療画像表示プログラム及び医療画像表示方法
Yin et al. Scalable edge enhancement with automatic optimization for digital radiographic images
JP4169954B2 (ja) 異常陰影候補の検出方法
CN106023078A (zh) 一种dr影像的拼接方法
Raja Rajeswari Chandni et al. Fundus image enhancement using EAL-CLAHE technique
JPH11332858A (ja) 照射野外黒化処理装置
JP2022137664A (ja) 情報処理装置、情報処理方法及びコンピュータプログラム
EP1446699B1 (fr) Procede d'optimisation et embellissement des films radiologiques par processus croise de traitement photographique de l'image et numerisation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08844867

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08844867

Country of ref document: EP

Kind code of ref document: A1