CN101615289B - Three-dimensional acquisition of biopsy tissues and fusion method of multilayer images - Google Patents

Three-dimensional acquisition of biopsy tissues and fusion method of multilayer images Download PDF

Info

Publication number
CN101615289B
CN101615289B CN200910089131A CN200910089131A CN101615289B CN 101615289 B CN101615289 B CN 101615289B CN 200910089131 A CN200910089131 A CN 200910089131A CN 200910089131 A CN200910089131 A CN 200910089131A CN 101615289 B CN101615289 B CN 101615289B
Authority
CN
China
Prior art keywords
image
images
biopsy tissues
row
biopsy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200910089131A
Other languages
Chinese (zh)
Other versions
CN101615289A (en
Inventor
郑众喜
刘明星
韩隽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Unic Tech Co ltd
Original Assignee
UNIC TECHNOLOGIES Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by UNIC TECHNOLOGIES Inc filed Critical UNIC TECHNOLOGIES Inc
Priority to CN200910089131A priority Critical patent/CN101615289B/en
Publication of CN101615289A publication Critical patent/CN101615289A/en
Application granted granted Critical
Publication of CN101615289B publication Critical patent/CN101615289B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses three-dimensional acquisition of biopsy tissues and a fusion method of multilayer images after acquisition. During three-dimensional acquisition of the biopsy tissues, the position of the initial focalplane is taken as the center, multilayer image acquisition is carried out on the update positions within the range of biopsy thickness and simultaneously the biopsy image on each layer is stored. The multilayer images of the invention adopt a partitioned area fusion method; based on the definition of the pixels in the images, the areas with relatively high definition in the images to be fused are selected and the template images are updated to finally achieve the purpose of optimizing the definition of the image of the whole biopsy. At the same time, a method of assignment weight is adopted to smoothen the pixel points in the boundary areas, thereby effectively inhibiting false edges. Being adopted, the method of the invention improves the reproduction accuracy of the digital images towards the characteristics of the biopsy tissues. The invention effectively alleviates the workload of the doctors and provides a basis for remote diagnosis.

Description

The three-dimensional acquisition of biopsy tissues and fusion method of multilayer images
Technical field
The invention belongs to the multilayer collection and the integration technology field of medical treatment and biologic slice image.More particularly, the present invention relates to a kind of multilayer fused images disposal route that is used for biopsy tissues.
Background technology
Image fusion technology is the fusion of visual information in the multi-sensor information fusion; It utilizes the different imaging mode of various imaging sensors; For different images provides complementary information, increase amount of image information, reduce the raw image data amount; Raising is to the adaptability of environment, with obtain more reliably, useful information supplies to observe or further handle more accurately.It is an emerging technology that combines sensor, signal Processing, Flame Image Process and artificial intelligence etc.In recent years, image co-registration has become important and useful graphical analysis of a kind of ten minutes and computer vision technique.It has a wide range of applications in fields such as automatic target identification, computer vision, remote sensing, robot, Medical Image Processing and military applications.
All can preserve organizational information in fields such as pathological diagnosis, genetic test, Pharmaceutical Analysis, blood analyses with the mode of making section.Yet, because characteristics such as section itself exist to be made difficulty, can't be duplicated, fugitive color are difficult to permanent preservation.Now, along with development of science and technology, the digitizing technique of biopsy tissues can intactly be saved in the biopsy tissues image in the computer, has solved the problem that biopsy tissues duplicates and keeps.
But in biopsy tissues was gathered, because section itself has certain thickness, and the thickness of whole section presented uneven distribution.Existing section collecting technology only limits to the subregion focusing and gathers and three-dimensional acquisition, can't accomplish true clear reproduction to the slice feature on the different focal planes in the zone.The present invention combines image fusion technology with three-dimensional acquisition, make this problem obtain solution.
Summary of the invention
For solving because biopsy tissues organizes became uneven can't obtain the problem of tissue image clearly.Need on the basis that focusing is gathered to the biopsy tissues piecemeal, add three-dimensional layering collection and multi-layer image and merge.Promptly under identical image-forming condition, a plurality of images on the different layers of lens focus same tissue region can obtain all targets in this tissue regions through image co-registration and all focus on the image after clear.In the present invention, the implication of said layer is following: although the tissue in the biopsy tissues is very thin, also be 3 D stereo, the image on different horizontal face (focal plane) is also different.Each focal plane is one deck.
The invention discloses a kind of three-dimensional layering collection and image interfusion method of biopsy tissues, said method comprising the steps of:
(1) for an arbitrary selection area of biopsy tissues tissue; This zone is meant that doctor or operator can select arbitrarily as required; In Flame Image Process, this zone is known as " area-of-interest " (ROI, Region of Interest); The sectioning image collecting device is gathered biopsy tissues view data and preservation on the current layer on the position of initial focal plane;
(2) be benchmark with above-mentioned initial focal plane position; In biopsy tissues tissue thickness scope, move up or down section, whenever move setpoint distance, just gather view data and preservation on this layer; When the mobile bound that is provided with, stop the sectioning image collection of this specific region; Said biopsy tissues tissue is three-dimensional, three-dimensional, and three-dimensional biopsy tissues tissue image can resolve into the image on the different focal planes in layer.
(3) carry out multi-layer image and merge, with in the image of being gathered under above-mentioned several different focal separately the most clearly partial fusion in piece image, thereby form the best image of sharpness.Said multi-layer image merges also can be through not preserving every layer image; The mode that merge on images acquired limit, limit realizes; The biopsy tissues view data of on said initial focal plane position, being gathered is considered to the highest image of sharpness in the image of all collections; When carrying out the multi-layer image fusion; At first, read in the biopsy tissues image except that initial focal plane position that a width of cloth was preserved, each regional sharpness of calculating present image with the view data of on initial focal plane position, gathering template image as image co-registration; Current image that reads in and template image are aimed at, and the area part that present image is higher with respect to the sharpness of template image merges with the corresponding part of template image; Image after renewal template image, the fusion soon is as new template image; Read in the image of other width of cloth slicing layer again, carry out image co-registration in a manner described, all merge completion up to slicing layer image with all collections and preservation.
Said multi-layer image merges also can be through not preserving every layer image, and the mode that merge on images acquired limit, limit realizes.
When carrying out the multi-layer image fusion, the multilayer sectioning image is being divided on the basis of several regions, its local contrast is asked in each zone reflected the difference between image focusing clear area and the fuzzy region.Utilize every regional local contrast that entire image is divided into clear area and fuzzy region; All again that the clear area is adjacent with fuzzy region piece zones divide borderline region into; In the present invention; With the marginal point that produces owing to image co-registration is the center, and three interior zone definitions of pixel coverage are borderline region up and down.Three zoness of different that obtain image are divided.For clear area and fuzzy region, directly choose the clear area as the relevant block zone after merging, for borderline region, adopt interpolation processing.After the fusion, picture quality is greatly improved.Pathology detection for follow-up possesses important practical value.
In the biopsy tissues IMAQ, very high to accuracy requirement.Even for the zone of an appointment, on the different focal planes, the image of collection is also inequality.During just as traditional microscopic examination, the doctor need adjust the position of object lens, and the key area of seeing a biopsy tissues clearly is the same; In the sectioning image of robotization is gathered, also need simulate this process of reproduction, can the characteristic of biopsy tissues be reappeared in digital picture truly.Like this, the doctor just can be the patient diagnosis state of an illness through browsing digital picture on computers, saves time, laborsaving, efficient.Simultaneously, also reduced the careless and inadvertent omission that causes.
Description of drawings
Fig. 1 is the sectioning image acquisition process process flow diagram that the present invention defines;
Fig. 2 is the image co-registration processing flow chart that the present invention defines;
Fig. 3 is two width of cloth image co-registration samples;
Fig. 4 is that the fringe region of fused images is handled synoptic diagram;
Fig. 5 is the sectioning image when not adding three-dimensional image acquisition and image co-registration;
Fig. 6 is the sectioning image after adding three-dimensional image acquisition and the image co-registration;
Fig. 7 is the image syncretizing effect comparison diagram.
Specific embodiments
Also in conjunction with the preferred embodiments technical scheme of the present invention is done further explain based on Figure of description below.
The computer system that this instance is selected for use is common PC system, and operating system is Windows XP HOME version.But, it should be appreciated by those skilled in the art that the spirit and scope of the present invention are not limited to any computing machine type and operating system, and specific communications protocol.
The sectioning image collecting device that this instance is selected for use is the biopsy tissues acquisition system of introducing among the utility model patent ZL200620139251.9.This equipment can be passed to PC through high-resolution CCD industrial camera with the tissue image in the observable section of microscopically, and can on PC, observe or save as the file of picture format.
In order to explanation the present invention and its practical application, and therefore make those skilled in the art can make and use the present invention in this embodiment that provides and example.But this only is a simple example, in fact can also improve based on actual condition of different concrete application of the present invention.
The implementation procedure of this instance comprises three-dimensional layered image collection and two parts of image co-registration of biopsy tissues image.Shown in Figure 1 is biopsy tissues image acquisition and processing process flow diagram, according to Fig. 1 sectioning image collecting flowchart of the present invention is done simple a description below.
For a specific region of biopsy tissues, at first, gather the slice image data on the current layer according to the position of initial focal plane; Then; With this position is benchmark, and upwards moves down section in the slice thickness scope, whenever moves a fixed step size (normally about 1/10 of biopsy tissues thickness; The present invention is applicable to the section of any thickness; The thickness of biopsy tissues is usually in 2~3 μ m levels in the present embodiment, and the change of distance realizes through the motor-driven mechanical drive between section and the object lens, and step-length, promptly the distance of a pulse correspondence is in 0.2~0.3 μ m level; Approximately be biopsy tissues thickness 1/10th), just stop to move and gathering the image on this layer; When reference position (just initial focal plane position), after promptly the IMAQ at the place of step-length up and down of focal plane A position finished, step-length added 1, continued to move and images acquired; When step-length surpasses predefined mobile bound, stop IMAQ, IMAQ is accomplished.Preferably, said predefined moving is limited to the half the of section maximum ga(u)ge value or minimum thickness value up and down; In practical operation, biopsy tissues once moves the distance of a pulse.Biopsy tissues is moved up by the position of initial focal plane earlier, moves the distance of a pulse at every turn, and gathers the image of this focal plane, moves five times; Get back to the position of initial focal plane; Move down, move the distance of a pulse at every turn, and gather the image of this focal plane, move five times.)
Move the assigned address (101) of cutting into slices:
Before the work of this instance system, the section that the user needs oneself manually will gather is placed on the section frame.The section frame can take out, and places section to make things convenient for the user.The slice type of native system support is standard section HE, Immunity, ISH, Frozen.
Mobile section is moved through Electric Machine Control, and the mobile accuracy of native system is very high, can reach in 1 micron, because this system is not the emphasis of content of the present invention, therefore is not described in detail.The mobile new position of cutting into slices of process flow diagram back all is meant and utilizes electric machine control system to move.
Calculate initial focal plane position A (102):
Use CCD camera continuous acquisition multilayer sectioning image,, select that the highest one deck of sharpness as initial focal plane position through the sharpness of more every tomographic image.Among the present invention, claim that this position is focal plane A.
This problem is actually the image definition metric question.Concerning piece image, if very sharpening of top object edge (in the present invention, said object edge refers to the edge of biopsy tissues tissue) looks just very clear; If (" the object edge gradient " among the present invention refers to the gradient at biopsy tissues organization edge place to the object edge gradient, and this value is big, and the edge is just clear; Otherwise just fuzzy) less, look just fuzzyyer.Therefore, the tolerance of edge gradient can be used for estimating the sharpness difference of piece image.At present the measure of edge gradient has a variety ofly, and we preferably use the sharpness that the quadratic sum (S) at edge is come the dimensioned plan picture here.
S = Σ ( i , j ∈ I ) C ( i , j ) 2
C (i, j) sharpness of the capable j row of i pixel in the presentation video wherein.
Simultaneously, C ( i , j ) = ( I ( i , j - 1 ) - I ( i , j + 1 ) ) 2 + ( I ( i - 1 , j ) - I ( i + 1 , j ) 2 , (i is that i is capable in the image j) to I, the data gray-scale value that j lists.Wherein, the span of i and j is the point in the capable j row of i in the entire image.
Like this, all use top formula to calculate sharpness, obtain the maximum image of S that comes and be exactly focal plane the most clearly each width of cloth image that collects.
Preserve this images of positions (103):
Because therefore the follow-up process that also has image co-registration, need preserve the view data of these pilot process.A lot of methods can be arranged to the image preservation that collects, just no longer be described in detail here.In order to save time, just can carry out the work of image co-registration when in fact gathering the pilot process image.
Section and images acquired (104) move up
Need to cut into slices to move up 5 times continuously, so at first with counter zero setting here.
Give pulse signal of motor, the motor-driven mechanical drive makes the certain distance that moves up of cutting into slices.(abbreviating the distance of a pulse here as).This distance is relevant with the precision of equipment itself.In the biopsy tissues image capturing system that this instance is selected for use, the range accuracy of a pulse correspondence is approximately 1/10th of biopsy tissues thickness in 0.2~0.3 μ m level.
Preserve the sectioning image of this layer.Identical with 103.
Counter increases 1.When the value of counter more than or equal to 5 the time, promptly think no longer to move up the upper surface that exceeds section.The value of counter less than 5 the time, the section that continues to move up, and images acquired.
Section moves to focal plane A (105)
Give 5 pulse signals of motor, make section move down the distance of 5 pulses, get back to A position, focal plane.
Move down section and images acquired (106)
Need to cut into slices to move down 5 times continuously, so at first with counter zero setting here.
Give pulse signal of motor, the motor-driven mechanical drive makes section move down the distance of a pulse.
Preserve the sectioning image of this layer.Identical with 103.
Counter increases 1.When the value of counter more than or equal to 5 the time, promptly think no longer to move down the lower surface that exceeds section.The value of counter continues to move down section less than 5 the time, and images acquired.After the biopsy tissues tissue is also preserved according to above method images acquired; Need carry out image co-registration to each tomographic image of gathering; So-called image co-registration is meant and merges two width of cloth or a multiple image from Same Scene under the different condition or target to improve the sharpness and the identifiability of image, obtain single image the process of the characteristic information that can not provide.Image co-registration mainly is to come the sharpness of current pixel is measured according to the intensity of a certain size neighborhood inward flange of pixel in each image and quantity; The pixel fusion that the comprehensive sharpness of two width of cloth or multiple image is best has just formed distinct image in piece image.
In this process, need to solve two critical problems.The first, treat between the fused images, can there be skew more or less, reflect in two width of cloth images that promptly the pixel of biopsy tissues same section characteristic not necessarily is in the correspondence position of two width of cloth images.The second, the noise that produces when suppressing image co-registration, the i.e. non-existent pseudo-edge of script.Specify below.
It is shown in Figure 2 that the flow process of image co-registration sees also flow process.
Image co-registration may further comprise the steps:
Template image (201)
As its name suggests, image co-registration be exactly multi-layer image the most clearly partial fusion become piece image, be equivalent to be compressed to three-dimensional information a process of two dimensional surface.Therefore, at first need be the highest piece image of sharpness, the image that obtains during promptly initial focal plane is preserved as template image.The process of seeking the highest image of sharpness also is to calculate the process (102) of current the most clear focal plane.Please with reference to the specific descriptions of front.
Read in image (202)
Read in one deck sectioning image,, carry out follow-up operation of merging with template image as present image.
In practical operation,, gathered the image of one deck, the fusion process that just gets into current layer image and template image from the consideration of efficient.For it is implementation method of the present invention more clearly is described, Cai three-dimensional acquisition and image co-registration is described separately.
Image alignment (203)
Solve the problem of treating to exist between the fused images skew here.
In gathering the process of medical image, owing to reasons such as machineries, the content between each image in the horizontal direction perhaps (with) can have certain skew on the vertical direction.If directly with each image the most clearly partial fusion together, also can there be dislocation in content in the final result images.Have a strong impact on the quality of image.
For fear of this problem, we chose template image as a reference before carrying out image co-registration, and the image that each width of cloth is to be merged all is reference object with the template image, calculates relative displacement through the correlativity of two images:
Corr = I 1 ( i , j ) * I 2 ( m , n ) Σ i , j I 1 2 ( i , j ) * Σ m , n I 2 2 ( m , n )
Wherein, Corr presentation video I 1And image I 2Correlativity, I 1(i, j) presentation video I 1The gray-scale value of the capable j row of i pixel, I 2(m, n) presentation video I 2The gray-scale value of the capable n row of m pixel.To treat that fused images makes certain limit (horizontal direction perhaps (with) vertical direction) skew; Calculate the correlativity between itself and the template image; When obtaining maximum correlation, promptly think to search out best deviation post, according to this side-play amount image is adjusted to normal state then.Be that horizontal offset and vertical offset are respectively Corr when obtaining maximal value, the difference of i and m and j and n's is poor.
Image co-registration (204)
Calculate current each the regional sharpness of fused images of treating, after with image current to be merged and template image aligning, further that present image is higher with respect to the sharpness of template image area part is substituted on the template image.
Template image after renewal template image, the fusion soon is as new template image;
If offset except two width of cloth images after the skew, directly merge.Certainly exist some tangible pseudo-edges in the image after merging so, influence the effect of image co-registration.Pseudo-edge is meant the non-existent noise that in IMAQ and processing procedure, produces of script in the biopsy tissues among the present invention.The reason that pseudo-edge produces is that the pixel value of its neighboring pixel derives from different images, and these pixel values differ greatly.
The disposal route that the present invention takes is when carrying out image co-registration, at first to carry out image definition C (i, calculating j).This also is the preprocessing process of image co-registration, prepares for suppressing noise.
The quadratic sum of the component of the gradient of utilizing current pixel on level and vertical both direction efficiently and accurately the sharpness of this pixel portrays:
C ( i , j ) = ( I ( i , j - 1 ) - I ( i , j + 1 ) ) 2 + ( I ( i - 1 , j ) - I ( i + 1 , j ) ) 2
Wherein, C (i, the j) sharpness of the capable j row of i pixel in the presentation video, I (i, j) gray scale of the capable j row of i pixel in the presentation video.Treat that relatively (i, j) value select the high point (being worth bigger point) of sharpness as fusion results for the C of corresponding point in the fused images.Illustrate as follows:
Fig. 3-1 and 3-2 are the images that two width of cloth prepare to carry out mixing operation.If each gray values of pixel points is I among Fig. 3-1 a(i, j), each gray values of pixel points is I among Fig. 3-2 b(i, j), the sharpness C of each pixel among difference calculating chart 3-1 and Fig. 3-2 a(i, j) and C b(i, j), then each gray values of pixel points of fused image 3-4 does
I R ( i , j ) = I a ( i , j ) C a ( i , j ) &GreaterEqual; C b ( i , j ) I b ( i , j ) C a ( i , j ) < C b ( i , j )
Among the image 3-4 after fusion; There are several borderline regions that pseudo-edge is arranged (in the present invention; With the marginal point that produces owing to image co-registration is the center; Three interior zone definitions of pixel coverage are borderline region up and down), this is because this regional pixel value derives from two width of cloth or the multiple image that differ greatly and causes.By this problem of identification image explanation, shown in Fig. 3-3, each pixel among the image 3-4 behind corresponding the fusion if derive from image 3-1, uses density bullet; If instead derive from image 3-2, use white marking.In Fig. 3-3, can be clear that the position at edge.
Explanation suppresses the method for pseudo-edge below: the method that the employing of this embodiment assigns weight is carried out smoothly the pixel of borderline region.
In Fig. 4, establishing curve is an edge that exists in the fused image.Choose any point O on the curve, the O corresponding gray in Fig. 3-1 of setting up an office is I a(i o, j o), corresponding gray is I in Fig. 3-2 b(i o, j o).After the image co-registration, the gray-scale value of some O is I R(i o, j o).
I R(i o,j o)=I a(i o,j o)*N a+I b(i o,j o)*N b
N aAnd N bBe respectively Fig. 3-1 with 3-2 in respective pixel to the weight of O point gray-scale value.The distribution situation decision of 8 neighborhood points that weight is ordered by O (with O is the center, all the other 8 pixels in the 3*3 matrix) in 3-1 and 3-2 two width of cloth images.With Fig. 4 is example, and in 8 neighborhood points, 4 are distributed among Fig. 3-1, and all the other 4 are distributed among Fig. 3-2.So, N a=4/8;
N b=4/8。
Substitution formula I R(i o, j o)=I a(i o, j o) * N a+ I b(i o, j o) * N b
I then R(i o, j o)=I a(i o, j o) * 4/8+I b(i o, j o) * 4/8
Each point in the fringe region (marginal point three pixel coverages up and down is interior) uses said method, correction formula I R ( i , j ) = I a ( i , j ) C a ( i , j ) &GreaterEqual; C b ( i , j ) I b ( i , j ) C a ( i , j ) < C b ( i , j ) The gray values of pixel points that obtains obtains final fused images, shown in Fig. 3-4.
Upgrade template image (205)
After the calculating through step (204), as new template image, and carry out ensuing mixing operation, all merge until the image that collects and finish with the image after merging.
Effect after the image co-registration is referring to the accompanying drawing 5-7. of back
Fig. 5 and Fig. 6 are respectively the contrast images before and after piece image merges, and Fig. 7 is the contrast images before and after whole tissue merges, and can obviously find out through the image after the image co-registration just to look more level and smooth, clear, true, natural.

Claims (7)

1. the three-dimensional image acquisition of a biopsy tissues and fusion method of multilayer images is characterized in that, said method comprising the steps of:
(1) for an arbitrary selection area of biopsy tissues, the sectioning image collecting device is gathered slice image data and preservation on the current layer on initial focal plane;
(2) be benchmark with above-mentioned initial focal plane; Change the focal plane by in the biopsy tissues thickness range, moving up or down section or camera lens; Whenever move setpoint distance; Just gather the view data on the slicing layer and preserve,, stop the three-dimensional image acquisition of selection area when moving up or down when reaching the mobile bound that sets in advance;
(3) carrying out multi-layer image merges; Sharpness in conjunction with the image of gathering on above-mentioned several different focal planes; The partial fusion that sharpness is high in piece image, thereby form best fused images, the biopsy tissues view data of on said initial focal plane position, being gathered is considered to the highest image of sharpness in the image of all collections; When carrying out the multi-layer image fusion; At first, read in the biopsy tissues image except that initial focal plane position that a width of cloth was preserved, each regional sharpness of calculating present image with the view data of on initial focal plane position, gathering template image as image co-registration; Current image that reads in and template image are aimed at, and the area part that present image is higher with respect to the sharpness of template image merges with the corresponding part of template image; Image after renewal template image, the fusion soon is as new template image; Read in the image of other width of cloth slicing layer again, carry out image co-registration in a manner described, all merge completion up to slicing layer image with all collections and preservation.
2. method according to claim 1 is characterized in that: described setpoint distance is specified by the user.
3. method according to claim 2 is characterized in that: described setpoint distance be biopsy tissues thickness 1/10.
4. method according to claim 1 is characterized in that:
Before two width of cloth images merge; Do the skew of horizontal direction or vertical direction with treating fused images;,, when obtaining maximum correlation, promptly thought to search out optimized migration position by fused images or title template image and the correlativity of treating fused images according to computes by fused images; To be adjusted to normal state by fused images according to side-play amount then
Corr = I 1 ( i , j ) * I 2 ( m , n ) &Sigma; i , j I 1 2 ( i , j ) * &Sigma; m , n I 2 2 ( m , n )
Wherein, Corr representation template image and the correlativity of treating fused images, I 1(i, j) presentation video I 1The gray-scale value of the capable j row of i pixel, I 2(m, n) presentation video I 2The gray-scale value of the capable n row of m pixel, horizontal offset and vertical offset are respectively Corr when obtaining maximal value, and the difference of i and m and j and n's is poor, and wherein i, j are respectively image I 1The capable j row of i, span is respectively the 1-image I 1The maximal value and the 1-image I of row 1The maximal value of row, m, n are respectively image I 2The capable n row of m, span is respectively the 1-image I 2The maximal value and the 1-image I of row 2The maximal value of row.
5. according to claim 1 or 4 described methods, it is characterized in that: the pseudo-edge to the borderline region of fused image suppresses, and removes because of immediate data to merge the noise that produces.
6. method according to claim 5 is characterized in that: the pseudo-edge to the borderline region of fused image suppresses: choose any point on the fused image borderline region, establishing this some corresponding gray in treating fused images is I a(i o, j o), corresponding gray is I at template image or in claiming by fused images b(i o, j o), after the image co-registration, its gray-scale value is I R(i o, j o),
I R(i o,j o)=I a(i o,j o)*N a+I b(i o,j o)*N b
N aAnd N bBe respectively to treat fused images and by the weight of respective pixel in the fused images to this gray-scale value, i 0, j 0Be respectively the row and column in this some place image, weight merges the distribution situation decision in the two preceding width of cloth images by 8 neighborhood points of this point.
7. according to the described method of claim 1-3, it is characterized in that: said multi-layer image merges also can be through not preserving every layer image, and the mode that merge on images acquired limit, limit realizes.
CN200910089131A 2009-08-05 2009-08-05 Three-dimensional acquisition of biopsy tissues and fusion method of multilayer images Active CN101615289B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910089131A CN101615289B (en) 2009-08-05 2009-08-05 Three-dimensional acquisition of biopsy tissues and fusion method of multilayer images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910089131A CN101615289B (en) 2009-08-05 2009-08-05 Three-dimensional acquisition of biopsy tissues and fusion method of multilayer images

Publications (2)

Publication Number Publication Date
CN101615289A CN101615289A (en) 2009-12-30
CN101615289B true CN101615289B (en) 2012-10-03

Family

ID=41494913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910089131A Active CN101615289B (en) 2009-08-05 2009-08-05 Three-dimensional acquisition of biopsy tissues and fusion method of multilayer images

Country Status (1)

Country Link
CN (1) CN101615289B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106998459A (en) * 2017-03-15 2017-08-01 河南师范大学 A kind of single camera stereoscopic image generation method of continuous vari-focus technology

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101794441A (en) * 2010-04-08 2010-08-04 西安交通大学 Method for reestablishing multi-frame different-angle high-resolution facial image by using single-frame low-resolution facial image
US8396269B2 (en) * 2010-04-08 2013-03-12 Digital Pathco LLC Image quality assessment including comparison of overlapped margins
CN101996397B (en) * 2010-11-04 2012-05-30 山东易创电子有限公司 Method for making digital slices
CN102944517A (en) * 2012-11-28 2013-02-27 南昌百特生物高新技术股份有限公司 Hierarchical diagnosis method for liquid-based cytology
CN103308452B (en) * 2013-05-27 2015-05-06 中国科学院自动化研究所 Optical projection tomography image capturing method based on depth-of-field fusion
CN105004723A (en) * 2015-06-25 2015-10-28 宁波江丰生物信息技术有限公司 Pathological section scanning 3D imaging and fusion device and method
CN105118088A (en) * 2015-08-06 2015-12-02 曲阜裕隆生物科技有限公司 3D imaging and fusion method based on pathological slice scanning device
CN105241811B (en) * 2015-09-30 2018-10-23 爱威科技股份有限公司 Multi-level focus adopts drawing method and system automatically
CN105578045A (en) 2015-12-23 2016-05-11 努比亚技术有限公司 Terminal and shooting method of terminal
CN107346537A (en) * 2016-05-05 2017-11-14 福耀集团(上海)汽车玻璃有限公司 glass intelligent counting method, device and mobile phone
CN106780488B (en) * 2017-01-16 2021-05-11 宁波江丰生物信息技术有限公司 System and method for detecting definition of digital pathological section
CN107360412A (en) * 2017-08-21 2017-11-17 广州视源电子科技股份有限公司 3D rendering creation method, capture apparatus and readable storage medium storing program for executing
CN107481213A (en) * 2017-08-28 2017-12-15 湖南友哲科技有限公司 Microscope hypograph multi-layer focusing fusion method
CN107395993B (en) * 2017-09-08 2023-06-30 北京睿智奥恒视觉科技有限公司 Full-automatic focusing method and system
CN107635093A (en) * 2017-09-18 2018-01-26 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium
CN109856015B (en) * 2018-11-26 2021-08-17 深圳辉煌耀强科技有限公司 Rapid processing method and system for automatic diagnosis of cancer cells
CN110264310B (en) * 2019-05-30 2021-09-03 肖伯祥 Clothing pattern making method based on human body big data
CN110276408B (en) * 2019-06-27 2022-11-22 腾讯科技(深圳)有限公司 3D image classification method, device, equipment and storage medium
CN110533772B (en) * 2019-08-23 2021-02-02 中国科学院自动化研究所 Three-dimensional image library obtaining method based on biological tissue sequence slice etching thinning
CN110675354B (en) * 2019-09-11 2022-03-22 北京大学 Image processing method, system and storage medium for developmental biology
CN112986239B (en) * 2021-02-05 2024-06-07 爱威科技股份有限公司 Hierarchical image acquisition method, hierarchical image acquisition device, computer equipment and storage medium
CN113487508B (en) * 2021-07-08 2024-03-26 山东志盈医学科技有限公司 Method and device for dynamically adjusting picture definition of digital slice scanner

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1353305A (en) * 2000-11-10 2002-06-12 北京理工大学 Ultrahigh-sensitivity fluorescence microprobing and processing system
CN101093280A (en) * 2006-06-22 2007-12-26 北京普利生仪器有限公司 Method for preparing microscopic image of holographic digitalized sliced sheet

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1353305A (en) * 2000-11-10 2002-06-12 北京理工大学 Ultrahigh-sensitivity fluorescence microprobing and processing system
CN101093280A (en) * 2006-06-22 2007-12-26 北京普利生仪器有限公司 Method for preparing microscopic image of holographic digitalized sliced sheet

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106998459A (en) * 2017-03-15 2017-08-01 河南师范大学 A kind of single camera stereoscopic image generation method of continuous vari-focus technology

Also Published As

Publication number Publication date
CN101615289A (en) 2009-12-30

Similar Documents

Publication Publication Date Title
CN101615289B (en) Three-dimensional acquisition of biopsy tissues and fusion method of multilayer images
US8396269B2 (en) Image quality assessment including comparison of overlapped margins
EP1016031B1 (en) Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
US6567682B1 (en) Apparatus and method for lesion feature identification and characterization
US7194118B1 (en) System for optically sectioning and mapping surgically excised tissue
US6101265A (en) Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
CN103402435B (en) Medical image processing device and medical image processing method
US20010050999A1 (en) Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
DE102005037806A1 (en) Method and device for enlarging the field of view in ultrasound imaging
WO1998044446A9 (en) Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
CN103592754B (en) A kind of digital slices real time scan automatic focus tracking
US9253449B2 (en) Mosaic picture generation
CN102053356A (en) System and method for imaging with enhanced depth of field
CN102053357A (en) System and method for imaging with enhanced depth of field
CN101930606A (en) Field depth extending method for image edge detection
CN106327451A (en) Image restorative method of ancient animal fossils
CN101436313A (en) Method for preparing anti-interference three-dimensional virtual sliced sheet
CN111665617A (en) Focusing method and system
CN108470585A (en) A kind of long-range mask method of interactive virtual sliced sheet and system
CN101996397B (en) Method for making digital slices
CN103955941A (en) Corneal endothelial cell analysis meter and analysis method
CN112363309B (en) Automatic focusing method and system for pathological image under microscope
CN101385640A (en) Anatomy image forming method and system
CN114219702A (en) Different-staining pathological section image matching and displaying method
CN103728304A (en) Focusing method for pathological section scanner

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: SHANGHAI UNINANO ADVANCED MATERIALS CO., LTD.

Free format text: FORMER OWNER: BEIJING UNITECH TECHNOLOGIES INC.

Effective date: 20150310

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 100085 HAIDIAN, BEIJING TO: 200030 XUHUI, SHANGHAI

TR01 Transfer of patent right

Effective date of registration: 20150310

Address after: 200030 room 19B1, 789 Jia Bang Road, Shanghai, Xuhui District

Patentee after: Shanghai youna Science & Technology Co.,Ltd.

Address before: 100085, Beijing, Haidian District on the East Road, No. 9, building on the first floor, No. 5 North District

Patentee before: BEIJING UNIC TECH CO.,LTD.

TR01 Transfer of patent right

Effective date of registration: 20180921

Address after: 100085 402, four floor 7, five street, Haidian District, Beijing.

Patentee after: BEIJING UNIC TECH CO.,LTD.

Address before: 200030 19B1 room 789, zhaojiaxin Road, Xuhui District, Shanghai.

Patentee before: Shanghai youna Science & Technology Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20181025

Address after: 100085 room 402, four floor, Hao Hai building, 7 five street, Haidian District, Beijing.

Patentee after: BEIJING UNIC TECH CO.,LTD.

Address before: 200030 19B1 room 789, zhaojiahong Road, Xuhui District, Shanghai.

Patentee before: Shanghai youna Science & Technology Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190619

Address after: 215010 West of No. 27, Taishan Road, Suzhou High-tech Zone, Jiangsu Province

Patentee after: SUZHOU YOUNA MEDICAL EQUIPMENT Co.,Ltd.

Address before: 100085 room 402, four floor, Hao Hai building, 7 five street, Haidian District, Beijing.

Patentee before: BEIJING UNIC TECH CO.,LTD.

TR01 Transfer of patent right
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230907

Address after: 100085 402, four floor 7, five street, Haidian District, Beijing.

Patentee after: BEIJING UNIC TECH CO.,LTD.

Address before: 215010 West of No. 27, Taishan Road, Suzhou High-tech Zone, Jiangsu Province

Patentee before: SUZHOU YOUNA MEDICAL EQUIPMENT Co.,Ltd.