CN102567988B - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
CN102567988B
CN102567988B CN201110386605.5A CN201110386605A CN102567988B CN 102567988 B CN102567988 B CN 102567988B CN 201110386605 A CN201110386605 A CN 201110386605A CN 102567988 B CN102567988 B CN 102567988B
Authority
CN
China
Prior art keywords
evaluation
mentioned
image processing
texture component
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110386605.5A
Other languages
Chinese (zh)
Other versions
CN102567988A (en
Inventor
河野隆志
神田大和
北村诚
弘田昌士
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2010265795A external-priority patent/JP5576775B2/en
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN102567988A publication Critical patent/CN102567988A/en
Application granted granted Critical
Publication of CN102567988B publication Critical patent/CN102567988B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention provides a kind of image processing apparatus and image processing method.Image processing apparatus has: evaluation region configuration part, and it in vivo sets the evaluation region differentiating object as classification in image;Texture component obtaining section, its in vivo image in above-mentioned evaluation region obtains texture component;Evaluation of estimate calculating part, it calculates the homogeneous evaluation of estimate representing above-mentioned texture component;And judegment part, it differentiates the classification of above-mentioned evaluation region according to upper evaluation values.

Description

Image processing apparatus and image processing method
Technical field
The image processing apparatus that the in vivo image that the present invention relates in vivo obtain shooting examinee processes And image processing method.
Background technology
In the past as importing in examinee's bodies such as patient, the medical finder in observing tube intracavity portion, endoscope obtains extensively General universal.Further developed in the last few years and swallow type endoscope (capsule type endoscope), it is taken the photograph in the storage of capsule-type enclosure interior As device and communicator etc., camera head the view data shot is wirelessly transmitted to external.Filled by this medical observation Put captured a series of in vivo images (image in digestive tract) and present huge quantity (more than several ten thousand), and be right In each digestive tract, image carries out observing and diagnosing needing more experience.It is desirable to there is the medical diagnosis assisting doctor diagnosed auxiliary Assist energy.As one of image recognition technology realizing this function, it is proposed that automatically detect pathological changes according to image in digestive tract Deng unusual part, it is shown that answer the technology of the image that emphasis diagnoses.
But in digestive tract in image, in addition in diagnosis as the mucosal areas of the object of observation, the most also can Photograph unwanted content in the such observation of residue.As differentiating this mucosal areas and (i.e. region, content region Classification) technology, in Japanese Unexamined Patent Publication 2010-115413 publication, such as disclose following image processing method: from one In series digestive tract, image selects multiple image, calculate each pixel of selected multiple images or each zonule Color Characteristic, according to the mucosal areas in each image of image in these Color Characteristics differentiation a series of digestive tract of composition.
It addition, as the technology differentiating the region different from each other photographed in image, it is also known that have based on using gray scale The method of the texture characteristic amount of co-occurrence matrix (" デ ィ ジ タ Le portrait reason ", CG-ARTS association written by richness such as difficult to understand is the quickest et al. Meeting, see page 194~195).Gray level co-occurrence matrixes refers to set pixel i and offset by certain relative position with range pixel i The pixel value of pixel j of δ=(d, θ) (d be distance, θ be angle) is respectively Li、LjIn the case of, to produce the right of pixel value (Li, Lj) probability Pδ(Li, Lj) as the matrix of key element.By using this matrix, the expression uniformity of pixel value, side can be obtained The characteristic quantity of the character such as tropism and contrast.
But, when using Color Characteristic to differentiate in vivo image in the case of the classification in each region of shooting, have Time can be affected by imaging conditions (lighting condition etc.).It addition, employ the texture characteristic amount of gray level co-occurrence matrixes in application In the case of, it is necessary to use the parameter in view of object, thus be difficult to produce process time algorithm faster.Therefore, when To amount of images as image in digestive tract very many in the case of in vivo image carrys out critical region, the process time can become Long.
Summary of the invention
The present invention is exactly to complete in view of the foregoing, and its object is to offer one can be by imaging conditions In the case of impact, appropriately differentiated the classification in each region comprised in vivo image by algorithm more faster than prior art Image processing apparatus and image processing method.
The image processing apparatus that one aspect of the invention relates to has: evaluation region configuration part, and it is in vivo in image Set the evaluation region differentiating object as classification;Texture component obtaining section, its in vivo image in above-mentioned evaluation region Obtain texture component;Evaluation of estimate calculating part, it calculates the homogeneous evaluation of estimate representing above-mentioned texture component;And judegment part, It differentiates the classification of above-mentioned evaluation region according to upper evaluation values, and institute's evaluation values calculating part has deflection evaluation of estimate calculating part, This deflection evaluation of estimate calculating part calculates expression texture component on the space i.e. coordinate space of coordinate representing location of pixels Homogeneous evaluation of estimate.
The image processing method that another aspect of the present invention relates to includes: evaluation region setting procedure, in vivo image Middle setting differentiates the evaluation region of object as classification;Texture component acquisition step, the in vivo figure in above-mentioned evaluation region As obtaining texture component;Evaluation of estimate calculation procedure, calculates the homogeneous evaluation of estimate representing above-mentioned texture component;And differentiate step Suddenly, differentiate the classification of above-mentioned evaluation region according to upper evaluation values, in institute's evaluation values calculation procedure, calculate and represent in expression The homogeneous evaluation of estimate of the texture component on the space of the coordinate of location of pixels i.e. coordinate space.
Comparison accompanying drawing also reads the following detailed description of the invention, it is possible to be further appreciated by foregoing and the present invention Meaning in terms of other purposes, feature, advantage and technology and industry.
Accompanying drawing explanation
Fig. 1 is the block diagram of the composition of the image processing apparatus representing that first embodiment of the present invention relates to.
Fig. 2 is that figure based on the morphologic texture component adquisitiones opening process is described.
Fig. 3 is to represent the analog result that the acquirement of the texture component for the evaluation region based on mucosal areas processes Figure.
Fig. 4 is to represent the analog result that the acquirement of the texture component for the evaluation region based on residue region processes Figure.
Fig. 5 is the flow chart of the action representing image processing apparatus shown in Fig. 1.
Fig. 6 is to represent the schematic diagram of an example of image in the digestive tract carrying out processing in the image processing apparatus of Fig. 1.
Fig. 7 is the schematic diagram of the example representing the evaluation region set on image in digestive tract.
Fig. 8 is the flow chart of the calculating process of the evaluation of estimate illustrating that texture component.
Fig. 9 is the block diagram of the composition of the image processing apparatus representing that second embodiment of the present invention relates to.
Figure 10 is the flow chart of the action representing the image processing apparatus shown in Fig. 9.
Figure 11 is the figure of the method illustrating that the texture component of the continuous distribution according to mucosal areas generates Discrete Distribution data.
Figure 12 is that the method that the texture component according to the continuous distribution in residue region generates Discrete Distribution data is described Figure.
Figure 13 is the figure of the method being calculated average closest distance by closest distance method.
Figure 14 is the figure of the deviation representing the discrete point in evaluation region and the relation of evaluation of estimate.
Figure 15 is the figure of the computational methods that evaluation of estimate based on K-function method is described.
Figure 16 is to illustrate based on x2The figure of the computational methods of the evaluation of estimate of calibrating.
Figure 17 is the block diagram of the composition of the image processing apparatus representing that third embodiment of the present invention relates to.
Figure 18 is texture component and the figure of a histogrammic example representing mucosa component.
Figure 19 is texture component and the figure of a histogrammic example representing residue component.
Figure 20 is the block diagram of the composition of the image processing apparatus representing that third embodiment of the present invention relates to.
Figure 21 is the example representing and the 1st~image processing apparatus of the third embodiment being applied to computer system System pie graph.
Figure 22 is the block diagram of the composition representing the main part shown in Figure 21.
Detailed description of the invention
The image processing apparatus that explanation embodiment of the present invention relates to reference to the accompanying drawings.Further, the invention is not restricted to this A little embodiments.And in the description of each accompanying drawing, give identical label to same section and be indicated.
The image processing apparatus that embodiments described below relates to is such as to by endoscope or capsule type endoscope etc. The in vivo image (image in digestive tract) of the examinee of medical finder shooting processes, and specifically carries out differentiating in digestion The classification in the region photographed in image in road (i.e. identifies that in diagnosis the mucosal areas and observing as the object of observation is not required to The residue region wanted) process.It addition, in the following embodiments, by image in the digestive tract captured by medical finder The most each pixel has for R (red), G (green), the cromogram of pixel level (pixel value) of each color component of B (blue) Picture.
1st embodiment
Fig. 1 is the block diagram of the composition of the image processing apparatus representing that first embodiment of the present invention relates to.As it is shown in figure 1, figure As processing means 1 has image acquiring section 11, input unit 12, display part 13, record portion 14, operational part 15, controls image procossing The control portion 10 of device 1 molar behavior.
Image acquiring section 11 obtains by the view data of image in the digestive tract captured by medical finder.Image obtains Portion 11 can properly configure according to the specification of the system comprising medical finder.The most medical finder is capsule-type Endoscope, when the transmitting-receiving of the view data being used between medical finder by movable-type record medium, image obtains This record medium is installed in portion 11 by the way of with detachable, the view data of image in the digestive tract that reading is preserved Reader device realizes.It addition, pre-save the picture number of image in the digestive tract shot by medical finder when arranging According to server in the case of, image acquiring section 11 is realized by the communicator that is connected with server etc., and between server Carry out data communication, it is thus achieved that the view data of image in digestive tract.Or can also be by medical from endoscope etc. via cable The interface arrangements of finder received image signal etc. constitute image acquiring section 11.
Input unit 12 is such as realized by input equipments such as keyboard, mouse, touch panel, various switches, will pass through user The input signal operating these input equipments and input exports to control portion 10.
Display part 13 is realized by display devices such as LCD or EL display, shows and comprise under the control in control portion 10 In digestive tract, image is at interior various pictures.
Record portion 14 is by can ROM or RAM such IC memorizer such as flash memory, built-in or pass through of more new record Information recording carrier and the reading devices etc. thereof such as the hard disk of data communication terminals connection or CD-ROM realize.Record portion 14 is except storage Deposit in the digestive tract obtained by image acquiring section 11 outside the view data of image, also store and be used for making image processing apparatus 1 enter Action is made and makes image processing apparatus 1 perform data etc. used in the execution of the program of various function, this program.Concrete and Speech, record portion 14 stores the image processing program for differentiating in digestive tract the mucosal areas included in image and residue region 141。
Operational part 15 is realized, to digestive tract by the hardware such as CPU and the image processing program 141 read in this hardware The view data of interior image processes, and carries out the various calculation process for differentiating mucosal areas and residue region.Operational part 15 homogeneities with the texture component in evaluation region configuration part 16, texture component obtaining section 17, calculating expression evaluation region The deflection evaluation of estimate calculating part 18 as evaluation of estimate computing unit of evaluation of estimate, judegment part 19.
Evaluation region configuration part 16 according to image setting in digestive tract as the object differentiating mucosal areas and residue region Evaluation region.
Image is removed in the texture component obtaining section 17 digestive tract in evaluation region structure components, thus obtains texture Component.As the adquisitiones of texture component, such as can use morphology (form) opens process (little elegant literary composition work, " モ Le Off ォ ロ ジ ", U ロ Na society, see page 82~85).
Fig. 2 is that the figure by the morphologic method opened and process acquisition texture component is described.Open in process morphologic, First make to be referred to as the reference pattern G of structural elementERelative to the pixel value by each pixel on the x/y plane constituting 2 dimension images Regard as the image G (Fig. 2 (a)) in the 3-dimensional space of height (z-axis), from the position that the pixel value of image G is little circumscribed one While move, it is thus achieved that reference pattern GEThe track SC (Fig. 2 (b)) that passed through of the maximum of periphery.This track SC is equivalent to artwork The structure components comprised as G.And then, by deducting track (structure components) SC from image G, it is thus possible to obtain texture component TC (Fig. 2 (c)).
Fig. 3 represents the analog result obtaining texture component from mucosal areas.As it is shown on figure 3, from the image G of mucosal areasM (Fig. 3 (a)) deducts by opening the structure components SC processing acquisitionM(Fig. 3 (b)), thus obtain texture component TC of mucosal areasM (Fig. 3 (c)).As shown in Fig. 3 (c), texture component TC of mucosal areasMCharacter for more homogenizing.
On the other hand, Fig. 4 represents the analog result obtaining texture component from residue region.As shown in Figure 4, from mucosa district The image G in territoryR(Fig. 4 (a)) deducts by opening the structure components SC processing acquisitionR(Fig. 4 (b)), thus obtain residue region Texture component TCR(Fig. 4 (c)).As shown in Fig. 4 (c), texture component TC in residue regionRFor more concavo-convex heterogeneous body character.
Further, as the adquisitiones of texture component, known various methods can also be used in addition to described above.Example As image in digestive tract is implemented Fourier Tranform, carry out high-pass filtering and process to end low frequency component.Can also be to as above obtaining The image obtained implements inverse Fourier Tranform, it is thus achieved that texture component.
Deflection evaluation of estimate calculating part 18 calculates the homogeneous evaluation of estimate of the texture component represented on coordinate space.Concrete and Speech, deflection evaluation of estimate calculating part 18 has coordinate centroidal distance calculating part 181.Coordinate centroidal distance calculating part 181 will evaluated Distance conduct between the intrinsic coordinates center of gravity in region and the coordinate center of gravity after being weighted by the pixel value (brightness) of texture component Represent that homogeneous evaluation of estimate is calculated.
Judegment part 19 is according to the evaluation of estimate calculated by deflection evaluation of estimate calculating part 18, it determines the classification of evaluation region.Tool For body, when evaluation of estimate represent texture component homogeneous in the case of, it determines it is mucosal areas that portion 19 is determined as evaluation region (seeing Fig. 3 (c)).On the other hand, when evaluation of estimate do not represent texture component homogeneous in the case of, it determines portion 19 is determined as commenting Valency region is residue region (seeing Fig. 4 (c)).Evaluation of estimate indicates whether that according to this evaluation of estimate whether the homogeneity of texture component is Judge in being contained in preset range.
Control portion 10 is realized by hardware such as CPU, reads in the various programs being stored in record portion 14, thus according to by scheming The view data inputted as obtaining section 11 and the operation signal etc. inputted by input unit 12, carried out for constituting image processing apparatus The instruction of each several part of 1 and data transmission etc., be uniformly controlled the action of image processing apparatus 1.
The action of image processing apparatus 1 is described referring next to Fig. 5.Fig. 5 is to represent that image processing apparatus 1 differentiates mucosal areas The flow chart of the action with the image procossing in residue region.And with to image in the digestive tract shown in Fig. 6 in herein below Image procossing as a example by illustrate.As shown in Figure 6, scheme comprising in digestive tract in a series of digestive tract of image 100 As in group based on mucosal areas 101, time and residue region 102, lesion region 103 etc. occur.
When in the digestive tract being taken into image processing apparatus 1, the view data of image sets is stored in record portion 14, Operational part 15 reads out image 100 (step S1) in the digestive tract of process object from record portion 14.
In step s 2, evaluation region configuration part 16 sets evaluation region to image in digestive tract 100.Specifically, comment Image in digestive tract 100 is divided into multiple rectangular area by valency region setting part 16 as shown in Figure 7, is set as successively in each region Evaluation region 105.And in the figure 7, image in digestive tract 100 is divided into 16 regions, and splits the big of number and each region Little and shape is not limited to this, can desirably set.The most both the region that one or more pixels form can be set as 1 Evaluation region, it is also possible to image in digestive tract 100 overall (not splitting) is set as 1 evaluation region.
In step s3, texture component obtaining section 17 accepts to represent image 100 and the information of evaluation region 105 in digestive tract (such as coordinate information), in the digestive tract in evaluation region 105, image removes structure components 106, thus obtains texture component.
In step s 4, the coordinate center of gravity within coordinate centroidal distance calculating part 181 Calculation Estimation region 105 with pass through stricture of vagina The pixel value of reason component to evaluation region 105 is internal be weighted after coordinate center of gravity between distance i.e. coordinate centroidal distance, Homogeneous evaluation of estimate (hereinafter referred to as " evaluation of estimate ") as the texture component representing evaluation region 105.
Fig. 8 is to represent the flow chart that the calculating of the homogeneous evaluation of estimate illustrating texture component processes.First in step S11 In, coordinate centroidal distance calculating part 181 obtains information (the such as coordinate representing evaluation region 105 from evaluation region configuration part 16 Information), and the texture component of evaluation region 105 is obtained from texture component obtaining section 17.
In step S12, the pixel that coordinate centroidal distance calculating part 181 is comprised in using formula (1) Calculation Estimation region 105 Coordinate center of gravity (gx, gy)。
g x = Σ i ( x , i ) N X i N g y = Σ i ( x , i ) N Y i N - - - ( 1 )
In formula (1), (Yi is in evaluating ith pixel i that Xi is in evaluation region 105 for x, x coordinate y) Ith pixel i (x, y-coordinate y) in region 105.It addition, N is the sum of all pixels included in evaluation region 105.
In step s 13, coordinate centroidal distance calculating part 181 uses formula (2) to calculate the pixel value by texture component Coordinate center of gravity (Gx, Gy) after the pixel comprised in evaluation region 105 is weighted.
G x = Σ i ( x , y ) N X i T i Σ i ( x , y ) N T i G y = Σ i ( x , y ) N Y i T i Σ i ( x , y ) N T i - - - ( 2 )
In formula (2), Ti (x, the pixel value of the texture component of ith pixel y) being in evaluation region 105.
And then, in step S14, use step S12 and the result of calculation of S13 and formula (3), coordinates computed centroidal distance DG-g
D G - g = ( G x - g x ) 2 + ( G y - g y ) 2 - - - ( 3 )
Hereafter action returns main flow.
Wherein, in the case of texture component is homogenizing, the coordinate center of gravity (Gx, Gy) after being weighted and coordinate center of gravity (gx, gy) substantially uniform, thus coordinate centroidal distance DG-gValue diminish.And in the case of texture component heterogeneous body, carry out Coordinate center of gravity (Gx, Gy) after weighting leaves coordinate center of gravity (gx, gy), therefore coordinate centroidal distance DG-gValue can become big.
The coordinate centroidal distance that coordinate centroidal distance calculating part 181 will be calculated by the process of above-mentioned steps S11~S14 DG-gExport to judegment part 19 as the homogeneous evaluation of estimate representing texture component.
In step s 5, it determines the coordinate centroidal distance D that portion 19 will be calculated by coordinate centroidal distance calculating part 181G-gMake For evaluation of estimate, it determines the classification of evaluation region 105.More specifically it determines portion 19 is first predetermined with set in advance by evaluation of estimate Threshold value compares.And when this evaluation of estimate is less than in the case of predetermined threshold, it determines it is homogenizing for texture component, evaluation region 105 is mucosal areas.On the other hand, in the case of evaluation of estimate is more than predetermined threshold, it determines portion 19 is determined as texture component For heterogeneous body, evaluation region 105 is residue region.And then, it determines portion 19 exports the differentiation result of evaluation region 105 and makes display Portion 13 shows this differentiation result, and makes record portion 14 record this differentiation result (step S6).
In the case of the evaluation region 105 remaining the differentiation also not terminating classification (step S7:No), action is transferred to Step S2.And when all evaluation regions 105 being performed differentiation and processing (step S7:Yes), terminate image in digestive tract The action of the image procossing of 100.
As it has been described above, according to the 1st embodiment, be conceived to the homogeneity of the texture component of evaluation region, it determines evaluate district Territory is mucosal areas or residue region.Therefore, it is possible in the case of not being photographed condition impact, to process time fast calculation Method differentiates classification rightly to the differentiation subject area (evaluation region) that can take all size and shape.It addition, it is real according to the 1st Executing mode, coordinates computed centroidal distance, as the homogeneous evaluation of estimate of expression texture component, therefore can pass through simple calculations Process the homogeneity judging texture component.
Variation 1-1
About the setting (step S2) of evaluation region, carry out also by various methods in addition to the method described above.Such as Can be according to image (seeing Japanese Unexamined Patent Publication 2010-115413 publication) in each Color Characteristic segmentation digestive tract, will segmentation After each region as evaluation region.Or can also be by the closed curve that extracted by dynamic outline method (snakes method) Internal as 1 evaluation region.Dynamic outline method refers to make the shape distortion of the closed curve being provided as initial value, and And extract the seriality of closed curve and smoothness and energy based on the edge strength on closed curve and the most stable Closed curve (reference: CG-ARTS association, デ ィ ジ タ Le portrait reason, page 197~199).
Variation 1-2
When calculating the coordinate center of gravity after being weighted by the pixel value of texture component (step S13), can be obstructed Cross pixel value all pixels are weighted, but only the pixel meeting predetermined condition is weighted.Specifically, to relatively The pixel of the pixel value that the maximum of the pixel value in evaluation region has more than predetermined ratio (such as 50%) is weighted. Or can also only to the pixel that pixel value in the continuous distribution of texture component is peak value, (i.e. once differentiation be zero and second differential Pixel for negative) it is weighted.Thus in the case of texture component is heterogeneous, common coordinate center of gravity (gx, gy) and weighting After coordinate center of gravity (Gx, Gy) deviate from the most notable, be therefore more convenient for differentiating mucosal areas and residue region.
2nd embodiment
The image processing apparatus that explanation second embodiment of the present invention relates to below.
Fig. 9 is the block diagram of the composition representing image processing apparatus of the second embodiment.Image procossing shown in Fig. 9 Device 2 has operational part 20.Operational part 20 has the deflection evaluation of estimate calculating part 21 as evaluation of estimate computing unit, according to deflection The judegment part 22 of the classification in the result of calculation judgement and evaluation region of evaluation of estimate calculating part 21.The operational part 20 of this 2nd embodiment The texture component of evaluation region is converted to Discrete Distribution from continuous distribution, calculates expression texture component according to this Discrete Distribution Homogeneous evaluation of estimate.Other composition is all identical with content shown in Fig. 1.
Deflection evaluation of estimate calculating part 21 has Discrete Distribution calculating part 211 and homogeneity evaluation of estimate calculating part 212.Discrete point Cloth calculating part 211 generates the Discrete Distribution number being made up of multiple discrete points according to the texture component showed by continuous distribution According to.Homogeneity evaluation of estimate calculating part 212 calculates the homogeneous evaluation of estimate representing texture component according to these Discrete Distribution data.Sentence Other portion 22 judges the homogeneity of texture component according to this evaluation of estimate, and being evaluated region always according to this result of determination is mucosal areas Or the judgement in residue region.
The action of image processing apparatus 2 is described referring next to Figure 10~Figure 12.Figure 10 is to represent moving of image processing apparatus 2 The flow chart made.Further, step S1~S3, S6 are identical with content illustrated in the 1st embodiment with the action of S7.
In the step s 21, Discrete Distribution calculating part 211 generates discrete according to the texture component showed by continuous distribution Distributed data.Figure 11 is the figure illustrating to generate the process of the Discrete Distribution data of the texture component of performance mucosal areas.Figure 12 is Illustrate to generate the figure of the process of the Discrete Distribution data of the texture component in performance residue region.Specifically, Discrete Distribution calculates Portion 211 collects the information (coordinate information) representing evaluation region from evaluation region configuration part 16, and from texture component obtaining section 17 collect the texture component showed by continuous distribution, it is thus achieved that maximum (Figure 11 (a), Figure 12 of the pixel value in evaluation region (a))。
Then, the value of the predetermined ratio that Discrete Distribution calculating part 211 obtains the maximum relative to pixel value (is referred to as below For sampling value).In Figure 11 (a) and Figure 12 (a), as an example, sampling value is set to the 50% of maximum.And then, Discrete Distribution Calculating part 211 extracts the pixel (coordinate) having with sampling value same pixel value, using these pixels as constituting Discrete Distribution Point (discrete point P) generate Discrete Distribution data (Figure 11 (b), Figure 12 (b)).
In step S22, homogeneity evaluation of estimate calculating part 212 is according to discrete point generated by Discrete Distribution calculating part 211 Cloth data, calculate the homogeneous evaluation of estimate representing texture component.Now, homogeneity evaluation of estimate calculating part 212 uses so-called Spatial analytic geometry.Spatial analytic geometry is " decentralized " or the method for " concentrated " of Discrete Distribution data being categorized as.Implement the 2nd In mode, as an example of Spatial analytic geometry, illustrate that use resolves the distribution of discrete point P according to the distance between discrete point P The method of closest distance method.
First, as shown in figure 13, obtain from each discrete point P constituting Discrete Distribution datai(i=1~n) is to nearest another Distance d of one discrete pointi.Then formula (4) is used to be calculated as these distances diAverage closest distance W of meansigma methods.
W = 1 n Σ i = 1 n d i - - - ( 4 )
In formula (4), n is the discrete point P included in Discrete Distribution dataiSum.Calculate as above is flat The most closest distance W can be used as representing the homogeneous evaluation of estimate of texture component.
Figure 14 is deviation (disperse or concentrate) and the average closest distance now of the discrete point P representing evaluation region A1 The figure of the relation of W.In the case of discrete point P concentrates on substantially 1 position of evaluation region A1 (Figure 14 (a)), the most adjacent Closely W takes less value (W=W1).And when the concentration position of discrete point P increases (Figure 14 (b)), average closest distance W Become big (W=W2>W1).Such as in Figure 14 (b), discrete point P concentrates on 2 positions of evaluation region A1.In the case of these Value W of average closest distance1、W2Such as shown in Figure 12 (b), there is shown the skewness of texture component.On the other hand, When discrete point P is in the case of evaluation region A1 entirety (Figure 14 (c)), average closest distance W becomes big (W=W3>W2).This Value W of the average closest distance in the case of Xie3Such as shown in Figure 11 (b), there is shown texture component is uniform.
In following step S23, it determines portion 22 is according to the evaluation of estimate calculated by homogeneity evaluation of estimate calculating part 212 (average closest distance W), it determines the homogeneity of texture component and the classification of evaluation region.More specifically it determines portion 22 will lead to Cross expected value E [W] that formula (5) calculates and carry out the homogeneity of judgement and evaluation value as threshold value.Expected value E [W] shown in formula (5) is Assuming that the expected value in the case of discrete point P is random distribution in Discrete Distribution data.
E [ W ] = k 2 n S - - - ( 5 )
In formula (5), k is the coefficient (constant) recorded by experiment, and S is the area of evaluation region A1.
Judge the homogeneity of texture component the most as follows.
W < E [W]: homogenizing
W >=E [W]: heterogeneous body
And then, it determines portion 22 is in the case of texture component is homogenizing, it determines be mucosal areas for evaluation region.And work as stricture of vagina In the case of reason component heterogeneous body, it determines be residue region for evaluation region.
As it has been described above, according to the 2nd embodiment, based on the Discrete Distribution data representing texture component, it determines texture component Homogeneity and the classification of evaluation region, therefore can reduce overall operand, can obtain in shorter time differentiation result.
Further, about the calculating (step S22) of evaluations of estimate based on Discrete Distribution data, can also outside removing said method Carried out by various methods.Other calculating of evaluations of estimate based on Discrete Distribution data are described as variation 2-1~2-3 below The example of method.
The calculating of variation 2-1 evaluation of estimate based on K-function method
K-function method is to be difficult to the distribution differentiated and the method developed according to closest distance method for identifying.Based on K- Evaluation of estimate K (h) of function method can be passed through formula (6) and calculate.
K ( h ) = 1 n &Sigma; i = 1 n K i ( h ) &lambda; - - - ( 6 )
In formula (6), kiH () represents from each discrete point P of Discrete Distribution dataiDistance is for existing in the range of h The quantity of other discrete points.Such as at certain discrete point P shown in the evaluation region A2 of Figure 15iIn the case of, ki(h)=4.Separately Outward, the density (λ=n/S) of the discrete point P during λ is evaluation region A2.
In the case of using this evaluation of estimate K (h), the homogeneity carrying out grain distribution as follows judges and evaluation region The differentiation of classification.
First pass through formula (7) and calculate expected value E [K (h)] of K-function.
E [K (h)]=π h2 (7)
Wherein π is pi.
It is that threshold value judges as described below that the homogeneity of grain distribution can be passed through with this expected value E [K (h)].
K (h) < E [K (h)]: homogenizing
K (h) >=E [K (h)]: heterogeneous body
Thus in the case of texture component is homogenizing, it determines be mucosal areas for evaluation region.And work as texture component also In the case of heterogeneous, it determines be residue region for evaluation region.
Variation 2-2 is based on x2The calculating of the evaluation of estimate of calibrating
In the case of evaluation region A3 shown in Figure 16 is evaluated, first imagines and evaluation region A3 is divided into shape Multiple regions (such as rectangular area) B with area equation.When assuming discrete point P in evaluation region A3 equal distribution, respectively The symbiosis probability of the discrete point P comprised in the B of region is certain.Such as in Figure 16 (a), 1 discrete point in each region B P exists with equal probability.And as shown in Figure 16 (b), in the case of the distribution of discrete point P exists deflection, each region B's is discrete The symbiosis probability of some P becomes different.If therefore having obtained the symbiosis probability of the discrete point P in the case of hypothesis discrete point P equalization With the symbiosis probability of actual discrete point P deviate from degree, then this can be deviated from degree and be used as to represent the homogeneity of texture component Evaluation of estimate.
x2Calibrating is to set discrete point P to enter into jth region BjProbability be PRjIn the case of, examine and determine based on this probability The method that theoretical value is the most consistent with the actual distribution of n discrete point P.When using x2Calibrating judges the homogeneity of texture component Time, using symbiosis probability P R of all region B as PR=1/M the quantity of rectangular area C (M be), calculate x by formula (8)2
&chi; 2 &Sigma; j = 1 M ( C j - n ( 1 M ) ) 2 n ( 1 M ) - - - ( 8 )
In formula (8), CjIt is jth region BjThe number of the discrete point P comprised.It addition, n is evaluation region A3 bag The sum of the discrete point P contained.
The x calculated by above formula (8)2Can be used as representing the homogeneous evaluation of estimate of grain distribution.That is, x is worked as2Less than pre- In the case of determining threshold value (the most above-mentioned deviate from degree hour), it is determined that uniform for grain distribution.Now, it determines be viscous for evaluation region Diaphragm area.And work as x2In the case of more than predetermined threshold when big (the most above-mentioned deviate from degree), it is determined that uneven for grain distribution. Now being determined as evaluation region is residue region.
The calculating of variation 2-3 evaluation of estimate based on Shannon-Wiener diversity index
As shown in Figure 16 (a), when discrete point P equal distribution is in the case of evaluation region A3, comprise the region of discrete point P The quantity of B becomes many.And as shown in Figure 16 (b), in the case of the distribution of discrete point P exists deflection, do not comprise the district of discrete point P Territory B increases.If therefore obtaining the discrete point P distributed degrees at multiple region B, then can be used as to represent texture component by this degree Homogeneous evaluation of estimate.
As calculating discrete point P in the means of the distributed degrees of multiple region B, such as, can use Shannon-Wiener diversity index.Various Degree index is to evaluate the kind (M) comprised in cluster (N) which kind of has reached to enrich the index of degree.Specifically, region B is used Number as kind (M), use formula (9) to calculate the Shannon-Wiener diversity index D of Simpson (Simpson).
D = 1 - &Sigma; j = 1 M ( C j n ) 2 - - - ( 9 )
In formula (9), Cj is jth region BjThe number of the discrete point P comprised.It addition, n is evaluation region A3 bag The sum of the discrete point P contained.
In the case of using this Shannon-Wiener diversity index D, the homogeneity carrying out grain distribution as follows judges and evaluates district The differentiation of the classification in territory.That is, in the case of Shannon-Wiener diversity index D is more than predetermined threshold, (i.e. discrete point P is distributed various situation Under), grain distribution is uniform, it determines be mucosal areas for evaluation region.And when Shannon-Wiener diversity index D situation below predetermined threshold Under (i.e. in the case of the distribution deflection of discrete point P), grain distribution is uneven, it determines be residue region for evaluation region.
Or can also Shannon (Shannon) the index H ' of calculating formula (10), as discrete point P multiple region B point Cloth degree.
H &prime; = - &Sigma; j = 1 M ( C j n ) l n ( C j n ) - - - ( 10 )
In this case, in the case of shannon index H ' is more than predetermined threshold, grain distribution is uniform, it determines for evaluating district Territory is mucosal areas.And in the case of shannon index H ' is less than predetermined threshold, the skewness of texture component, it determines for commenting Valency region is residue region.
3rd embodiment
Then the image processing apparatus that explanation third embodiment of the present invention relates to.
Figure 17 is the block diagram of the composition representing image processing apparatus of the third embodiment.At image shown in Figure 17 Reason device 3 has operational part 30.Operational part 30 has the deflection evaluation of estimate calculating part 31 as evaluation of estimate computing unit, according to partially The tiltedly judegment part 32 of the classification in the result of calculation judgement and evaluation region of evaluation of estimate calculating part 31.Other constitutes all interior with shown in Fig. 1 Hold identical.
The texture component of evaluation region is obtained the channel zapping of pixel value (intensity) by deflection evaluation of estimate calculating part 31, according to The deformation of the shape of this channel zapping, calculates the homogeneous evaluation of estimate representing texture component.Judegment part 32 is according to this evaluation of estimate Judging the homogeneity of texture component, then according to this result of determination, being evaluated region is mucosal areas or residue region Differentiate.
The classification discrimination principles of the 3rd embodiment is described herein with reference to Figure 18 and Figure 19.In Figure 18 (a) and Figure 19 (a), Transverse axis represents the x coordinate of the pixel that evaluation region comprised, and the longitudinal axis represents the meansigma methods of pixel value I and the pixel value of evaluation region Difference I of μ '.It addition, in Figure 18 (b) and Figure 19 (b), transverse axis represents difference I of pixel value I and average value mu ', the longitudinal axis represents poor I ' Frequency.
Figure 18 (a) represents the characteristic of the texture component of mucosal areas.In mucosal areas, the pixel value of texture component is more Uniformly.This texture component is made histogrammic in the case of, rectangular histogram becomes symmetrical centered by meansigma methods (I '=0) The less shape of deformation (Figure 18 (b)).And the characteristic of the texture component in Figure 19 (a) expression residue region.In residue region, Deviation bigger seen from the pixel value of texture component.This texture component is made histogrammic in the case of, can at Nogata Figure is deformed (Figure 19 (b)).Then the rectangular histogram to the pixel value of texture component obtains symmetry (deformation), it is thus possible to sentence The homogeneity of disconnected texture component, can be mucosal areas or residue region according to this uniformity judgement and evaluation region.
The action of image processing apparatus 3 is described below.Figure 20 is the flow chart of the action representing image processing apparatus 3.And And, step S1~S3, S6 are identical with the content of explanation in the 1st embodiment with the action of S7.
First in step S31, deflection evaluation of estimate calculating part 31 obtains texture component from texture component obtaining section 17, calculates The channel zapping of pixel value.
In following step S32, deflection evaluation of estimate calculating part 31 uses formula (11) to calculate degree of deformation (Skewness) Sk, as the symmetric evaluation of estimate representing the channel zapping obtained in step S31.
S k = &Sigma; i ( I i - &mu; ) 3 n&sigma; 3 - - - ( 11 )
In formula (11), IiBeing the pixel value of ith pixel, μ is the meansigma methods of pixel value, and σ is the standard deviation of pixel value Difference, n is the summation (i.e. pixel count) of the frequency of pixel value.
Degree of deformation Sk is to represent data (pixel value Ii) relative to the degree of meansigma methods (average value mu) asymmetric distribution Value.In the case of channel zapping does not exists deformation (i.e. normal distribution), degree of deformation Sk is zero, deforms the biggest then degree of deformation Sk more Away from zero.And when channel zapping deflection I ' < 0 side (such as situation shown in Figure 19 (a)), Sk < 0.And when channel zapping is inclined When I ' > 0 side, Sk > 0.
In step S33, it determines portion 32, according to the evaluation of estimate (degree of deformation) calculated by deflection evaluation of estimate calculating part 31, is sentenced The homogeneity of other texture component and the classification of evaluation region.More specifically it determines the deformation that portion 32 will be calculated by formula (11) Degree Sk compares with predetermined threshold Thresh1 and Thresh2 being obtained ahead of time, thus judges the homogenizing of texture component as follows Property.
Thresh1 < Sk < Thresh2: homogenizing
Thresh1 >=Sk or Thresh2≤Sk: heterogeneous body
And then, in the case of texture component is homogenizing, it determines it is mucosal areas that portion 32 is determined as evaluation region.And work as stricture of vagina Reason component be heterogeneous in the case of, it determines it is residue region that portion 32 is determined as evaluation region.
As it has been described above, according to the 3rd embodiment, channel zapping based on pixel value, it determines the homogeneity of texture component and The classification of evaluation region, therefore can obtain differentiation result by processing time algorithm faster.
The image processing apparatus 1~3 being described separately in above-mentioned 1st~the 3rd embodiment can be by by individual calculus The computer system such as machine or work station performs preprepared program and realizes.Explanation below has and image processing apparatus 1 ~3 identical functions, perform the computer system of the image processing program 141 being stored in record portion 14.
Figure 21 is the system pie graph representing computer system 4.It addition, Figure 22 is the structure representing main part 41 shown in Figure 21 The block diagram become.As shown in figure 21, computer system 4 has main part 41, for by carrying out the instruction of main body 41 in display The display 42 of the information such as image is shown, for this computer system 4 being inputted the keyboard 43 of various information, using on picture 421 The mouse 44 of the optional position on the display picture 421 of designated display 42.
Main part 41 has CPU41, RAM412, ROM413, hard disk drive (HDD) 414, accepts the CD-of CD-ROM46 ROM drive 415, connect in removable mode USB storage 47 USB port 416, connect display 42, keyboard 43 and The I/O interface 417 of mouse 44, for the LAN interface 418 being connected with LAN or wide area network (LAN/WAN) N1.
And then, computer system 4 connects the modem 45 having for being connected with public line N3 such as the Internet, and And via LAN interface 418 and LAN or wide area network N1, connect have as other computer systems personal computer (PC) 5, Server 6, printer 7 etc..
Computer system 4 reads image processing program (the such as image procossing journey shown in Fig. 1 being recorded in record medium Sequence 141) and perform, thus realize the image processing apparatus 1~3 of explanation in the 1st~the 3rd embodiment.Wherein, record medium removes Outside CD-ROM46 and USB storage 47, also include that MO dish, DVD disc, floppy disk (FD), photomagneto disk, IC-card etc. " can carry thing Reason medium ", be arranged at computer system 4 inside and outside HDD414 and RAM412, ROM413 etc. " fixing physical medium " etc., remember All record media of the image processing program that record can be read by computer system 4.Based on this implication, as via modulation /demodulation Public line N3, connection that device 45 connects have the PC5 as other computer systems and the LAN of server 6 or wide area network N1 Record medium is fallen within Deng " communication media " of the short-term storage program when router.Further, image processing program is not limited to Performed by computer system 4, present invention is equally applicable to be held by PC5 or the server 6 as other computer systems The situation of row image processing program and said apparatus cooperation perform the situation of image processing program.
Further, the invention is not restricted to the 1st~the 3rd embodiment and each variation, by appropriately combined at each embodiment With the multiple elements disclosed in variation, various invention can be formed.The most both can be from each embodiment and variation institute The all elements shown are removed certain several element, it is also possible to shown in appropriately combined different embodiments and variation Element is formed.
According to an aspect of the present invention, sentence according to the uniformity of the texture component of the in vivo image within evaluation region The not classification of this evaluation region, therefore, it is possible in the case of not affected by imaging conditions, obtains processing time fast algorithm Obtain differentiation result appropriately.

Claims (14)

1. an image processing apparatus, it is characterised in that have:
Evaluation region configuration part, it in vivo sets the evaluation region differentiating object as classification in image;
Texture component obtaining section, its in vivo image in above-mentioned evaluation region obtains texture component;
Evaluation of estimate calculating part, it calculates the above-mentioned texture represented on the space i.e. coordinate space of coordinate representing location of pixels and divides The homogeneous evaluation of estimate of amount;And
Judegment part, it differentiates the classification of above-mentioned evaluation region according to upper evaluation values.
Image processing apparatus the most according to claim 1, it is characterised in that
Above-mentioned image in vivo image is digestive tract,
In the case of upper evaluation values is contained in and represents in the preset range that above-mentioned texture component is homogenizing, above-mentioned judegment part is sentenced Wei the classification of above-mentioned evaluation region be not mucosal areas.
Image processing apparatus the most according to claim 1, it is characterised in that
Upper evaluation values calculating part calculates the coordinate center of gravity of above-mentioned evaluation region and by the pixel value of above-mentioned texture component to bag The pixel being contained in above-mentioned evaluation region be weighted after weighting coordinate center of gravity between distance.
Image processing apparatus the most according to claim 3, it is characterised in that
The one part of pixel of above-mentioned evaluation region is added by upper evaluation values calculating part by the pixel value of above-mentioned texture component Power, thus calculate above-mentioned weighting coordinate center of gravity.
Image processing apparatus the most according to claim 4, it is characterised in that
Upper evaluation values calculating part has the picture of more than predetermined ratio to the maximum of the pixel value relative to above-mentioned texture component The pixel of element value is weighted, thus calculates above-mentioned weighting coordinate center of gravity.
Image processing apparatus the most according to claim 1, it is characterised in that
Upper evaluation values calculating part has:
Discrete Distribution calculating part, it is made up of multiple discrete points according to the above-mentioned texture component showed by continuous distribution, generation Discrete Distribution data;And
Homogeneity evaluation of estimate calculating part, it calculates upper evaluation values according to the coordinate information of above-mentioned multiple discrete points.
Image processing apparatus the most according to claim 6, it is characterised in that
Above-mentioned Discrete Distribution calculating part extracts has predetermined ratio relative to the maximum of the pixel value of above-mentioned texture component The pixel of pixel value, as above-mentioned multiple discrete points.
Image processing apparatus the most according to claim 6, it is characterised in that
Above-mentioned homogeneity evaluation of estimate calculating part resolves dividing of above-mentioned multiple discrete points according to the distance between above-mentioned multiple discrete points Cloth, thus calculate upper evaluation values.
Image processing apparatus the most according to claim 8, it is characterised in that
Above-mentioned homogeneity evaluation of estimate calculating part uses each discrete point and closest to the distance between other discrete points of each discrete point, Evaluation values in calculating.
Image processing apparatus the most according to claim 8, it is characterised in that
Above-mentioned homogeneity evaluation of estimate calculating part use each discrete point and be contained in away from each discrete point be in the range of preset distance The number of discrete point, evaluation values in calculating.
11. image processing apparatus according to claim 6, it is characterised in that
Multiple regions that above-mentioned evaluation region is divided into shape and area to be equal to each other by above-mentioned homogeneity evaluation of estimate calculating part, root According to each self-contained discrete point quantity in multiple regions, evaluation values in calculating.
12. image processing apparatus according to claim 11, it is characterised in that
Above-mentioned homogeneity evaluation of estimate calculating part uses x2Calibrating calculates evaluation values.
13. image processing apparatus according to claim 11, it is characterised in that
Above-mentioned homogeneity evaluation of estimate calculating part uses Shannon-Wiener diversity index to calculate evaluation values, and this Shannon-Wiener diversity index represents cluster In various degree of kind of comprising.
14. 1 kinds of image processing methods, it is characterised in that including:
Evaluation region setting procedure, in vivo sets the evaluation region differentiating object as classification in image;
Texture component acquisition step, the in vivo image in above-mentioned evaluation region obtains texture component;
Evaluation of estimate calculation procedure, calculates the above-mentioned texture represented on the space i.e. coordinate space of coordinate representing location of pixels and divides The homogeneous evaluation of estimate of amount;And
Discriminating step, differentiates the classification of above-mentioned evaluation region according to upper evaluation values.
CN201110386605.5A 2010-11-29 2011-11-29 Image processing apparatus and image processing method Expired - Fee Related CN102567988B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2010265795A JP5576775B2 (en) 2010-11-29 2010-11-29 Image processing apparatus, image processing method, and image processing program
JP2010-265795 2010-11-29
JPJP2010-265795 2010-11-29

Publications (2)

Publication Number Publication Date
CN102567988A CN102567988A (en) 2012-07-11
CN102567988B true CN102567988B (en) 2016-12-14

Family

ID=

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1277241C (en) * 1999-03-18 2006-09-27 纽约州立大学研究基金会 System and method for performing a three-dimensional virtual examination, navigation and visualization
EP1870020A1 (en) * 2005-04-13 2007-12-26 Olympus Medical Systems Corp. Image processing device and method
CN101794434A (en) * 2008-11-07 2010-08-04 奥林巴斯株式会社 Image display device and image processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1277241C (en) * 1999-03-18 2006-09-27 纽约州立大学研究基金会 System and method for performing a three-dimensional virtual examination, navigation and visualization
EP1870020A1 (en) * 2005-04-13 2007-12-26 Olympus Medical Systems Corp. Image processing device and method
CN101794434A (en) * 2008-11-07 2010-08-04 奥林巴斯株式会社 Image display device and image processing method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Extraction of Microcalcifications from mammograms using Morphological Filter with Multiple Structuring Elements;Hua-Rong Jin et al.;《Systems and Computers in Japan》;19930101;第24卷(第11期);第66-74页 *
图像纹理的灰度共生矩阵计算问题的分析;薄华 等;《电子学报》;20060131;第34卷(第1期);第155-158页 *
胃粘膜下肿瘤超声内镜声像图的纹理分析价值;金震东 等;《中国医学影像技术》;19980731;第14卷(第7期);第483-485页 *

Similar Documents

Publication Publication Date Title
Zhou et al. A radiomics approach with CNN for shear-wave elastography breast tumor classification
Joo et al. Computer-aided diagnosis of solid breast nodules: use of an artificial neural network based on multiple sonographic features
EP2466541B1 (en) Image processing apparatus, image processing method and image processing program
CN102243762B (en) Image processing apparatus and image processing method
CN102247118B (en) Medical image processing apparatus
CN109858540B (en) Medical image recognition system and method based on multi-mode fusion
JP5576775B2 (en) Image processing apparatus, image processing method, and image processing program
CN103249358A (en) Medical image processing device
Bourbakis Detecting abnormal patterns in WCE images
CN103458765B (en) Image processing apparatus
CN114782307A (en) Enhanced CT image colorectal cancer staging auxiliary diagnosis system based on deep learning
CN103455821B (en) Image analysis apparatus and method based on BI-RADS
Hao et al. Texture branch network for chronic kidney disease screening based on ultrasound images
CN112419246B (en) Depth detection network for quantifying esophageal mucosa IPCLs blood vessel morphological distribution
Sengun et al. Automatic liver segmentation from CT images using deep learning algorithms: a comparative study
CN117218127B (en) Ultrasonic endoscope auxiliary monitoring system and method
Hatano et al. Classification of osteoporosis from phalanges CR images based on DCNN
WO2021183765A1 (en) Automated detection of tumors based on image processing
CN102567988B (en) Image processing apparatus and image processing method
Arnold et al. Indistinct frame detection in colonoscopy videos
Cao et al. Deep learning based lesion detection for mammograms
CN102124471A (en) Methods for enhancing vascular patterns in cervical imagery
CN115294329A (en) Focus image identification method based on deep learning model
JP2019013461A (en) Probe type confocal laser microscopic endoscope image diagnosis support device
Karargyris et al. A video-frame based registration using segmentation and graph connectivity for Wireless Capsule Endoscopy

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20161214

Termination date: 20181129