CN102243710B - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
CN102243710B
CN102243710B CN201110122736.2A CN201110122736A CN102243710B CN 102243710 B CN102243710 B CN 102243710B CN 201110122736 A CN201110122736 A CN 201110122736A CN 102243710 B CN102243710 B CN 102243710B
Authority
CN
China
Prior art keywords
image
reliability
initial value
specific region
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110122736.2A
Other languages
Chinese (zh)
Other versions
CN102243710A (en
Inventor
北村诚
神田大和
河野隆志
弘田昌士
松田岳博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2010112541A external-priority patent/JP5576711B2/en
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN102243710A publication Critical patent/CN102243710A/en
Application granted granted Critical
Publication of CN102243710B publication Critical patent/CN102243710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention provides image processing apparatus and image processing method.Image processing apparatus possesses: discrimination standard preparing department, and it makes the discrimination standard for differentiating the specific region processed in object images, and described process object images is to select according to temporal order from the image constituting sequential chart picture;Feature value calculation unit, its calculating processes the characteristic quantity of each cut zone of object images;And specific region judegment part, it uses discrimination standard to differentiate the specific region processed in object images according to the characteristic quantity of each cut zone.And, it determines benchmark preparing department, according to the characteristic quantity of the specific region differentiated in the image having finished on process, makes discrimination standard.

Description

Image processing apparatus and image processing method
Technical field
The present invention relates at image that a series of sequential chart pictures shot chronologically obtained by reference object are processed Reason device and image processing method.
Background technology
Known a kind of diagnosis supporting device, this diagnosis supporting device is from the internal of use endoscope apparatus shooting examinee The image obtained is to differentiate the region of pathological changes, and shows at display part (with reference to Japanese Unexamined Patent Publication 2002-165757 publication). In the technology disclosed in the disclosure publication, by being prepared for differentiating teacher's data of the benchmark of lesion region in advance, Use these teacher's data that the characteristic quantities such as the such as color ratio of each pixel are classified, thus differentiate lesion region.
But, in the case of reference object is live body, the color of object changes along with individual variation etc..Such as, As a example by enumerating mucosa, the color of mucosa changes along with the individual variation of examinee.Additionally, the color of mucosa is due to root Kind according to digestive tube is different and different, even so same examinee, as well as the live body of this image of shooting In position and change.Therefore, preprepared is being used to teach as Japanese Unexamined Patent Publication 2002-165757 publication In the method for teacher's data, owing to benchmark is fixed, it is difficult to come high-precision according to various intravital images so existing The problem that degree differentiates the specific region in image.
Summary of the invention
The present invention is the invention made in view of above-mentioned background, its object is to provide the one can differentiation figure accurately The image processing apparatus of the specific region in Xiang and image processing method, reference object is carried out by this image construction chronologically The a series of sequential chart pictures shot and obtain.
The image processing apparatus that the present invention relates to be process chronologically reference object is shot and obtain a series of The image processing apparatus of sequential chart picture, it possesses: image selecting section, and it is pressed from the image constituting described sequential chart picture Object images is processed according to temporal order selection;Discrimination standard preparing department, its making is used for differentiating described process object images The discrimination standard of interior specific region;Feature value calculation unit, it calculates each pixel or each little of described process object images The characteristic quantity in region;And specific region judegment part, it is according to each described pixel or the described feature of each described zonule Amount, the specific region in using described discrimination standard to differentiate described process object images, described discrimination standard preparing department root According to being chosen as described process object images and by described specific region judegment part by described image selecting section The characteristic quantity of the described specific region in the image differentiated, makes described discrimination standard.
Additionally, the image processing method that the present invention relates to comprises the steps: to carry out reference object chronologically from composition The image of a series of sequential chart pictures shot and obtain processes object images according to temporal order selection;Making is used for sentencing The discrimination standard of the specific region in the most described process object images;Calculate each pixel of described process object images or each The characteristic quantity of zonule;And according to each described pixel or the described characteristic quantity of each described zonule, use described differentiation Benchmark differentiates the specific region in described process object images, about making described discrimination standard, according to having been selected as The characteristic quantity of described specific region in the image that described process object images and having been carried out differentiates, makes described Discrimination standard.
If reading in conjunction with the accompanying detailed description of the invention below, then can be further appreciated by foregoing and the present invention Other purposes, feature, advantage and the meaning technically and in industry.
Accompanying drawing explanation
Fig. 1 is the block diagram of the functional structure of the image processing apparatus that embodiment 1 is described.
Fig. 2 is the overall flow figure of the processing sequence illustrating that the image processing apparatus of embodiment 1 carried out.
Fig. 3 is the flow chart of the detailed processing sequence of the characteristic quantity calculating process illustrating embodiment 1.
Fig. 4 is the flow chart of the detailed processing sequence of the specific region differentiation process illustrating embodiment 1.
Fig. 5 is the flow chart of the detailed processing sequence of the reliability calculating process illustrating embodiment 1.
Fig. 6 is the flow chart of the detailed processing sequence of the discrimination standard making process illustrating embodiment 1.
Fig. 7 is the block diagram of the functional structure of the image processing apparatus illustrating embodiment 2.
Fig. 8 is the overall flow figure of the processing sequence illustrating that the image processing apparatus of embodiment 2 carried out.
Fig. 9 is the flow chart of the detailed processing sequence of the specific region differentiation process illustrating embodiment 2.
Figure 10 is the flow chart of the detailed processing sequence of the discrimination standard making process illustrating embodiment 2.
Figure 11 is the block diagram of the functional structure of the image processing apparatus that embodiment 3 is described.
Figure 12 is the explanatory diagram of the summary of the process illustrating that the image processing apparatus of embodiment 3 carried out.
Figure 13 is the overall flow figure of the processing sequence illustrating that the image processing apparatus of embodiment 3 carried out.
Figure 14 is the overall flow figure illustrating the processing sequence after Figure 13.
Figure 15 is the block diagram of the functional structure of the image processing apparatus that embodiment 4 is described.
Figure 16 is the overall flow figure of the processing sequence illustrating that the image processing apparatus of embodiment 4 carried out.
Figure 17 is the flow chart of the detailed processing sequence of the initial value setting process illustrating embodiment 4.
Figure 18 is the block diagram of the functional structure of the image processing apparatus that embodiment 5 is described.
Figure 19 is the overall flow figure of the processing sequence illustrating that the image processing apparatus of embodiment 5 carried out.
Figure 20 is the flow chart of the detailed processing sequence of the near zone differentiation process illustrating embodiment 5.
Figure 21 is the block diagram of the functional structure of the image processing apparatus that embodiment 6 is described.
Figure 22 is the overall flow figure of the processing sequence illustrating that the image processing apparatus of embodiment 6 carried out.
Figure 23 is the flow chart of the detailed processing sequence of the reliability calculating benchmark setting process illustrating embodiment 6.
Figure 24 is the system construction drawing of the structure illustrating the computer system applying the present invention.
Figure 25 is the block diagram of the structure of the main part illustrating the computer system constituting Figure 24.
Detailed description of the invention
Below, come referring to the drawings the preferred embodiment of the present invention is illustrated.It addition, the present invention is not limited to this Embodiment.Additionally, in the record of each accompanying drawing, same section is indicated identical labelling.
To such as endoscope and capsule endoscope, these import in examinee's body the image processing apparatus of present embodiment Medical finder shoot the in vivo tube chamber such as digestive tube chronologically and a series of images (sequential chart picture) that obtains enters Row processes.Below, normal with by the image (hereinafter referred to as " in vivo lumen image ") being constituted this sequential chart picture The region (normal mucosa region) of mucosa illustrates in case of being determined as specific region.Here, constitute this reality The in vivo lumen image executing the sequential chart picture handled by the image processing apparatus of mode is such as to have in each pixel Relative to R (red), G (green), the coloured image of pixel value of 256 tonal gradations of each colour content of B (blue). It addition, specific region is not limited to normal mucosa region, can will want in lumen image in vivo desired by differentiation Region be set to specific region.Additionally, this invention is not limited to the image that reference object is in vivo tube chamber, too can Enough it is applicable to reflect the image of other reference object.
Embodiment 1
First, the image processing apparatus of embodiment 1 is illustrated.Fig. 1 is to illustrate at the image of embodiment 1 The block diagram of the functional structure of reason device 1.As it is shown in figure 1, the image processing apparatus 1 of embodiment 1 possesses: image Obtaining section 11, input unit 12, display part 13, storage part 14, operational part 20, control image processing apparatus 1 are whole The control portion 15 of body action.
Image acquiring section 11 is for obtaining the view data of the sequential chart picture shot by medical finder, by this image The view data of the sequential chart picture that obtaining section 11 obtains (is specifically each in vivo tube chamber figure constituting sequential chart picture The view data of picture) it is stored in storage part 14, after being processed by operational part 20, the most suitably it is shown in display Portion 13.Situation in the exchange of view data movable storage medium being used between the most medical finder Under, image acquiring section 11 is by detachably installing this storage medium and reading the image of preserved sequential chart picture The reading device of data is constituted.Additionally, in the view data by preserving the sequential chart picture shot by medical finder Server be arranged on appropriate location and from this server obtain view data structure in the case of, image acquiring section 11 are made up of the communicator etc. for being connected with server.And, enter with server via this image acquiring section 11 Row data communication, obtains the view data of sequential chart picture.Additionally, in addition, from medical finder via cable In the case of line obtains the structure of view data, it is also possible to obtained by pie graph pictures such as the interface arrangements of input image data Portion 11.
Input unit 12 is such as realized by keyboard, mouse, touch panel, various switches etc., input signal is exported Control portion 15.Display part 13 is realized by display devices such as LCD, EL display, aobvious under the control in control portion 15 Show the various pictures comprising in vivo lumen image.
Storage part 14 by these various IC memorizeies of ROM, RAM such as the flash memory that can update storage, built-in or by Information storage medium and the reading devices etc. thereof such as the hard disk of data communication terminals connection, CD-ROM realize, and deposit in advance Storage or temporarily store when processing every time makes image processing apparatus 1 action, is had for realizing this image processing apparatus 1 The program of standby various functions and the data etc. used when performing this program.This storage part 14 stores and has by scheming View data as the sequential chart picture that obtaining section 11 obtains.Have for differentiating composition additionally, store in storage part 14 The image processing program 141 in the normal mucosa region in each in vivo lumen image of sequential chart picture.
Operational part 20 is realized by hardware such as CPU.This operational part 20 sequential processing chronologically constitutes each of sequential chart picture In vivo lumen image, carries out the various fortune for differentiating the normal mucosa region being reflected in each in vivo lumen image Calculation processes.In embodiment 1, by from sequential front to the order at sequential rear, (i.e. temporal order is from front end (the One width) in vivo lumen image to the in vivo lumen image of end (being such as set to N width)), process successively Each in vivo lumen image.And, in embodiment 1, to the temporal order in vivo lumen image in front end, Use the initial value made in advance as discrimination standard to differentiate normal mucosa region.Base is differentiated here, make in advance Accurate initial value is also stored in storage part 14.Then, by using at the in vivo lumen image being over process The characteristic quantity in the normal mucosa region of interior differentiation, suitably makes the discrimination standard in normal mucosa region, use Discrimination standard differentiates normal mucosa region.
This operational part 20 includes: feature value calculation unit 21, discrimination standard preparing department 22, specific region judegment part 25. It is divided into such as the cut section of zonule in the in vivo lumen image that feature value calculation unit 21 will process object Territory, is calculated meansigma methods and the meansigma methods of B/G value of G/R value described later by the region (cut zone) being respectively partitioned into, As characteristic quantity.
Discrimination standard preparing department 22 is the function part making the discrimination standard for differentiating normal mucosa region, is processing Discrimination standard, before the in vivo lumen image of front end, is set as described initial value by temporal order.On the other hand, Constitute first of sequential chart picture in vivo after lumen image processing, the most suitably make and be used for processing temporal order Discrimination standard for Next in vivo lumen image.This discrimination standard preparing department 22 possesses initial value configuration part 221 With weighted mean calculating part 222.Discrimination standard is set as initial value by initial value configuration part 221.Weighted mean Calculating part 222 is the feature calculating the normal mucosa region differentiated in being over the in vivo lumen image processed The function part of the weighted mean of amount, possesses the distance weighted configuration part of sequential 223 and reliability weight setting portion 224.
Sequential distance weighting configuration part 223 is to the normal mucosa region in the in vivo lumen image being over process Characteristic quantity set with the distance of timing (sequential distance) corresponding weighted value (hereinafter referred to as " based on sequential distance Weighted value ").Here, the sequential of in vivo lumen image that sequential distance is equivalent to comprise this normal mucosa region is suitable The difference of the temporal order of the in vivo lumen image of sequence and process object.
Reliability weight setting portion 224 possesses reliability calculating portion 225, this reliability calculating portion 225 calculate about by Specific region judegment part 25 is determined as the cut zone in normal mucosa region in the in vivo lumen image processing object The reliability (reliability for this cut zone is normal mucosa region) of differentiation result, set according to this reliability The weighted value (hereinafter referred to as " weighted value based on reliability ") of the fixed characteristic quantity for this normal mucosa region.This can By degree calculating part 225, there is reliability calculating benchmark configuration part 226 and nonspecific regional determination portion 227.Reliability meter Calculate benchmark configuration part 226 and set the reliability calculating benchmark for calculating reliability.Nonspecific regional determination portion 227 Determine whether to exist and in the in vivo lumen image processing object, be determined as improper gluing by specific region judegment part 25 The region of diaphragm area.
Specific region judegment part 25, according to the characteristic quantity calculated by feature value calculation unit 21, uses by discrimination standard system Make the normal mucosa region in the in vivo lumen image of the discrimination standard differentiation process object that portion 22 makes.
Control portion 15 is realized by hardware such as CPU.This control portion 15 is according to the picture number obtained by image acquiring section 11 According to input signal, the program being stored in storage part 14 and the data etc. inputted from input unit 12, carry out composition The instruction in each portion of image processing apparatus 1 and the transmission etc. of data, control the dynamic of image processing apparatus 1 entirety uniformly Make.
Fig. 2 is the overall flow figure of the processing sequence illustrating that the image processing apparatus 1 of embodiment 1 carried out.It addition, Process described herein by each portion of image processing apparatus 1 according to the image processing program being stored in storage part 14 141 carry out action realizes.
As in figure 2 it is shown, first, image acquiring section 11 obtains the view data (step a1) of sequential chart picture.Constitute The view data of each in vivo lumen image of acquired sequential chart picture and the picture number one representing this temporal order Rise and be stored in storage part 14, be in the image that can read in the in vivo lumen image being labeled with arbitrary image numbering The state of data.
Then, with constitute sequential chart picture each the most in vivo lumen image for processing object successively, carry out differentiating the most in vivo The process in the normal mucosa region in lumen image, but first in operational part 20, it determines at the beginning of benchmark preparing department 22 Initial value configuration part 221 reads the initial value making in advance and being stored in storage part 14, is set as discrimination standard reading The initial value (step a3) gone out.This initial value is for differentiating as processing object images reading in step a5 of back segment Normal mucosa region that go out, that temporal order is in the in vivo lumen image of front end.Additionally, initial value such as passes through The characteristic quantity distribution precalculating normal mucosa region makes.
Here, the computation sequence of the characteristic quantity distribution in the normal mucosa region being set to initial value is illustrated.Calculating Before initial value, prepare multiple in vivo lumen image that have taken normal mucosa region.First, by ready instead The in vivo lumen image reflecting normal mucosa region is divided into the preliminary dimension (such as 8 × 8 pixel) as cut zone Rectangle frame.
Then, characteristic quantity is calculated by the cut zone being respectively partitioned into.The G/R of each pixel in calculating cut zone below It is worth the meansigma methods of (G component and the color ratio of R component) and the average of B/G value (B component and the color ratio of G component) Value is as characteristic quantity.If here, the aggregate value of the value of the R component set in cut zone (R value) is as RsumIf, The aggregate value of the value (G-value) of the G component in cut zone is GsumIf, the value (B of the B component in cut zone Value) aggregate value be Bsum, then meansigma methods GR of G/R value is illustrated by following formula (1), meansigma methods BG of B/G value Illustrated by following formula (2).
GR=(Gsum/Rsum)…(1)
BG=(Bsum/Gsum)…(2)
Then, carry out respectively ready multiple in vivo lumen image calculating G/R by such each cut zone The meansigma methods of value and the process of the meansigma methods of B/G value, the meansigma methods of the G/R value obtained by making and putting down of B/G value Average channel zapping in two dimensional character plane.Then, the total of frequency is become channel zapping normalization 100, the characteristic quantity as normal mucosa region is distributed.Then, by so calculating the characteristic quantity in normal mucosa region Distribution makes the initial value of discrimination standard, is stored in storage part 14.
Returning Fig. 2, then, operational part 20 carries out such process: reading is formed in step a1 and is stored in storage Sequential chart picture in portion 14, in vivo lumen image that temporal order is in front end, as processing object images (step Rapid a5), selection processes object images as image selecting section.Then, step a7, feature value calculation unit are transferred to 21 perform characteristic quantity calculating processes.Fig. 3 is the flow chart of the detailed processing sequence illustrating that characteristic quantity calculating processes.
In characteristic quantity calculating processes, first process object images is divided into such as 8 × 8 pictures by feature value calculation unit 21 The rectangle frame i.e. cut zone (step b1) of element.Additionally, now by each cut zone with in vivo in lumen image Each location of pixels carries out correspondence so that can determine in process afterwards belonging to each pixel processed in object images Cut zone.Specifically, to intrinsic zone numbers such as each cut zone distribution series numbers.Then, making will be each The pixel value of pixel is set to the zone number image of the zone number of the cut zone belonging to this pixel.
Then, feature value calculation unit 21, by each cut zone being partitioned into, calculates the G/R value as characteristic quantity The meansigma methods (step b3) of meansigma methods and B/G value.Specifically, by each cut zone, according to above-mentioned formula (1) The meansigma methods of the G/R value of each pixel in calculating cut zone, and calculate in cut zone according to above-mentioned formula (2) The meansigma methods of B/G value of each pixel.The characteristic quantity of the process object images so calculated by each cut zone is i.e. The meansigma methods of G/R value and the meansigma methods of B/G value are stored in storage part 14 accordingly with picture number.Then, return Return step a7 of Fig. 2, transfer to step a9.
In step a9, specific region judegment part 25 performs specific region differentiation and processes.Fig. 4 is to illustrate specific region The flow chart of the detailed processing sequence that differentiation processes.In this specific region differentiation processes, according to the step at Fig. 2 The characteristic quantity calculated by each cut zone in a7, processes the normal mucosa in object images by each cut zone differentiation Region.Now, the discrimination standard, i.e. set for processing the differentiation in normal mucosa region in object images is used The discrimination standard that current time is up-to-date.In embodiment 1, it determines benchmark is the characteristic quantity (G/R in normal mucosa region The meansigma methods of value and the meansigma methods of B/G value) channel zapping in two dimensional character plane, at temporal order in front end In vivo lumen image be to process in the case of object images, the initial value set in step a3 of Fig. 2 is suitable In the discrimination standard that current time is up-to-date.Additionally, temporal order be second later in vivo lumen image be place In the case of reason object images, when in step e11 of Fig. 6 described later, the discrimination standard of final updating is equivalent to current Carve up-to-date discrimination standard.Specifically, by characteristic quantity and the discrimination standard of relatively each cut zone, it determines each point Cut whether region is normal mucosa region.
Such as, in embodiment 1, the mahalanobis distance (ginseng of characteristic quantity and discrimination standard by calculating each cut zone Examine: CG-ARTS association, Digital Image Processing, P222~P223), carry out the differentiation in normal mucosa region.That is, As shown in Figure 4, the meansigma methods of the G/R value that first specific region judegment part 25 is calculated by each cut zone and Mahalanobis distance between the meansigma methods of B/G value and the up-to-date discrimination standard of current time, as this feature amount (step c1).Then, specific region judegment part 25 is by the segmentation within normal range set in advance of the mahalanobis distance that calculated Area judging is normal mucosa region (step c3).Now, for mahalanobis distance outside predetermined normal range point Cut region, it determines for improper mucosal areas.Predetermined normal range can be set to fixed value, it is also possible to is configured to Enough by change settings such as user operations.So, the differentiation knot in the normal mucosa region determined in processing object images Really the differentiation result of normal mucosa region or improper mucosal areas (each cut zone be) is stored in storage part In 14.Then, return to step a9 of Fig. 2, transfer to step a11.
In step a11, in reliability calculating portion 225, reliability calculating benchmark configuration part 226 is set in step The reliability calculating benchmark that the reliability calculating of a13 uses in processing.Such as, when existing in step c3 of Fig. 4 Be determined as be non-normal mucosa region cut zone in the case of, reliability calculating benchmark configuration part 226 calculates this and sentences Wei not that the meansigma methods of characteristic quantity i.e. G/R value and the meansigma methods of B/G value of the cut zone in non-normal mucosa region exists Channel zapping in two dimensional character plane, is set to reliability calculating benchmark by obtained channel zapping.Set can Calculate benchmark by degree to be stored in storage part 14.
It addition, the setting order of reliability calculating benchmark is not limited to this.Such as, as first variation, currently Moment up-to-date discrimination standard is i.e. for differentiating that the discrimination standard in the normal mucosa region processed in object images can set For reliability calculating benchmark.
As second variation, before setting reliability calculating benchmark, prepare to photograph the work in normal mucosa region Lumen of a body vessel image and the in vivo lumen image photographing improper mucosal areas.Then, first by normal for reflection viscous The in vivo lumen image of diaphragm area is divided into cut zone, calculates meansigma methods and the B/G of G/R value by each cut zone The meansigma methods of value, the characteristic quantity as normal mucosa region is collected.Similarly, improper mucosal areas will be reflected In vivo lumen image be divided into cut zone, calculate the meansigma methods of G/R value and B/G value by each cut zone Meansigma methods, the characteristic quantity as improper mucosal areas is collected.Then, from the normal mucosa region collected The characteristic quantity of characteristic quantity and improper mucosal areas determines the characteristic quantity scope in normal mucosa region, can will determine The characteristic quantity scope in normal mucosa region is as reliability calculating benchmark.It addition, in this case, it is possible to make in advance Reliability calculating benchmark, is stored in advance in storage part 14, so in step a11, as long as carrying out from storage part 14 process reading reliability calculating benchmark.
Setting reliability calculating benchmark if as discussed above, then, reliability calculating portion 225 performs reliability calculating Process, by being identified as being each cut zone in normal mucosa region processing in object images, calculate reliability (step Rapid a13).Fig. 5 is the flow chart of the detailed processing sequence illustrating that reliability calculating processes.At this reliability calculating In reason, for the normal mucosa region/improper mucosal areas carried out by each cut zone in step c3 of Fig. 4 Differentiate result, calculate reliability.
In reliability calculating processes, the most nonspecific regional determination portion 227 determines whether there is step c3 at Fig. 4 In be identified as being the cut zone (step d1) in non-normal mucosa region.Then, it is identified as being anon-normal in existence Often in the case of the cut zone of mucosal areas (step d3: yes), reliability calculating portion 225 is based on each cut zone Calculate the reliability in normal mucosa region.That is, reliability calculating portion 225 processes the most successively and is identified as being normal mucosa The cut zone in region, the characteristic quantity calculating each cut zone and the reliability calculating set in step a11 of Fig. 2 Benchmark (is identified as being that the frequency of the characteristic quantity of the cut zone in non-normal mucosa region divides processing in object images Cloth) between mahalanobis distance (step d4).Then, be identified as be normal mucosa region cut zone in, The reliability of the cut zone that mahalanobis distance is more than threshold value set in advance that reliability calculating portion 225 will calculate It is set as " 1 ", mahalanobis distance is set as " 0 " (step d5) less than the reliability of the cut zone of threshold value.So, It is that the value of reliability that the cut zone in normal mucosa region sets is stored in storage part 14 by being identified as.In advance Fixed threshold value can be set to fixed value, it is also possible to is configured to carry out change setting according to user operation.
Here, as it has been described above, reliability calculating benchmark is the characteristic quantity processing the improper mucosal areas in object images Channel zapping.Therefore, normal mucosa region and improper is judged according to the mahalanobis distance that calculates in step d4 The zone line between region beyond the normal mucosa regions such as mucosal areas, is normal mucosa region being identified as In cut zone, to being determined to be the cut zone of above-mentioned zone line, calculate low reliability.Then, return To step a13 of Fig. 2, transfer to step a15.
On the other hand, it is the cut zone (step d3: no) in non-normal mucosa region if there is no being identified as, Then reliability calculating portion 225 is that the reliability of all cut zone in normal mucosa region is set as " 1 " by being identified as (step d7).The value of set reliability is stored in storage part 14.Then, the step of Fig. 2 is returned to A13, transfers to step a15.
It addition, the computation sequence of reliability is not limited to this.Such as, as reliability calculating benchmark setting order the One variation, as it has been described above, the discrimination standard being used for differentiating the normal mucosa region processed in object images is set In the case of reliability calculating benchmark, reliability calculating portion 225 processes the most successively and is identified as being normal mucosa district The cut zone in territory, calculates the characteristic quantity of each cut zone and the normal mucosa region in processing object images respectively Differentiation discrimination standard i.e. reliability calculating benchmark between mahalanobis distance.Then, reliability calculating portion 225, Be identified as be normal mucosa region cut zone in, be more than threshold value set in advance by the mahalanobis distance calculated The reliability of cut zone be set as " 0 ", mahalanobis distance is set as less than this reliability of the cut zone of threshold value “1”.So, in this variation, it is determined that between the region beyond normal mucosa region and normal mucosa region Between region, the cut zone being determined to be zone line is calculated low reliability.
Additionally, second variation of the setting order as reliability calculating benchmark, as it has been described above, will normally glue In the case of the characteristic quantity scope of diaphragm area is set to reliability calculating benchmark, reliability calculating portion 225 being identified as is In the cut zone in normal mucosa region, by its characteristic quantity at the reliability calculating benchmark i.e. characteristic quantity in normal mucosa region In the range of the reliability of cut zone be set as " 1 ", the reliability of extraneous cut zone is set as " 0 ".With This situation is similarly, it is determined that the zone line between region beyond normal mucosa region and normal mucosa region, to quilt It is judged to it is that the cut zone of zone line calculates low reliability.
Additionally, in step a11 of Fig. 2, improper mucosal areas in above-mentioned process object images can be set The channel zapping of characteristic quantity, for processing the discrimination standard of the differentiation in the normal mucosa region in object images, normally gluing The characteristic quantity scope these three reliability calculating benchmark of diaphragm area.Then, in reliability calculating processes, can distinguish Calculating the reliability employing these each reliability calculating benchmark, each value according to obtaining obtains final reliability.Tool For body, the channel zapping of the characteristic quantity of the improper mucosal areas processed in object images is used as reliability calculating base Standard, calculates reliability T1, the discrimination standard being used for processing the differentiation in the normal mucosa region in object images is used as Reliability calculating benchmark, calculates reliability T2, and the characteristic quantity scope in normal mucosa region is used as reliability calculating base Standard, calculates reliability T3.Then according to each value of these T1, T2, T3, (3) calculate final according to the following formula Reliability T.
T=(T1+T2+T3)/8 ... (3)
Then, in step a15 of Fig. 2, it determines benchmark preparing department 22 performs discrimination standard making and processes.Fig. 6 It it is the flow chart of the detailed processing sequence illustrating that discrimination standard making processes.
In discrimination standard making processes, first reliability weight setting portion 224 is normal mucosa region to being identified as Each cut zone set weighted value (step e1) based on its reliability.Then, weighted mean calculating part 222 By the weighted value based on reliability set by each cut zone in step e1 is multiplied by corresponding cut zone The meansigma methods of characteristic quantity i.e. G/R value and the meansigma methods of B/G value, thus according to each to normal mucosal areas of reliability The characteristic quantity of cut zone is weighted, and makes meansigma methods and the B/G value of the G/R value of each cut zone after weighting Meansigma methods channel zapping (step e3) in two dimensional character plane.Then, weighted mean calculating part 222 will Made channel zapping normalization so that the total of frequency becomes 100 (steps e5).
In embodiment 1, such as, set the value of reliability as weighted value based on reliability.Therefore, it is being judged to Wei not in each cut zone in normal mucosa region, reliability in step d5 or step d7 of Fig. 5 is set as The characteristic quantity of the cut zone of " 1 " is multiplied by " 1 ".In the case, in step e3, it is set to there are 1.0 these segmentations Region, adds up to channel zapping.On the other hand, reliability in step d5 of Fig. 5 is set as dividing of " 0 " The characteristic quantity cutting region is multiplied by " 0 ".Then, so in step e3, reliability is that the cut zone of " 0 " is i.e. determined For being the characteristic quantity of cut zone of zone line between the region beyond normal mucosa region and normal mucosa region Will not be aggregated in channel zapping.
Additionally, the variation processed as the reliability calculating carried out in step a13 of Fig. 2, as it has been described above, Set multiple reliability calculating benchmark, obtain use each reliability calculating benchmark go out reliability (such as T1, T2, T3) meansigma methods as the situation etc. of final reliability T time, being worth between " 0 "~" 1 " of reliability T Value, the characteristic quantity of the cut zone that reliability is such as set to " 0.5 " is multiplied by " 0.5 ".In the case, at Fig. 6 Step e3 in, be set to there is 0.5 this cut zone, channel zapping added up to.Therefore, may be used Channel zapping is made, so being in normal mucosa region by the low characteristic quantity of cut zone of degree, mode that weight is the least With the low region of reliability between the region beyond normal mucosa region is difficult to be aggregated in channel zapping.It addition, Weighted value based on reliability is not limited to the value of reliability, calculates with can being set to the value Quadratic Finite Element according to reliability Value additionally.
Then, sequential distance weighting configuration part 223 is for channel zapping normalized in step e5 with in step e11 What middle last time made is up-to-date discrimination standard at current time, sets weighted value based on sequential distance (step e7). Then, weighted mean calculating part 222 is by sentencing of channel zapping normalized in step e5 and last time being made Other benchmark is multiplied by weighted value based on the sequential distance set in step e7 and is weighted them, adds up to weighting After each value, make new discrimination standard (step e9).Then, weighted mean calculating part 222 will make Discrimination standard store in storage part 14 and be updated (step e11).
Such as, as step e7 and the process of step e9, carry out making new discrimination standard according to following formula (4) SnewThe process of (GR, BG).
Snew(GR, BG)=S (GR, BG) × k1+H (GR, BG) × (1-k1)
·(4)
In above-mentioned formula (4), H (GR, BG) is normalized channel zapping in step e5, i.e. in basis The reliability of corresponding cut zone is each cut section in normal mucosa region to being identified as in processing object images The channel zapping that the characteristic quantity (meansigma methods of G/R value and the meansigma methods of B/G value) in territory is made after being weighted. Additionally, S (GR, BG) be last time make discrimination standard, i.e. in the process object images that this processes The discrimination standard of the differentiation in normal mucosa region.
Here, S (GR, BG) is to make according to formula (4) last time.Therefore, S (GR, BG) basis exists Being identified as in the in vivo lumen image of the pre-treatment of last time is the characteristic quantity of each cut zone in normal mucosa region Determine.On the other hand, H (GR, BG) is according to the normal mucosa region in this process object images processed Characteristic quantity determine.So, it determines benchmark is according to the normal mucosa in the in vivo lumen image of the pre-treatment of last time The characteristic quantity in the normal mucosa region in the characteristic quantity in region and this process object images processed determines.And this Time, H (GR, BG) and S (GR, BG) each value weight according to the value updating coefficient k 1.
K1 is the renewal coefficient of discrimination standard, is multiplied by the value of the k1 of S (GR, BG) and is multiplied by H (GR, BG) The value of 1-k1 be equivalent to weighted value based on sequential distance.This renewal coefficient k 1 is such as at the model of 0.0 < k1 < 1.0 Suitable value it is set in enclosing.Therefore, the value of this renewal coefficient k 1 is the least, in made discrimination standard just The differentiation result in the normal mucosa region being reflected in this process object images processed manyly.Additionally, conduct exists The differentiation result in the normal mucosa region in the process object images of the pre-treatment of last time, its sequential distance and process object The close side of image is reflected more than a remote side.
As it has been described above, discrimination standard be savings last time pre-treatment in vivo lumen image in normal mucosa region The characteristic quantity in characteristic quantity the normal mucosa region in using this process object images processed suitably makes.This Time, the characteristic quantity in the normal mucosa region in this process object images processed is carried out add corresponding with reliability Power, to the characteristic quantity in the normal mucosa region in the process object images after this weighting and the pre-treatment of last time in vivo The characteristic quantity in the normal mucosa region in lumen image is carried out with sequential apart from corresponding weighting.
Then, operational part 20 determines whether there is untreated in vivo lumen image.If have untreated in vivo Lumen image, i.e. not using constitute sequential chart picture all in vivo lumen image as process object perform step a7~ In the case of the process of step a15 (step a17: no), then carry out such process: when storage part 14 reads The in vivo lumen image that sequence order is the next one (and then) of this process object images processed, as process Object images (step a19), processes object images as image selecting section selection.Then return to step a7, right This process object images performs step a7~the process of step a15.
On the other hand, if performing step to constitute all in vivo lumen image of sequential chart picture as processing object The process (step a17: yes) of a7~step a15, then terminate present treatment.
As described above, in embodiment 1, calculate each cut zone G/R value meansigma methods and The meansigma methods of B/G value is as the characteristic quantity of process object images, by comparing characteristic quantity and the differentiation of this each cut zone Benchmark differentiates whether each cut zone is normal mucosa region.Then, it is each of normal mucosa region to being identified as Cut zone calculates reliability, be identified as be normal mucosa region cut zone in, to normal mucosal areas and Zone line between region beyond normal mucosa region sets low reliability.Then, process object diagram is being carried out After the differentiation in the normal mucosa region in Xiang, according to comprise this process object images be over process in vivo The characteristic quantity in the normal mucosa region differentiated in lumen image, makes discrimination standard.More specifically, to tying The characteristic quantity in the normal mucosa region differentiated in the in vivo lumen image that bundle processes is carried out and sequential distance and reliability After corresponding weighting, make discrimination standard.Then, the temporal order in vivo lumen image in front end is being made In the case of processing for process object images, the initial value that use makes in advance is as discrimination standard, sequential is suitable Sequence is, in the case of second later in vivo lumen image processes as process object images, to use institute the most above State what the characteristic quantity according to the normal mucosa region differentiated in being over the in vivo lumen image processed made Discrimination standard.
According to this embodiment 1, to constituting each in vivo lumen image of sequential chart picture according to this temporal order in the past During end starts to process successively, differentiated in being over the in vivo lumen image processed by using The characteristic quantity in normal mucosa region, it is possible to suitably make in vivo lumen image normal being applicable to process next time The discrimination standard of mucosal areas.And, it is possible to use the discrimination standard the most suitably made to differentiate each live body inner tube successively Normal mucosa region in the image of chamber.Therefore, it is possible to differentiate the reflection example to each in vivo lumen image accurately Such as specific regions such as normal mucosa regions, described each the most in vivo lumen image is constituted chronologically in vivo shooting A series of sequential chart pictures.
For extracting the in vivo lumen image in normal mucosa region as described above, implement to extract such as lesion region Or the process etc. in the abnormal portion region such as hemorrhagic areas, it is adequately shown on display part 13 users such as prompting doctor.Specifically For, in vivo lumen image is as the image that such as can abnormal portion region and other region recognition be shown with coming Display is on display part 13.Or, comprise the in vivo lumen image in abnormal portion region as the image that should diagnose And show on display part 13.Now, by application implementation mode 1, it is possible to remove the normal mucosa district differentiated Territory and extract abnormal portion region, it is possible to realize high-precision abnormal portion detection.
Embodiment 2
Below, the image processing apparatus of embodiment 2 is illustrated.Fig. 7 is to illustrate at the image of embodiment 2 The block diagram of the functional structure of reason device 1a.It addition, for the structure identical with the structure of explanation in embodiment 1, Mark identical labelling.The image processing apparatus 1a of embodiment 2 is as it is shown in fig. 7, possess: image acquiring section 11, Input unit 12, display part 13, storage part 14a, operational part 20a, control image processing apparatus 1a molar behavior Control portion 15.
In storage part 14a, storage has normally gluing in differentiating each in vivo lumen image constituting sequential chart picture The image processing program 141a of diaphragm area.
Additionally, operational part 20a includes: feature value calculation unit 21, discrimination standard preparing department 22a, specific region differentiate Portion 25a.Discrimination standard preparing department 22a possesses initial value configuration part 221 and weighted mean calculating part 222a, weighting Mean value calculation portion 222a possesses the distance weighted configuration part of sequential 223 and reliability weight setting portion 224a.Reliability Weight setting portion 224a possesses reliability calculating portion 225a, and this reliability calculating portion 225a possesses normalized set portion 228a.Normalized set portion 228a calculates and is identified as being the feature statistics of variables of the cut zone in normal mucosa region Amount.Additionally, the channel zapping that specific region judegment part 25a possesses the characteristic quantity to each cut zone is classified (clustering) division 251a.
Fig. 8 is the overall flow figure of the processing sequence illustrating that the image processing apparatus 1a of embodiment 2 carried out.Separately Outward, process described herein by each portion of image processing apparatus 1a according to the image being stored in storage part 14a at Reason program 141a carries out action and realizes.Additionally, in fig. 8, to the process step mark identical with embodiment 1 Note identical labelling.
As shown in Figure 8, in embodiment 2, in step a7, feature value calculation unit 21 performs at characteristic quantity calculating Reason and by each cut zone calculate process object characteristic quantity after, specific region judegment part 25a performs specific region Differentiation processes (step f9).Fig. 9 is the flow chart of the detailed processing sequence illustrating specific region differentiation process.
In specific region differentiation processes, first specific region judegment part 25a is produced in step a7 of Fig. 8 calculating The meansigma methods of the characteristic quantity i.e. G/R value of each cut zone gone out and the meansigma methods of B/G value are in two dimensional character plane Channel zapping (step g1).
Then, made channel zapping is classified (step g3) by division 251a.Classification is according to data Between the similar degree distribution of the data in feature space is divided into the block being referred to as class (cluster) method.Additionally, Now in order to process afterwards can determine the cut zone belonging to all kinds of, make and illustrate that each cut zone is classified Which kind of, for list, it is stored in storage part 14a.
Such as, for the meansigma methods of G/R value of each cut zone and the meansigma methods of B/G value in two dimensional character plane Data, use the known method (reference: CG-ARTS association, Digital Image Processing, P232) such as K-means method Carry out classification process.Here, the distance between data in two dimensional character plane is equivalent to similar degree.It addition, In K-means method, need the class number K split in advance is appointed as parameter, different according to the class number K specified, point The precision of class changes the most greatly.Therefore, in order to obtain high-precision classification results, need to determine premium class number by each image K.Here, as the determination method of premium class number K, use the algorithm determining premium class number K according to class number evaluation of estimate (reference: Chong-Wah Ngo et al, " On Clustering and Retrieval of Video Shots Through Temporal Slices Analysis, " Trans Mlt, Vol.4, No.4, pp446-458,2002).But can be suitable for Sorting technique be not limited to K-means method, it is possible to use other sorting technique.
Then, division 251a calculates all kinds of centers of gravity (step g5) according to the result of classification.Then, given zone Territory judegment part 25a calculates the mahalanobis distance (step between this center of gravity and the up-to-date discrimination standard of current time by every class g7).Here, as above-mentioned embodiment 1, it determines benchmark is characteristic quantity (the G/R value in normal mucosa region Meansigma methods and the meansigma methods of B/G value) channel zapping in two dimensional character plane.Then, specific region judegment part The mahalanobis distance calculated class within normal range set in advance is set to normal mucosa class by 25a, just will belong to this Often the cut zone of mucosa class is determined as normal mucosa region (step g9).Now, to belonging to mahalanobis distance predetermined The cut zone of the class outside normal range, it determines for improper mucosal areas.Then, return to step f9 of Fig. 8, Transfer to step f11.
In step f11, in reliability calculating portion 225a, normalized set portion 228a is according to belonging to Fig. 9's As the characteristic quantity of cut zone of class of normal mucosa class in step g9, by each normal mucosa class counting statistics amount. Such as, normalized set portion 228a presses each normal mucosa class, calculates all cut zone belonging to this normal mucosa class The variance yields of characteristic quantity.
Then, reliability calculating portion 225a uses the statistic i.e. variance yields that calculates in step f11, by each just Often mucosa class calculates reliability (step f13).Specifically, reliability calculating portion 225a at normal mucosa apoplexy due to endogenous wind, The reliability of the variance yields calculated normal mucosa class more than threshold value set in advance is set as " 0 ", by variance Value is set as " 1 " less than the reliability of the normal mucosa class of threshold value.Threshold value can be set to fixed value, it is also possible to is configured to Can be according to change settings such as user operations.So, in embodiment 2, according to the size of variance yields, with normal viscous Film class is the zone line that unit judges between the region beyond normal mucosa region and normal mucosa region, to mesozone Normal mucosa class belonging to territory calculates low reliability.
It addition, the computation sequence of reliability is not limited to this.Such as, calculate respectively each normal mucosa class with for differentiating Process mahalanobis distance between the discrimination standard in the normal mucosa region in object images.Then, at normal mucosa apoplexy due to endogenous wind, The reliability of the mahalanobis distance calculated normal mucosa class more than threshold value set in advance can be set as " 0 ", Mahalanobis distance is set as " 1 " less than the reliability of the normal mucosa class of threshold value.Generally, capsule endoscope etc. pressing In the in vivo lumen image of sequential shooting, normal mucosa region accounts for most.Therefore, the great majority of channel zapping are just Often mucosal areas.By process here, the reliability of the normal mucosa class similar with most normal mucosa class Calculate higher.On the other hand, to not similar with most normal mucosa class normal mucosa class, as normally The normal mucosa class belonging to zone line between region beyond mucosal areas and normal mucosa region, calculate low reliably Degree.
Then, it determines benchmark preparing department 22a performs discrimination standard making and processes (step f15).Figure 10 is to illustrate to sentence The flow chart of the detailed processing sequence that other reference system deals with.It addition, in Fig. 10, to identical with embodiment 1 Process step, mark identical labelling.
Discrimination standard making process in, first reliability weight setting portion 224a to each normal mucosa class set based on The weighted value (step h1) of its reliability.Then, weighted mean calculating part 222a will be by will press in step e1 The weighted value based on reliability that normal mucosa class sets is multiplied by the characteristic quantity of the cut zone belonging to this normal mucosa class The meansigma methods of i.e. G/R value and the meansigma methods of B/G value, thus according to the reliability each cut section to normal mucosal areas The characteristic quantity in territory is weighted, and makes the meansigma methods of the G/R value of each cut zone after weighting and the average of B/G value Value channel zapping (step h3) in two dimensional character plane.Then, carry out in the same manner as above-mentioned embodiment 1 Step e5~the process of step e11, update discrimination standard.Then, return to step f15 of Fig. 8, transfer to step Rapid a17.
As described above, in embodiment 2, first to the process object calculated by each cut zone The characteristic quantity (meansigma methods of G/R value and the meansigma methods of B/G value) of image is classified, by classification results obtain each Class differentiates normal mucosa region.Specifically, normal viscous by the center of gravity of class and discrimination standard being compared to differentiation Diaphragm area.Additionally, calculate reliability by the variance yields of all kinds of calculating characteristic quantities as normal mucosa class, it is considered to meter The reliability calculated suitably makes discrimination standard.According to this embodiment 2, with embodiment 1 like that by each cut section Territory differentiates that normal mucosa region, the situation of making/renewal discrimination standard are compared, and has differentiation result and easily follows mucosa The effect of the change of color.
Embodiment 3
Below, the image processing apparatus of embodiment 3 is illustrated.Figure 11 is the image that embodiment 3 is described The block diagram of the functional structure of processing means 1b.It addition, for the knot identical with the structure of explanation in embodiment 1 Structure, marks identical labelling.The image processing apparatus 1b of embodiment 3 as shown in figure 11, possesses: image obtains Portion 11, input unit 12, display part 13, storage part 14b, operational part 20b, control image processing apparatus 1b entirety The control portion 15 of action.
In storage part 14b, storage has normally gluing in differentiating each in vivo lumen image constituting sequential chart picture The image processing program 141b of diaphragm area.
Additionally, operational part 20b includes: feature value calculation unit 21, discrimination standard preparing department 22b, specific region differentiate Portion 25b.Discrimination standard preparing department 22b possesses initial value configuration part 221b and weighted mean calculating part 222.In reality Executing in mode 3, initial value configuration part 221b possesses initial value weighted mean calculating part 229b and processes stopping control Portion 230b, sets the initial value of discrimination standard.Initial value weighted mean calculating part 229b sets for constituting initial value Determine the characteristic quantity in normal mucosa region in the in vivo lumen image that object is interval, set interval end with initial value Setting object images sequential distance the most closely set the highest weight to calculate weighted mean.Process stop control unit 230b controls the stopping that the calculating of the weighted mean of initial value weighted mean calculating part 229b processes.Additionally, add Weight average value calculating part 222 possesses sequential distance weighting configuration part 223 and reliability weight setting portion 224, reliability Weight setting portion 224 possesses reliability calculating portion 225.Reliability calculating portion 225 possesses reliability calculating benchmark and sets Portion 226 and nonspecific regional determination portion 227.
In above-mentioned embodiment 1 grade, prepare the in vivo lumen image that have taken normal mucosa region in advance, according to The characteristic quantity of ready in vivo lumen image makes the initial value of discrimination standard.Initial as this discrimination standard Value, in order to differentiate normal mucosa region accurately, uses the initial value that noise is few the most as far as possible.But, in standard In the case of including improper mucosal areas in the in vivo lumen image in the reflection normal mucosa region got ready by mistake, Will comprise the characteristic quantity of improper mucosal areas in its characteristic quantity, the characteristic quantity of this improper mucosal areas will become makes an uproar Sound, the precision of the initial value of making will decline.But, as it has been described above, shot chronologically by capsule endoscope etc. In vivo lumen image in, normal mucosa region accounts for most, so compared with normal mucosa region, becoming noise Improper mucosal areas is few.Therefore, if processing multiple intravital lumen image and discrimination standard being repeated Make/update, the discrimination standard obtained is used as initial value, then can reduce effect of noise further.
Then, in embodiment 3, by according to the temporal order each the most in vivo lumen image to constituting sequential chart picture Such as carry out the process of predetermined number, set the initial value of discrimination standard.Figure 12 is the image that embodiment 3 is described The explanatory diagram of the summary of the process that processing means 1b is carried out.In fig. 12, it is schematically shown by temporal order be from The sequential chart picture that first in vivo lumen image to n-th is constituted.Such as, above-mentioned predetermined number is set to 1~ M between N opens, and is each in vivo tube chamber figure from front end (first) to m-th by processing temporal order Picture, makes the discrimination standard as initial value.Specifically, it is right to be set to m-th in vivo lumen image to set As image It, the time ordered interval from first to m-th is set to initial value and sets interval.Then, in order to make The initial value of discrimination standard, as shown in arrow A11 in Figure 12, to from each the most in vivo tube chamber of first to m-th Image from first chronologically order (from sequential front to sequential rear) process, with embodiment 1 Same main points differentiate the normal mucosa region in vivo lumen image, make/update discrimination standard.Then, will M-th sets the discrimination standard in the moment processed before object images and is set as initial value.
So set after initial value, as shown in arrow A12 in Figure 12, reverse with temporal order (from sequential rear to Sequential front) M-1 played each the most in vivo lumen image till first process, with enforcement The same main points of mode 1 make/update discrimination standard, and differentiate the normal mucosa region in vivo lumen image. Then, if processing first in vivo lumen image, then discrimination standard is reset to initial value.Then, As shown in arrow A13 in Figure 12, to from m-th in vivo lumen image (setting object images) to n-th Each the most in vivo lumen image order (from sequential front to sequential rear) chronologically process, with embodiment 1 same main points make/update discrimination standard, and differentiate the normal mucosa region in vivo lumen image.
It is similar to here, be in sequential mucosa form and aspect between the in vivo lumen image of near position.On the contrary, place Mucosa color between the in vivo lumen image of sequential far away location is not similar to.As shown in arrow A12, from Set object images to start to process first inversely with temporal order and in vivo reset after lumen image initial Value, this is to prevent the erroneous judgement caused due to the difference of its mucosa color.That is, if by process to first work The discrimination standard in the moment of lumen of a body vessel image is applied to set object images, then first in vivo lumen image and the M the sequential distance setting object images is remote, it is possible that there is erroneous judgement due to the difference of mucosa color.Then, As starting sequential processing chronologically to during in vivo lumen image (the arrow A13) at end from setting object images Initial value, uses and makes from each the most in vivo lumen image (arrow A11) of first to m-th by processing Discrimination standard.So, in embodiment 3, discrimination standard as described below is used to be used as initial value, it may be assumed that should Discrimination standard is to process first inversely in vivo with temporal order from M-1 in vivo lumen image During lumen image (arrow A12) and from m-th in vivo lumen image (setting object images) chronologically Sequential processing is to (arrow A13) during n-th in vivo lumen image, as described above, constitutes initially by processing Value sets what interval in vivo lumen image made.
Figure 13 and 14 is the overall flow of the processing sequence illustrating that the image processing apparatus 1b of embodiment 3 carried out Figure.It addition, process described herein by each portion of image processing apparatus 1b according to being stored in storage part 14b Image processing program 141b carries out action and realizes.
As shown in figure 13, first, image acquiring section 11 obtains the view data (step i1) of sequential chart picture.Then, Process stop control unit 230b and set initial value setting object interval, determine that this initial value sets the end that object is interval In vivo lumen image as set object images (step i3).Specifically, stop control unit 230b is processed As shown in Figure 12, the time ordered interval of the predetermined number from front end being set as, initial value sets target area Between, determine that this initial value sets the sequential chart picture at the interval end of object as setting object images.It is set to initial value set The fixed interval number (time ordered interval of several tensors is set to initial value and sets interval) from front end, can be set to solid Definite value, it is also possible to be configured to by change settings such as user operations.
Then, operational part 20b reading is formed in step i1 the sequential chart picture of acquirement, temporal order in front end In vivo lumen image is as processing object images (step i5).Then, initial value weighted mean calculating part 229b Perform characteristic quantity calculating to process, carry out region segmentation to processing object images, calculate as characteristic quantity by each cut zone The meansigma methods of G/R value and the meansigma methods (step i7) of B/G value.This process according to the characteristic quantity calculating of Fig. 3 at Manage same processing sequence to carry out.
Then, initial value weighted mean calculating part 229b with process object images and set between object images time Sequence distance the nearlyest then weight setting must be the biggest mode, calculate normal mucosa region characteristic quantity weighted mean (step Rapid i9).Specifically, the characteristic quantity (meansigma methods of G/R value of each cut zone calculated in step i7 it is produced on Meansigma methods with B/G value) channel zapping in two dimensional character plane, made channel zapping normalization is made Obtain frequency adds up to 100.Then, as carrying out step e7 with Fig. 6~step e9 according to above-mentioned formula (4) Process, to the channel zapping after the normalization of the characteristic quantity processing object images with in last time, step i9 calculates weighting The channel zapping that meansigma methods obtains, sets weight and plus weight according to sequential distance, as the spy in normal mucosa region The weighted mean of the amount of levying.
Then, in following step i10, process stop control unit 230b according to the sequential processing object images Order is weighted the end of mean value calculation and judges.Then, process stop control unit 230b and process object images Temporal order with set object images temporal order consistent before, it is determined that for not terminating (step i11: no), return Return to step i5.On the other hand, stop control unit 230b is processed right with setting at the temporal order processing object images As the temporal order of image is consistent and process is arrived in the case of setting object images, it is determined that calculate for terminating weighted mean (step i11: yes), transfers to step i13.
Then, in step i13, initial value configuration part 221b is by the weighted average at the end of weighted mean calculating Value, i.e. be judged in step i10~step i11 terminate weighted mean calculate before just in step i9 calculate The weighted mean gone out is set as the initial value of discrimination standard.
Then, operational part 20b readout sequence order is setting the previous in vivo lumen image of object images as place Reason object images (step i15).Then, feature value calculation unit 21 performs characteristic quantity calculating to this process object images Processing (step i17), specific region judegment part 25 performs specific region differentiation and processes (step i19), reliability meter Calculating benchmark configuration part 226 and set reliability calculating benchmark (step i21), reliability calculating portion 225 performs reliability meter Calculation processes (step i23), it determines benchmark preparing department 22b performs discrimination standard making and processes (step i25).Here, Processing of step i17 is carried out according to the processing sequence as processing with the characteristic quantity calculating of Fig. 3, the process of step i19 Carry out according to the processing sequence as processing with the specific region differentiation of Fig. 4, carry out in the same manner as step a11 of Fig. 2 The process of step i21, processing of step i23 is carried out according to the processing sequence as processing with the reliability calculating of Fig. 5, Processing of step i25 is carried out according to the processing sequence as processing with the discrimination standard making of Fig. 6.
Then, operational part 20b determines whether the in vivo lumen image having processed temporal order in front end.Untreated In the case of (step i27: no), carry out from storage part 14b readout sequence order in this process object diagram processed The previous in vivo lumen image of picture is as the process (step i29) of process object images, as image selecting section Selection processes object images.Then, return to step i17, this process object images is performed step i17~step i25 Process.
On the other hand, temporal order is being processed in the case of the in vivo image of front end, if i.e. set with sequential Determine all in vivo lumen image in object images front and perform step i17~step i25 as processing object Process (step i27: yes), then transfer to step i31 of Figure 14.
That is, in step i31, initial value configuration part 221b resets discrimination standard as step i13 at Figure 13 The initial value of middle setting.Then, operational part 20b reads and sets object images using as processing object images (step i33).Then, for this process object images, feature value calculation unit 21 performs characteristic quantity calculating and processes (step i35), Specific region judegment part 25 performs specific region differentiation and processes (step i37), reliability calculating benchmark configuration part 226 Setting reliability calculating benchmark (step i39), reliability calculating portion 225 performs reliability calculating and processes (step i41), Discrimination standard preparing department 22b performs discrimination standard making and processes (step i43).Here, the process of step i35 according to Processing sequence as characteristic quantity calculating with Fig. 3 processes is carried out, processing according to the given zone with Fig. 4 of step i37 Territory differentiation processes same processing sequence and carries out, and carries out the process of step i39 in the same manner as step a11 of Fig. 2, step Processing of rapid i41 is carried out according to the processing sequence as processing with the reliability calculating of Fig. 5, and the process of step i43 is pressed Carry out according to the processing sequence as processing with the discrimination standard making of Fig. 6.
Then, operational part 20b determines whether the in vivo lumen image having processed temporal order at end.Untreated In the case of (step i45: no), carry out from storage part 14b readout sequence order in this process object diagram processed As Next in vivo lumen image is using the process (step i47) as process object images, as image selecting section Selection processes object images.Then, return to step i35, this process object images is performed step i35~step i43 Process.
On the other hand, temporal order is being processed in the case of the in vivo lumen image at end, if i.e. by sequential Step i35~step is performed as processing object at all in vivo lumen image setting object images rear The process (step i45: yes) of i43, then terminate present treatment.
As described above, in embodiment 3, it is not to be prepared in advance to photograph normal mucosa region In vivo lumen image makes initial value, but constitutes the in vivo lumen image of sequential chart picture also by processing successively Make/update discrimination standard and make initial value.Therefore, it is possible to the precise decreasing of suppression initial value, it is possible to fall The discrimination precision affecting and improving further normal mucosa region of low noise.
It addition, in embodiment 3, by processing each work constituting sequential chart picture of predetermined number according to temporal order Lumen of a body vessel image, sets the initial value of discrimination standard.On the other hand, it is also possible to be configured to according to Figure 13's The weighted mean of the characteristic quantity in the normal mucosa region calculated in step i9 is weighted the end of mean value calculation Judge.Common mucomembranous color is uneven, so variance yields becomes big to a certain extent.Then, such as can be according to just Whether the variance yields of the weighted mean of the characteristic quantity of normal mucosal areas has exceeded threshold value set in advance is weighted The end of mean value calculation judges.In the case, process stop control unit 230b and replace the place of above-mentioned steps i10 Reason, calculates the variance yields of the weighted mean every time calculated in the process of step i9, according to the variance yields calculated Whether exceed threshold value set in advance to judge to the end being weighted mean value calculation.Then, process stopping to control Portion 230b, replaces the process of above-mentioned steps i11, and the variance yields at the weighted mean calculated as described above exceedes In the case of threshold value, it is determined that calculate for terminating weighted mean.At the variance yields of weighted mean below threshold value Period, it is determined that calculate for not terminating weighted mean.
Embodiment 4
Below, the image processing apparatus of embodiment 4 is illustrated.Figure 15 is the image that embodiment 4 is described The block diagram of the functional structure of processing means 1c.To the structure mark phase identical with the structure of explanation in embodiment 1 Same labelling.The image processing apparatus 1c of embodiment 4 as shown in figure 15, possesses: image acquiring section 11, defeated Enter portion 12, display part 13, storage part 14c, operational part 20c, the control of control image processing apparatus 1c molar behavior Portion 15 processed.
In storage part 14c, storage has normally gluing in differentiating each in vivo lumen image constituting sequential chart picture The image processing program 141c of diaphragm area.
Additionally, operational part 20c includes: feature value calculation unit 21, discrimination standard preparing department 22c, specific region differentiate Portion 25.Discrimination standard preparing department 22c possesses initial value configuration part 221c and weighted mean calculating part 222.Initially At the beginning of discrimination standard when value configuration part 221c is to set for processing the temporal order in vivo lumen image in front end The function part of initial value, and possess initial value image zooming-out portion 231c.Initial value image zooming-out portion 231c is in time constituting The in vivo lumen image of sequence image extracts the function part of multiple in vivo lumen image, possesses initial value interval and set Portion 232c.Initial value interval configuration part 232c is to set initial value image zooming-out portion 231c to extract the time ordered interval of image Function part, possess initial value internal organs judegment part 233c.Initial value internal organs judegment part 233c differentiation is reflected in each live body Internal organs kind in interior lumen image.Additionally, weighted mean calculating part 222 possesses sequential distance weighting configuration part 223 and reliability weight setting portion 224, reliability weight setting portion 224 possesses reliability calculating portion 225.Reliably Degree calculating part 225 possesses reliability calculating benchmark configuration part 226 and nonspecific regional determination portion 227.
Figure 16 is the overall flow figure of the processing sequence illustrating that the image processing apparatus 1c of embodiment 4 carried out.Separately Outward, process described herein by each portion of image processing apparatus 1c according to the image being stored in storage part 14c at Reason program 141c carries out action and realizes.Additionally, in figure 16, to the process step mark identical with embodiment 1 Note identical labelling.
As shown in figure 16, in embodiment 4, in step a1, image acquiring section 11 obtains the figure of sequential chart picture As, after data, then, initial value internal organs judegment part 233c will constitute each in vivo lumen image of sequential chart picture successively As differentiating object, it determines be reflected in the internal organs kind in the in vivo lumen image differentiating object.(step j31).
Method of discrimination as internal organs kind, it is possible to use the most known technology.Such as, Japanese Unexamined Patent Publication is used Technology disclosed in 2006-288612 publication, according to the meansigma methods R value of in vivo lumen image, G-value, B value Differentiate.Specifically, the average R value of each internal organs kind, G-value, the numerical range of B value have been previously set. Such as, internal organs kind is differentiated by these four kinds of esophagus, stomach, small intestinal and large intestine.In the case, set the most in advance Determine the average R value of esophagus, stomach, small intestinal and large intestine, G-value, the numerical range of B value.Then, difference computational discrimination The R value of the in vivo lumen image of object, G-value, B value meansigma methods as average R value, G-value, B value, this If some average R values, G-value, B value the average R value of esophagus, G-value, B value numerical range in, then general is instead The internal organs kind reflecting the look-out station in this in vivo lumen image is determined as esophagus.If differentiating object in vivo The average R value of lumen image, G-value, B value the average R value of stomach, G-value, B value numerical range in, then will The internal organs kind being reflected in this in vivo look-out station in lumen image is determined as stomach.If differentiating object in vivo The average R value of lumen image, G-value, B value the average R value of small intestinal, G-value, B value numerical range in, then The internal organs kind that will reflect in this in vivo look-out station in lumen image is determined as small intestinal.If differentiating the work of object The average R value of lumen of a body vessel image, G-value, B value the average R value of large intestine, G-value, B value numerical range in, The internal organs kind that then will reflect in this in vivo look-out station in lumen image is determined as large intestine.As long as here, can Differentiate the internal organs kind being reflected in in vivo lumen image, it is possible to use additive method.
Then, configuration part, initial value interval 232c is according to initially treated (temporal order is in front end) live body inner tube The internal organs kind of chamber image sets the time ordered interval (step j33) extracting image.Specifically, according in step j31 The internal organs kind of each in vivo lumen image of middle differentiation, for the temporal order in vivo lumen image in front end, if Calmly reflect the time ordered interval of the internal organs of the internal organs kind differentiated.Such as, from the temporal order live body inner tube in front end In the case of chamber image is located the sequential chart picture of reason capsule endoscope shooting, at initially treated temporal order in front end In vivo lumen image in reflect esophagus.So at temporal order in the internal organs kind of the in vivo lumen image of front end In the case of class is esophagus, in step j33, it is identified as the in vivo lumen image of esophagus according to internal organs kind Temporal order, sets the time ordered interval of reflection esophagus.
Then, initial value image zooming-out portion 231c is from the live body inner tube being formed in step j33 the time ordered interval set In the image of chamber, select and extract multiple in vivo lumen image (steps j35) the most at random.The number extracted can set For fixing number, it is also possible to be configured to according to change settings such as user operations.
Then, transferring to step j37, initial value configuration part 221c performs initial value setting and processes.Figure 17 is to illustrate The flow chart of the detailed processing sequence that initial value setting processes.In initial value setting processes, initial value configuration part 221c First by step j35 of Figure 16 extract multiple in vivo lumen image be divided into such as 8 × 8 pixels respectively Rectangle frame i.e. cut zone (step k1).Then, initial value configuration part 221c calculates meansigma methods and the B/G of G/R value The meansigma methods of value is as the characteristic quantity (step k2) of each cut zone being partitioned into.Specifically, by each cut zone, The meansigma methods of the G/R value according to each pixel in above-mentioned formula (1) calculating cut zone, and according to above-mentioned formula (2) The meansigma methods of the B/G value of each pixel in calculating cut zone.
Then, initial value configuration part 221c, to multiple extracted in vivo lumen image, makes respectively by each segmentation The meansigma methods of the G/R value that region calculates and the meansigma methods of the B/G value channel zapping (step in two dimensional character plane Rapid k3).Then, initial value configuration part 221c is so that the mode of 100 of adding up to of frequency is by channel zapping normalizing Change, be set as the initial value (step k5) of discrimination standard.Then, return to step j37 of Figure 16, transfer to step Rapid a5.
In above-mentioned embodiment 1 grade, prepare the in vivo lumen image photographing normal mucosa region in advance, according to The characteristic quantity of ready in vivo lumen image makes the initial value of discrimination standard.But, captured in vivo In lumen image, according to the kind of these internal organs is different, different situations is to exist to the mucosa color of reflection.Here, make The initial value of discrimination standard for differentiating the normal mucosa region in initially treated in vivo lumen image, if but The internal organs kind of this initially treated in vivo lumen image and the internal organs kind of ready in vivo lumen image are not Unanimously, then there is the situation that mucosa color is different, so the precision of initial value can reduce.
In embodiment 4, according to temporal order in the internal organs kind of the in vivo lumen image of front end, use reflection The characteristic quantity of the in vivo lumen image of the internal organs identical with the in vivo lumen image of this front end makes discrimination standard Initial value.Specifically, now, multiple internal organs kinds are extracted identical in vivo with the in vivo lumen image of front end Lumen image, according to the respective characteristic quantity of in vivo lumen image extracted, makes the initial value of discrimination standard.? This, in vivo in lumen image, normal mucosa region accounts for most.Therefore, if to multiple in vivo lumen image Calculate characteristic quantity and make channel zapping, then becoming normal mucosa region near the mode of channel zapping.
So, according to embodiment 4, according to reflection and the initially treated sequential in vivo lumen image phase in front end The characteristic quantity of multiple of same internal organs in vivo lumen image, it is possible to make the initial value of discrimination standard, it is possible to high Precision ground makes initial value.Then, for the initially treated temporal order in vivo lumen image in front end, it is possible to Made initial value is used to carry out the differentiation in normal mucosa region.Therefore, it is possible to prevent from leading because of the variation of mucosa color The erroneous judgement in the normal mucosa region caused is other, it is possible to differentiate normal mucosa region accurately.
Embodiment 5
Below, the image processing apparatus of embodiment 5 is illustrated.Figure 18 is the image that embodiment 5 is described The block diagram of the functional structure of processing means 1d.It addition, for the structure identical with the structure of explanation in embodiment 1, Mark identical labelling.The image processing apparatus 1d of embodiment 5 as shown in figure 18, possesses: image acquiring section 11, input unit 12, display part 13, storage part 14d, operational part 20d and control image processing apparatus 1d entirety are moved The control portion 15 made.
In storage part 14d, storage has normally gluing in differentiating each in vivo lumen image constituting sequential chart picture The image processing program 141d of diaphragm area.
Additionally, operational part 20d includes: feature value calculation unit 21, discrimination standard preparing department 22d, specific region differentiate Portion 25.Discrimination standard preparing department 22d possesses initial value configuration part 221 and weighted mean calculating part 222d.Weighting Mean value calculation portion 222d possesses sequential distance weighting configuration part 223 and reliability weight setting portion 224d, reliability Weight setting portion 224d possesses reliability calculating portion 225d.Reliability calculating portion 225d possesses near zone judegment part 234d.Near zone judegment part 234d differentiates that being identified as is the near zone of the cut zone in normal mucosa region, Improper mucosal areas whether is there is near differentiation.
Figure 19 is the overall flow figure of the processing sequence illustrating that the image processing apparatus 1d of embodiment 5 carried out.Separately Outward, process described herein by each portion of image processing apparatus 1d according to the image being stored in storage part 14d at Reason program 141d carries out action and realizes.Additionally, in Figure 19, to the process step mark identical with embodiment 1 Note identical labelling.
As shown in figure 19, in embodiment 5, in step a9, specific region judegment part 25 performs specific region After differentiation processes and differentiates the normal mucosa region processed in object images, near zone judegment part 234d performs attached Near field differentiation processes (step 111).Figure 20 is the flow process of the detailed processing sequence illustrating that near zone differentiation processes Figure.
During differentiation nearby processes, first near zone judegment part 234d distributes to segmentation according to by each pixel placement The zone number image of the zone number in region and the differentiation result in the normal mucosa region of each cut zone, make two and enter Imaged, in this binary picture, it is identified as being the picture of the pixel of the cut zone in normal mucosa region by belonging to Element value is set to " 0 ", is identified as being that the pixel value of the pixel of the cut zone in non-normal mucosa region is set to " 1 " by belonging to (step m1).
Then, near zone judegment part 234d carries out known expansion to the binary picture made in step m1 Processing (reference: CG-ARTS association, Digital Image Processing, P179~P180), making and making pixel value is " 1 " The binary picture (step m3) that improper mucosal areas expands.Then, near zone judegment part 234d presses each picture Element calculates the former binary picture before carrying out expansion process (binary picture made in step m1) and makes this former Binary picture (binary picture made in the step m3) sum (step that binary picture expands and makes m5)。
Then, near zone judegment part 234d is the cut zone in normal mucosa region by being identified as, and judges successively Whether it is included in the pixel that value is " 1 " calculated in step m5.That is, for being included in step m5 calculates Value is the cut zone of the pixel of " 1 ", it determines there is improper mucosal areas for neighbouring, for being not included in step m5 In the cut zone of the pixel that value is " 1 " that calculates, it determines there is not improper mucosal areas (step for neighbouring m7).Then, return to the step 111 of Figure 19, transfer to step 113.
In step 113, near reliability calculating portion 225d use, whether there is the differentiation knot of improper mucosal areas Really, by be identified as be normal mucosa region cut zone calculate reliability.Specifically, by the step at Figure 20 The reliability of the cut zone that there is improper mucosal areas near being identified as in rapid m7 is set as " 0 ", will be judged to The reliability of the neighbouring cut zone that there is not improper mucosal areas Wei not be set as " 1 ".Then, step is transferred to The discrimination standard making of a15 processes.
As described above, in embodiment 5, it is the cut zone in normal mucosa region being identified as In, it determines it is identified as being that the region in non-normal mucosa region is present in neighbouring cut zone.Then, differentiated For be normal mucosa region cut zone in, for there is the cut zone of improper mucosal areas near being determined as, As the region of the semi-tone between the region having beyond normal mucosa region and normal mucosa region, calculate low can By degree.Therefore, in the way of being difficult to add up to the characteristic quantity in the region with such semi-tone, make discrimination standard, So the discrimination precision in normal mucosa region can be improved further.
Embodiment 6
Below, the image processing apparatus of embodiment 6 is illustrated.Figure 21 is the image that embodiment 6 is described The block diagram of the functional structure of processing means 1e.It addition, to the structure identical with the structure of explanation in embodiment 1, Mark identical labelling.The image processing apparatus 1e of embodiment 6 as shown in figure 21, possesses: image acquiring section 11, input unit 12, display part 13, storage part 14e, operational part 20e, control image processing apparatus 1e entirety are moved The control portion 15 made.
In storage part 14e, storage has normally gluing in differentiating each in vivo lumen image constituting sequential chart picture The image processing program 141e of diaphragm area.
Additionally, operational part 20e includes: feature value calculation unit 21, discrimination standard preparing department 22e, specific region differentiate Portion 25.Discrimination standard preparing department 22e possesses initial value configuration part 221 and weighted mean calculating part 222e, weighting Mean value calculation portion 222e possesses sequential distance weighting configuration part 223 and reliability weight setting portion 224e.Reliability Weight setting portion 224e possesses reliability calculating portion 225e, and this reliability calculating portion 225e possesses reliability calculating benchmark Configuration part 226e.Then, in embodiment 6, reliability calculating benchmark configuration part 226e possesses reliability calculating Image zooming-out portion 235e.Reliability calculating image zooming-out portion 235e is from the in vivo lumen image constituting sequential chart picture The function part of multiple in vivo lumen image of middle extraction, possesses reliability interval configuration part 236e.Reliability interval sets Determining portion 236e is the function part setting the time ordered interval that reliability calculating image zooming-out portion 235e extracts image, and possessing can By degree internal organs judegment part 237e.Reliability internal organs judegment part 237e differentiates the internal organs of reflection in each in vivo lumen image Kind.
Figure 22 is the overall flow figure of the processing sequence illustrating that the image processing apparatus 1e of embodiment 6 carried out.Separately Outward, process described herein by each portion of image processing apparatus 1e according to the image being stored in storage part 14e at Reason program 141e carries out action and realizes.Additionally, in fig. 22, to the process step mark identical with embodiment 1 Note identical labelling.
As shown in figure 22, in embodiment 6, in step a1, image acquiring section 11 obtains the figure of sequential chart picture As, after data, then, in operational part 20e, reliability calculating benchmark configuration part 226e performs reliability calculating benchmark Setting processes (step n2).Figure 23 is the flow process of the detailed processing sequence illustrating that reliability calculating benchmark setting processes Figure.
Reliability calculating benchmark setting process in, first reliability internal organs judegment part 237e by constitute sequential chart as Each in vivo lumen image is successively as differentiating object, it determines in the in vivo lumen image differentiating object, reflection is dirty Device kind (step o1).Process in this such as can be by step j31 with the Figure 16 at above-mentioned embodiment 4 The same method that processes of middle explanation realizes.
Then, configuration part, reliability interval 236e sets the sequential district of each internal organs kind according to the differentiation result of step o1 Between (step o3).Such as, in the case of differentiating esophagus, stomach, small intestinal and large intestine as internal organs kind, in step Rapid o3 sets time ordered interval, by reflection stomach in vivo respectively that be made up of the in vivo lumen image of reflection esophagus The time ordered interval that lumen image is constituted, the time ordered interval being made up of the in vivo lumen image of reflection small intestinal, big by reflection The time ordered interval that the in vivo lumen image of intestinal is constituted.
Then, by setting each internal organs kind of time ordered interval in step o3, it is circulated the process (step of A O5~step o17).That is, in circulation A, first, reliability calculating image zooming-out portion 235e is from being formed in step In the in vivo lumen image of the time ordered interval set by the internal organs kind processing object in o3, the most randomly Select multiple in vivo lumen image (steps o7).The number extracted can be fixing number set in advance, it is possible to To be configured to carry out change setting according to user operation etc..
Then, multiple in vivo lumen image that reliability calculating benchmark configuration part 226e will extract in step o7 It is divided into the rectangle frame i.e. cut zone (step o9) of such as 8 × 8 pixels respectively.Then, reliability calculating benchmark sets Determine portion 226e and calculate the meansigma methods of G/R value and the meansigma methods of B/G value using the feature as each cut zone being partitioned into Amount (step o11).Specifically, by each cut zone, each picture in cut zone is calculated according to above-mentioned formula (1) The meansigma methods of G/R value of element, and the putting down of B/G value of each pixel in cut zone is calculated according to above-mentioned formula (2) Average.
Then, reliability calculating benchmark configuration part 226e, to multiple extracted in vivo lumen image, makes respectively The meansigma methods of the G/R value calculated by each cut zone and the meansigma methods of B/G value frequency in two dimensional character plane Distribution (step o13).Then, reliability calculating benchmark configuration part 226e is so that the total of frequency becomes the side of 100 Formula, by channel zapping normalization, is set as the reliability calculating benchmark (step relevant with the internal organs kind processing object o15)。
The process being circulated A by each internal organs kind as described above, if setting reliability by each internal organs kind Calculate benchmark, then return to step n2 of Figure 22, be then transferred into step a3.
Additionally, in embodiment 6, in step a9, specific region judegment part 25 performs specific region differentiation process And after differentiating the normal mucosa region processed in object images, reliability calculating portion 225e obtains the step at Figure 23 To processing the internal organs kind (step n11) that object images differentiates in o1.Then, reliability calculating portion 225e is by place Reason object images in be identified as be normal mucosa region each cut zone calculate reliability (step n13).At this After step n13, transfer to step a15.
Specifically, reliability calculating portion 225e uses relevant with being reflected in the internal organs kind processed in object images Reliability calculating benchmark calculates reliability.That is, reliability calculating portion 225e processes successively and is identified as being normal viscous The cut zone of diaphragm area, calculating processes meansigma methods and the B/G value of the characteristic quantity i.e. G/R value of the cut zone of object Meansigma methods with step n2 reliability calculating benchmark setting process in by step n11 obtain internal organs kind The reliability calculating benchmark set (reflects in the in vivo lumen image of the internal organs identical with processing object images and respectively splits The channel zapping of the characteristic quantity in region) between mahalanobis distance.
Then, reliability calculating portion 225e according to the mahalanobis distance calculated, according to the following formula (5), (6) calculate can By degree T.Here, Maha represents the mahalanobis distance calculated.That is, reliability calculating portion 225e is divided into mahalanobis distance The value of Maha be less than 0.1 situation and more than 0.1 situation both of these case, calculating process object cut zone Reliability T.
If Maha≤0.1, then T=1 ... (5)
If Maha > 0.1, then T = 1 | Maha | × 10 . . . ( 6 )
As described above, according to embodiment 6, it determines constitute each in vivo lumen image of sequential chart picture Internal organs kind, sets the time ordered interval of each internal organs kind.Then, extract from the time ordered interval corresponding with each internal organs kind Multiple in vivo lumen image, according to multiple the extracted respective characteristic quantity of in vivo lumen image, set each internal organs The reliability calculating benchmark of kind.Here, as described in embodiment 4, captured in vivo tube chamber In image, the mucosa color of reflection is different because the kind of these internal organs is different.According to embodiment 6, by each internal organs kind, The characteristic quantity that can use the in vivo lumen image reflecting these internal organs calculates reliability calculating benchmark.Then, it is possible to make By the reliability with the internal organs kind corresponding reliability calculating benchmark normal mucosa region processing object images, examine Consider the reliability calculated to make discrimination standard.Therefore, it is possible to more precisely calculate the reliable of normal mucosa region Degree, because can add up to the mode of the characteristic quantity calculating low reliability to make discrimination standard by being difficult to, it is possible to improve The discrimination precision in normal mucosa region.
In the respective embodiments described above, as characteristic quantity, exemplified with meansigma methods and the meansigma methods of B/G value of G/R value, But the characteristic quantity that can use in the present invention is not limited to this, it is possible to use other values are as characteristic quantity.Such as, pass through L*a*b* conversion (reference: CG-ARTS association, Digital Image Processing, P62~P63) is according to the RGB of each pixel Value obtains a*b* value, such as, can calculate meansigma methods by each cut zone and be used as characteristic quantity.Or, use HSI Rgb value is converted to tone H by conversion (reference: CG-ARTS association, Digital Image Processing, P64~P68) With the value of chroma S, such as, can calculate these tones H and the meansigma methods of chroma S by each cut zone and be used as spy The amount of levying.Then, in the case of other values such are used as characteristic quantity, as long as using other values above-mentioned to carry out as follows Described process, described in be processed as using the meansigma methods as G/R value and B/G in the respective embodiments described above The characteristic quantity of the meansigma methods of value and the process that is illustrated.
Additionally, in the respective embodiments described above, using the most in vivo lumen image constituting sequential chart picture successively as processing Object images, is divided into this process object images cut zone, but the process being divided into this cut zone is to subtract The light process processing load and carry out.Therefore, it may not be necessary for carry out region segmentation, can according to pixels calculate characteristic quantity. In the case, as long as such as calculating the G/R value of each pixel and B/G value or a*b* value, tone H and chroma S Being worth the characteristic quantity as each pixel, the process carrying out carrying out by cut zone in the respective embodiments described above by each pixel is Can.
Additionally, the image processing apparatus 1 of above-mentioned embodiment 1, the image processing apparatus 1a of embodiment 2, enforcement The image processing apparatus 1b of mode 3, the image processing apparatus 1c of embodiment 4, the image procossing of embodiment 5 Device 1d, embodiment 6 image processing apparatus 1e can be by holding by the computer system such as computer, work station The preprepared program of row realizes.Below to the image procossing dress having with explanation in each embodiment 1~6 Put 1, function that 1a, 1b, 1c, 1d, 1e are same, for perform image processing program 141,141a, 141b, The computer system of 141c, 141d, 141e illustrates.
Figure 24 is the system construction drawing of the structure illustrating the computer system 400 in this variation, and Figure 25 is to illustrate structure Become the block diagram of the structure of the main part 410 of this computer system 400.As shown in figure 24, computer system 400 has Standby: main part 410, for showing the information such as image on display picture 421 according to carrying out the instruction of main body 410 Display 420, for inputting the keyboard 430 of various information to this computer system 400, for designated display The mouse 440 of the optional position on the display picture 421 of 420.
Additionally, as shown in figures 24 and 25, the main part 410 in this computer system 400 possesses: CPU 411, RAM 412, ROM 413, hard disk drive (HDD) 414, the CD-ROM of storage CD-ROM 460 drive Dynamic device 415, releasably the connection USB port 416 of USB storage 470, display 420, connection keyboard 430 and the I/O interface 417 of mouse 440, be used for being connected to LAN or the LAN of wide area network (LAN/WAN) N1 Interface 418.
Additionally, this computer system 400 connects the modem for being connected to the common return N3 such as the Internet 450, and have the meter as other computer systems via LAN interface 418 and LAN or wide area network N1 connection Calculation machine (PC) 481, server 482, printer 483 etc..
And, this computer system 400 is by reading and performing the image processing program (example being stored in storage medium Such as the image processing program 141 of embodiment 1, the image processing program 141a of embodiment 2, embodiment 3 Image processing program 141b, the image processing program 141c of embodiment 4, the image processing program of embodiment 5 141d, the image processing program 141e of embodiment 6) realize image processing apparatus (such as embodiment 1 Image processing apparatus 1, the image processing apparatus 1a of embodiment 2, the image processing apparatus 1b of embodiment 3, The image processing apparatus 1c of embodiment 4, the image processing apparatus 1d of embodiment 5, the image of embodiment 6 Processing means 1e).Here, as storage medium, in addition to CD-ROM 460 and USB storage 470, Also include MO dish, DVD disc, floppy disk (FD), photomagneto disk, IC-card etc. " portable physical medium ", computer HDD 414 that system 400 is inside and outside to be possessed, RAM 412, ROM 413 etc. " fixed physical medium ", such as warp The common return N3, the connection that are connected by modem 450 have the PC 481 as other computer systems, service The LAN of device 482 or wide area network N1 etc. like that when router " communication media " of short-term storage program etc., deposit All storage mediums of the enough image processing programs read by computer system 400 of energy storage.
That is, image processing program can be stored in " portable physical medium ", " fixing use by computer in the way of reading Physical medium ", in the storage medium such as " communication media ", computer system 400 is by reading from such storage medium And perform image processing program to realize image processing apparatus.It addition, image processing program is not limited to by computer system 400 perform, it is also possible to the present invention is equally applicable to by as other computer systems PC 481 and service Device 482 performs the situation of image processing program and these equipment perform the situation of image processing program collaboratively.
Additionally, the present invention is the most directly confined to the respective embodiments described above 1~6 and variation, can be by suitable group It is combined in the multiple elements disclosed in each embodiment and variation to form various invention.For example, it is possible to from each reality Execute and the whole elements shown in mode and variation remove several element formed.Or, can suitable group Close the element shown in different embodiments and variation to be formed.
The present invention from the description above, it is possible to differentiate that reference object is shot and obtains by composition chronologically accurately Specific region in the image of a series of sequential chart pictures arrived.

Claims (23)

1. an image processing apparatus, this image processing apparatus processes and shoots reference object chronologically and obtain A series of sequential chart picture, this image processing apparatus possesses:
Image selecting section, it selects to process object diagram according to temporal order from the image constituting described sequential chart picture Picture;
Discrimination standard preparing department, it makes the discrimination standard of the specific region in differentiating described process object images;
Feature value calculation unit, its each pixel calculating described process object images or the characteristic quantity of each zonule;And Specific region judegment part, it is according to each described pixel or the described characteristic quantity of each described zonule, use described in sentence Other benchmark differentiates the specific region in described process object images,
Described discrimination standard preparing department according to by described image selecting section be chosen as described process object images and The characteristic quantity of the described specific region in the image differentiated by described specific region judegment part, makes described Discrimination standard,
Described discrimination standard preparing department possesses:
Initial value configuration part, it sets the initial value of described discrimination standard;And
Weighted mean calculating part, it calculates described initial value and the described spy in the described image having been carried out and differentiating Determine the weighted mean of the characteristic quantity in region,
Described discrimination standard preparing department makes described discrimination standard according to described weighted mean.
Image processing apparatus the most according to claim 1, wherein,
Described specific region judegment part possesses the distribution to each described pixel or the described characteristic quantity of each described zonule and enters The division of row classification, all kinds of differentiates described specific region according to what the result as described classification obtained.
Image processing apparatus the most according to claim 1, wherein,
Described initial value configuration part possesses the initial value image zooming-out portion extracting multiple images from described sequential chart picture, root Described initial value is set according to the characteristic quantity of multiple images described.
Image processing apparatus the most according to claim 3, wherein,
Described initial value image zooming-out portion possesses the initial value interval of the time ordered interval setting extraction multiple images described and sets Determine portion, extract multiple images described from described time ordered interval.
Image processing apparatus the most according to claim 4, wherein,
Described reference object is in vivo tube chamber,
Configuration part, described initial value interval possesses initial value internal organs judegment part, and this initial value internal organs judegment part differentiates composition institute State the internal organs kind of reflection, configuration part, described initial value interval sentencing according to described internal organs kind in the image of sequential chart picture Other result sets described time ordered interval.
Image processing apparatus the most according to claim 1, wherein,
Described initial value configuration part possesses:
Initial value weighted mean calculating part, it processes successively and constitutes initial value and set interval image, to this at The sequential of the image of reason, apart from the nearest processed image, carries out the biggest weighting, thus calculates and constitute described initial value The weighted mean of the characteristic quantity of the specific region in the image that setting is interval, wherein, described initial value sets interval structure Become described sequential chart picture;And
Stop control unit, it sets the described meter of moment control of interval all images having processed the described initial value of composition The stopping calculated,
The described weighted mean that stopped the moment of described calculating is set as described initial value.
Image processing apparatus the most according to claim 6, wherein,
To sequential on the basis of the temporal order of the image that described image selecting section sets interval end by described initial value Front is reviewed and is selected described process object images, selects the temporal order image in front end as processing object images After, set the image at interval end from described initial value and select described process object images to sequential rear,
Described initial value configuration part have selected described initial value at described image selecting section and sets the image at interval end As when processing object images, described discrimination standard is reset to as described initial value, stopped described The described weighted mean in the moment calculated.
Image processing apparatus the most according to claim 1, wherein,
Described initial value configuration part possesses:
Initial value weighted mean calculating part, its initial value in described sequential chart picture sets interval, and in sequential The sequential distance of the image being positioned at end is the nearest, then set the highest weight, thus calculates specific region in image The weighted mean of characteristic quantity;And
Stop control unit, it is according to the stopping of the calculating of the described weighted mean described weighted mean of control,
The described weighted mean that stopped the moment of the calculating of described weighted mean is set by described initial value configuration part It is set to described initial value.
Image processing apparatus the most according to claim 1, wherein,
Described weighted mean calculating part possesses sequential distance weighting configuration part, and this sequential distance weighting configuration part is to described Have been carried out the characteristic quantity of described specific region in the image differentiated, set and the described figure having been carried out and differentiating The sequential of picture and described process object images apart from corresponding weight,
According to described sequential apart from weighted mean described in corresponding weight calculation.
Image processing apparatus the most according to claim 9, wherein,
Described sequential distance is the nearest, and described sequential distance weighting configuration part will be with described sequential apart from corresponding weight setting Must be the biggest.
11. image processing apparatus according to claim 1, wherein,
Described weighted mean calculating part possesses:
Reliability calculating portion, it calculates the reliability of the described specific region in described process object images;And
Reliability weight setting portion, the characteristic quantity of the described specific region in described process object images is set and institute by it State the corresponding weight of reliability,
Described weighted mean calculating part calculates described process object images according to weight corresponding with described reliability The weighted mean of the characteristic quantity of interior described specific region.
12. image processing apparatus according to claim 11, wherein,
Described reliability calculating portion calculates described reliability according to the characteristic quantity of described specific region.
13. image processing apparatus according to claim 12, wherein,
Described reliability calculating portion possesses normalized set portion, and this normalized set portion calculates and differentiated by described specific region Portion is determined as being each described pixel or the institute of each described zonule of described specific region in described process object images State the statistic of characteristic quantity,
Reliability according to specific region described in described normalized set.
14. image processing apparatus according to claim 12, wherein,
Described reliability calculating portion possesses the near zone judegment part of the near zone differentiating described specific region, according to institute The differentiation result stating near zone calculates the reliability of described specific region.
15. image processing apparatus according to claim 12, wherein,
Described reliability calculating portion possesses reliability calculating benchmark configuration part, and this reliability calculating benchmark configuration part sets to be used In the reliability calculating benchmark of the described reliability of calculating, described reliability calculating portion is according to the feature of described specific region Amount, uses described reliability calculating benchmark to calculate the reliability of described specific region.
16. image processing apparatus according to claim 15, wherein,
Described specific region judegment part is used in described process object images by described reliability calculating benchmark configuration part The described discrimination standard of the differentiation of described specific region is set as described reliability calculating benchmark.
17. image processing apparatus according to claim 15, wherein,
Described reliability calculating benchmark configuration part is not differentiated by described specific region according in described process object images Portion is determined as being the distribution of the described characteristic quantity of each described pixel of described specific region or each described zonule, sets institute State reliability calculating benchmark.
18. image processing apparatus according to claim 15, wherein,
Described reliability calculating benchmark configuration part possesses the reliability calculating extracting multiple images from described sequential chart picture Image zooming-out portion, and set described reliability calculating benchmark according to the characteristic quantity of multiple images described.
19. image processing apparatus according to claim 18, wherein,
Described reliability calculating image zooming-out portion possess the time ordered interval extracting multiple images described is set can By the interval configuration part of degree, and extract multiple images described from described time ordered interval.
20. image processing apparatus according to claim 19, wherein,
Configuration part, described reliability interval possesses the internal organs kind of reflection in the image to the described sequential chart picture of composition and carries out The reliability internal organs judegment part differentiated, and set described time ordered interval according to the differentiation result of described internal organs kind.
21. image processing apparatus according to claim 11, wherein,
Described reliability calculating portion possesses nonspecific regional determination portion, and this nonspecific regional determination portion judges in described process With or without not being determined as being the described pixel or described of described specific region by described specific region judegment part in object images Zonule,
Described reliability calculating portion is the described pixel of described specific region or described zonule according to not being identified as With or without, calculate described reliability.
22. image processing apparatus according to claim 1, wherein,
Described reference object is in vivo tube chamber,
Described specific region be described in the region of the in vivo normal mucosa of lumen wall.
23. 1 kinds of image processing methods, this image processing method comprises the steps:
According to sequential from the image constituting a series of sequential chart pictures chronologically reference object shot and obtain Sequentially selection processes object images;
The discrimination standard of making specific region in differentiating described process object images;
Calculate each pixel or the characteristic quantity of each zonule of described process object images;And
According to each described pixel or the described characteristic quantity of each described zonule, use described discrimination standard to differentiate described place Specific region in reason object images,
In the step making described discrimination standard, according to having been selected as described process object images and having carried out The characteristic quantity of the described specific region in the image differentiated, makes described discrimination standard,
The step making described discrimination standard also comprises the steps:
Set the initial value of described discrimination standard;And
Calculate adding of the described initial value characteristic quantity with the described specific region in the described image having been carried out and differentiating Weight average value,
In the step making described discrimination standard, make described discrimination standard according to described weighted mean.
CN201110122736.2A 2010-05-14 2011-05-12 Image processing apparatus and image processing method Active CN102243710B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010112541A JP5576711B2 (en) 2010-05-14 2010-05-14 Image processing apparatus, image processing method, and image processing program
JP2010-112541 2010-05-14

Publications (2)

Publication Number Publication Date
CN102243710A CN102243710A (en) 2011-11-16
CN102243710B true CN102243710B (en) 2016-12-14

Family

ID=

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101048800A (en) * 2004-08-31 2007-10-03 美国西门子医疗解决公司 Method and system for motion correction in a sequence of images
CN101084528A (en) * 2004-12-20 2007-12-05 皇家飞利浦电子股份有限公司 A method, a system and a computer program for integration of medical diagnostic information and a geometric model of a movable body
CN101136066A (en) * 2006-07-25 2008-03-05 富士胶片株式会社 System for and method of taking image and computer program
EP2052674A1 (en) * 2006-08-08 2009-04-29 Olympus Medical Systems Corp. Medical image processing device and medical image processing method
EP2149326A1 (en) * 2007-05-08 2010-02-03 Olympus Corporation Image processing device and image processing program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101048800A (en) * 2004-08-31 2007-10-03 美国西门子医疗解决公司 Method and system for motion correction in a sequence of images
CN101084528A (en) * 2004-12-20 2007-12-05 皇家飞利浦电子股份有限公司 A method, a system and a computer program for integration of medical diagnostic information and a geometric model of a movable body
CN101136066A (en) * 2006-07-25 2008-03-05 富士胶片株式会社 System for and method of taking image and computer program
EP2052674A1 (en) * 2006-08-08 2009-04-29 Olympus Medical Systems Corp. Medical image processing device and medical image processing method
EP2149326A1 (en) * 2007-05-08 2010-02-03 Olympus Corporation Image processing device and image processing program

Similar Documents

Publication Publication Date Title
JP5576711B2 (en) Image processing apparatus, image processing method, and image processing program
CN103957771B (en) Image processing apparatus and image processing method
US9959618B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
CN106127796B (en) Image processing apparatus and image processing method
US7565073B2 (en) Photography apparatus, photography method, and photography program for obtaining an image of a subject
CN104244802B (en) Image processing apparatus, image processing method and image processing program
Segui et al. Categorization and segmentation of intestinal content frames for wireless capsule endoscopy
JP5576775B2 (en) Image processing apparatus, image processing method, and image processing program
CN105072975A (en) Image processing device, image processing method and image processing program
CN103269635A (en) Image management device, method, and program for image reading
CN106910184B (en) Endoscope image intestinal bleeding detection method based on deep convolutional neural network
CN105979847A (en) Endoscopic image diagnosis support system
CN110855889B (en) Image processing method, image processing apparatus, image processing device, and storage medium
JP6625504B2 (en) Information processing apparatus, information processing system and program
WO2020153471A1 (en) Deduction device, learning model, learning model generation method, and computer program
CN108229576A (en) Across the multiplying power pathological image feature learning method of one kind
CN111882559B (en) ECG signal acquisition method and device, storage medium and electronic device
CN107529963A (en) Image processing apparatus, image processing method and image processing program
CN104411229A (en) Image processing device, image processing method, and image processing program
CN108697310A (en) Image processing apparatus, image processing method and program
Hong et al. Deep learning model generalization with ensemble in endoscopic images
CN101273916A (en) System and method for evaluating status of patient
CN102243710B (en) Image processing apparatus and image processing method
Bourbakis et al. A neural network-based detection of bleeding in sequences of WCE images
JP7087390B2 (en) Diagnostic support device, image processing method and program

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant