CN101166456A - Image processing apparatus, image processing method and image processing program - Google Patents

Image processing apparatus, image processing method and image processing program Download PDF

Info

Publication number
CN101166456A
CN101166456A CNA2006800140335A CN200680014033A CN101166456A CN 101166456 A CN101166456 A CN 101166456A CN A2006800140335 A CNA2006800140335 A CN A2006800140335A CN 200680014033 A CN200680014033 A CN 200680014033A CN 101166456 A CN101166456 A CN 101166456A
Authority
CN
China
Prior art keywords
mentioned
hemorrhagic
candidate region
zone
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2006800140335A
Other languages
Chinese (zh)
Other versions
CN101166456B (en
Inventor
井上凉子
野波徹绪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Medical Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2005130231A external-priority patent/JP4855709B2/en
Application filed by Olympus Medical Systems Corp filed Critical Olympus Medical Systems Corp
Priority claimed from PCT/JP2006/305024 external-priority patent/WO2006117932A1/en
Publication of CN101166456A publication Critical patent/CN101166456A/en
Application granted granted Critical
Publication of CN101166456B publication Critical patent/CN101166456B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides an image processing apparatus, image processing method and image processing program. A hemorrhagic edge candidate region extracting section extracts a candidate region of the contour portion of a hemorrhagic region according to the image signal of a medical image composed of color signals obtained by imaging an organism. A feature value calculating section divides the medical image into small regions and calculates the feature value which the hemorrhagic region has according to calculation of the amount of variation of the image signal of the small region including the candidate region. A hemorrhagic edge judging section judges from the feature value whether or not the candidate region is the contour portion of the hemorrhagic region.

Description

Image processing apparatus, image processing method and image processing program
Technical field
The present invention relates to be used for basis is extracted courageous and upright edge by the medical imaging of acquisitions such as endoscope apparatus image processing apparatus, image processing method and image processing program.
Background technology
In medical field, extensively use X line, CT, MRI, ultrasound observation apparatus, endoscope apparatus etc. to have the observation and the diagnosis of internal organs in the body cavity of armarium of image pickup function.
For example endoscope apparatus is inserted into elongated insertion section in the body cavity, image units such as use solid-state image pickup come being looked like to make a video recording by internal organs in the body cavity that objective lens optical system was taken into of the leading section that is arranged on this insertion section, according to the endoscopic images of this image pickup signal display body intracavity internal organs on monitor picture.Then, operator is observed and is diagnosed according to being presented at endoscopic images on this monitor picture.
Because endoscope apparatus can directly take the gastrointestinal mucosal picture, thereby operator can synthetically be observed tone, the pathological changes shape of mucosa, the various observed results such as fine structure of mucomembranous surface.
In recent years, as the same armarium, developed capsule type endoscope device with the new image pickup function that can expect serviceability with this endoscope apparatus.The examinee through the oral cavity swallow that capsule type endoscope is excreted to capsule type endoscope during, capsule type endoscope device is to making a video recording in the body cavity, the image pickup signal of this shooting sent to be arranged on external receptor.Swallowing capsule type endoscope from the examinee is excreted to capsule type endoscope each digestive tract by endoceliac esophagus, stomach, duodenum, small intestinal and large intestine and needs several hours.
Suppose at capsule type endoscope and take the image of per second 2 (frames) for example and send under the situation of external receptor, from swallowing capsule type endoscope to excreting cost 6 hours, captured picture number is 43200 in then during capsule type endoscope is advanced in body cavity, and quantity is huge.
Under the situation of observing on the finder and diagnosing, even per second shows for example 10 images, in order to show the image of all quantity, also need 72 minutes at the pictorial display that this quantity is huge, the time is longer.Operator is observed the photographed images that shows in this is long-time, this time concerning operator loads very big and becomes big problem.
And the last diagnostic major part in the splanchnoscopy that uses capsule type endoscope or common endoscope apparatus depends on doctor's subjective judgment, also has the irregular problem that differs of quality of diagnosis.Therefore, with improve the diagnostic imaging quality and shorten endoscopic images read as the time be purpose, expectation realizes detecting automatically from endoscopic images the computer diagnosis auxiliary (CAD) of the existence of focuses such as hemorrhage, rubescent, abnormal vascular, polyp.
Computer diagnosis auxiliary (CAD) is to use the endoscopic diagnosis auxiliary device to realize.The various characteristic quantities whether the endoscopic diagnosis auxiliary device uses the care zone (ROI) in the image to calculate Xiang doctor prompting, and the evaluator of use threshold process or statistics/non-statistical, to classify by this observed result as the image of diagnosis object, thereby the diagnosis of carrying out objective and numerical value is auxiliary.And, the image of endoscopic diagnosis auxiliary device by selecting and pointing out pathological changes under a cloud, what alleviate the doctor reads the picture burden.
On the other hand, consider various pathological reasons for hemorrhage, thereby, used Several Methods about hemorrhage detection.As a kind of method wherein, in PCT WO 02/073507 A2 communique, a kind of method has been proposed, that is: use above-mentioned endoscopic diagnosis auxiliary device, form and aspect, chroma, brightness according to the object of observation zone in the endoscopic images detect hemorrhage automatically.
Yet, the method that above-mentioned communique is put down in writing compares the sample value of the form and aspect of the sample value of the form and aspect of the value of the form and aspect in object of observation zone, chroma, brightness and predefined common mucosa in the endoscopic diagnosis auxiliary device, chroma, brightness and hemorrhage portion, chroma, brightness, according to the gap of sample value, differentiating object of observation zone is common mucosa or hemorrhage portion.
Therefore, has the differentiation result depends on the sample value of setting in the endoscopic diagnosis auxiliary device problem.
Summary of the invention
Therefore, in the present invention, the variable quantity that purpose provides a kind of picture signal of the profile portion of using the hemorrhagic zone or color signal detects image processing apparatus, image processing method and the image processing program in courageous and upright zone.
The image processing apparatus of the 1st mode of the present invention is characterized in that, this image processing apparatus has:
Candidate region, hemorrhagic edge extraction unit, it extracts the candidate region of the profile portion in courageous and upright zone according to by the picture signal of taking the medical image that a plurality of color signals that organism obtains constitute;
Feature value calculation unit, it calculates the characteristic quantity that above-mentioned hemorrhagic zone has according to the calculating with the amount of change of the above-mentioned picture signal in the zonule that comprises above-mentioned candidate region in a plurality of zonules after the above-mentioned medical image segmentation; And
Hemorrhagic edge detection unit, it judges according to above-mentioned characteristic quantity whether above-mentioned candidate region is the profile portion in above-mentioned hemorrhagic zone.
The image processing apparatus of the 2nd mode of the present invention is characterized in that, this image processing apparatus has:
The evaluation region configuration part, its medical image segmentation corresponding to the medical picture signal that a plurality of color signals that obtained by the shooting organism constitute becomes a plurality of zonules, at above-mentioned a plurality of zonules, the zonule that comprises the profile portion in hemorrhagic zone according at least 1 above-mentioned color signal extraction, this zonule of being extracted is set at the hemorrhagic evaluation region, and sets the evaluation object zone that constitutes by a plurality of above-mentioned zonules at the periphery of above-mentioned hemorrhagic evaluation region;
Hemorrhagic candidate region detection unit, it is according to the amount of change of the above-mentioned color signal in above-mentioned evaluation object zone, from candidate region, above-mentioned evaluation object extracted region hemorrhagic edge, according to the ratio in above-mentioned relatively evaluation object zone, candidate region, above-mentioned hemorrhagic edge, judge whether above-mentioned hemorrhagic evaluation region is the hemorrhagic candidate region; And
Hemorrhagic regional determination portion, it is according to the change of the above-mentioned color signal more than 2 of candidate region, above-mentioned hemorrhagic edge, extract the profile portion in courageous and upright zone from candidate region, above-mentioned hemorrhagic edge, according to the ratio of the candidate region, above-mentioned profile portion above-mentioned relatively hemorrhagic edge in above-mentioned hemorrhagic zone, judge whether above-mentioned hemorrhagic candidate region is above-mentioned hemorrhagic zone.
Description of drawings
Fig. 1 is the figure that briefly shows the network structure of the image processing apparatus of the 1st embodiment of the present invention and interconnected system.
Fig. 2 is the integrally-built figure that briefly shows image processing apparatus.
Fig. 3 is the flow chart that the image analysis processing process to image processing program describes.
Fig. 4 is the flow chart that hemorrhage edge candidate extraction processing procedure described.
Fig. 5 is the skeleton diagram that analytical data is described, Fig. 5 (a) is the skeleton diagram that original image is described, Fig. 5 (b) is the skeleton diagram that hemorrhage edge candidate image described, Fig. 5 (c) is the skeleton diagram that the shape edges image is described, and Fig. 5 (d) is the skeleton diagram that hemorrhage edge image described.
Fig. 6 extracts the flow chart that processing procedure describes to shape edges.
Fig. 7 is the figure that the computational methods that R changes are described.
Fig. 8 is the flow chart that hemorrhage edge decision processing procedure described.
Fig. 9 is the flow chart that the image analysis processing process to the image processing program in the 2nd embodiment of the present invention describes.
Figure 10 is the flow chart that the computational process to the colour edging characteristic quantity describes.
Figure 11 is the figure that the computational methods that R changes and G changes are described.
Figure 12 is the flow chart that the image analysis processing process to the image processing program in the 3rd embodiment of the present invention describes.
Figure 13 is the skeleton diagram that the relation to candidate region, hemorrhage edge and background area describes.
Figure 14 is the figure of value that briefly shows the R signal of the pixel on the C-C ' line that is arranged in Figure 13, Figure 14 (a) is the figure of variation that is illustrated in the value of the R signal under the situation that candidate region, hemorrhage edge is hemorrhage edge, and Figure 14 (b) is the figure of variation that is illustrated in the value of the R signal under the situation that candidate region, hemorrhage edge is the edge that is made of the key element beyond hemorrhage.
Figure 15 is the flow chart that the image analysis processing process to the image processing program in the 4th embodiment of the present invention describes.
Figure 16 is the skeleton diagram that the relation to candidate region, hemorrhage edge and interior zone describes.
Figure 17 is the integrally-built figure that briefly shows the image processing apparatus of the 5th embodiment of the present invention.
Figure 18 is the flow chart that the image analysis processing process to image processing program describes.
Figure 19 is the flow chart that the edge extracting processing procedure is described.
Figure 20 is the figure that the computational methods that G changes are described.
Figure 21 is the flow chart that hemorrhage candidate extraction processing procedure described.
Figure 22 handles the figure that describes to hemorrhage candidate extraction.
Figure 23 is the figure that the decision method to hemorrhage candidate describes.
Figure 24 is the flow chart that hemorrhage decision processing procedure described.
Figure 25 is the figure that the computational methods that R changes and G changes are described.
Figure 26 is to hemorrhage the flow chart that edge candidate's determinating treatment course describes in the 2nd embodiment.
Figure 27 is the figure that the desired location to the neighboring area in the 5th embodiment describes.
Figure 28 is to hemorrhage the figure that candidate's determination processing describes in the 5th embodiment.
Figure 29 is the figure that the configuration mode to the neighboring area in the 6th embodiment describes.
The specific embodiment
Below, with reference to accompanying drawing embodiments of the present invention are described.
(the 1st embodiment)
Below, referring to figs. 1 through Fig. 8 the 1st embodiment of the present invention is described.The purpose of present embodiment provides a kind of variable quantity of color signal of the profile portion (following table is shown the hemorrhagic edge) of using the hemorrhagic zone and relatively judges and detect whether be image processing apparatus, image processing method and the image processing program at hemorrhagic edge.
As the hemorrhagic zone, can enumerate from the hemorrhage zone of the actual generation of mucosa the zone that to be hemorrhage zone, mucomembranous surface redden owing to hyperemia etc. is rubescent zone etc.In the present embodiment, the situation to the profile portion (following table is shown hemorrhage edge) in test example such as hemorrhage zone describes.
At first, according to Fig. 1, the image processing apparatus 1 of the 1st embodiment of the present invention and the network structure of interconnected system are described.Fig. 1 is the figure that briefly shows the network structure of the image processing apparatus 1 of the 1st embodiment of the present invention and interconnected system.
As shown in Figure 1, carrying out the image processing apparatus 1 of various Flame Image Process or information processing is connected with the LAN 2 that TCP/IP is used as communication protocol.On the other hand, the intravital mucosa of biology etc. is made a video recording and the endoscopic visualization device 3 of exporting the picture signal of medical image also is connected with LAN 2 via endoscope's filing (filing) device 4.
Endoscope's filing apparatus 4 receives or is taken into picture signal from endoscopic visualization device 3, generates view data, and accumulates the view data that is generated.Then, image processing apparatus 1 is obtained the view data that is accumulated in endoscope's filing apparatus 4 via LAN 2.
Below, the overall structure of image processing apparatus 1 is described.Image processing apparatus 1 is that the center constitutes with general personal computer 11, and has as shown in Figure 2: the operating means 12 that is made of keyboard and mouse, the storage device 13 that is made of hard disk, and the display device 14 that is made of CRT.
Fig. 2 is the integrally-built figure that briefly shows image processing apparatus 1.Operating means 12, storage device 13 and display device 14 are electrically connected with personal computer 11 respectively.Become the appointment of the view data of process object, the indication that obtains demonstration and processing execution of specified view data from operating means 12 input.And the result of the various processing of being undertaken by image processing apparatus 1 is presented in the display device 14.
Personal computer 11 has: CPU 21, and it carries out, and various program implementation are handled or control; Memorizer 22, it stores various handling procedures or data; Exterior storage I/F 23, and it carries out Card read/write to storage device 13; Carry out information communication between the network interface card 24, itself and external equipment; Operation I/F 25, it receives the operation signal of being imported from operating means 12 and carries out data necessary and handle; And graphic boards 26, it is to display device 14 outputting video signals.These CPU 21, memorizer 22, exterior storage I/F 23, network interface card 24, operation I/F 25 and the graphic boards 26 that constitute personal computer 11 are electrically connected via bus 27 separately.Therefore, these each elements in the personal computer 11 can be carried out information transmit-receive mutually via bus 27.
Network interface card 24 is electrically connected with LAN 2, and with endoscope's filing apparatus 4 of the same LAN of connection 2 between receive and send messages.
Exterior storage I/F 23 reads in and is stored in the image processing program 28 in the storage device 13 and is stored in the memorizer 22.In addition, image processing program 28 is programs of carries out image analyzing and processing, is made of a plurality of execute files or dynamic link library file or enactment document.Be stored in image processing program 28 in the memorizer 22, CPU 21 work by execution.
21 couples of CPU carry out image analysis processing such as the detection of profile portion (edge part) in hemorrhagic described later zone or judgement from endoscope's filing apparatus 4 obtained view data.Be stored in the memorizer 22 by the analytical data 29 obtained, that generate of respectively handling among the CPU 21.It is original image 31 that this analytical data 29 includes from endoscope's filing apparatus 4 obtained view data.
And, include hemorrhage edge candidate image 32, shape edges image 33 and hemorrhage edge image 34 of generating by various processing described later in the analytical data 29.The back is described in detail these each images 32~34.In addition, CPU 21 has the function of Fig. 3 of following explanation, that is, and and hemorrhage edge candidate extraction function, characteristic quantity computing function and hemorrhage edge decision-making function.
Effect to the image processing apparatus 1 of above-mentioned formation describes.In the present embodiment, the flow chart of use Fig. 3 describes the situation of the profile portion (following table is shown hemorrhage edge) in test example such as hemorrhage zone.
Fig. 3 is the flow chart that the image analysis processing process to image processing program 28 describes.At first, obtain in the step at the original image of step S1, CPU 21 obtains from the specified view data of operating means 12 from image filing apparatus 4, is stored in the memorizer 22 as original image 31.Original image 31 is by the coloured image of the three primary colours formation of red (R), green (G), blue (B), supposes that the grey of the pixel of each primary colours is got 8, i.e. 0~255 value.
Then, in the image analysis processing step of step S2,21 couples of CPU implement various processing at the obtained original image 31 of step S1, generate hemorrhage edge candidate image 32, shape edges image 33 and hemorrhage edge image 34.
This image analysis processing step (step S2) is made of following processing, that is: hemorrhage edge candidate extraction handled (step S10), generates hemorrhage edge candidate image 32 according to original image 31; Shape edges is extracted and is handled (step S20), and according to hemorrhage edge candidate image 32, computing generates shape edges image 33 based on characteristic quantity; And hemorrhage edge decision processing (step S40), generate hemorrhage edge image 34 (step S40) according to hemorrhage edge candidate image 32 and shape edges image 33.
In addition, by above-mentioned characteristic quantity computing, shape edges image 33 and hemorrhage edge candidate image 32 might generate under hybrid state, generate hemorrhage edge image 34 by hemorrhage edge treated.
Carrying out each according to the order of above-mentioned steps S10, S20, S40 handles.According to processing sequence above-mentioned each processing in the image analysis processing step (step S2) is described.
At first, using Fig. 4 that hemorrhage the edge candidate extraction of step S10 handled describes.Fig. 4 is the flow chart that hemorrhage edge candidate extraction processing procedure described.At first, in step S11, CPU 21 is divided into N * N zonule with original image 31.In the present embodiment, for example be set at N=36.
In addition, as the pretreatment of step S11, can append the contrary gamma (Gamma) of original image 31 is proofreaied and correct or shadow (shading) correction.In this case, will carry out these correcting images of proofreading and correct after handling and be divided into the zonule, carry out the processing of back.
Then, in step S12, CPU 21 is initialized as 1 with the i that expression is used to specify as the numbering of the divided zonule (following table is shown cut zone) of analytic target.In addition, specifying the numbering i of cut zone 43 is more than or equal to 1 and smaller or equal to the integer value of N * N.
Then, in step S13, CPU 21 obtains R signal, the G signal of the pixel that comprises in i the cut zone 43, the value (brightness value) of B signal, calculates the meansigma methods Ra of each color signal of i cut zone 43 i, Ga i, Ba i
Next, in step S14, CPU 21 is with the meansigma methods Ra of predefined hemorrhage regional tone space with each color signal that calculates at step S13 i, Ga i, Ba iCompare.
Hemorrhage regional tone space is to be set at respectively in the three dimensions of x axle, y axle, z axle at each gray scale with R signal, G signal, B signal, and (((Bmin≤z≤Bmax) divides the inner space that obtains to the gray scale scope of Gmin≤y≤Gmax), B signal to the gray scale scope of Rmin≤x≤Rmax), G signal to have the gray scale scope of the R signal of hemorrhage portion with meeting.
In step S14, CPU 21 judges the meansigma methods Ra of each color signal that calculates at step S13 i, Ga i, Ba iWhether be present in hemorrhage the regional tone space, be judged as under the sure situation, move on to step S15.
That is, at the meansigma methods Ra of each color signal that calculates i, Ga i, Ba iBe present under the situation in hemorrhage the regional tone space, that is, and at Rmin≤Ra i≤ Rmax and Gmin≤Ga i≤ Gmax and Bmin≤Ba iUnder the situation of≤Bmax, move on to the processing of step S15.Then, in this step S15, it is hemorrhage edge candidates that CPU 21 is judged to be i cut zone 43.
On the other hand, the meansigma methods Ra of each color signal that calculates at step S13 i, Ga i, Ba iBe not present under the situation in hemorrhage the regional tone space, that is, and corresponding to Rmin>Ra i, Ra i>Rmax, Gmin>Ga i, Ga i>Gmax, Bmin>Ba i, Ba iUnder the situation of the arbitrary condition the among>Bmax, move on to the processing of step S16.Then, in this step S16, it is not hemorrhage edge candidate that CPU 21 is judged to be i cut zone 43.
In addition, in order to judge hemorrhage edge candidate, also can utilize the tone space of G/R, B/G.That is, at (G/R) min≤Ga i/ Ra i≤ (G/R) max and (B/G) min≤Ba i/ Ga i≤ (B/G) under the situation of max, it is hemorrhage edge candidates that CPU 21 is judged to be i cut zone 43.
Here, (G/R) min and (G/R) max represent minima and the maximum of G/R respectively.And, (B/G) min and (B/G) max represent minima and the maximum of B/G respectively.
When the processing of step 815 or step S16 finished, next in step S17, CPU 21 judged that whether all cut zone 43 having been carried out hemorrhage edge candidate judges.
Specifically, under the situation of i<N * N, in step S18, CPU 21 is used in and specifies the numbering i of cut zone 43 to add 1 (i=i+1), gets back to step S13, and remaining cut zone is carried out hemorrhage edge candidate's determination processing.Under the situation of i=N * N, CPU 21 finishes the processing (that is, hemorrhage the edge candidate extraction of the step S10 of Fig. 3 handled) of Fig. 4, transfers to the shape edges of the step S20 of ensuing Fig. 3 and extracts processing.
When hemorrhage edge candidate extraction processing finishes, generate hemorrhage edge candidate image 32 shown in Fig. 5 (b) according to the original image shown in Fig. 5 (a) 31.Fig. 5 is the skeleton diagram that analytical data 29 is described.
Promptly, Fig. 5 (a) is the skeleton diagram that original image 31 is described, Fig. 5 (b) is the skeleton diagram that hemorrhage edge candidate image 32 described, and Fig. 5 (c) is the skeleton diagram that shape edges image 33 is described, and Fig. 5 (d) is the skeleton diagram that hemorrhage edge image 34 described.
Mucosa shape area 41 and hemorrhage zone 42 of in the original image 31 of Fig. 5 (a), having the ditch be formed on the mucomembranous surface etc.In hemorrhage edge candidate image 32 shown in the Fig. 5 (b) that obtains according to this original image 31, all cut zone 43 are classified as the either party of candidate region, hemorrhage edge 44 (oblique line portion) or candidate region, non-hemorrhage edge 45, include mucosa shape area 41 and hemorrhage zone 42 in candidate region, hemorrhage edge 44.
Below, use Fig. 6 that shape edges is extracted to handle and describe.Fig. 6 extracts the flow chart that processing procedure describes to the shape edges of the step S20 of Fig. 3.
Extract in this shape edges and to handle, CPU 21 carries out processing that the profile portion (following table is shown shape edges) of mucosa shape area 41 extracted as the shape edges zone in the big zone that is formed on a plurality of cut zone 43 of connection.At first, in step S21, CPU 21 is initialized as 1 with the i that expression is used to specify as the numbering of the cut zone 43 of analytic target.In addition, the numbering i that is used to specify cut zone 43 is more than or equal to 1 and smaller or equal to the integer value of N * N.
Then, in step S22, CPU 21 calculates the change (following table is shown the R change) of the value of the R signal in i the cut zone 43.R change is to calculate according to the value (R2) of the R signal of another interior specific pixel of the value (R1) of the R signal of the specific pixel in i the cut zone 43 and identical cut zone 43.Specifically, CPU 21 is according to R change=log e(R2)-log e(R1) calculate.
In the present embodiment, as shown in Figure 7, CPU 21 at about in the cut zone 43, eight directions of left and right directions and diagonal, calculate the R change respectively.Fig. 7 (a)~(h) is the figure that the computational methods that R changes are described.
The 1R change is the R change of direction up shown in Fig. 7 (a), is the value of the R signal of the pixel of central lower to be set at R1 and the value of the R signal of the pixel of central upper is set at R2 calculate.2R change as Fig. 7 (b) shown in be towards the diagonal upper right side to the R change, be that the value of the R signal of the pixel of lower left quarter is set at R1, also the value of the R signal of the pixel of upper right quarter is set at R2 and calculates.3R change is the R change towards right as Fig. 7 (c) shown in, is the value of the R signal of the pixel of left central part is set at R1, also the value of the R signal of the pixel of right median portion is set at R2 and calculates.
4R change as Fig. 7 (d) shown in be towards the diagonal lower right to the R change, be that the value of the R signal of the pixel of upper left quarter is set at R1, also the value of the R signal of the pixel of right lower quadrant is set at R2 and calculates.The 5R change is the R change of direction down shown in Fig. 7 (e), is the value of the R signal of the pixel of central upper to be set at R1 and the value of the R signal of the pixel of central lower is set at R2 calculate.6R change as Fig. 7 (f) shown in be towards the diagonal lower left to the R change, be that the value of the R signal of the pixel of upper right quarter is set at R1, also the value of the R signal of the pixel of lower left quarter is set at R2 and calculates.
7R change as Fig. 7 (g) shown in be towards left to the R change, be that the value of the R signal of the pixel of right median portion is set at R1, also the value of the R signal of the pixel of left central part is set at R2 and calculates.8R change as Fig. 7 (h) shown in be towards the diagonal upper left side to the R change, be that the value of the R signal of the pixel of right lower quadrant is set at R1, also the value of the R signal of the pixel of upper left quarter is set at R2 and calculates.
Then, in step S23, CPU 21 obtains maximum in the 1st~the 8R that step S22 calculates change as edge feature amount A iThen, in step S24, CPU 21 judges whether i cut zone 43 is edges.Specifically, at edge feature amount A iBe A iDuring>Th1, being judged to be i cut zone 43 is edges.
Here, Th1 is the threshold value that is used to carry out the benchmark judged at the edge, in the present embodiment, for example is set at Th1=0.14.At A iUnder the situation of>Th1, in step S25, it is edges that CPU 21 is judged to be i cut zone 43, enters the processing of step S27.At A iUnder the situation of≤Th1, in step S26, it is not the edge that CPU 21 is judged to be i cut zone 43, enters the processing of step S27.
In step S27, CPU 21 judges whether all cut zone 43 have been carried out the edge judgement.Specifically, under the situation of i<N * N, in step S28, CPU 21 is used in and specifies the numbering i of cut zone 43 to add 1 (i=i+1), gets back to step S22, remaining cut zone 43 is carried out the edge of step S22~S26 and judges.Under the situation of i=N * N, CPU 21 end edge determination processing enter the processing of step S29.
In step S29, being judged as at step S25 in 21 pairs of N * N cut zone 43 of CPU is the processing of labelling of the zone at edge.Labelling in the present embodiment handled and undertaken by following.
CPU 21 from upper left towards the bottom right scan image data successively, find that not label and be judged as be the cut zone 43 of marginal area, give the regulation numbering value of serving as a mark.
At this moment, CPU 21 does not use the mark value of having given to other cut zone 43, and the maximum of the mark value of having given is added that 1 resulting value gives to this cut zone 43.
In addition, in which cut zone 43 all also be not endowed under the situation of mark value, CPU 21 gives for example 1 value of serving as a mark to this cut zone 43.Then, this cut zone 43 is connected with the cut zone 43 that has been endowed mark value, and 21 couples of CPU to be judged as be that all cut zone 43 of marginal area are given identical mark value.
CPU 21 repeats above-mentioned scanning and mark value is given, up to being that all cut zone 43 of marginal area have been given mark value to being judged as.That is,, give identical mark value to the cut zone 43 that belongs to same coupling part, and each coupling part is endowed different mark value by the processing of labelling.
Then, in step S30, CPU 21 obtains at step S29 and is given to the maximum of mark value of N * N cut zone 43 as L.Next, in step S31, CPU 21 is initialized as 1 with the j that expression is used to specify the mark value of the connected cut zone 43 that becomes analytic target.In addition, the mark value j that is used to specify connected cut zone 43 gets more than or equal to 1 and smaller or equal to the integer value of L.Then, in step S32,21 couples of CPU have the quantity of the cut zone 43 of mark value j to be counted, and obtains the number of regions M that is counted j
Next, in step S33, it is which labeled bracketing in the shape edges marks and bleeds portion edge labelling is judged that CPU 21 carries out marker for determination value j.Specifically, CPU 21 is at number of regions M jBe M jBeing judged to be mark value j during>Th2 is the shape edges labelling.Here, Th2 is as the threshold value of the reference value when differentiating shape edges marks and bleeds portion edge labelling, in the present embodiment, for example is set at Th2=10.At M jUnder the situation of>Th2, in step S34, it is the shape edges labelling that CPU 21 is judged to be mark value j, enters the processing of step S36.At M jUnder the situation of≤Th2, in step S35, it is hemorrhage edge labelling that CPU 21 is judged to be mark value j, enters the processing of step S36.
In step S36, whether CPU 21 judges that to being judged as be that all cut zone 43 at edge have been carried out the labeled bracketing judgement.
Specifically, under the situation of j<L, in step S37, CPU 21 is used in and specifies the mark value j of connected cut zone to add 1 (j=j+1), gets back to step S32, remaining areas is carried out the labeled bracketing of step S32~S35 and judges.Under the situation of j=L, finish the processing of Fig. 6, next transfer to hemorrhage the edge decision of the step S40 of Fig. 3 and handle.
When the shape edges of Fig. 6 is extracted the processing end, generate the shape edges image 33 shown in Fig. 5 (c).In the shape edges image 33 shown in Fig. 5 (c), the cut zone 43 that has been endowed the mark value that is categorized as the shape edges labelling is shown as shape edges zone 46 (oblique line portion).The profile portion corresponding divided areas 43 with mucosa shape area 41 in the candidate region, hemorrhage edge 44 of this shape edges zone 46 and hemorrhage edge candidate image 32 shown in Fig. 5 (b) is consistent.
Below, use Fig. 8 that hemorrhage the edge decision of the step S40 of Fig. 3 handled and describe.Fig. 8 is the flow chart that hemorrhage edge decision processing procedure described.
In hemorrhage edge decision handled, CPU 21 carried out deciding according to the candidate region, hemorrhage edge 44 of being extracted the processing of hemorrhage marginal area 47 in hemorrhage edge candidate extraction handled.At first, in step S41, CPU 21 is initialized as 1 with the i that expression is used to specify the numbering of the cut zone 43 that becomes analytic target.
In addition, the numbering i that is used to specify cut zone 43 is more than or equal to 1 and smaller or equal to the integer value of N * N.Then, in step S42, CPU 21 judges whether i cut zone 43 is candidate regions, hemorrhage edge 44.
Determination processing among the step S42 is to carry out according to the result that hemorrhage the edge candidate extraction of the step S10 of Fig. 3 handled.I cut zone 43 is under the situation of candidate region, hemorrhage edge 44, and CPU 21 enters the processing of step S43, judges whether the mark value that is given to i cut zone 43 is hemorrhage edge labelling.I cut zone 43 is not under the situation of candidate region, hemorrhage edge 44, and in step S45, CPU 21 judges that this cut zone is not a hemorrhage marginal area 47, enters step S46.
Judgement among the step S43 is to extract the result who handles according to the shape edges of the step S20 of Fig. 3 to carry out.In the mark value that is given to i cut zone 43 is under the situation of hemorrhage edge labelling, and in step S44, CPU 21 judges that i cut zone 43 is hemorrhage marginal areas 47, enters step S46.
In the mark value that is given to i cut zone 43 is not under the situation of hemorrhage edge labelling, promptly, in the mark value that is given to i cut zone 43 is that shape edges labelling or i cut zone 43 are not endowed under the situation of labelling, in step S45, CPU 21 judges that this cut zone is not a hemorrhage marginal area 47, enters step S46.
In step S46, CPU 21 judges that whether all cut zone 43 having been carried out hemorrhage marginal area judges.Specifically, under the situation of i<N * N, in step S47, CPU 21 is used in and specifies the numbering i of cut zone 43 to add 1 (i=i+1), gets back to step S42, remaining cut zone 43 is carried out hemorrhage marginal area judge.Under the situation of i=N * N, CPU21 finishes hemorrhage the edge decision of Fig. 8 to be handled.
When hemorrhage edge decision processing finishes, generate hemorrhage edge image 34 shown in Fig. 5 (d).In hemorrhage edge image 34 shown in Fig. 5 (d), the profile portion in hemorrhage zone 42 of the original image 31 shown in Fig. 5 (a), a promptly hemorrhage edge is shown as hemorrhage marginal area 47 (oblique line portion).
By above processing, image processing apparatus 1 can be obtained by the captured original image 31 of endoscopic visualization device 3 via image filing apparatus 4, detects hemorrhage edge in the original image 31.
Like this, in the image processing apparatus 1 of present embodiment, the variable quantity of the particular color signal by using hemorrhage edge, whether can relatively judge and detect be hemorrhage edge.And present embodiment still is hemorrhage edge owing to differentiating shape edges according to the size at edge, thereby can extract hemorrhage edge accurately.And, according to present embodiment, operator hemorrhage edge image 34 as the object of observation, thereby improved the diagnostic imaging quality, can realize endoscopic images read the picture time shortening.
(the 2nd embodiment)
Below, the 2nd embodiment of the present invention is described.In the above-described first embodiment, using the maximum of the change of R signal is edge feature amount A iExtract marginal area, judge whether be hemorrhage edge according to the size of marginal area.By contrast, in the present embodiment, calculate colour edging characteristic quantity B based on the change of the color signal more than 2 in R signal, G signal, the B signal i, use colour edging characteristic quantity B iJudge whether be hemorrhage edge.
Structure on the hardware of the image processing apparatus 1 of present embodiment is identical with the image processing apparatus 1 of Fig. 2.And, store the image processing program 51 different with the image processing program 28 in the storage device 13 that is stored in Fig. 2.And the analytical data that is stored in the memorizer 22 is different with the situation of Fig. 2, is original image 31 and hemorrhage edge image 34 these two kinds of view data.
That is, the image processing apparatus 1 of present embodiment is except following aspect, and other are identical with the 1st embodiment, that is: the contents processing difference of image processing program 51; And the analytical data 52 of obtaining and generating and be stored in the memorizer 22 by carries out image processing program 51 are original image 31 and hemorrhage edge image 34 these two kinds of view data, do not comprise hemorrhage edge candidate image 32 and shape edges image 33.Therefore, in the present embodiment, only the effect as feature is described, identical element is enclosed same numeral, omit its explanation.
In the present embodiment, the same with the 1st embodiment, the flow chart of use Fig. 9 describes the processing at test example such as hemorrhage edge.Fig. 9 is the flow chart that the image analysis processing process to image processing program 51 describes.
At first, in step S110, CPU 21 obtains from the specified view data of operating means 12 from image filing apparatus 4, is stored in the memorizer 22 as original image 31.Then, in step S120,21 couples of CPU are cut apart at the obtained original image 31 of step S110, generate N * N cut zone 43.Then, in step S130, CPU 21 is initialized as 1 with the i that expression is used to specify the numbering of the cut zone 43 that becomes analytic target.In addition, the numbering i that is used to specify cut zone 43 is more than or equal to 1 and smaller or equal to the integer value of N * N.
Then, in step S140, CPU 21 calculates the colour edging characteristic quantity B of i cut zone 43 iThe flow chart of use Figure 10 describes the computational process of the colour edging characteristic quantity among the step S140.Figure 10 is the flow chart that the computational process to the colour edging characteristic quantity describes.
At first, in step S141, the change (following table is shown the G change) of the change (following table is shown the R change) of the value of the R signal of i cut zone 43 of CPU 21 calculating and the value of G signal.R change is the same with the 1st embodiment, is to calculate according to the value (R2) of the R signal of interior another specific pixel P2 of the value (R1) of the R signal of the specific pixel P1 in the cut zone 43 and identical cut zone 43.Specifically, CPU 21 is according to R change=log e(R2)-log e(R1) calculate.
And the G change is to calculate according to the value (G2) of the G signal of the value (G1) of the G signal of the pixel P1 that uses when calculating the R change and pixel P2.Specifically, by CPU 21 according to G change=log e(G2)-log e(G1) calculate.In the present embodiment, as shown in figure 11, CPU 21 at about in the cut zone 43, eight directions of left and right directions and diagonal, calculate R change and G respectively and change.Figure 11 (a)~(h) is the figure that the computational methods that R changes and G changes are described.Because the computational methods of the change of the 1st~the 8R shown in Figure 11 (a)~(h) are identical with the computational methods of the 1st~the 8R change shown in Fig. 7 (a)~(h), thereby omit its explanation.
The computational methods of the 1st~the 8G change shown in Figure 11 (a)~(h) are used and the identical pixel of using when calculating the 1st~the 8R change of pixel, when being the value of R signal that value that R1, R2 are replaced into the G signal respectively is when being G1, G2, identical with the computational methods of the 1st~the 8R change, thereby omit its explanation.
Then, in step S142, CPU 21 respectively divided by the 1st~the 8R change, obtains the 1st~the 8th change ratio to the 1st~the 8G change.In following step S143, CPU 21 obtain step S142 obtain the 1st~the 8th the change than in maximum as colour edging characteristic quantity B i
Near the borderline region of periphery mucosa and hemorrhage portion, near the promptly hemorrhage edge, generally compare with the variation of R signal or B signal, the variation of G signal is big.Therefore, in the present embodiment, G is changed/maximum of R change is set at colour edging characteristic quantity B iIn addition, as colour edging characteristic quantity B i, CPU 21 also can use the maximum of B change/G change.
Then, in the step S150 of Fig. 9, CPU 21 judges whether i cut zone 43 is hemorrhage edges.Specifically, at colour edging characteristic quantity B iBe B iDuring>Th3, it is hemorrhage edges that CPU 21 is judged to be i cut zone 43.
Here, Th3 is as the threshold value of the reference value when judgement is hemorrhage edge, in the present embodiment, for example is set at Th3=0.1.At B iUnder the situation of>Th3, in step S160, it is hemorrhage edges that CPU 21 is judged to be i cut zone 43, enters the processing of step S180.
At B iUnder the situation of≤Th3, in step S170, it is not hemorrhage edge that CPU 21 is judged to be i cut zone 43, enters step S180.
In step S180, CPU 21 judges that whether all cut zone 43 having been carried out hemorrhage edge judges.Specifically, under the situation of i<N * N, in step S190, CPU 21 is used in and specifies the numbering i of cut zone 43 to add 1 (i=i+1), gets back to step S140, remaining cut zone is carried out hemorrhage edge judge.Under the situation of i=N * N, CPU 21 finishes the processing of Fig. 9.
Like this, in the image processing apparatus 1 of present embodiment, the colour edging characteristic quantity B that uses the change according to the color signal more than 2 in R signal, G signal, the B signal to calculate iJudge whether be hemorrhage edge, thereby whether can relatively judge and detect be hemorrhage edge.And present embodiment is owing to hemorrhage edge of multiple size such as hemorrhage that can extract that area wide hemorrhage portion, segmenting edge extract, thereby further improved the accuracy of detection at hemorrhage edge.
(the 3rd embodiment)
Below, the 3rd embodiment of the present invention is described.In the above-described first embodiment, use has reflected the edge feature amount of variable quantity of the color signal at hemorrhage edge judges whether be hemorrhage edge, yet present embodiment is passed through in the neighboring area also edge calculation characteristic quantity, and estimate from the neighboring area seriality of the edge feature amount at hemorrhage edge, judge whether be hemorrhage edge.
The image processing apparatus 1 of present embodiment uses the contents processing image processing program 61 different with the image processing program of the image processing apparatus 1 of Fig. 2.And, the image processing apparatus 1 of present embodiment is except following aspect, other aspects are identical with the 1st embodiment, that is: the analytical data 62 of obtaining and generating and be stored in by carries out image processing program 61 in the memorizer 22 is original image 31 and hemorrhage edge image 34 these two kinds of view data, does not comprise hemorrhage edge candidate image 32 and shape edges image 33.Therefore, only the effect as feature is described here, identical element is enclosed same numeral, omit its explanation.
In the present embodiment, the same with the 1st embodiment, the flow chart of use Figure 12 describes the processing at test example such as hemorrhage edge.Figure 12 is the flow chart that the image analysis processing process to image processing program 61 describes.
At first, in step S201, CPU 21 obtains from the specified view data of operating means 12 from image filing apparatus 4, is stored in the memorizer 22 as original image 31.In next step S202, CPU 21 extracts candidate region, hemorrhage edge 44 according to generating N * N cut zone 43 at the obtained original image 31 of step S201.Handle identical processing because the processing among the step S202 is hemorrhage the edge candidate extraction illustrated with use Fig. 4 in the 1st embodiment, thereby omit its explanation.
In next step S203, CPU 21 is initialized as 1 with the i that expression is used to specify the numbering that becomes candidate region, hemorrhage edge of analytic target 44.In addition, the numbering i that is used to specify candidate region, hemorrhage edge 44 is more than or equal to 1 and smaller or equal to the integer value of M.Here, M is the quantity of the cut zone 43 that is extracted out as candidate region, hemorrhage edge 44 in step S202.
In following step S204, CPU 21 calculates i the 1st~the 8R change in the candidate region, hemorrhage edge 44.Because the illustrated method of the computational methods of the change of the R among the step S204 and use Fig. 7 in the 1st embodiment is identical, thereby omits its explanation.
In next step S205, CPU 21 is set at edge feature amount Al to the maximum in the 1st~the 8R change that step S204 calculates i, and be this R change D for maximum direction setting iCalculate.For example, under the situation of 4R change maximum, direction D iBe the diagonal lower right shown in Fig. 7 (d) to.In addition, comprise edge feature amount Al iAnd D iTwo characteristic quantities, and be called candidate region continuity Characteristics amount.
Next in step S206, CPU 21 sets the background area 63 about i candidate region, hemorrhage edge 44 as shown in figure 13.Figure 13 is the skeleton diagram that the relation to candidate region 44, hemorrhage edge and background area 63 describes.
Background area 63 is to be the needed zone of the seriality of evaluation edge characteristic quantity, and the center from i candidate region, hemorrhage edge 44 is located at the direction D that step S205 obtains iThe adjacent position of opposite direction.And background area 63 exists 1 with candidate region, hemorrhage edge 44: the similarity relation of k in the present embodiment, as shown in figure 13, for example is set at k=2.
In next step S207, the direction D that CPU 21 calculates in the background area 63 iR change, regional as a setting continuity Characteristics amount A2 iCalculate.Because the computational methods of R change are identical with the computational methods that R in the candidate region, hemorrhage edge 44 changes, thereby omit its explanation.
In next step S208, CPU 21 judges whether i candidate region, hemorrhage edge 44 is hemorrhage edges.Specifically, background area continuity Characteristics amount A2 iDivided by edge feature amount Al iThe value that obtains is set at seriality evaluating characteristic amount C i, work as C iDuring≤Th4, being judged to be i candidate region, hemorrhage edge 44 is hemorrhage edges.Here, Th4 is as the threshold value of the reference value when judgement is hemorrhage edge, in the present embodiment, for example is set at Th4=0.2.
Shown in Figure 14 (a), at C iUnder the situation of≤Th4, in step S209, it is hemorrhage edges that CPU 21 is judged to be i candidate region, hemorrhage edge 44, enters step S211.Shown in Figure 14 (b), at C iUnder the situation of>Th4, in step S210, it is the edges that are made of the key element beyond hemorrhage that CPU 21 is judged to be i candidate region, hemorrhage edge 44, is not hemorrhage edge, enters the processing of step S211.
In addition, Figure 14 is to using seriality evaluating characteristic amount C iHemorrhage edge judge the skeleton diagram that describes.Figure 14 is the figure of value that briefly shows the R signal of the pixel on the C-C ' line that is arranged in Figure 13.
Figure 14 (a) is illustrated in the variation of the value of the R signal under the situation that candidate region, hemorrhage edge 44 is hemorrhage edges, and Figure 14 (b) is illustrated in the variation of the value of the R signal under the situation that candidate region, hemorrhage edge 44 is edges of being made of the key element beyond hemorrhage.
Shown in Figure 14 (a), be under the situation at hemorrhage edge in candidate region, hemorrhage edge 44, the value almost fixed of R signal in background area 63, and the value of R signal significantly changes in candidate region, hemorrhage edge 44.
On the other hand, shown in Figure 14 (b), be that 63 to candidate region, hemorrhage edge 44 from the background area under the situation at the edge that is made of the key element beyond hemorrhage in candidate region, hemorrhage edge 44, the value of R signal slowly changes.
The difference of the variation by catching R signal value from this background area 63 to candidate region, hemorrhage edge 44 as both sides' R change than being seriality evaluating characteristic amount C i, can use seriality evaluating characteristic amount C iJudge whether be hemorrhage edge.
In step S211, CPU 21 judges whether the edge judgement has been carried out in all candidate regions, hemorrhage edge 44.Specifically, under the situation of i<M, in step S212, CPU 21 is used in and specifies the numbering i of candidate region, hemorrhage edge 44 to add 1 (i=i+1), gets back to step S204, hemorrhage edge is carried out in candidate region, remaining hemorrhage edge 44 judge.On the other hand, under the situation of i=M, CPU 21 finishes the processing of Figure 12.
Like this, in the image processing apparatus 1 of present embodiment, the variable quantity of the color signal by using hemorrhage edge is the edge feature amount, and whether can relatively judge and detect be hemorrhage edge.
And, present embodiment also edge calculation characteristic quantity in the neighboring area at hemorrhage edge, the seriality of the edge feature amount of evaluation from the periphery zone to hemorrhage edge judges whether be hemorrhage edge, thereby can prevent to be hemorrhage edge having to survey with flase drops such as the change of shape of hemorrhage the similar edge feature amount in edge or intestinal juice.Therefore, according to present embodiment, further improved the accuracy of detection at hemorrhage edge.
In addition, can replace the edge feature amount and use colour edging characteristic quantity illustrated in the 2nd embodiment to extract candidate region, hemorrhage edge 44.
Specifically, CPU 21 replaces the extraction of the candidate region, hemorrhage edge 44 among the step S202 that carries out Figure 12 and handles (promptly hemorrhage illustrated edge candidate extraction of the use Fig. 4 in the 1st embodiment handled), and carry out with the 2nd embodiment in hemorrhage the illustrated identical processing of edge extracting processing of use Fig. 9.In this case, the zone of extracting as hemorrhage edge in the 2nd embodiment is candidate region, hemorrhage edge 44 in the present embodiment.
(the 4th embodiment)
Below, the 4th embodiment of the present invention is described.In the above-described 3rd embodiment, edge calculation characteristic quantity in the neighboring area of candidate region, hemorrhage edge, evaluation from the neighboring area to the seriality of the edge feature amount of candidate region, hemorrhage edge, thereby judge whether be hemorrhage edge, yet in the present embodiment, according to the liquid level tone of the interior zone of candidate region, hemorrhage edge, judge whether be hemorrhage edge.
The overall structure of image processing apparatus 1 is except the different this point of the contents processing of image processing program 71, other are identical with the 3rd embodiment, thereby only the effect as feature is described here, identical element is enclosed same numeral, omit its explanation.
In the present embodiment, the same with the 3rd embodiment, the flow chart of use Figure 15 describes the processing at test example such as hemorrhage edge.Figure 15 is the flow chart that the image analysis processing process to image processing program 71 describes.
At first, in step S301, CPU 21 obtains from the specified view data of operating means 12 from image filing apparatus 4, is stored in the memorizer 22 as original image 31.In next step S302, CPU 21 extracts candidate region, hemorrhage edge 44 according to generating N * N cut zone 43 at the obtained original image 31 of step S301.
In next step S303, CPU 21 is initialized as 1 with the i that expression is used to specify the numbering that becomes candidate region, hemorrhage edge of analytic target 44.In addition, the numbering i that is used to specify candidate region, hemorrhage edge 44 is more than or equal to 1 and smaller or equal to the integer value of M.Here, M is the quantity of the cut zone 43 that extracts as candidate region, hemorrhage edge 44 in step S302.
In following step S304, CPU 21 calculates i the 1st~the 8R change in the candidate region, hemorrhage edge 44.In addition, each of above-mentioned step S301~step S304 handle be with Figure 12 in step S201~step mule S204 respectively handle identical processing.
In next step S305, CPU 21 is the 1st~the 8R change that calculates at step S304 candidate region characteristic quantity D for maximum direction setting i'.For example, under the situation of 4R change maximum, candidate region characteristic quantity D i' shown in Fig. 7 (d), become the diagonal lower right to.
In following step S306, CPU 21 sets the interior zone 72 about i candidate region, hemorrhage edge 44 as shown in figure 16.Figure 16 is the skeleton diagram that the relation to candidate region 44, hemorrhage edge and interior zone 72 describes.
Interior zone 72 is zones of the liquid level tone of evaluation edge inboard, from the center of i candidate region, hemorrhage edge 44, be positioned at at candidate region characteristic quantity D that step S305 obtains i' the direction position adjacent.And interior zone 72 has shape and the area identical with candidate region, hemorrhage edge 44.
Then, in step S307, CPU 21 obtains R signal, the G signal of the pixel that comprises, the value of B signal in interior zone 72, calculates the meansigma methods Ra of each color signal in the interior zone 72 i', Ga i', Ba i'.In following step S308, CPU 21 is with the meansigma methods Ra of predefined hemorrhage regional tone space with each color signal that calculates at step S307 i', Ga i', Ba i' compare.
In addition, the processing of step S307, S308 be with Fig. 3 in the identical processing of processing of step S13, S14.Meansigma methods Ra when each color signal that calculates at step S307 i', Ga i', Ba i' be present under the situation in hemorrhage the regional tone space, in step S309, it is hemorrhage edges that CPU 21 is judged to be i candidate region, hemorrhage edge 44, enters the processing of step S311.
On the other hand, if the meansigma methods Ra of each color signal that calculates at step S307 i', Ga i', Ba i' be not present in hemorrhage the regional tone space, then in step S310, it is not hemorrhage edge that CPU 21 is judged to be i candidate region, hemorrhage edge 44, enters the processing of step S311.
In addition, in order to judge hemorrhage edge, also can utilize the tone space of G/R, B/G.That is, at (G/R) min≤Ga i/ Ra i≤ (G/R) max and (B/G) min≤Ba i/ Ga i≤ (B/G) under the situation of max, it is hemorrhage edge candidates that CPU 21 is judged to be i cut zone 43.
In step S311, CPU 21 judges whether the edge judgement has been carried out in all candidate regions, hemorrhage edge 44.Specifically, under the situation of i<M, in step S311, CPU 21 is used in and specifies the numbering i of candidate region, hemorrhage edge 44 to add 1 (i=i+1), gets back to step S304, hemorrhage edge is carried out in candidate region, remaining hemorrhage edge 44 judge.On the other hand, under the situation of i=M, CPU 21 finishes the processing of Figure 15.
Like this, in the image processing apparatus 1 of present embodiment, the variable quantity of the color signal by using hemorrhage edge is the edge feature amount, and whether can relatively judge and detect be hemorrhage edge.
And, present embodiment is owing to judge whether be hemorrhage edge at the interior zone inner evaluation liquid level tone at hemorrhage edge, thereby can prevent from having from peripheral mucosa to hemorrhage portion surveyed with flase drops such as the foreign body of hemorrhage edge similar edge feature quantitative changeization or intestinal juice and be hemorrhage edge.Therefore, according to present embodiment, further improved the accuracy of detection at hemorrhage edge.
In addition, the same with the 3rd embodiment in the present embodiment, can replace the edge feature amount, illustrated colour edging characteristic quantity extracts candidate region, hemorrhage edge 44 in the 2nd embodiment and use.
As mentioned above, in above-mentioned 4 embodiments, be that example is described with the situation of extracting hemorrhage edge, yet the invention is not restricted to above-mentioned embodiment, can in the scope that does not change main idea of the present invention, carry out various changes, change etc.For example also can be applicable to extract the situation of profile portion of the rubescent portion of mucomembranous surface.
(the 5th embodiment)
Below, with reference to Fig. 1, Figure 17~Figure 25 the 5th embodiment of the present invention is described.
The variable quantity that the purpose of present embodiment provides the color signal of the profile portion (following table is shown the hemorrhagic edge) that can use the hemorrhagic zone relatively judges whether it is the hemorrhagic edge, and can detect image processing apparatus, image processing method and image processing program by the hemorrhagic zone of hemorrhagic surrounded by edges.
As illustrated in the 1st embodiment,, can enumerate from the hemorrhage zone of the actual generation of mucosa zone that to be hemorrhage portion, mucomembranous surface redden owing to hyperemia etc. and be rubescent etc. as the hemorrhagic zone.
In the present embodiment, the situation to test example such as hemorrhage portion describes.The image processing apparatus 1 of the 5th embodiment of the present invention is identical with Fig. 1 with the network structure of interconnected system.
As shown in Figure 1, carrying out the image processing apparatus 1 of various Flame Image Process or information processing is connected with the LAN 2 that TCP/IP is used as communication protocol.On the other hand, the endoscopic visualization device 3 of output image signal also is connected with LAN 2 via endoscope's filing apparatus 4 to making a video recording also in the organism.
Endoscope's filing apparatus 4 receives picture signal from endoscopic visualization device 3, generates view data, and accumulates the view data that is generated.That is, image processing apparatus 1 is obtained the view data that is accumulated in endoscope's filing apparatus 4 via LAN 2.
And the image processing apparatus 1 of present embodiment has structure shown in Figure 17.Structure on the hardware of this image processing apparatus 1 is identical with the image processing apparatus of Fig. 21.Therefore, the element identical with Fig. 2 enclosed same numeral.The image processing apparatus 1 of present embodiment adopts the image processing program 128 different with the image processing program 28 of Fig. 2.And, in the present embodiment, be stored in analytical data 29 in the memorizer 22 and store the image different with the analytical data 29 in the memorizer 22 that is stored in Fig. 2.
As shown in figure 17, the image processing apparatus 1 of present embodiment is that the center constitutes with general personal computer 11, and has: the operating means 12 that is made of keyboard and mouse, the storage device 13 that is made of hard disk, and the display device 14 that is made of CRT.
Operating means 12, storage device 13 and display device 14 are electrically connected with personal computer 11 respectively.Become the appointment of the view data of process object, the indication that obtains demonstration and processing execution of specified view data from operating means 12 input.And the result of the various processing of being undertaken by image processing apparatus 1 is presented in the display device 14.
Personal computer 11 has: CPU 21, and it carries out, and various program implementation are handled or control; Memorizer 22, it stores various handling procedures or data; Exterior storage I/F 23, carry out Card read/write between itself and the storage device 13; Carry out information communication between the network interface card 24, itself and external equipment; Operation I/F 25, it receives the operation signal of being imported from operating means 12 and carries out data necessary and handle; And graphic boards 26, it is to display device 14 outputting video signals, and these component parts are electrically connected with bus 27 respectively.Therefore, each element of personal computer 11 can be carried out information transmit-receive mutually via bus 27.
Network interface card 24 is electrically connected with LAN 2, and and is connected to equally between endoscope's filing apparatus 4 of LAN 2 and receives and sends messages.
Exterior storage I/F 23 reads in the image processing program 128 that is stored in the storage device 13 and it is stored in the memorizer 22.In addition, image processing program 128 is programs of carries out image analyzing and processing, is made of a plurality of execute files or dynamic link library file or enactment document.
Be stored in image processing program 128 in the memorizer 22, CPU 21 work by execution.21 couples of CPU carry out image analysis processing from endoscope's filing apparatus 4 obtained view data.Be stored in the memorizer 22 by the analytical data 29 obtained and that generate of respectively handling among the CPU 21.
Including from endoscope's filing apparatus 4 obtained view data in this analytical data 29 is original image 31.And, include the edge image 132 that generates by various processing described later, hemorrhage candidate image 133 and hemorrhage image 134 in the analytical data 29.
In this case, CPU 21 has the processing capacity that evaluation region is set, that is: such as described below, original image 31 is divided into a plurality of zonules, from the color signal of each zonule, extract the zone of the profile portion that comprises the hemorrhagic zone, carry out the zonule of being extracted is set at the processing of hemorrhagic evaluation region.
And, when CPU 21 sets processing at evaluation region, near or peripheral a plurality of zonules of this hemorrhagic evaluation region are set at the evaluation object zone.And, CPU 21 has the processing capacity that the hemorrhagic candidate region is judged, that is: calculate the amount of change of the color signal in a plurality of evaluation objects zone, whether extract candidate region, courageous and upright edge according to this amount of change, carrying out the hemorrhagic evaluation region according to the regional ratio with the candidate region, hemorrhagic edge of being extracted of the evaluation object that sets is the judgement of hemorrhagic candidate region.
And this CPU 21 has the processing capacity of hemorrhagic regional determination, that is: whether according to the amount of change of the color signal more than 2 of candidate region, hemorrhagic edge, determining courageous and upright candidate region is the hemorrhagic zone.
Effect to the image processing apparatus 1 of above-mentioned formation describes.In the present embodiment, use the flow chart of Figure 18 to come the situation in test example such as hemorrhage zone is described.
Figure 18 is the flow chart that the image analysis processing process to image processing program 128 describes.At first, obtain in the step at the original image of step S401, CPU 21 obtains from the specified view data of operating means 12 from image filing apparatus 4, is stored in the memorizer 22 as original image 31.
Original image 31 is by the coloured image of the three primary colours formation of red (R), green (G), blue (B), supposes that the grey of the pixel of each primary colours is got 8, i.e. 0~255 value.Then, in the image analysis processing step of step S402,21 couples of CPU implement various processing at the obtained original image 31 of step S401, generate hemorrhage candidate image 133, edge image 132 and hemorrhage image 134.
This image analysis processing step (step S402) is made of following processing, that is: edge extracting is handled (step S410), generates edge image 132 according to original image 31; Hemorrhage candidate extraction handled (step S420), generates hemorrhage candidate image 133 according to original image 31; And hemorrhage decision processing (step S440), generate hemorrhage image 134 according to edge image 132 and hemorrhage candidate image 133, carry out each in proper order by this and handle.By processing sequence above-mentioned each processing in the image analysis processing step (step S402) is described.
At first, using Figure 19 that edge extracting is handled describes.Figure 19 is the flow chart that the edge extracting processing procedure is described.
At first, in step S411, CPU 21 is divided into N * N zonule with original image 31.In embodiments of the present invention, for example be set at N=36.In addition, as the pretreatment of step S411, also can append the contrary gamma correction or the shadow of original image 31 are proofreaied and correct.In this case, will carry out these correcting images of proofreading and correct after handling and be divided into the zonule, carry out the processing of back.
Then, in step S412, CPU 21 is initialized as 1 with the i that expression is used to specify as the numbering of the divided zonule (following table is shown cut zone) of analytic target.In addition, the numbering i that is used to specify cut zone 141 is more than or equal to 1 and smaller or equal to the integer value of N * N.
Then, in step S413, the change (following table is shown the G change) of the value of the G signal of i cut zone 141 of CPU 21 calculating.G change is to calculate according to the value (G2) of the G signal of another interior specific pixel of the value (G1) of the G signal of the specific pixel in the cut zone 141 and identical cut zone 43.Specifically, CPU 21 is according to G change=log e(G2)-log e(G1) calculate.
In the present embodiment, as shown in figure 20, CPU 21 at about in the cut zone 141, eight directions of left and right directions and diagonal, calculate the G change respectively.Figure 20 (a)~(h) is the figure that the computational methods that G changes are described.
The 1G change is the G change of direction up shown in Figure 20 (a), is the value of the G signal of the pixel of central lower to be set at G1 and the value of the G signal of the pixel of central upper is set at G2 calculate.2G change as Figure 20 (b) shown in be towards the diagonal upper right side to the G change, be that the value of the G signal of the pixel of lower left quarter is set at G1, also the value of the G signal of the pixel of upper right quarter is set at G2 and calculates.3G change is the G change towards right as Figure 20 (c) shown in, is the value of the G signal of the pixel of left central part is set at G1, also the value of the G signal of the pixel of right median portion is set at G2 and calculates.
4G change as Figure 20 (d) shown in be towards the diagonal lower right to the G change, be that the value of the G signal of the pixel of upper left quarter is set at G1, also the value of the G signal of the pixel of right lower quadrant is set at G2 and calculates.The 5G change is the G change of direction down shown in Figure 20 (e), is the value of the G signal of the pixel of central upper to be set at G1 and the value of the G signal of the pixel of central lower is set at G2 calculate.6G change as Figure 20 (f) shown in be towards the diagonal lower left to the G change, be that the value of the G signal of the pixel of upper right quarter is set at G1, also the value of the G signal of the pixel of lower left quarter is set at G2 and calculates.
7G change as Figure 20 (g) shown in be towards left to the G change, be that the value of the G signal of the pixel of right median portion is set at G1, also the value of the G signal of the pixel of left central part is set at G2 and calculates.8G change as Figure 20 (h) shown in be towards the diagonal upper left side to the G change, be that the value of the G signal of the pixel of right lower quadrant is set at G1, also the value of the G signal of the pixel of upper left quarter is set at G2 and calculates.
Then, in step S414, CPU 21 is set at E1 to the maximum in the 1st~the 8G change that step S413 calculates i, and be G change D1 for maximum direction setting iCalculate.For example, under the situation of 4G change maximum, direction D1 iBe the diagonal lower right shown in Figure 20 (d) to.In addition, the maximum E1 that comprises the G change iWith direction D1 iTwo characteristic quantities, be called the edge feature amount.
Then, in step S415, CPU 21 judges whether i cut zone 141 is edges.Specifically, the maximum E1 that changes as G iBe E1 iDuring>Thr1, it is edges that CPU 21 is judged to be i cut zone 141.Here, Thr1 is a threshold value, in embodiments of the present invention, for example is set at Thr1=0.3.
At E1 iUnder the situation of>Thr1, in step S416, it is edges that CPU 21 is judged to be i cut zone 141, enters step S418.At E1 iUnder the situation of≤Thr1, in step S417, it is not the edge that CPU 21 is judged to be i cut zone 141, enters step S418.
In step S418, CPU 21 judges whether all cut zone 141 have been carried out the edge judgement.Specifically, under the situation of i<N * N, in step S419, CPU 21 is used in and specifies the numbering i of cut zone 141 to add 1 (i=i+1), gets back to step S413, remaining cut zone 141 is carried out the edge judge.Under the situation of i=N * N, CPU 21 finishes the processing of Figure 19, transfers to ensuing hemorrhage candidate extraction and handles.
Then, using Figure 21 that hemorrhage candidate extraction handled describes.Figure 21 is the flow chart that hemorrhage candidate extraction processing procedure described.In hemorrhage candidate extraction handled, CPU21 extracted hemorrhage candidate and hemorrhage edge candidate according to the configuration at the edge that is extracted in edge extracting is handled.
At first, the processing that CPU 21 carries out as the step S421~step S423 of evaluation region setup unit at i cut zone 141 as the hemorrhagic evaluation region, is set the neighboring area 143 as the evaluation object zone.
Here, neighboring area 143 is set on the closed curve of the periphery of surrounding i cut zone 141.In step S421, CPU 21 is initialized as 1 with the i that expression is used to specify the numbering of the cut zone 141 that becomes analytic target.In addition, the numbering i that is used to specify cut zone 141 gets more than or equal to 1 and smaller or equal to the integer value of N * N.
Then, in step S422, it is that M * M the cut zone 141 at center is as the configuration evaluation region 142 in i the cut zone 141 that CPU 21 obtains shown in Figure 22 (a) with i cut zone 141.
Figure 22 handles the flow chart that describes to hemorrhage candidate extraction.In the present embodiment, CPU 21 for example sets M=5.Next, in step S423, CPU 21 obtains the cut zone 141 of joining with this regional outline in the configuration evaluation region of obtaining 142 as neighboring area 143 (oblique line portion) in step S422.
In the present embodiment, shown in Figure 22 (a), the number C1 of the neighboring area 143 of i cut zone 141 iIt is 16.
Then, CPU 21 execution are as the processing of the step S424~step S432 of hemorrhagic candidate region identifying unit.At first, in step S424, CPU 21 is initialized as 1 with the j that expression is used to specify the numbering of the neighboring area 143 that becomes analytic target, and will be initialized as 0 to the enumerator Cnt1 that candidate region, hemorrhage the edge number in the neighboring area 143 is counted.
In addition, the numbering j sum counter Cnt1 that is used to specify neighboring area 143 gets more than or equal to 1 and smaller or equal to C1 iInteger value.Next, in step S425, CPU 21 is being D2 from j neighboring area 143 towards the direction setting of i cut zone 141 IjThen, in step S26, CPU 21 calculates the G change of eight directions in j neighboring area 43, be the G change D3 for maximum direction setting IjThe computational methods of the G change among the step S426 are identical with the computational methods that the G among the step S413 changes.
Next, in step S427, CPU 21 obtains direction D2 obtained in step S425 IjWith direction D3 obtained in step S426 IjAngulation θ Ij(with reference to Figure 22 (b)).Then, in step S428, CPU 21 judges whether j neighboring area 143 is hemorrhage edge candidates.Specifically, CPU 21 judges whether the angle θ ij that calculates satisfies θ in step S427 Ij≤ Thr2.Here, Thr2 is a threshold value, in embodiments of the present invention, for example is set at Thr2=45 (degree).In addition, hemorrhage edge candidate among the step S428 judges it is that interim (fixing tentatively) judged, is under the situation of hemorrhage candidate region being judged to be i cut zone 141 by processing described later, just adopts the result of determination in this step.
At θ IjUnder the situation of≤Thr2, it is hemorrhage edge candidates that CPU 21 is judged to be j neighboring area 143, in step S429, makes Cnt1 add 1, enters step S430.At θ IjUnder the situation of>Thr2, in step S428, it is not hemorrhage edge candidate that CPU 21 is judged to be j neighboring area 143, enters step S430.
In step S430, CPU 21 judges that whether hemorrhage edge candidate having been carried out in all neighboring areas 143 of i cut zone 141 judges.Specifically, at j<C1 iSituation under, in step S431, the numbering j that CPU 21 is used in designated perimeter zone 143 adds 1 (j=j+1), gets back to step S425, hemorrhage edge candidate is carried out in remaining neighboring area 143 judge.At j=C1 iSituation under, CPU 21 enters step S432.
In step S432, CPU 21 judges whether i cut zone 41 is hemorrhage candidates.Specifically, calculate candidate region, hemorrhage edge and count the number C1 of the relative neighboring area 143 of Cnt1 iRatio, take a decision as to whether Cnt1/C1 i>Thr3.Here, Thr3 is a threshold value, in embodiments of the present invention, for example is set at Thr3=0.7.
As shown in figure 23, at Cnt1/C1 iUnder the situation of>Thr3, in step S433, it is hemorrhage candidates that CPU21 is judged to be i cut zone 141.And, in step S428, CPU21 neighboring area 143 that temporarily is judged to be hemorrhage edge candidate, that is, and the neighboring area 143 of i cut zone 141, just θ IjThe neighboring area 143 of≤Thr2 formally is judged to be hemorrhage edge candidate.Figure 23 is the figure that the decision method to hemorrhage candidate describes.
In Figure 23, for example in 16 of i cut zone 141 neighboring areas 143, in neighboring area 143, utilize arrow to represent direction D1 as the edge feature amount iTo be judged as be the edge in 14 neighboring areas 143.And, because 12 of being judged as in 14 neighboring areas at edge satisfy θ Ij≤ Thr2, thereby Cnt1/C1 iBe 0.75.
Because Cnt1/C1 iValue more than or equal to being set to 0.7 of Thr3, thereby i cut zone 141 be judged as hemorrhage candidate, satisfies θ Ij12 neighboring areas 143 of≤Thr2 are judged as hemorrhage edge candidate.(neighboring area 143 of band oblique line is the zone that is judged as hemorrhage edge candidate in Figure 23.)
When step S433 finished, CPU 21 entered step S434.At Cnt1/C1 iUnder the situation of≤Thr3, it is not hemorrhage candidate that CPU 21 is judged to be i cut zone 141, enters step S434.
In step S434, CPU 21 judges that whether all cut zone 141 having been carried out hemorrhage candidate judges.Specifically, under the situation of i<N * N, in step S435, CPU 21 is used in and specifies the numbering i of cut zone 141 to add 1 (i=i+1), gets back to step S422, remaining cut zone 141 is carried out hemorrhage candidate judge.Under the situation of i=N * N, CPU 21 finishes the processing of Figure 21, transfers to ensuing hemorrhage decision and handles.
Then, use Figure 24 to unitary hemorrhage decision processing describes as the hemorrhagic regional determination.Figure 24 is the flow chart that hemorrhage decision processing procedure described.In hemorrhage decision handled, CPU 21 is at the neighboring area 143 of handling hemorrhage the edge candidate who is extracted by hemorrhage candidate extraction, calculating judges whether be hemorrhage edge based on the colour edging characteristic quantity of the change ratio of the color signal more than 2 in R signal, G signal, the B signal according to the colour edging characteristic quantity.And CPU 21 is according to hemorrhage the zone of hemorrhage marginal area decision of being extracted.
At first, in step S441, CPU 21 obtains and is judged as in hemorrhage candidate extraction handled is hemorrhage candidate's cut zone 141.And, the number as hemorrhage candidate's cut zone 141 is set at H.Then, in step S442, CPU 21 is initialized as 1 with the k that expression is used to specify the numbering of the cut zone 141 that becomes analytic target.In addition, the cut zone 141 that becomes analytic target is as H cut zone 141 hemorrhage obtained candidate of step S441.Therefore, the numbering k that is used to specify cut zone 141 gets more than or equal to 1 and smaller or equal to the integer value of H.
Next, in step S443, CPU 21 obtains and is judged as in hemorrhage candidate extraction handled is the neighboring area 143 of hemorrhage edge candidate in k the cut zone 141.And, the number as hemorrhage edge candidate's neighboring area 143 is set at C2 kThen, in step S444, CPU 21 expression is used to specify the neighboring area 143 that becomes analytic target numbering 1 be initialized as 1, and will be initialized as 0 to the enumerator Cnt2 that hemorrhage marginal area numbers in the neighboring area 143 are counted.In addition, the numbering 1 sum counter Cnt2 that is used to specify neighboring area 143 gets more than or equal to 1 and smaller or equal to C2 kInteger value.
Then, in step S445, the change (following table is shown the G change) of the change (following table is shown the R change) of the value of the R signal in CPU 21 calculating the 1st neighboring area 143 and the value of G signal.R change is to calculate according to the value (R2) of the R signal of interior another specific pixel P2 in the value (R1) of the R signal of the specific pixel P1 in the neighboring area 143 and identical neighboring area 143, specifically, and according to R change=log e(R2)-log e(R1) calculate.
And the G change is to calculate according to the value (G2) of the G signal of the value (G1) of the G signal of the pixel P1 that uses when calculating the R change and pixel P2, specifically, and according to G change=log e(G2)-log e(G1) calculate.In the present embodiment, as shown in figure 25,, calculate R change and G change respectively at the direction up and down in the neighboring area 143 and eight directions of diagonal.
Figure 25 (a)~(h) is the figure that the computational methods that R changes and G changes are described.Because the computational methods of the change of the 1st~the 8G shown in Figure 25 (a)~(h) are identical with the computational methods of the 1st~the 8G change shown in Figure 20 that calculates in step S413 (a)~(h), thereby detailed.
The computational methods of the 1st~the 8R change shown in Figure 25 (a)~(h) are used and the identical pixel of using when calculating the 1st~the 8G change of pixel, when being the value of G signal that value that G1, G2 are replaced into the R signal respectively is when being R1, R2, identical with the computational methods of the 1st~the 8G change, thereby detailed.
Then, in step S446, CPU 21 respectively divided by the 1st~the 8R change, obtains the 1st~the 8th change ratio to the 1st~the 8G change.And, CPU 21 in step S426, calculate, the G change is the direction D3 of maximum the 1st neighboring area 143 in KlOn the G value that changes the change ratio of relative R change be set at colour edging characteristic quantity B1 l
Near the borderline region of periphery mucosa and hemorrhage portion, near the promptly hemorrhage edge, generally compare with the variation of R signal or B signal, the variation of G signal is big.Therefore, G is changed/maximum of R change is set at the colour edging characteristic quantity.In addition, the colour edging characteristic quantity also can use the maximum of G change/B change.
Then, in step S447, CPU 21 judges whether the 1st neighboring area 143 is hemorrhage edges.Specifically, at colour edging characteristic quantity B1 lBe B1 lDuring>Thr4, it is hemorrhage edges that CPU 21 is judged to be the 1st neighboring area 143.In addition, hemorrhage edge among the step S447 judges it is interim judgement, is under the situation in hemorrhage zone being judged to be k cut zone 141 by processing described later, just adopts the result of determination in this step.
Here, Thr4 is a threshold value, in embodiments of the present invention, for example is set at Thr4=1.0.At B1 lUnder the situation of>Thr4, it is hemorrhage edges that CPU 21 is judged to be the 1st neighboring area 143, in step S448, makes Cnt2 add 1, enters step S449.
At B1 lUnder the situation of≤Thr4, it is not hemorrhage edge that CPU 21 is judged to be the 1st neighboring area 143, enters step S449.
In step S449, CPU 21 judges that whether hemorrhage edge having been carried out in all neighboring areas 143 of k cut zone 141 judges.Specifically, at l<C2 kSituation under, in step S450, the numbering l that CPU 21 is used in designated perimeter zone 143 adds 1 (l=l+1), gets back to step S445, hemorrhage edge is carried out in remaining neighboring area 143 judge.
At j=C2 kSituation under, CPU 21 finishes hemorrhage edge determination processing, enters step S451.
In step S451, CPU 21 judges whether k cut zone 141 is hemorrhage portions.Specifically, CPU 21 calculates hemorrhage marginal area and counts the number C2 of the relative neighboring area 143 of Cnt2 kRatio, take a decision as to whether Cnt2/C2 k>Thr5.
Here, Thr5 is a threshold value, in embodiments of the present invention, for example is set at Thr5=0.7.At Cnt2/C2 kUnder the situation of>Th5, in step S452, it is hemorrhage portions that CPU 21 is judged to be k cut zone 141.
And in step S447, CPU 21 is the neighboring area 143 at hemorrhage edge be judged to be temporarily, and promptly B1 is just satisfied in the neighboring area 143 of k cut zone 141 lIt is hemorrhage edge that the neighboring area 143 of>Th4 formally is judged to be.
When step S452 finished, CPU 21 entered step S453.At Cnt2/C2 kUnder the situation of≤Th5, it is not hemorrhage portion that CPU 21 is judged to be k cut zone 141, enters step S453.
In step S453, CPU 21 judges whether all cut zone 141 as hemorrhage candidate have been carried out hemorrhage judgement.Specifically, under the situation of k<H, in step S454, CPU 21 is used in and specifies the numbering k of cut zone 141 to add 1 (k=k+1), gets back to step S443, and the cut zone 141 as remaining hemorrhage candidate is carried out hemorrhage judgement.Under the situation of k=H, CPU 21 end process.
Like this, in the image processing apparatus 1 of present embodiment, extract hemorrhage candidate according to hemorrhage edge candidate's configuration, and the variable quantity according to color signals different in hemorrhage edge candidate relatively judges whether it is hemorrhage edge than (colour edging characteristic quantity), be hemorrhage portion detecting, thereby can also detect the hemorrhage portion of small size accurately by hemorrhage edge institute area surrounded.
(the 6th embodiment)
Below, the 6th embodiment of the present invention is described.In the above-described 5th embodiment, the cut zone 141 that outline with configuration evaluation region 142 is joined is set at neighboring area 143, yet in the present embodiment, inside at configuration evaluation region 142 also is provided with neighboring area 143, makes hemorrhage edge candidate's evaluation object zone have width.Thus, can detect the hemorrhage portion of more various shape.
The overall structure of image processing apparatus 1 is except the image processing program in the contents processing of image processing program 81 and the 5th embodiment 128 is different, and other aspects are identical with the 5th embodiment.Therefore, only the effect as feature is described here, identical element is enclosed same numeral, omit its explanation.And, the same with the 5th embodiment in the present embodiment, the situation of test example such as hemorrhage portion is described.
Step S423~step S431 that image analysis processing in the present embodiment is handled except hemorrhage candidate extraction, beyond the contents processing difference of promptly hemorrhage edge candidate judgement, other aspects are identical with the 5th embodiment.Use the flow chart of Figure 26 that hemorrhage edge candidate's determination processing in the present embodiment described.
Figure 26 is to hemorrhage the flow chart that edge candidate's determinating treatment course describes in the 6th embodiment.
In the step S401 of Figure 18, CPU 21 obtains original image 31, and the edge extracting among the end step S410 is handled, and obtains i the configuration evaluation region 142 in the cut zone 141 afterwards in step S421~S422 of Figure 21.Next, carry out hemorrhage edge candidate's determination processing shown in Figure 26.
In hemorrhage edge candidate's determination processing, at first, in step S523, CPU 21 is the number C3 that is set at neighboring area 143 in the configuration evaluation region 142 obtained in step S22 from the number of Z inboard cut zone 141 of outline portion iIn the present embodiment, for example obtain 5 * 5=25 cut zone 141, the Z suitable with the width of neighboring area 143 is set at Z=2 as configuration evaluation region 142.In this case, as shown in figure 27, the number C3 of neighboring area 143 iIt is 8.Figure 12 is the figure that the desired location to the neighboring area in the 2nd embodiment 143 describes.
Then, in step S524, CPU 21 is initialized as 1 with the m that expression is used to specify the numbering of the neighboring area 143 that becomes analytic target, and will be initialized as 0 to the enumerator Cnt3 that candidate region, hemorrhage the edge number in the neighboring area 143 is counted.
In addition, the numbering m sum counter Cnt3 that is used to specify neighboring area 143 gets more than or equal to 1 and smaller or equal to C3 iInteger value.Next, in step S525, CPU 21 obtains the cut zone 141 that comprises in m neighboring area 143.
Specifically, mark a half line, this half line is from being positioned at i the cut zone 141 at the center that disposes evaluation region 142, by the outline that is positioned at distance configuration evaluation region 142 is that the cut zone 141 of Z inboard arrives the outline that disposes evaluation region 142, obtains the cut zone 141 that is positioned on the half line as neighboring area 143 (with reference to Figure 27).
Under the situation of present embodiment, owing to be set at Z=2, thereby the cut zone 141 that comprises in 1 neighboring area 143 is 2.That is, in Figure 27,2 cut zone 141 that connect with line segment belong to identical neighboring area 143.
Next, in step S526, CPU 21 is initialized as 1 with the n that expression is used to specify the numbering of the cut zone 141 in m the neighboring area 143.In addition, the numbering n that is used to specify cut zone 141 gets more than or equal to 1 and smaller or equal to the integer value of Z.
Then, in step S527, CPU 21 is 143 n interior cut zone 141 are D4 towards the direction setting of i cut zone 141 from m neighboring area ImnNext, in step S528, CPU 21 obtains the direction D1 of the edge feature amount of n cut zone 141 in m obtained in the step S14 neighboring area 143 ImnWith direction D4 obtained in step S527 ImnAngulation θ Imn
Next, in step S529, CPU 21 judges whether m neighboring area 143 is hemorrhage edge candidates.Specifically, CPU 21 judges the angle θ that calculates in step S528 ImnWhether be θ Imn≤ Thr6.
Here, Thr6 is a threshold value, in embodiments of the present invention, for example is set at Thr6=45 (degree).In addition, the same with the step S428 in the 5th embodiment, hemorrhage edge candidate among the step S529 judges it is interim judgement, is under the situation of hemorrhage candidate region being judged to be i cut zone 141 by processing described later, just adopts the result of determination in this step.At θ ImnUnder the situation of≤Thr6, it is hemorrhage edge candidates that CPU 21 is judged to be m neighboring area 143, makes Cnt3 add 1 in step S530, enters step S533.
At θ ImnUnder the situation of>Thr6, in step S531, CPU 21 judges that whether all cut zone 141 that comprise in m the neighboring area 143 having been carried out hemorrhage edge candidate judges.Specifically, under the situation of n<Z, in step S532, CPU 21 is used in and specifies the numbering n of the cut zone 141 in m the neighboring area 143 to add 1 (n=n+1), get back to step S527, remaining cut zone 141 is carried out hemorrhage edge candidate judge.
Under the situation of n=Z, it is not hemorrhage edge candidate that CPU 21 is judged to be m neighboring area 143, enters step S533.That is, as long as the decision condition that has 1 zone to satisfy hemorrhage edge candidate in Z the cut zone 141 that comprises in m neighboring area is θ Imn≤ Th6, then to be judged to be m neighboring area be hemorrhage edge candidate to CPU 21.
In step S533, CPU 21 judges that whether hemorrhage edge candidate having been carried out in all neighboring areas 143 of i cut zone 141 judges.Specifically, at m<C3 iSituation under, in step S534, the numbering m that CPU 21 is used in designated perimeter zone 143 adds 1 (m=m+1), gets back to step S525, hemorrhage edge candidate is carried out in remaining neighboring area 143 judge.
At m=C3 iSituation under, CPU 21 end process enter step S32.Because the later processing of step S32 and step S32 is identical with the 5th embodiment, thereby the omission explanation.
When hemorrhage candidate's determination processing finishes in step S34, for example extract hemorrhage candidate as shown in figure 28.Figure 28 is the figure that hemorrhage candidate's determination processing described.In Figure 28, for example in 8 of i cut zone 141 neighboring areas 143, in cut zone 141, utilize arrow to represent direction D1 as the edge feature amount ImnTo be judged as be the edge in 8 neighboring areas 143.
And, all satisfy θ owing to be judged as 8 neighboring areas 143 at edge Imn≤ Thr6, thereby Cnt3/C3 iBe 1.Because Cnt3/C3 iValue more than or equal to being set to 0.7 of Thr3, thereby i cut zone 141 be judged as hemorrhage candidate, satisfies θ Imn8 neighboring areas 143 of≤Thr6 are judged as hemorrhage edge candidate.
About in handling by hemorrhage of above hemorrhage the candidate who extracts decision, the same with the 5th embodiment, carry out hemorrhage judgement according to hemorrhage edge candidate's colour edging characteristic quantity.
Like this, in the image processing apparatus 1 of present embodiment, owing to make the neighboring area 143 of the evaluation object that becomes hemorrhage edge candidate have width, thereby except hemorrhage of small size, can also detect the hemorrhage portion of multiple shapes such as having ellipse or ameba (amobe) shape, further improve the accuracy of detection of hemorrhage portion.
(the 7th embodiment)
Below, the 7th embodiment of the present invention is described.In the above-described 5th embodiment, the cut zone 141 that the full Zhou Fangxiang with the outline that disposes evaluation region 142 is joined is set at neighboring area 143.By contrast, in the present embodiment, opposed 2 limits of the outline that is configured in configuration evaluation region 142 and clip diagonal and cut zone 141 on opposed 2 straight lines is set at neighboring area 143.
Thus, even be banded and under the situation of a part that has only showed hemorrhage portion on the endoscopic images, also can detect hemorrhage portion according to present embodiment in hemorrhage portion.
The overall structure of image processing apparatus 1 is except the image processing program in the contents processing of image processing program 91 and the 5th embodiment 128 is different, and other aspects are identical with the 5th embodiment.Therefore, only the effect as feature is described here, identical element is enclosed same numeral, omit its explanation.And, the same with the 5th embodiment in the present embodiment, the situation of test example such as hemorrhage portion is described.
Image analysis processing in the present embodiment is except following aspect, and other aspects are identical with the 5th embodiment, that is: the position difference of the neighboring area of setting in the step S423 that hemorrhage candidate extraction handled 143; And the configuration mode of the neighboring area of setting at 1 cut zone 141 143 is provided with a plurality of shown in Figure 29 (a)~(d).
Promptly, in step S423, CPU 21 shown in Figure 29 (a), obtain in the configuration evaluation region 142 obtained in step S422 with and 2 limits of the opposed outline of the above-below direction cut zone 141 of joining as neighboring area 143, hemorrhage the edge candidate who carries out step S424~step S433 judges and hemorrhage candidate's determination processing.
When step S433 finishes, CPU 21 gets back to step S423, shown in Figure 29 (b), obtain in configuration evaluation region 142 with and 2 limits of the opposed outline of the left and right directions cut zone 141 of joining as neighboring area 143, carry out hemorrhage edge candidate and judge and hemorrhage candidate's determination processing.
Equally, for shown in Figure 29 (c) be positioned at clip from the cut zone 141 on cornerwise 2 straight lines of direction left to bottom right be set at the pattern of neighboring area 143 and as Figure 29 (d) shown in be positioned at clip from upper right to the lower left to cornerwise 2 straight lines on cut zone 141 be set at the pattern of neighboring area 143, CPU 21 carries out hemorrhage edge candidate judgement and hemorrhage candidate's determination processing.
CPU 21 the configuration mode at all neighboring areas 143 carried out that hemorrhage edge candidate judges and hemorrhage candidate's determination processing after, enter step S434, carry out this step S434 and step S434 processing afterwards.
About in handling by hemorrhage of above hemorrhage the candidate who extracts decision, the same with the 5th embodiment, carry out hemorrhage judgement according to hemorrhage edge candidate's colour edging characteristic quantity.
Like this, in the image processing apparatus 1 of present embodiment, since being configured in the cut zone 141 that clips the evaluation object that becomes hemorrhage candidate and the cut zone 141 on opposed 2 straight lines as neighboring area 143 as hemorrhage edge candidate's evaluation object, even thereby be banded and under the situation of a part that has only showed hemorrhage portion on the endoscopic images in hemorrhage portion, also can detect hemorrhage portion, further improve the accuracy of detection of hemorrhage portion.
As mentioned above, in 3 embodiments of the above-mentioned the 5th to the 7th, to extract hemorrhage the situation as the hemorrhagic zone is that example is described, yet the invention is not restricted to above-mentioned embodiment, can carry out various changes, change etc. in the scope that does not change main idea of the present invention.For example can also be applied to extract the situation of the rubescent portion of mucomembranous surface.
At by medical images that shooting obtains to the organism mucosa such as endoscopies, by Flame Image Process, can utilize its a plurality of colouring informations to detect courageous and upright zone objectively, but nondominant hand patient's diagnosis.
The application is that special Willing 2005-130231 number of filing an application in Japan with special Willing 2005-130229 number of filing an application in Japan on April 27th, 2005 with on April 27th, 2005 applies for that as the basis of claim of priority above-mentioned disclosure is cited in the application's description, claims and accompanying drawing.

Claims (31)

1. an image processing apparatus is characterized in that, this image processing apparatus has:
Candidate region, hemorrhagic edge extraction unit, it extracts the candidate region of the profile portion in courageous and upright zone according to by the picture signal of taking the medical image that a plurality of color signals that organism obtains constitute;
Feature value calculation unit, it calculates the characteristic quantity that above-mentioned hemorrhagic zone has according to the calculating with the amount of change of the above-mentioned picture signal of the zonule that comprises above-mentioned candidate region in a plurality of zonules after the above-mentioned medical image segmentation; And
Hemorrhagic edge detection unit, it judges according to above-mentioned characteristic quantity whether above-mentioned candidate region is the profile portion in above-mentioned hemorrhagic zone.
2. image processing apparatus according to claim 1, it is characterized in that, candidate region, above-mentioned hemorrhagic edge extraction unit has detection unit, this detection unit judges whether the meansigma methods of a plurality of color signals is distinguished within the limits prescribed, perhaps within the limits prescribed whether the ratio of the meansigma methods of 2 color signals, wherein above-mentioned a plurality of color signal constitutes the above-mentioned picture signal with a plurality of zonules after the above-mentioned medical image segmentation, and candidate region, above-mentioned hemorrhagic edge extraction unit is extracted the candidate region of the profile portion in above-mentioned hemorrhagic zone according to the result of above-mentioned detection unit.
3. image processing apparatus according to claim 1 is characterized in that, above-mentioned feature value calculation unit is calculated above-mentioned characteristic quantity according to the value of the amount of change of the maximum change of brightness value generation of at least 1 color signal in the above-mentioned zonule.
4. image processing apparatus according to claim 1, it is characterized in that, above-mentioned hemorrhagic edge detection unit comprises: the shape edges detection unit, and it judges according to above-mentioned characteristic quantity whether above-mentioned candidate region is the profile portion of above-mentioned hemorrhagic zone shape area in addition; And the edge detection unit, it judges according to above-mentioned characteristic quantity whether above-mentioned candidate region is the profile portion in above-mentioned hemorrhagic zone.
5. image processing apparatus according to claim 1, it is characterized in that, above-mentioned hemorrhagic edge detection unit also comprises the adjacent area feature value calculation unit of the above-mentioned characteristic quantity that calculates the adjacent area adjacent with above-mentioned candidate region, and, judge whether above-mentioned candidate region is the profile portion in above-mentioned hemorrhagic zone according to the ratio of the above-mentioned characteristic quantity of the above-mentioned characteristic quantity of above-mentioned candidate region and above-mentioned adjacent area.
6. image processing apparatus according to claim 1 is characterized in that, above-mentioned hemorrhagic edge detection unit judges according to the above-mentioned picture signal of the adjacent area adjacent with above-mentioned candidate region whether above-mentioned candidate region is the profile portion in above-mentioned hemorrhagic zone.
7. an image processing method is characterized in that, this image processing method has:
Candidate region, hemorrhagic edge extraction step according to by the picture signal of taking the medical image that a plurality of color signals that organism obtains constitute, extracts the candidate region of the profile portion in courageous and upright zone;
The characteristic quantity calculation procedure according to the calculating with the amount of change of the above-mentioned picture signal of the zonule that comprises above-mentioned candidate region in a plurality of zonules after the above-mentioned medical image segmentation, is calculated the characteristic quantity that above-mentioned hemorrhagic zone has; And
Hemorrhagic edge determination step judges according to above-mentioned characteristic quantity whether above-mentioned candidate region is the profile portion in above-mentioned hemorrhagic zone.
8. image processing method according to claim 7, it is characterized in that, candidate region, above-mentioned hemorrhagic edge extraction step has determination step, this determination step judges whether the meansigma methods of a plurality of color signals is distinguished within the limits prescribed, perhaps within the limits prescribed whether the ratio of the meansigma methods of 2 color signals, wherein above-mentioned a plurality of color signal constitutes the above-mentioned picture signal with a plurality of zonules after the above-mentioned medical image segmentation, and candidate region, above-mentioned hemorrhagic edge extraction step extracts the candidate region of the profile portion in above-mentioned hemorrhagic zone according to the result of above-mentioned determination step.
9. image processing method according to claim 7 is characterized in that, above-mentioned characteristic quantity calculation procedure according to the brightness value of 1 color signal in the above-mentioned zonule the maximum amount of change that changes takes place, and calculates above-mentioned characteristic quantity.
10. image processing method according to claim 7 is characterized in that, above-mentioned feature value calculation unit is calculated above-mentioned characteristic quantity according to the result of calculation of the amount of change of the maximum change of ratio generation of the amount of change of 2 color signals of above-mentioned zonule.
11. image processing method according to claim 10 is characterized in that, 2 color signals that use in the calculating of above-mentioned amount of change are amount of change of using the G color signal at least.
12. image processing method according to claim 7, it is characterized in that, above-mentioned hemorrhagic edge determination step has the shape edges determination step of the profile portion of judging the shape area beyond the above-mentioned hemorrhagic zone, and the profile portion of the above-mentioned shape area that will be judged by above-mentioned shape edges determination step removes whether external judgement is the profile portion in above-mentioned hemorrhagic zone.
13. image processing method according to claim 7 is characterized in that, above-mentioned hemorrhagic edge determination step is used to judge that the dimension threshold of the profile portion in above-mentioned hemorrhagic zone carries out the judgement to the profile portion in above-mentioned hemorrhagic zone.
14. image processing method according to claim 7, it is characterized in that, above-mentioned hemorrhagic edge determination step judges according to the neighboring area information that calculates whether the above-mentioned candidate region of being extracted is the profile portion in above-mentioned hemorrhagic zone in the neighboring area of the above-mentioned candidate region of being extracted.
15. image processing method according to claim 7, it is characterized in that, above-mentioned hemorrhagic edge determination step judges whether be the profile portion in above-mentioned hemorrhagic zone according to the successional evaluation result of the neighboring area characteristic quantity that calculates with the characteristic quantity that calculates in above-mentioned candidate region in the neighboring area of the above-mentioned candidate region of being extracted.
16. image processing method according to claim 7, it is characterized in that, above-mentioned hemorrhagic edge determination step judges according to the tone evaluation result of the medial region that calculates in the inboard of the above-mentioned candidate region of being extracted whether the above-mentioned candidate region of being extracted is the profile portion in above-mentioned hemorrhagic zone.
17. image processing method according to claim 1 is characterized in that, above-mentioned hemorrhagic edge comprises the situation at the edge of rubescent portion.
18. an image processing program is characterized in that, this image processing program is carried out computer:
Candidate region, hemorrhagic edge leaching process according to by the picture signal of taking the medical image that a plurality of color signals that organism obtains constitute, extracts the candidate region of the profile portion in courageous and upright zone;
Characteristic quantity computational process according to the calculating with the amount of change of the above-mentioned picture signal of the zonule that comprises above-mentioned candidate region in a plurality of zonules after the above-mentioned medical image segmentation, is calculated the characteristic quantity that above-mentioned hemorrhagic zone has; And
Hemorrhagic edge decision process judges according to above-mentioned characteristic quantity whether above-mentioned candidate region is the profile portion in above-mentioned hemorrhagic zone.
19. an image processing apparatus is characterized in that, this image processing apparatus has:
The evaluation region configuration part, its medical image segmentation corresponding to the medical picture signal that a plurality of color signals that obtained by the shooting organism constitute becomes a plurality of zonules, at above-mentioned a plurality of zonules, according at least 1 above-mentioned color signal, extraction comprises the zonule of the profile portion in hemorrhagic zone, this zonule of being extracted is set at the hemorrhagic evaluation region, and sets the evaluation object zone that constitutes by a plurality of above-mentioned zonules at the periphery of above-mentioned hemorrhagic evaluation region;
Hemorrhagic candidate region detection unit, it is according to the amount of change of the above-mentioned color signal in above-mentioned evaluation object zone, from candidate region, above-mentioned evaluation object extracted region hemorrhagic edge, according to the ratio in above-mentioned relatively evaluation object zone, candidate region, above-mentioned hemorrhagic edge, judge whether above-mentioned hemorrhagic evaluation region is the hemorrhagic candidate region; And
Hemorrhagic regional determination portion, it is according to the change of the above-mentioned color signal more than 2 of candidate region, above-mentioned hemorrhagic edge, extract the profile portion in courageous and upright zone from candidate region, above-mentioned hemorrhagic edge, according to the ratio of the candidate region, above-mentioned profile portion above-mentioned relatively hemorrhagic edge in above-mentioned hemorrhagic zone, judge whether above-mentioned hemorrhagic candidate region is above-mentioned hemorrhagic zone.
20. image processing apparatus according to claim 19, it is characterized in that, the maximum of amount of change of the brightness value of the above-mentioned color signal in the above-mentioned zonule is calculated in above-mentioned evaluation region configuration part, and the zonule that this maximum is surpassed under the situation of setting is set at above-mentioned hemorrhagic evaluation region.
21. image processing apparatus according to claim 19 is characterized in that, above-mentioned evaluation region configuration part is set in above-mentioned evaluation object zone on the closed curve of the periphery of surrounding above-mentioned hemorrhagic evaluation region.
22. image processing apparatus according to claim 21 is characterized in that, the width that above-mentioned evaluation region configuration part has the above-mentioned closed curve of the periphery of surrounding above-mentioned hemorrhagic evaluation region is the width of a plurality of above-mentioned zonules.
23. image processing apparatus according to claim 19 is characterized in that, above-mentioned evaluation region configuration part is set in the mode that clips above-mentioned hemorrhagic evaluation region above-mentioned evaluation object zone on opposed 2 straight lines.
24. image processing apparatus according to claim 19, it is characterized in that, maximum the 1st direction that changes takes place in the brightness value that above-mentioned hemorrhagic candidate region detection unit calculates the above-mentioned color signal in above-mentioned evaluation object zone, calculate the 1st direction with from this evaluation object zone towards the formed angle of the 2nd direction in above-mentioned hemorrhagic evaluation object zone smaller or equal to this evaluation object zone of setting as candidate region, above-mentioned hemorrhagic edge.
25. image processing apparatus according to claim 24, it is characterized in that, whether above-mentioned hemorrhagic candidate region detection unit is set to the zonule in above-mentioned evaluation object zone relatively according to the quantity that is calculated as the zonule of candidate region, above-mentioned hemorrhagic edge the ratio of quantity has surpassed setting, judges whether above-mentioned hemorrhagic evaluation region is the hemorrhagic candidate region.
26. image processing apparatus according to claim 19, it is characterized in that, the amount of change of brightness value of 2 color signals that comprise the G color signal at least of candidate region, above-mentioned hemorrhagic edge is calculated by above-mentioned hemorrhagic regional determination portion, the amount of change ratio that calculates above-mentioned 2 color signals is maximum maximum, calculates to be used for whether judging above setting whether above-mentioned hemorrhagic candidate region is the tentative zone in above-mentioned hemorrhagic zone according to above-mentioned maximum.
27. image processing apparatus according to claim 26, it is characterized in that, whether above-mentioned hemorrhagic regional determination portion surpasses setting according to the ratio of the quantity of candidate region, the quantity in above-mentioned tentative zone relative hemorrhagic edge, judges whether above-mentioned hemorrhagic candidate region is above-mentioned hemorrhagic zone.
28. an image processing method is characterized in that, this image processing method has:
Evaluation region is set step, the medical image segmentation of the medical picture signal that constitutes corresponding to a plurality of color signals that obtained by the shooting organism is become a plurality of zonules, at above-mentioned a plurality of zonules, according at least 1 above-mentioned color signal, extraction comprises the zonule of the profile portion in hemorrhagic zone, this zonule of being extracted is set at the hemorrhagic evaluation region, and sets the evaluation object zone that constitutes by a plurality of zonules at the periphery of above-mentioned hemorrhagic evaluation region;
Hemorrhagic candidate region determination step, amount of change according to the above-mentioned color signal in above-mentioned evaluation object zone, from candidate region, above-mentioned evaluation object extracted region hemorrhagic edge, according to the ratio in above-mentioned relatively evaluation object zone, candidate region, above-mentioned hemorrhagic edge, judge whether above-mentioned hemorrhagic evaluation region is the hemorrhagic candidate region; And
Hemorrhagic regional determination step, change according to the above-mentioned color signal more than 2 of candidate region, above-mentioned hemorrhagic edge, extract the profile portion in courageous and upright zone from candidate region, above-mentioned hemorrhagic edge, according to the ratio of the candidate region, above-mentioned profile portion above-mentioned relatively hemorrhagic edge in above-mentioned hemorrhagic zone, judge whether above-mentioned hemorrhagic candidate region is above-mentioned hemorrhagic zone.
29. image processing method according to claim 28, it is characterized in that, above-mentioned evaluation region is set the maximum of amount of change that step is calculated the brightness value of the above-mentioned color signal above-mentioned zonule in, and this maximum is set at above-mentioned hemorrhagic evaluation region above the zonule under the situation of setting.
30. image processing method according to claim 28 is characterized in that, above-mentioned evaluation region configuration part is set in above-mentioned evaluation object zone on the closed curve of the periphery of surrounding above-mentioned hemorrhagic evaluation region.
31. an image processing program is characterized in that, this image processing program is carried out computer:
The evaluation region assignment procedure, the medical image segmentation of the medical picture signal that constitutes corresponding to a plurality of color signals that obtained by the shooting organism is become a plurality of zonules, at above-mentioned a plurality of zonules, according at least 1 above-mentioned color signal, extraction comprises the zonule of the profile portion in hemorrhagic zone, this zonule of being extracted is set at the hemorrhagic evaluation region, and sets the evaluation object zone that constitutes by a plurality of above-mentioned zonules at the periphery of above-mentioned hemorrhagic evaluation region;
Hemorrhagic candidate region decision process, amount of change according to the above-mentioned color signal in above-mentioned evaluation object zone, from candidate region, above-mentioned evaluation object extracted region hemorrhagic edge, according to the ratio in above-mentioned relatively evaluation object zone, candidate region, above-mentioned hemorrhagic edge, judge whether above-mentioned hemorrhagic evaluation region is the hemorrhagic candidate region; And
Hemorrhagic regional determination process, change according to the above-mentioned color signal more than 2 of candidate region, above-mentioned hemorrhagic edge, extract the profile portion in courageous and upright zone from candidate region, above-mentioned hemorrhagic edge, according to the ratio of the candidate region, above-mentioned profile portion above-mentioned relatively hemorrhagic edge in above-mentioned hemorrhagic zone, judge whether above-mentioned hemorrhagic candidate region is above-mentioned hemorrhagic zone.
CN2006800140335A 2005-04-27 2006-03-14 Image processing apparatus Expired - Fee Related CN101166456B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2005130231A JP4855709B2 (en) 2005-04-27 2005-04-27 Image processing apparatus, image processing method, and image processing program
JP130229/2005 2005-04-27
JP2005130229A JP4832794B2 (en) 2005-04-27 2005-04-27 Image processing apparatus and image processing program
JP130231/2005 2005-04-27
PCT/JP2006/305024 WO2006117932A1 (en) 2005-04-27 2006-03-14 Image processing device, image processing method, and image processing program

Publications (2)

Publication Number Publication Date
CN101166456A true CN101166456A (en) 2008-04-23
CN101166456B CN101166456B (en) 2011-01-19

Family

ID=37472575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2006800140335A Expired - Fee Related CN101166456B (en) 2005-04-27 2006-03-14 Image processing apparatus

Country Status (2)

Country Link
JP (1) JP4832794B2 (en)
CN (1) CN101166456B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104411229A (en) * 2012-06-28 2015-03-11 奥林巴斯株式会社 Image processing device, image processing method, and image processing program
CN105530851A (en) * 2013-09-13 2016-04-27 奥林巴斯株式会社 Image processing device, method, and program
CN112274115A (en) * 2020-07-03 2021-01-29 母宗军 Integrated gastric environment detection platform and method
CN113038868A (en) * 2018-11-14 2021-06-25 富士胶片株式会社 Medical image processing system
CN116506995A (en) * 2023-04-24 2023-07-28 深圳市计量质量检测研究院(国家高新技术计量站、国家数字电子产品质量监督检验中心) Electronic endoscope test image acquisition method and device and intelligent terminal

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5121204B2 (en) 2006-10-11 2013-01-16 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
JP5658873B2 (en) 2009-11-13 2015-01-28 オリンパス株式会社 Image processing apparatus, electronic apparatus, endoscope system, and program
JP6410498B2 (en) * 2014-07-08 2018-10-24 キヤノン株式会社 Ophthalmic apparatus and control method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05210736A (en) * 1992-01-31 1993-08-20 Olympus Optical Co Ltd Method for extracting contour of image
JPH0660182A (en) * 1992-08-04 1994-03-04 Komatsu Ltd Area division method using texture analysis and device
JPH0737056A (en) * 1993-07-19 1995-02-07 Toshiba Corp Medical diagnosis support device
JP4493386B2 (en) * 2003-04-25 2010-06-30 オリンパス株式会社 Image display device, image display method, and image display program

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104411229A (en) * 2012-06-28 2015-03-11 奥林巴斯株式会社 Image processing device, image processing method, and image processing program
CN105530851A (en) * 2013-09-13 2016-04-27 奥林巴斯株式会社 Image processing device, method, and program
CN113038868A (en) * 2018-11-14 2021-06-25 富士胶片株式会社 Medical image processing system
US11961228B2 (en) 2018-11-14 2024-04-16 Fujifilm Corporation Medical image processing system
CN112274115A (en) * 2020-07-03 2021-01-29 母宗军 Integrated gastric environment detection platform and method
CN116506995A (en) * 2023-04-24 2023-07-28 深圳市计量质量检测研究院(国家高新技术计量站、国家数字电子产品质量监督检验中心) Electronic endoscope test image acquisition method and device and intelligent terminal
CN116506995B (en) * 2023-04-24 2024-04-02 深圳市计量质量检测研究院(国家高新技术计量站、国家数字电子产品质量监督检验中心) Electronic endoscope test image acquisition method and device and intelligent terminal

Also Published As

Publication number Publication date
CN101166456B (en) 2011-01-19
JP2006304993A (en) 2006-11-09
JP4832794B2 (en) 2011-12-07

Similar Documents

Publication Publication Date Title
KR100943367B1 (en) Image processing device, image processing method, and recording medium for recording image processing program
CN101166456B (en) Image processing apparatus
US8837821B2 (en) Image processing apparatus, image processing method, and computer readable recording medium
US8743189B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium storing image processing program
EP1450287A2 (en) Method of extracting region of interest from tongue image and health monitoring method and apparatus using the tongue image
EP1994878B9 (en) Medical image processing device and medical image processing method
JP4767591B2 (en) Endoscope diagnosis support method, endoscope diagnosis support device, and endoscope diagnosis support program
US9468356B2 (en) Lesion evaluation information generator, and method and computer readable medium therefor
CN103327883B (en) Medical image-processing apparatus and medical image processing method
CN104540438A (en) Image processing device and endoscopic instrument
CN102697446B (en) Image processing apparatus and image processing method
CN104470416A (en) Image processing device and endoscopic instrument
WO2012153568A1 (en) Medical image processing device and medical image processing method
US9224199B2 (en) Pathological diagnosis assisting apparatus, pathological diagnosis assisting method and non-transitory computer readable medium storing pathological diagnosis assisting program
JP2006061472A (en) Breast image display device, and program therefor
CN109152517A (en) The control program of image processing apparatus, the control method of image processing apparatus and image processing apparatus
JP4855709B2 (en) Image processing apparatus, image processing method, and image processing program
KR101063343B1 (en) Color Information Segmentation Method Using Tongue Image
US8929629B1 (en) Method and system for image-based ulcer detection
JP2013240701A (en) Image processor, method for operating the same, and image processing program
Arnold et al. Indistinct frame detection in colonoscopy videos
Cui et al. Detection of lymphangiectasia disease from wireless capsule endoscopy images with adaptive threshold
Zhu et al. Convolutional neural network based anatomical site identification for laryngoscopy quality control: A multicenter study
Kamarudin et al. A fast and effective segmentation algorithm with automatic removal of ineffective features on tongue images
WO2019088259A1 (en) Processor for electronic endoscope, and electronic endoscope system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20151113

Address after: Tokyo, Japan, Japan

Patentee after: Olympus Corporation

Address before: Tokyo, Japan

Patentee before: Olympus Medical Systems Corp.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110119

Termination date: 20170314