WO2015133100A1 - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
WO2015133100A1
WO2015133100A1 PCT/JP2015/001016 JP2015001016W WO2015133100A1 WO 2015133100 A1 WO2015133100 A1 WO 2015133100A1 JP 2015001016 W JP2015001016 W JP 2015001016W WO 2015133100 A1 WO2015133100 A1 WO 2015133100A1
Authority
WO
WIPO (PCT)
Prior art keywords
sample
reference white
color
image
sample image
Prior art date
Application number
PCT/JP2015/001016
Other languages
French (fr)
Inventor
Toshiki Shiga
Toru Sasaki
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Publication of WO2015133100A1 publication Critical patent/WO2015133100A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6075Corrections to the hue

Definitions

  • the present invention relates to a color change method for a sample image.
  • WSI system whole slide image system
  • the digital image of the entire sample acquired by the WSI system can be image-processed to support diagnosis.
  • the WSI system is also expected to implement various advantages, such as quickening remote diagnosis, describing to a patient using digital images, sharing rare cases, and making educational and practical training more efficient.
  • Patent Literature 3 proposes that an image is converted into the color temperature image specified by the user, and observation is performed using a desired color temperature image. Since the color temperature of an image is changed, color balance of the image is maintained. However it is unknown how to precisely set the color temperature to improve the identification level among the target segments, hence the user must set various temperature and search for an optimum color temperature by trial and error. Therefore if this method is applied to pathological image diagnosis, operation load on the pathologist (observer) increases. Moreover, if the pathologist must select the color temperature, color temperature may disperse depending on the individual, and objectivity of a diagnosis (allowing anyone to diagnose under same conditions) may be diminished. Therefore this method cannot be directly applied to pathological image diagnosis.
  • a first aspect of the present invention provides an image processing apparatus including: an image acquisition unit that acquires data of a sample image obtained by imaging a sample; a sample information acquisition unit that acquires sample information to specify a plurality of target colors to be identified in the sample image; a reference white determination unit that determines a reference white based on the sample information; and a color change unit that performs processing of changing a reference white of the sample image to the reference white determined by the reference white determination unit, on the data of the sample image, so as to generate data on a post color change sample image, wherein the reference white determination unit determines a reference white which is used for the color change of the sample image, so that an absolute value of the difference of the hue of the plurality of target colors in a predetermined color space for observation becomes greater in the post color change sample image than in the sample image before the color change.
  • a second aspect of the present invention provides a method for controlling an image processing apparatus including: an image acquisition step of acquiring data of a sample image obtained by imaging a sample; a sample information acquisition step of acquiring sample information to specify a plurality of target colors to be identified in the sample image; a reference white determination step of determining a reference white based on the sample information; and a color change step of performing processing of changing a reference white of the sample image to the reference white determined in the reference white determination step, on the data of the sample image, so as to generate data on a post color change sample image, wherein in the reference white determination step, a reference white which is used for the color change of the sample image is determined, so that an absolute value of the difference of the hue of the plurality of target colors in a predetermined color space for observation becomes greater in the post color change sample image than in the sample image before the color change.
  • a third aspect of the present invention provides a program causing a computer to execute each step of the method for controlling an image processing apparatus according to the present invention.
  • an image, of which identification level among target segments has been automatically improved can be presented to the user while suppressing the influence on diagnosis.
  • Fig. 1 is a diagram depicting a uniform color space and three attributes of color.
  • Figs. 2A to 2C show flow charts of a reference white change processing.
  • Fig. 3 is a flow chart depicting a hue difference deriving processing of two colors.
  • Fig. 4 is a diagram depicting a configuration of an image display system according to an embodiment of the present invention.
  • Fig. 5 is a block diagram depicting hardware of an image processing apparatus.
  • Fig. 6A is a functional block diagram of the image processing apparatus and
  • Fig. 6B is an example of the reference white data.
  • Fig. 7 is an example of a GUI according to Embodiment 1.
  • Fig. 8 is a flow chart depicting a color change method according to Embodiment 1.
  • Fig. 9 is an example of a correspondence table of a sample type, a staining type and a target color, which is used for Embodiment 2.
  • Fig. 10 is a flow chart depicting an LUT creation method according to Embodiment 2.
  • Fig. 11A shows an example of the LUT according to Embodiment 2 and
  • Fig. 11B is a flow chart depicting a color change method.
  • Fig. 12A is a flow chart depicting a color frequency peak coordinate value acquisition method according to Embodiment 3 and Fig. 12B shows an example of the correspondence table.
  • Fig. 13 is a flow chart depicting an LUT creation method according to Embodiment 3.
  • Fig. 14A shows an example of the LUT according to Embodiment 3 and Fig. 14B is a flow chart depicting a color change method.
  • Reference white is white that becomes the reference of color balance of an image, and can be freely selected within a color gamut of the display monitor. It is known that if the eyes of the observer become accustomed to the reference white of a target when observing images that have different reference whites, each image is seen as a completely same image.
  • "Uniform color space” is a color space developed such that the distance in a color space, which is a difference of colors, matches well with the difference of colors recognized by the human eye.
  • L*a*b* color space is a type of uniform color space, and is expressed three-dimensionally by lightness L* which indicates luminosity, and a* and b* which indicate colors (hereafter L*, a* and b* are simply expressed as L, a and b).
  • the lightness that indicates luminosity, the chroma that indicates saturation and the hue that indicates the difference of color types are important indexes to compare colors, and are called as the three attributes of color.
  • the coordinate values of the color space and the three attributes of color will be described with reference to Fig. 1. It is assumed that the coordinates of a point P in the color space are (L, a, b). The lightness L is indicated by the coordinate values of a point 102.
  • the chroma C is a distance 103 from the origin on the ab plane, and is given by Expression 1.
  • Hue h is an angle 104 formed by a line from the origin to the point P and the a axis on the ab plane, and is given by Expression 2.
  • “Hue difference” is an index to express the absolute value of the difference of hue values of two different points in a color space, and is also called as the hue angle difference.
  • the hue difference between a point P and a point Q is given by Expression 3.
  • h P and h Q denote a hue 104 of the point P and the hue 106 of the point Q respectively.
  • Reference white change is a change in the reference white that is used for standardizing the uniform color space.
  • the reference white is changed when the color temperature change processing is performed for an image. If the eyes of the observer do not become accustomed to the reference white after the change, then appearance does not perfectly match with the expected image, however image diagnosis is hardly affected.
  • Reference white for observation is a white which the observer becomes accustomed to during observation.
  • the reference white that is defined in the standard or in the setting of the observation monitor, such as sRGB and Adobe RGB or the ambient light corresponds to "reference white for observation”.
  • Color space for observation is a uniform color space of which reference white is the reference white for observation. A plurality of images developed with a different reference white is used as a common color space when a same observer compares images.
  • Identity level of colors is an effect of colors to distinguish a plurality of different objects or segments as different objects or segments in the color space for observation. As the hue difference is greater, the identification level is higher.
  • the reference white change processing shown in Fig. 2A is constituted by RGBLab conversion S201 and LabRGB conversion S202.
  • RGBLab conversion S201 the RGB values of the image are converted into Lab values when the eyes of the observer become accustomed to an arbitrary reference white.
  • Fig. 2B shows the flow of the processing.
  • RGBXYZ conversion S211 the RGB values of the image are converted into tristimulus values in the XYZ color space. If the reference white setting when the original image data is developed is D65 light source color, this conversion is given by Expression 4 according to a conversion method of the sRGB standard. Here subscript 0 denotes brightness (cd/m 2 ) of the monitor. R, G and B denote linear RGB values standardized to 1 to 0. If an image has been developed with a white other than the D65 light source color, this conversion can be preformed by determining in advance a conversion matrix or an approximate conversion expression of the RGB values and XYZ values, or by creating a lookup table (hereafter called as LUT) and applying the LUT.
  • LUT lookup table
  • the RGB values of the image after the change of the reference white are determined based on the Lab coordinate values acquired in the RGBLab conversion S201.
  • Fig. 2C shows the flow of the processing.
  • the Lab coordinate values are converted into the XYZ tristimulus values after the reference white is changed.
  • the LabXYZ conversion S221 is an inverse conversion of the XYZLab conversion S212, and the XYZ tristimulus values are determined by optimization using Expression 5 to Expression 7.
  • the tristimulus values of the reference white after the change are used for the X n , Y n and Z n values.
  • the XYZRGB conversion S222 the XYZ tristimulus values are converted into the RGB values of the monitor.
  • the XYZRGB conversion S222 is an inverse conversion of the RGBXYZ conversion S211, and the XYZ values are converted into the RGB values.
  • the RGB values are given by Expression 8.
  • the conversion may be performed by determining the conversion matrix or an approximate conversion expression between Lab and XYZ in advance for each reference white, or by creating an LUT.
  • the original RGB values and the RGB values after the reference white is changed may also be converted by creating the conversion matrix, approximate conversion expression or LUT in advance.
  • the RGB value after the reference white is changed can be directly determined from the original RGB values without performing the processing in Fig. 2A to Fig. 2C, and processing can be simplified and faster.
  • RGB value acquisition step S301 RGB values of two target segments (target colors) in the image are acquired. If a target segment is provided as a region (a plurality of pixels), a representative value (e.g. mean value, mode) in the region may be acquired as a target color.
  • a representative value e.g. mean value, mode
  • each RGB value of the two target segments is converted to a coordinate value in the color space for observation, using the reference white for observation.
  • the flow of the processing is the same as the RGBLab conversion shown in Fig. 2B.
  • a reference white of the monitor or an ambient light is used.
  • the hue difference calculation step S303 the hue difference between the two target segments in the color space for observation is determined by Expression 3, using the determined Lab color coordinate values. Thus the hue difference between the target segments in the color space for observation can be determined.
  • the image data for observation having a different color temperature is created by applying the reference white change processing shown in Fig. 2A to the sample image data acquired by imaging the sample.
  • the hue difference between the two target segments in the space for observation can be determined by applying the hue difference calculation processing shown in Fig. 3 to the image data for observation.
  • the image display system is configured by a display device 401, a keyboard 402, a mouse 403, an image processing unit 404, a storage device 405, a computer 406, an image server 407 and an imaging device 408.
  • the image processing unit 404 and the storage device 405 are integrated into the computer 406.
  • the display device 401, the keyboard 402, the mouse 403 and the imaging device 408 are connected to the computer 406 via a general purpose I/F cable.
  • the image server 407 and the computer 406 are connected via LAN.
  • the display device 401 is a display device using liquid crystals, EL (Electro-Luminescence), a CRT (Cathode Ray Tube) or the like.
  • Image processing software of this embodiment displays a GUI (Graphical User Interface) on the display device 401.
  • the GUI will be described in detail later.
  • the input device such as the keyboard 402 and the mouse 403, is used for specifying a target segment in a sample image displayed on the GUI.
  • the image processing unit 404 is a device that includes a processor dedicated to image processing and a memory, and is used for processing reference white change processing and hue difference calculation processing, which will be described later, at high-speed.
  • the storage device 405 is an auxiliary storage device in which the later mentioned OS (Operating System) executed by the CPU, programs and various parameters of the image processing software, and later mentioned reference white data among others are stored in a non-volatile format.
  • OS Operating System
  • any storage type, such as HDD (Hard Disk Drive), SSD (Solid State Drive) and flash memory, may be used.
  • the image server 407 is a computer that saves the image data and performs various types of processing outside the computer 406.
  • the imaging device 408 is a WSI, a digital microscope or the like, and is a device that photographs a slide on which a pathological sample is fixed, at high magnification, and acquires a digital image at high resolution.
  • the image data acquired by the imaging device 408 is called as sample image data.
  • the computer 406 includes a CPU (Central Processing Unit) 501, a RAM (Random Access Memory) 502, a storage device 405, a data input/output I/F 504 and an internal bus 503 that interconnects these components.
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • I/F Data input/output
  • the CPU 501 accesses the RAM 502 or the like when necessary, and comprehensively controls each block of the computer 406 while executing various types of arithmetic processing.
  • the RAM 502 is used as an operation area of the CPU 501, and temporarily stores the OS, various programs during execution, and various data, such as image data and reference white data, which become targets of processing.
  • the following components are connected to the data input/output I/F 504: the image processing unit 404; the image server 407 (via a LAN I/F 505); the display device 401 (via a graphics board 506); and the imaging device 408 (via an external device I/F 507).
  • the reference white data and the image data stored in the storage device 405 may be stored in the image server 407 or in the imaging device 408.
  • input devices such as a keyboard 402 and a mouse 403, are connected to the data input/output I/F 504 via an operation I/F 508.
  • the configuration depicted in Fig. 5 is used, but the image processing unit 404 may be incorporated in the image server 407 or in the imaging device 408, for example. All or a part of the functions of the image processing unit 404 may be performed by the computer 406 instead.
  • the display device 401 is assumed to be connected as an external device, but a computer integrated with the display device may be used. An example thereof is a notebook PC.
  • the keyboard 402 and the mouse 403 are used as examples of input devices, but such an input device as a trackball, a touchpad and a controller may be used. It is also possible to perform input and operation on screen if a touch panel display is used as the display device 401.
  • the image processing apparatus and the image processing method according to the embodiment of the present invention are implemented by the computer 406 executing image processing software (programs) and controlling necessary hardware resources in the above configuration.
  • the image processing apparatus includes an image acquisition unit 601, a GUI unit 602, a sample information acquisition unit 603, a reference white determination unit 604, a reference white data 605, and a color change unit 606.
  • the image acquisition unit 601 has a function to acquire sample image data used for observation (diagnosis), from any one of the storage device 405, the image server 407 and the imaging device 408.
  • the GUI unit 602 has a function to generate and display a GUI for displaying images, specifying a target segment, displaying a color change result, and setting various parameters.
  • the sample information acquisition unit 603 has a function to acquire sample information on the sample image data.
  • This sample information is information for specifying a target color of the sample image, and is used for processing of determining a reference white after the change.
  • the reference white determination unit 604 has a function to determine a reference white that is suitable for observing the sample image (that is, identification level of the target segments is high).
  • the color change unit 606 has a function to execute reference white change processing on the sample image data.
  • Fig. 6B shows an example of the reference white data 605 referred to by the reference white determination unit 604.
  • the reference white data 605 is a table (list) that defines the values of a plurality of types of reference white.
  • the reference white determination unit 604 can select a candidate of the reference white to be changed, out of the reference white data 605.
  • the candidates of the reference white can be: A light source, B light source, C light resource, D55 light resource, D65 light source, D75 light source or the like, which are standard light sources specified by CIE (Commission Internationale de l'Eclairage).
  • CIE Commission Internationale de l'Eclairage
  • the coordinate values of any color selected from the range of a 0.03 deviation from the black body locus in the uv chromaticity diagram can be selected.
  • the reference white data is defined as the XYZ tristimulus values, but may be defined as the coordinate values of the xy, uv or u'v' chromaticity diagram.
  • GUI which the GUI unit 602 outputs to the display device 401, will be described with reference to Fig. 7.
  • the GUI is displayed in a window 701, and a sample image 703, before the reference white change, selected by the user, is displayed in a region 702.
  • the user can specify a target segment in any position in the sample image displayed in the region 702.
  • the target segment is specified, for example, by moving a pointer 704 by the keyboard 402 or the mouse 403, and clicking a predetermined button.
  • Fig. 7 shows an example when two target segments 705 and 706 are specified.
  • Each mark of the target segments 705 and 706 is displayed superposed on the sample image 703, so that the positions of the specified target segments are recognized, and the information 707 and 708 on the RGB values of each target segment are displayed in the window 701. If the pointer 704 is positioned at a button 709 and a specified button of the keyboard or mouse is clicked after the two target segments are specified, then color conversion is performed such that the hue difference between the target segments increases in the color space for observation.
  • FIG. 8 The flow of the color change processing of this embodiment will be described with reference to the flow chart in Fig. 8.
  • the processing shown in Fig. 8 is implemented by an image processing software (program) which the CPU 501 of the computer 406 or the processor of the image processing unit 404 executes.
  • step S801 the image acquisition unit 601 acquires the sample image data from the storage device 405 or the like.
  • the target data to be read is specified by the user.
  • the read sample image data is displayed on the region 702 on the GUI window 701 by the GUI unit 602 (see Fig. 7).
  • the user e.g. pathologist, technician
  • a target segment e.g. pathologist, technician
  • the color conversion button 709 is clicked on after specifying two target segments for the sample image 703, processing advances to step S802.
  • the sample information acquisition unit 603 acquires the RGB values (two target colors) of the two target segments specified by the user, as the sample information.
  • the RGB values of a point (one pixel) specified on the sample image 703 are acquired, but the user may specify a region (pixel group) of the target segment on the GUI window 701, and a representative value (e.g. mean value, mode) of the RGB values in this region may be acquired.
  • the reference white determination unit 604 reads the XYZ tristimulus values of the candidate reference white from the reference white data 605 stored in the storage device 405. At this time, all reference whites registered in the reference white data 605 may be read, or candidates may be narrowed down to a part of the reference whites in order to shorten the processing time.
  • the method of narrowing down the candidates of post change reference whites can be any method, such as eliminating the reference white candidates close to the reference white used for development of the original sample image data.
  • the reference white determination unit 604 executes the processing in step S804 and the processing in step S805 for each of the reference white candidates which were read in step S803.
  • step S804 the reference white determination unit 604 determines the RGB values after the reference white is changed respectively, for the RGB values of the two target segments acquired in step S802.
  • the content of the processing in step S804 is the same as the reference white change processing described in Fig. 2A to Fig. 2C.
  • the reference white used for the development of the sample image data (white balance) is used for the tristimulus values Xn, Yn and Zn of the reference white.
  • the information on the reference white used for the development of the sample image data may be acquired from the header of the sample image data or from meta data, or may be acquired as sample information which the user inputted (selected) in the GUI window 701.
  • the Xn, Yn and Zn tristimulus values may be registered in the reference white data 605 in advance.
  • the tristimulus values of the reference white candidates are used as the tristimulus values Xn, Yn and Zn of the reference white.
  • step S805 the reference white determination unit 604 determines the hue difference between the two target segments using the RGB values after the reference white is changed, which was calculated in step S804.
  • the content of the processing in step S805 is the same as the hue difference calculation processing shown in Fig. 3.
  • the tristimulus values of the reference white for observation are used for the tristimulus values Xn, Yn and Zn of the reference white. This is for evaluating the hue difference in the color space for observation (that is, the difference of the hue when the sample image after the color change is actually observed on the display device).
  • the information on the reference white for observation may be acquired by reading the set value of the color temperature from the storage device 405 or on the display device 401, or may be inputted (selected) by the user in the GUI window 701.
  • the ambient light existing when the image is observed may be measured using a sensor disposed in the display device 401 or the like, and the color space for observation may be determined based on this measurement value.
  • step S806 the reference white determination unit 604 selects a reference white of which hue difference between the two target segments is the greatest, from all the reference white candidates. In the following processing, the selected reference white is called "optimum reference white”.
  • step S807 the color change unit 606 changes the reference white of the whole (all pixels) of the sample image data to the optimum reference white.
  • the content of the processing in step S807 is the same as the reference white change processing shown in Fig. 2A to Fig. 2C.
  • the tristimulus values of the reference white used for the development of the sample image data are used as the tristimulus values Xn, Yn and Zn of the reference white
  • the tristimulus values of the optimum reference white are used for the tristimulus values Xn, Yn and Zn of the reference white.
  • step S808 the color change unit 606 sends the post color change sample image data to the GUI unit 602, and displays the sample image based on the post color change sample image data on the region 702 of the GUI window 701.
  • the sample image after the color change may be displayed instead of the sample image before the color change, or these two images may be displayed side by side.
  • a sample image, of which identification level among target segments has been improved compared with the original image can be presented to the user.
  • the optimum color change can be automatically determined merely by specifying a target segment (target color), hence the operation load on the user is virtually null, which is superb in terms of convenience.
  • the identification level is improved by changing the reference white, hence color balance in the entire image is maintained, and artifacts are barely generated. As a consequence, influence on diagnosis can be minimized.
  • the present inventors confirmed through experiments that in the case of a sample image stained a reddish color by HE staining or the like, the hue difference between the target segments increases as the reference white color has a temperature that is higher (has a color that is more bluish). In the case of a sample image stained a bluish color by Giemsa staining or the like, the hue difference between target segments increases as the reference white has a color temperature that is lower (has a color that is more reddish).
  • the user specifies two target segments, but the user may specify only one target segment or three or more target segments.
  • the optimum reference white is determined so as to maximize the hue difference between the color of the specified target segment and the color of the peripheral segments thereof.
  • the total value of the color differences of all the pairs is determined, and the optimum reference white is determined so as to maximize this total value, for example.
  • Embodiment 2 An image display system including an image processing apparatus according to Embodiment 2 of the present invention will be described.
  • the color information of the target segments specified by the user is used as the sample information, but in Embodiment 2, data that can specify the sample type and the staining type linked to the sample image data is used as the sample information, and the optimum reference white is determined according to the sample type and the staining type.
  • the device configuration of the image display system is the same as the configuration of Embodiment 1, except that a "correspondence table of the sample type, staining type and target color" is stored in the storage device 405 in advance.
  • the sample type is information to specify a type of the sample (e.g. organ from which sample was extracted, purpose of pathological diagnosis), and the staining type is information to specify a staining method used for preparing the sample.
  • Data to specify the sample type and the staining type are provided by the header information of the sample image data, or by the meta data linked to the sample image data.
  • IDs to indicate the sample type and the staining type may be embedded in the file name of the sample image data, or if a label in which the sample type and the staining type are recorded is attached to the slide, the sample type and the staining type may be recognized by the label portion in the sample image data.
  • a type of tissue and segment, such as a nucleus, to be observed are determined almost uniquely for each combination of the sample type and the staining type. Further, if the sample preparation method and the observation environment are specified, the degree of staining of the sample and the color thereof can also be specified for each combination of the sample type and the staining type.
  • the correspondence table of the sample type, the staining type and the target color can be statistically created by acquiring the RGB values of the target segment from the sample images of which sample type and staining type are known.
  • Fig. 9 shows an example of the correspondence table.
  • the target color may include not only the color of the sample tissue, but also a color of the background where the sample tissue does not exist.
  • the target color may be defined not by the RGB values but by the reference white and the values that can be converted into RGB values, such as XYZ tristimulus values and Lab coordinate values.
  • step S802 in Fig. 8 can be replaced with the "processing of acquiring the target color linked to the sample type and the staining type of the sample image data from the correspondence table", and the optimum reference white can be determined, and the post color change sample image data can be generated in the same manner as Embodiment 1.
  • This processing is acceptable, but in this embodiment, an LUT linking the sample information (sample type and staining type) and the optimum reference white is created in advance based on the correspondence table in Fig. 9, and the color conversion processing is performed using this LUT, so as to further simplify processing.
  • the processing in Fig. 10 is executed by an LUT creation unit (not illustrated) which is a function of the image processing software.
  • the LUT creation unit reads a correspondence table of the sample type, the staining type and the target color which is stored in the storage device 405. Then the following processing is performed for each of every combination of the sample type and the staining type listed in the correspondence table.
  • step S1002 the LUT creation unit specifies two target colors (RGB values) linked to the target sample type and the staining type from the correspondence table which was read.
  • the subsequent processing of steps S1003 to S1005 is the same as the processing of steps S803 to S805 in Fig. 8, hence description thereof is omitted.
  • step S1006 the LUT creation unit selects a reference white of which the hue difference between two target colors is greatest in the color space for observation, out of all the reference white candidates, and determines this reference white as the optimum reference white corresponding to the target sample type and the staining type.
  • the processing in steps S1002 to S1006 is applied to all combinations of the sample types and staining types, the processing advances to step S1007.
  • step S1007 the LUT creation unit creates an LUT of the sample type, the staining type and the optimum reference white by combining the correspondence table of the sample type, the staining type and the target color, which were read in step S1001, and the optimum reference white acquired in step S1006, and stores the LUT in the storage device 405.
  • Fig. 11A shows an example of the created LUT.
  • the LUT is created within the image display system, but may be created outside the image display system in advance, and be stored in the storage device 405.
  • step S1101 the image acquisition unit 601 acquires the sample image data from the storage device 405 or the like.
  • the target data to be read is specified by the user.
  • step S1102 the sample information acquisition unit 603 acquires data to specify the sample type and the staining type corresponding to this sample image data, and specifies the sample type and the staining type of the sample image.
  • step S1103 the reference white determination unit 604 reads the LUT linking the sample type, the staining type and the optimum reference white (see Fig. 11A). Then in step S1104, the reference white determination unit 604 specifies the corresponding optimum reference white based on the sample type and the staining type acquired in step S1102 and the LUT read in step S1103.
  • step S1105 the color change unit 606 changes the reference white of the sample image data to the optimum reference white, and in step S1106, the sample image after the color change is displayed on the display device 401.
  • the processing is the same as steps S807 and S808 in Fig. 8.
  • the function to improve the identification level among the target segments can be implemented at a faster speed by selecting the optimum reference white from the LUT.
  • the optimum reference white is specified by using the specific sample preparation method and the correspondence table in the observation environment. If the sample preparation method and the observation environment are different, a correspondence table of the sample type, the staining type and the target color in this case can be created. Instead of using the correspondence table of the sample type, the staining type and the target color, a correspondence table linking information only on the staining type and the target color may be used. In this case, the correspondence can be determined by making the color unique to the staining pigment, or the color of the background is set as the color of the target segment. The correspondence of the staining type and the target color may also be determined by estimating the sample type to be observed based on the frequency of the sample type that is observed.
  • Embodiment 3 An image display system including an image processing apparatus according to Embodiment 3 will be described.
  • To specify the optimum reference white it is necessary for the user to specify the target segments in Embodiment 1, and information to specify the sample type and the staining type must be provided in Embodiment 2.
  • Embodiment 3 is characterized in that the target color is automatically detected based on the color frequency distribution in the sample image data, and the optimum reference white is specified based on this detected target color. Thereby an image, of which identification level among the target segments have been improved, can be presented without causing problems to the user in selecting target segments.
  • the device configuration of the image display system is the same as the configuration of Embodiment 2 except that a correspondence table of the sample type, the staining type and the target color (peak coordinate value of the color frequency distribution in the color space for observation) is additionally stored in the storage device in advance.
  • the color frequency distribution of an image will be described.
  • a method and an environment when the sample was prepared or observed are specified, the stained degree and the color of the sample are determined for each combination of the sample type and the staining type. Therefore a color, of which frequency of appearance is high (e.g. color of a cell or a nucleus of which occupying area in the image is large) can be specified for each sample type and staining type in advance.
  • the color coordinate values in the color space where the frequency of appearance is high hereafter called as color frequency peak coordinate value
  • the correspondence table of the sample type, the staining type and the color frequency peak coordinate value can be created.
  • the processing of acquiring the color frequency peak coordinate value is performed within the image display system in this embodiment, but may be performed in a device that is difference from the image display system.
  • FIG. 12A A method for acquiring the color frequency peak coordinate values of the image will be described with reference to the flow chart in Fig. 12A.
  • the processing in Fig. 12A is executed by a peak coordinate value acquisition unit (not illustrated), which is a function of the image processing software.
  • step S1301 the peak coordinate value acquisition unit converts the RGB values of all or a part of the pixels of the original image data into the color coordinate values (Lab values) in the color space for observation.
  • the content of the processing is the same as the RGBLab conversion shown in Fig. 2B.
  • step S1302 the peak coordinate value acquisition unit creates a frequency distribution for the color coordinate values of a plurality of pixels determined in step S1301.
  • step S1303 the peak coordinate value acquisition unit acquires a plurality of color frequency peak coordinate values from the frequency distribution created in step S1302. Thus the peak coordinate values on the color frequency distribution of the sample image data can be acquired.
  • the color frequency peak coordinate values can be acquired for each combination of sample type and staining type by preparing a plurality of sample images for samples having a different combination of sample type and staining type, and applying the processing in Fig. 12A to each sample image.
  • Fig. 12B is an example of a correspondence table linking a combination of the sample type and the staining type and a plurality of color frequency peak coordinate values. In the example of Fig. 12B, three color frequency peak coordinate values are listed, but it is sufficient if at least two color frequency peak coordinate values are linked to each combination of the sample type and the staining type. A number of color frequency peak coordinate values to link may be different for each combination of the sample type and the staining type.
  • the color having a high frequency is specified based on the frequency distribution of the color coordinate values (Lab values) in the color space for observation of the image, but the frequency distribution of the RGB values or the XYZ tristimulus values, instead of the color coordinate values, may be used.
  • the peak value in the frequency distribution an average coordinate value, a center-of-gravity value or the like near the peak may be used, and instead of the frequency peak value in the space coordinate of Lab, a color frequency peak coordinate value or the like on the ab plane coordinates may be used.
  • the method for creating an LUT linking the color frequency peak coordinate value and the optimum reference white will be described next with reference to the flow chart in Fig. 13.
  • the processing in Fig. 13 is executed by a LUT creation unit (not illustrated), which is a function of the image processing software.
  • step S1501 the LUT creation unit creates an LUT of the sample type, the staining type and the optimum reference white (see Fig. 11A).
  • the content of the processing is the same as the LUT creation processing shown in Fig. 10.
  • step S1502 the LUT creation unit reads the correspondence table of the sample type, the staining type and the color frequency peak coordinate value (see Fig. 12B), which was created in advance and stored in the storage device 405.
  • step S1503 the LUT creation unit creates a "LUT of the color frequency peak coordinate values and the optimum reference white" by combining the "LUT of the sample type, the staining type and the optimum reference white" and the "correspondence table of the sample type, the staining type and the color frequency peak coordinate value".
  • Fig. 14B shows an example of the created LUT.
  • step S1701 the image acquisition unit 601 acquires the sample image data from the storage device 405 or the like.
  • the target data to be read is specified by the user.
  • step S1702 the sample information acquisition unit 603 reads the "LUT of the color frequency peak coordinate value and the optimum reference white" (see Fig. 14A) from the storage device 405 or the like.
  • step S1703 the sample information acquisition unit 603 analyzes the sample image data that was read in step S1701, and acquires a plurality of color frequency peak coordinate values from the frequency distribution of the color coordinate values in the color space for observation.
  • the color frequency peak coordinate value corresponds to the target color
  • the processing of detecting the color frequency peak coordinate value from the sample image data corresponds to acquisition of the sample information.
  • the content of the processing in step S1703 is the same as the color frequency peak coordinate value acquisition processing shown in Fig. 12A.
  • step S1704 the reference white determination unit 604 acquires the optimum reference white, which corresponds to the color frequency peak coordinate values acquired in step S1703, from the "LUT of the color frequency peak coordinate value and the optimum reference white". If the same combination as the color frequency peak coordinate values acquired in step S1703 does not exist in the LUT, then the optimum reference white linked to a most similar combination of the peak coordinate values is acquired.
  • step S1705 the color change unit 606 changes the reference white of the sample image data to the optimum reference white
  • step S1706 the sample image after the color change is displayed on the display device 401.
  • the processing is the same as steps S807 and S808 in Fig. 8.
  • the optimum reference white can be selected from the LUT based on the color information which is most frequently detected from the sample image, even if there is not data linked to the sample image, and the function to improve the identification level among target segments can be implemented at high-speed.
  • the color frequency peak coordinate values of the sample image are acquired, and the optimum reference white corresponding to the peak coordinate values is determined using the LUT.
  • a candidate, of which hue difference between the peak coordinate values after the color change becomes the greatest among a plurality of reference white candidates may be selected as the optimum reference white, just like Fig. 8 of Embodiment 1.
  • the background color a color of a region where tissue does not exist
  • the tissue color color of a tissue of which frequency of appearance is high
  • a reference white which increases the hue difference between the background color and the tissue color, or the hue difference among a plurality of tissue colors, may be selected.
  • any method may be used for acquiring the background color.
  • a color having a pixel of which frequency of appearance is highest among pixels of which lightness is within the top 10% in the sample image, or a color having a pixel of which frequency of appearance is highest among pixels of which lightness is within the bottom 10% in the sample image may be defined as the background color.
  • Lightness is within the top 10% refers to the 10% range from the maximum value in the value range of the lightness in the sample image
  • “lightness is within the bottom 10%” refers to the 10% range from the minimum value in the value range.
  • the pixel to evaluate whether the color is the background color or not may be acquired from the entire sample image, but preferably should be acquired from a range that does not include a label attached to the slide, a marker written on the slide, dust adhering to the slide or optical system.
  • the tissue color can be defined as a color of a pixel at one of the peaks of the distribution of the frequency of appearance, among the pixels of which lightness is different from the background color by a predetermined value or more (e.g. brightness outside a -5% to +5% range of the lightness of the background).
  • Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a 'non-transitory computer-readable storage medium') to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD) TM ), a flash memory device, a memory card, and the like.

Abstract

Provided is an image processing apparatus having: an image acquisition unit that acquires data of a sample image obtained by imaging a sample; a sample information acquisition unit that acquires sample information to specify a plurality of target colors to be identified in the sample image; a reference white determination unit that determines a reference white based on the sample information; and a color change unit that performs processing of changing a reference white of the sample image to the reference white determined by the reference white determination unit, on the data of the sample image, so as to generate data on a post color change sample image. A reference white which is used for the color change of the sample image is determined, so that an absolute value of the difference of the hue of the plurality of target colors in a predetermined color space for observation becomes greater in the post color change sample image than in the sample image before the color change.

Description

IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD
The present invention relates to a color change method for a sample image.
In conventional pathological inspection, diagnosis is performed by a pathologist observing a stained sample using an optical microscope. Recently as a substitution thereof, a whole slide image system (hereafter called as WSI system), which digitizes a sample image for diagnosing the image on the display, is receiving attention. The digital image of the entire sample acquired by the WSI system can be image-processed to support diagnosis. The WSI system is also expected to implement various advantages, such as quickening remote diagnosis, describing to a patient using digital images, sharing rare cases, and making educational and practical training more efficient.
Japanese Patent Application Laid-open No. 2010-079522 Japanese Patent Application Laid-open No. 2010-181833 Japanese Patent Application Laid-open No. 2000-261825
In pathological image diagnosis, pathologists demand images where a plurality of types of target segments, critical for diagnosis, can be accurately identified from one another. One approach for improving the identification level is changing the colors of the images, for which there are a few methods.
In Patent Literature 1, the identification level among the target segments is improved by spreading the hue angle in the color space for the colors in the target region of the image. By this method however, color balance among the segments is lost, therefore it is difficult to distinguish whether the characteristics that appear in the presented image originated from the sample or are due to an artifact generated by image processing, all of which may affect diagnosis.
In Patent Literature 2, the identification level among the target segments is improved by using spectroscopic images and extracting a portion which has a conspicuous wavelength characteristic in the target segments. This method requires expensive imaging equipment for acquiring images since spectroscopic images are used. Furthermore, just like Patent Literature 1, color balance among the segments is lost, which may affect diagnosis.
Patent Literature 3 proposes that an image is converted into the color temperature image specified by the user, and observation is performed using a desired color temperature image. Since the color temperature of an image is changed, color balance of the image is maintained. However it is unknown how to precisely set the color temperature to improve the identification level among the target segments, hence the user must set various temperature and search for an optimum color temperature by trial and error. Therefore if this method is applied to pathological image diagnosis, operation load on the pathologist (observer) increases. Moreover, if the pathologist must select the color temperature, color temperature may disperse depending on the individual, and objectivity of a diagnosis (allowing anyone to diagnose under same conditions) may be diminished. Therefore this method cannot be directly applied to pathological image diagnosis.
With the foregoing in view, it is an object of the present invention to provide a technique to present the user with an image of which identification level among target segments has been automatically improved while suppressing the influence on diagnosis.
A first aspect of the present invention provides an image processing apparatus including: an image acquisition unit that acquires data of a sample image obtained by imaging a sample; a sample information acquisition unit that acquires sample information to specify a plurality of target colors to be identified in the sample image; a reference white determination unit that determines a reference white based on the sample information; and a color change unit that performs processing of changing a reference white of the sample image to the reference white determined by the reference white determination unit, on the data of the sample image, so as to generate data on a post color change sample image, wherein the reference white determination unit determines a reference white which is used for the color change of the sample image, so that an absolute value of the difference of the hue of the plurality of target colors in a predetermined color space for observation becomes greater in the post color change sample image than in the sample image before the color change.
A second aspect of the present invention provides a method for controlling an image processing apparatus including: an image acquisition step of acquiring data of a sample image obtained by imaging a sample; a sample information acquisition step of acquiring sample information to specify a plurality of target colors to be identified in the sample image; a reference white determination step of determining a reference white based on the sample information; and a color change step of performing processing of changing a reference white of the sample image to the reference white determined in the reference white determination step, on the data of the sample image, so as to generate data on a post color change sample image, wherein in the reference white determination step, a reference white which is used for the color change of the sample image is determined, so that an absolute value of the difference of the hue of the plurality of target colors in a predetermined color space for observation becomes greater in the post color change sample image than in the sample image before the color change.
A third aspect of the present invention provides a program causing a computer to execute each step of the method for controlling an image processing apparatus according to the present invention.
According to the present invention, an image, of which identification level among target segments has been automatically improved can be presented to the user while suppressing the influence on diagnosis.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Fig. 1 is a diagram depicting a uniform color space and three attributes of color. Figs. 2A to 2C show flow charts of a reference white change processing. Fig. 3 is a flow chart depicting a hue difference deriving processing of two colors. Fig. 4 is a diagram depicting a configuration of an image display system according to an embodiment of the present invention. Fig. 5 is a block diagram depicting hardware of an image processing apparatus. Fig. 6A is a functional block diagram of the image processing apparatus and Fig. 6B is an example of the reference white data. Fig. 7 is an example of a GUI according to Embodiment 1. Fig. 8 is a flow chart depicting a color change method according to Embodiment 1. Fig. 9 is an example of a correspondence table of a sample type, a staining type and a target color, which is used for Embodiment 2. Fig. 10 is a flow chart depicting an LUT creation method according to Embodiment 2. Fig. 11A shows an example of the LUT according to Embodiment 2 and Fig. 11B is a flow chart depicting a color change method. Fig. 12A is a flow chart depicting a color frequency peak coordinate value acquisition method according to Embodiment 3 and Fig. 12B shows an example of the correspondence table. Fig. 13 is a flow chart depicting an LUT creation method according to Embodiment 3. Fig. 14A shows an example of the LUT according to Embodiment 3 and Fig. 14B is a flow chart depicting a color change method.
First terms common to the embodiments of the present invention, specifically "reference white", "uniform color space", "L*a*b* color space", "hue difference", "reference white change", "reference white for observation", "color space for observation" and "identification level of colors" will be described.
"Reference white" is white that becomes the reference of color balance of an image, and can be freely selected within a color gamut of the display monitor. It is known that if the eyes of the observer become accustomed to the reference white of a target when observing images that have different reference whites, each image is seen as a completely same image.
"Uniform color space" is a color space developed such that the distance in a color space, which is a difference of colors, matches well with the difference of colors recognized by the human eye.
"L*a*b* color space is a type of uniform color space, and is expressed three-dimensionally by lightness L* which indicates luminosity, and a* and b* which indicate colors (hereafter L*, a* and b* are simply expressed as L, a and b). L, a and b are standardized such that the color of the reference white becomes (L, a, b) = (100, 0, 0). Therefore if the eyes of the observer become accustomed to the reference white used for the standardization, the coordinate values in the color space are regarded as the sensory values of the observer.
The lightness that indicates luminosity, the chroma that indicates saturation and the hue that indicates the difference of color types are important indexes to compare colors, and are called as the three attributes of color. The coordinate values of the color space and the three attributes of color will be described with reference to Fig. 1.
It is assumed that the coordinates of a point P in the color space are (L, a, b). The lightness L is indicated by the coordinate values of a point 102. And the chroma C is a distance 103 from the origin on the ab plane, and is given by Expression 1.
Figure JPOXMLDOC01-appb-M000001
Hue h is an angle 104 formed by a line from the origin to the point P and the a axis on the ab plane, and is given by Expression 2.
Figure JPOXMLDOC01-appb-M000002
"Hue difference" is an index to express the absolute value of the difference of hue values of two different points in a color space, and is also called as the hue angle difference. The hue difference between a point P and a point Q is given by Expression 3.
Figure JPOXMLDOC01-appb-M000003
Here hP and hQ denote a hue 104 of the point P and the hue 106 of the point Q respectively.
"Reference white change" is a change in the reference white that is used for standardizing the uniform color space. The reference white is changed when the color temperature change processing is performed for an image. If the eyes of the observer do not become accustomed to the reference white after the change, then appearance does not perfectly match with the expected image, however image diagnosis is hardly affected.
"Reference white for observation" is a white which the observer becomes accustomed to during observation. The reference white that is defined in the standard or in the setting of the observation monitor, such as sRGB and Adobe RGB or the ambient light corresponds to "reference white for observation".
"Color space for observation" is a uniform color space of which reference white is the reference white for observation. A plurality of images developed with a different reference white is used as a common color space when a same observer compares images.
"Identification level of colors" is an effect of colors to distinguish a plurality of different objects or segments as different objects or segments in the color space for observation. As the hue difference is greater, the identification level is higher.
"Reference white change processing", that is common to the embodiments of the present invention, will be described next with reference to Fig. 2A to Fig. 2C. Further, "hue difference calculation processing" will be described with reference to Fig. 3. Each processing shown in Fig. 2A to Fig. 2C and Fig. 3 is executed by an image processing apparatus which is described later.
The reference white change processing shown in Fig. 2A is constituted by RGBLab conversion S201 and LabRGB conversion S202.
In the RGBLab conversion S201, the RGB values of the image are converted into Lab values when the eyes of the observer become accustomed to an arbitrary reference white. Fig. 2B shows the flow of the processing.
In RGBXYZ conversion S211, the RGB values of the image are converted into tristimulus values in the XYZ color space. If the reference white setting when the original image data is developed is D65 light source color, this conversion is given by Expression 4 according to a conversion method of the sRGB standard.
Figure JPOXMLDOC01-appb-M000004
Here subscript 0 denotes brightness (cd/m2) of the monitor. R, G and B denote linear RGB values standardized to 1 to 0. If an image has been developed with a white other than the D65 light source color, this conversion can be preformed by determining in advance a conversion matrix or an approximate conversion expression of the RGB values and XYZ values, or by creating a lookup table (hereafter called as LUT) and applying the LUT.
In the XYZLab conversion S212, the XYZ color space is converted into the Lab color space. This conversion expression is given by Expression 5 to Expression 7.
Figure JPOXMLDOC01-appb-M000005
Xn, Yn and Zn of Expression 5 to Expression 7 denote the XYZ tristimulus values of the reference white used when the image was developed.
In the LabRGB conversion S202, the RGB values of the image after the change of the reference white are determined based on the Lab coordinate values acquired in the RGBLab conversion S201. Fig. 2C shows the flow of the processing.
In the LabXYZ conversion S221, the Lab coordinate values are converted into the XYZ tristimulus values after the reference white is changed. The LabXYZ conversion S221 is an inverse conversion of the XYZLab conversion S212, and the XYZ tristimulus values are determined by optimization using Expression 5 to Expression 7. In the LabXYZ conversion S221, the tristimulus values of the reference white after the change are used for the Xn, Yn and Zn values.
In the XYZRGB conversion S222, the XYZ tristimulus values are converted into the RGB values of the monitor. The XYZRGB conversion S222 is an inverse conversion of the RGBXYZ conversion S211, and the XYZ values are converted into the RGB values. In the case of sRGB, the RGB values are given by Expression 8.
Figure JPOXMLDOC01-appb-M000006
By the above steps, image data after the reference white is changed can be created.
In the LabXYZ conversion S221 and the XYZRGB conversion S222, the conversion may be performed by determining the conversion matrix or an approximate conversion expression between Lab and XYZ in advance for each reference white, or by creating an LUT.
The original RGB values and the RGB values after the reference white is changed may also be converted by creating the conversion matrix, approximate conversion expression or LUT in advance. In this case, the RGB value after the reference white is changed can be directly determined from the original RGB values without performing the processing in Fig. 2A to Fig. 2C, and processing can be simplified and faster.
The hue difference calculation processing shown in Fig. 3 will be described next.
In the RGB value acquisition step S301, RGB values of two target segments (target colors) in the image are acquired. If a target segment is provided as a region (a plurality of pixels), a representative value (e.g. mean value, mode) in the region may be acquired as a target color.
In the RGBLab conversion step S302, each RGB value of the two target segments is converted to a coordinate value in the color space for observation, using the reference white for observation. The flow of the processing is the same as the RGBLab conversion shown in Fig. 2B. For the reference white for observation, a reference white of the monitor or an ambient light is used.
In the hue difference calculation step S303, the hue difference between the two target segments in the color space for observation is determined by Expression 3, using the determined Lab color coordinate values.
Thus the hue difference between the target segments in the color space for observation can be determined.
In the later mentioned embodiment, the image data for observation having a different color temperature is created by applying the reference white change processing shown in Fig. 2A to the sample image data acquired by imaging the sample. The hue difference between the two target segments in the space for observation can be determined by applying the hue difference calculation processing shown in Fig. 3 to the image data for observation. By selecting a reference white such that the hue difference increases in the color space for observation, an image, of which identification level between the target segments is high, can be presented to the user.
<Embodiment 1>
An image display system including an image processing apparatus according to Embodiment 1 of the present invention will be described with reference to Fig. 4.
The image display system is configured by a display device 401, a keyboard 402, a mouse 403, an image processing unit 404, a storage device 405, a computer 406, an image server 407 and an imaging device 408. The image processing unit 404 and the storage device 405 are integrated into the computer 406. The display device 401, the keyboard 402, the mouse 403 and the imaging device 408 are connected to the computer 406 via a general purpose I/F cable. The image server 407 and the computer 406 are connected via LAN.
The display device 401 is a display device using liquid crystals, EL (Electro-Luminescence), a CRT (Cathode Ray Tube) or the like. Image processing software of this embodiment displays a GUI (Graphical User Interface) on the display device 401. The GUI will be described in detail later.
The input device, such as the keyboard 402 and the mouse 403, is used for specifying a target segment in a sample image displayed on the GUI.
The image processing unit 404 is a device that includes a processor dedicated to image processing and a memory, and is used for processing reference white change processing and hue difference calculation processing, which will be described later, at high-speed.
The storage device 405 is an auxiliary storage device in which the later mentioned OS (Operating System) executed by the CPU, programs and various parameters of the image processing software, and later mentioned reference white data among others are stored in a non-volatile format. For the storage device 405, any storage type, such as HDD (Hard Disk Drive), SSD (Solid State Drive) and flash memory, may be used.
The image server 407 is a computer that saves the image data and performs various types of processing outside the computer 406.
The imaging device 408 is a WSI, a digital microscope or the like, and is a device that photographs a slide on which a pathological sample is fixed, at high magnification, and acquires a digital image at high resolution. The image data acquired by the imaging device 408 is called as sample image data.
A hardware configuration of the image processing apparatus will be described next with reference to the block diagram in Fig. 5.
The computer 406 includes a CPU (Central Processing Unit) 501, a RAM (Random Access Memory) 502, a storage device 405, a data input/output I/F 504 and an internal bus 503 that interconnects these components.
The CPU 501 accesses the RAM 502 or the like when necessary, and comprehensively controls each block of the computer 406 while executing various types of arithmetic processing.
The RAM 502 is used as an operation area of the CPU 501, and temporarily stores the OS, various programs during execution, and various data, such as image data and reference white data, which become targets of processing.
The following components are connected to the data input/output I/F 504: the image processing unit 404; the image server 407 (via a LAN I/F 505); the display device 401 (via a graphics board 506); and the imaging device 408 (via an external device I/F 507). The reference white data and the image data stored in the storage device 405 may be stored in the image server 407 or in the imaging device 408. Furthermore, input devices, such as a keyboard 402 and a mouse 403, are connected to the data input/output I/F 504 via an operation I/F 508.
In this embodiment, the configuration depicted in Fig. 5 is used, but the image processing unit 404 may be incorporated in the image server 407 or in the imaging device 408, for example. All or a part of the functions of the image processing unit 404 may be performed by the computer 406 instead. Further, here the display device 401 is assumed to be connected as an external device, but a computer integrated with the display device may be used. An example thereof is a notebook PC.
The keyboard 402 and the mouse 403 are used as examples of input devices, but such an input device as a trackball, a touchpad and a controller may be used. It is also possible to perform input and operation on screen if a touch panel display is used as the display device 401.
The image processing apparatus and the image processing method according to the embodiment of the present invention are implemented by the computer 406 executing image processing software (programs) and controlling necessary hardware resources in the above configuration.
The functional configuration of the image processing apparatus of this embodiment will be described with reference to the functional blocks in Fig. 6A.
Functionally the image processing apparatus includes an image acquisition unit 601, a GUI unit 602, a sample information acquisition unit 603, a reference white determination unit 604, a reference white data 605, and a color change unit 606. The image acquisition unit 601 has a function to acquire sample image data used for observation (diagnosis), from any one of the storage device 405, the image server 407 and the imaging device 408. The GUI unit 602 has a function to generate and display a GUI for displaying images, specifying a target segment, displaying a color change result, and setting various parameters. The sample information acquisition unit 603 has a function to acquire sample information on the sample image data. This sample information is information for specifying a target color of the sample image, and is used for processing of determining a reference white after the change. The reference white determination unit 604 has a function to determine a reference white that is suitable for observing the sample image (that is, identification level of the target segments is high). The color change unit 606 has a function to execute reference white change processing on the sample image data.
Fig. 6B shows an example of the reference white data 605 referred to by the reference white determination unit 604. The reference white data 605 is a table (list) that defines the values of a plurality of types of reference white. The reference white determination unit 604 can select a candidate of the reference white to be changed, out of the reference white data 605. The candidates of the reference white can be: A light source, B light source, C light resource, D55 light resource, D65 light source, D75 light source or the like, which are standard light sources specified by CIE (Commission Internationale de l'Eclairage). In addition to these standard light sources, the coordinate values of any color selected from the range of a 0.03 deviation from the black body locus in the uv chromaticity diagram can be selected. In this embodiment, the reference white data is defined as the XYZ tristimulus values, but may be defined as the coordinate values of the xy, uv or u'v' chromaticity diagram.
GUI, which the GUI unit 602 outputs to the display device 401, will be described with reference to Fig. 7. The GUI is displayed in a window 701, and a sample image 703, before the reference white change, selected by the user, is displayed in a region 702. The user can specify a target segment in any position in the sample image displayed in the region 702. The target segment is specified, for example, by moving a pointer 704 by the keyboard 402 or the mouse 403, and clicking a predetermined button. Fig. 7 shows an example when two target segments 705 and 706 are specified. Each mark of the target segments 705 and 706 is displayed superposed on the sample image 703, so that the positions of the specified target segments are recognized, and the information 707 and 708 on the RGB values of each target segment are displayed in the window 701. If the pointer 704 is positioned at a button 709 and a specified button of the keyboard or mouse is clicked after the two target segments are specified, then color conversion is performed such that the hue difference between the target segments increases in the color space for observation.
The flow of the color change processing of this embodiment will be described with reference to the flow chart in Fig. 8. The processing shown in Fig. 8 is implemented by an image processing software (program) which the CPU 501 of the computer 406 or the processor of the image processing unit 404 executes.
In step S801, the image acquisition unit 601 acquires the sample image data from the storage device 405 or the like. The target data to be read is specified by the user. The read sample image data is displayed on the region 702 on the GUI window 701 by the GUI unit 602 (see Fig. 7). The user (e.g. pathologist, technician) observes the displayed sample image 703, and specifies, as a target segment, a segment where a lesion is suspected, or a segment of which detailed observation is required. As shown in Fig. 7, if the color conversion button 709 is clicked on after specifying two target segments for the sample image 703, processing advances to step S802.
In step S802, the sample information acquisition unit 603 acquires the RGB values (two target colors) of the two target segments specified by the user, as the sample information. In this embodiment, the RGB values of a point (one pixel) specified on the sample image 703 are acquired, but the user may specify a region (pixel group) of the target segment on the GUI window 701, and a representative value (e.g. mean value, mode) of the RGB values in this region may be acquired.
In step S803, the reference white determination unit 604 reads the XYZ tristimulus values of the candidate reference white from the reference white data 605 stored in the storage device 405. At this time, all reference whites registered in the reference white data 605 may be read, or candidates may be narrowed down to a part of the reference whites in order to shorten the processing time. The method of narrowing down the candidates of post change reference whites can be any method, such as eliminating the reference white candidates close to the reference white used for development of the original sample image data.
Then the reference white determination unit 604 executes the processing in step S804 and the processing in step S805 for each of the reference white candidates which were read in step S803.
In step S804, the reference white determination unit 604 determines the RGB values after the reference white is changed respectively, for the RGB values of the two target segments acquired in step S802. The content of the processing in step S804 is the same as the reference white change processing described in Fig. 2A to Fig. 2C. In this case, in the RGBLab conversion S201, the reference white used for the development of the sample image data (white balance) is used for the tristimulus values Xn, Yn and Zn of the reference white. The information on the reference white used for the development of the sample image data may be acquired from the header of the sample image data or from meta data, or may be acquired as sample information which the user inputted (selected) in the GUI window 701. If the reference white when sample image data was developed is fixed (known), the Xn, Yn and Zn tristimulus values may be registered in the reference white data 605 in advance. In the case of the LabRGB conversion S202 on the other hand, the tristimulus values of the reference white candidates are used as the tristimulus values Xn, Yn and Zn of the reference white.
In step S805, the reference white determination unit 604 determines the hue difference between the two target segments using the RGB values after the reference white is changed, which was calculated in step S804. The content of the processing in step S805 is the same as the hue difference calculation processing shown in Fig. 3. In the RGBLab conversion step S302 in Fig. 3, the tristimulus values of the reference white for observation are used for the tristimulus values Xn, Yn and Zn of the reference white. This is for evaluating the hue difference in the color space for observation (that is, the difference of the hue when the sample image after the color change is actually observed on the display device). The information on the reference white for observation may be acquired by reading the set value of the color temperature from the storage device 405 or on the display device 401, or may be inputted (selected) by the user in the GUI window 701. The ambient light existing when the image is observed may be measured using a sensor disposed in the display device 401 or the like, and the color space for observation may be determined based on this measurement value.
When the hue difference is determined for all the reference white candidates, the processing advances to step S806.
In step S806, the reference white determination unit 604 selects a reference white of which hue difference between the two target segments is the greatest, from all the reference white candidates. In the following processing, the selected reference white is called "optimum reference white".
In step S807, the color change unit 606 changes the reference white of the whole (all pixels) of the sample image data to the optimum reference white. The content of the processing in step S807 is the same as the reference white change processing shown in Fig. 2A to Fig. 2C. In the RGBLab conversion S201 however, the tristimulus values of the reference white used for the development of the sample image data are used as the tristimulus values Xn, Yn and Zn of the reference white, and in the LabRGB conversion S202, the tristimulus values of the optimum reference white are used for the tristimulus values Xn, Yn and Zn of the reference white. Thereby the sample image of which reference white has been changed (hereafter called "post color change sample image data") is generated.
In step S808, the color change unit 606 sends the post color change sample image data to the GUI unit 602, and displays the sample image based on the post color change sample image data on the region 702 of the GUI window 701. At this time, the sample image after the color change may be displayed instead of the sample image before the color change, or these two images may be displayed side by side.
According to the image display system of this embodiment described above, a sample image, of which identification level among target segments has been improved compared with the original image, can be presented to the user. Moreover, the optimum color change can be automatically determined merely by specifying a target segment (target color), hence the operation load on the user is virtually null, which is superb in terms of convenience. Furthermore, the identification level is improved by changing the reference white, hence color balance in the entire image is maintained, and artifacts are barely generated. As a consequence, influence on diagnosis can be minimized.
The present inventors confirmed through experiments that in the case of a sample image stained a reddish color by HE staining or the like, the hue difference between the target segments increases as the reference white color has a temperature that is higher (has a color that is more bluish). In the case of a sample image stained a bluish color by Giemsa staining or the like, the hue difference between target segments increases as the reference white has a color temperature that is lower (has a color that is more reddish).
In this embodiment, the user specifies two target segments, but the user may specify only one target segment or three or more target segments. In the case of a method where the user specifies only one target segment, the optimum reference white is determined so as to maximize the hue difference between the color of the specified target segment and the color of the peripheral segments thereof. In the case of a method where the user specifies three or more target segments, the total value of the color differences of all the pairs is determined, and the optimum reference white is determined so as to maximize this total value, for example.
<Embodiment 2>
An image display system including an image processing apparatus according to Embodiment 2 of the present invention will be described.
In Embodiment 1, the color information of the target segments specified by the user is used as the sample information, but in Embodiment 2, data that can specify the sample type and the staining type linked to the sample image data is used as the sample information, and the optimum reference white is determined according to the sample type and the staining type. As a result, an image, of which identification level among the target segments has been improved, can be presented to the user without causing problems in selecting target segments.
The device configuration of the image display system is the same as the configuration of Embodiment 1, except that a "correspondence table of the sample type, staining type and target color" is stored in the storage device 405 in advance. The sample type is information to specify a type of the sample (e.g. organ from which sample was extracted, purpose of pathological diagnosis), and the staining type is information to specify a staining method used for preparing the sample. Data to specify the sample type and the staining type are provided by the header information of the sample image data, or by the meta data linked to the sample image data. IDs to indicate the sample type and the staining type may be embedded in the file name of the sample image data, or if a label in which the sample type and the staining type are recorded is attached to the slide, the sample type and the staining type may be recognized by the label portion in the sample image data.
In the pathological diagnosis, a type of tissue and segment, such as a nucleus, to be observed are determined almost uniquely for each combination of the sample type and the staining type. Further, if the sample preparation method and the observation environment are specified, the degree of staining of the sample and the color thereof can also be specified for each combination of the sample type and the staining type.
The correspondence table of the sample type, the staining type and the target color can be statistically created by acquiring the RGB values of the target segment from the sample images of which sample type and staining type are known. Fig. 9 shows an example of the correspondence table. The target color may include not only the color of the sample tissue, but also a color of the background where the sample tissue does not exist. The target color may be defined not by the RGB values but by the reference white and the values that can be converted into RGB values, such as XYZ tristimulus values and Lab coordinate values.
If the correspondence table in Fig. 9 is used, the processing in step S802 in Fig. 8 can be replaced with the "processing of acquiring the target color linked to the sample type and the staining type of the sample image data from the correspondence table", and the optimum reference white can be determined, and the post color change sample image data can be generated in the same manner as Embodiment 1. This processing is acceptable, but in this embodiment, an LUT linking the sample information (sample type and staining type) and the optimum reference white is created in advance based on the correspondence table in Fig. 9, and the color conversion processing is performed using this LUT, so as to further simplify processing.
The method for creating the LUT linking the sample type, the staining type and the optimum reference white will be described with reference to the flow chart in Fig. 10. The processing in Fig. 10 is executed by an LUT creation unit (not illustrated) which is a function of the image processing software.
In step S1001, the LUT creation unit reads a correspondence table of the sample type, the staining type and the target color which is stored in the storage device 405. Then the following processing is performed for each of every combination of the sample type and the staining type listed in the correspondence table.
First in step S1002, the LUT creation unit specifies two target colors (RGB values) linked to the target sample type and the staining type from the correspondence table which was read. The subsequent processing of steps S1003 to S1005 is the same as the processing of steps S803 to S805 in Fig. 8, hence description thereof is omitted. In step S1006, the LUT creation unit selects a reference white of which the hue difference between two target colors is greatest in the color space for observation, out of all the reference white candidates, and determines this reference white as the optimum reference white corresponding to the target sample type and the staining type. When the processing in steps S1002 to S1006 is applied to all combinations of the sample types and staining types, the processing advances to step S1007.
In step S1007, the LUT creation unit creates an LUT of the sample type, the staining type and the optimum reference white by combining the correspondence table of the sample type, the staining type and the target color, which were read in step S1001, and the optimum reference white acquired in step S1006, and stores the LUT in the storage device 405. Fig. 11A shows an example of the created LUT. In this embodiment, the LUT is created within the image display system, but may be created outside the image display system in advance, and be stored in the storage device 405.
A flow of the color change processing of this embodiment using the LUT will be described with reference to the flow chart in Fig. 11B.
In step S1101, the image acquisition unit 601 acquires the sample image data from the storage device 405 or the like. The target data to be read is specified by the user.
In step S1102, the sample information acquisition unit 603 acquires data to specify the sample type and the staining type corresponding to this sample image data, and specifies the sample type and the staining type of the sample image.
In step S1103, the reference white determination unit 604 reads the LUT linking the sample type, the staining type and the optimum reference white (see Fig. 11A). Then in step S1104, the reference white determination unit 604 specifies the corresponding optimum reference white based on the sample type and the staining type acquired in step S1102 and the LUT read in step S1103.
In step S1105, the color change unit 606 changes the reference white of the sample image data to the optimum reference white, and in step S1106, the sample image after the color change is displayed on the display device 401. The processing is the same as steps S807 and S808 in Fig. 8.
Thus in this embodiment, the function to improve the identification level among the target segments can be implemented at a faster speed by selecting the optimum reference white from the LUT.
In this embodiment, it was described that the optimum reference white is specified by using the specific sample preparation method and the correspondence table in the observation environment. If the sample preparation method and the observation environment are different, a correspondence table of the sample type, the staining type and the target color in this case can be created.
Instead of using the correspondence table of the sample type, the staining type and the target color, a correspondence table linking information only on the staining type and the target color may be used. In this case, the correspondence can be determined by making the color unique to the staining pigment, or the color of the background is set as the color of the target segment. The correspondence of the staining type and the target color may also be determined by estimating the sample type to be observed based on the frequency of the sample type that is observed.
<Embodiment 3>
An image display system including an image processing apparatus according to Embodiment 3 will be described.
To specify the optimum reference white, it is necessary for the user to specify the target segments in Embodiment 1, and information to specify the sample type and the staining type must be provided in Embodiment 2. Embodiment 3 is characterized in that the target color is automatically detected based on the color frequency distribution in the sample image data, and the optimum reference white is specified based on this detected target color. Thereby an image, of which identification level among the target segments have been improved, can be presented without causing problems to the user in selecting target segments.
The device configuration of the image display system is the same as the configuration of Embodiment 2 except that a correspondence table of the sample type, the staining type and the target color (peak coordinate value of the color frequency distribution in the color space for observation) is additionally stored in the storage device in advance.
The color frequency distribution of an image will be described.
As described in Embodiment 2, if a method and an environment when the sample was prepared or observed are specified, the stained degree and the color of the sample are determined for each combination of the sample type and the staining type. Therefore a color, of which frequency of appearance is high (e.g. color of a cell or a nucleus of which occupying area in the image is large) can be specified for each sample type and staining type in advance. If the color coordinate values in the color space where the frequency of appearance is high (hereafter called as color frequency peak coordinate value) is statistically determined in advance using many sample images of the sample type and staining type, then the correspondence table of the sample type, the staining type and the color frequency peak coordinate value can be created. The processing of acquiring the color frequency peak coordinate value is performed within the image display system in this embodiment, but may be performed in a device that is difference from the image display system.
A method for acquiring the color frequency peak coordinate values of the image will be described with reference to the flow chart in Fig. 12A. The processing in Fig. 12A is executed by a peak coordinate value acquisition unit (not illustrated), which is a function of the image processing software.
In step S1301, the peak coordinate value acquisition unit converts the RGB values of all or a part of the pixels of the original image data into the color coordinate values (Lab values) in the color space for observation. The content of the processing is the same as the RGBLab conversion shown in Fig. 2B.
In step S1302, the peak coordinate value acquisition unit creates a frequency distribution for the color coordinate values of a plurality of pixels determined in step S1301.
In step S1303, the peak coordinate value acquisition unit acquires a plurality of color frequency peak coordinate values from the frequency distribution created in step S1302.
Thus the peak coordinate values on the color frequency distribution of the sample image data can be acquired.
The color frequency peak coordinate values can be acquired for each combination of sample type and staining type by preparing a plurality of sample images for samples having a different combination of sample type and staining type, and applying the processing in Fig. 12A to each sample image. Fig. 12B is an example of a correspondence table linking a combination of the sample type and the staining type and a plurality of color frequency peak coordinate values. In the example of Fig. 12B, three color frequency peak coordinate values are listed, but it is sufficient if at least two color frequency peak coordinate values are linked to each combination of the sample type and the staining type. A number of color frequency peak coordinate values to link may be different for each combination of the sample type and the staining type.
In this embodiment, the color having a high frequency is specified based on the frequency distribution of the color coordinate values (Lab values) in the color space for observation of the image, but the frequency distribution of the RGB values or the XYZ tristimulus values, instead of the color coordinate values, may be used. Further, instead of the peak value in the frequency distribution, an average coordinate value, a center-of-gravity value or the like near the peak may be used, and instead of the frequency peak value in the space coordinate of Lab, a color frequency peak coordinate value or the like on the ab plane coordinates may be used.
The method for creating an LUT linking the color frequency peak coordinate value and the optimum reference white will be described next with reference to the flow chart in Fig. 13. The processing in Fig. 13 is executed by a LUT creation unit (not illustrated), which is a function of the image processing software.
In step S1501, the LUT creation unit creates an LUT of the sample type, the staining type and the optimum reference white (see Fig. 11A). The content of the processing is the same as the LUT creation processing shown in Fig. 10.
In step S1502, the LUT creation unit reads the correspondence table of the sample type, the staining type and the color frequency peak coordinate value (see Fig. 12B), which was created in advance and stored in the storage device 405.
In step S1503, the LUT creation unit creates a "LUT of the color frequency peak coordinate values and the optimum reference white" by combining the "LUT of the sample type, the staining type and the optimum reference white" and the "correspondence table of the sample type, the staining type and the color frequency peak coordinate value". Fig. 14B shows an example of the created LUT.
The flow of the color change processing using the LUT in Fig. 14B will be described next with reference to the flow chart in Fig. 14A.
In step S1701, the image acquisition unit 601 acquires the sample image data from the storage device 405 or the like. The target data to be read is specified by the user.
In step S1702, the sample information acquisition unit 603 reads the "LUT of the color frequency peak coordinate value and the optimum reference white" (see Fig. 14A) from the storage device 405 or the like.
In step S1703, the sample information acquisition unit 603 analyzes the sample image data that was read in step S1701, and acquires a plurality of color frequency peak coordinate values from the frequency distribution of the color coordinate values in the color space for observation. In the embodiment, the color frequency peak coordinate value corresponds to the target color, and the processing of detecting the color frequency peak coordinate value from the sample image data corresponds to acquisition of the sample information. The content of the processing in step S1703 is the same as the color frequency peak coordinate value acquisition processing shown in Fig. 12A.
In step S1704, the reference white determination unit 604 acquires the optimum reference white, which corresponds to the color frequency peak coordinate values acquired in step S1703, from the "LUT of the color frequency peak coordinate value and the optimum reference white". If the same combination as the color frequency peak coordinate values acquired in step S1703 does not exist in the LUT, then the optimum reference white linked to a most similar combination of the peak coordinate values is acquired.
In step S1705, the color change unit 606 changes the reference white of the sample image data to the optimum reference white, and in step S1706, the sample image after the color change is displayed on the display device 401. The processing is the same as steps S807 and S808 in Fig. 8.
Thus according to this embodiment, the optimum reference white can be selected from the LUT based on the color information which is most frequently detected from the sample image, even if there is not data linked to the sample image, and the function to improve the identification level among target segments can be implemented at high-speed.
<Other embodiments>
The embodiments described above are merely examples and are not intended to limit the scope of the present invention. Various other embodiments can be carried out within the scope of the technical concept of the present invention.
For example, in Embodiment 3, the color frequency peak coordinate values of the sample image are acquired, and the optimum reference white corresponding to the peak coordinate values is determined using the LUT. However regarding the color frequency peak coordinate values acquired from the sample image as the "color of the target segment" in Embodiment 1, a candidate, of which hue difference between the peak coordinate values after the color change becomes the greatest among a plurality of reference white candidates, may be selected as the optimum reference white, just like Fig. 8 of Embodiment 1.
If the sample type and the staining type of the image data are not clear, the background color (a color of a region where tissue does not exist) and the tissue color (color of a tissue of which frequency of appearance is high) are defined, and a reference white which increases the hue difference between the background color and the tissue color, or the hue difference among a plurality of tissue colors, may be selected.
Any method may be used for acquiring the background color. For example, a color having a pixel of which frequency of appearance is highest among pixels of which lightness is within the top 10% in the sample image, or a color having a pixel of which frequency of appearance is highest among pixels of which lightness is within the bottom 10% in the sample image, may be defined as the background color. "Lightness is within the top 10%" refers to the 10% range from the maximum value in the value range of the lightness in the sample image, and "lightness is within the bottom 10%" refers to the 10% range from the minimum value in the value range. The pixel to evaluate whether the color is the background color or not may be acquired from the entire sample image, but preferably should be acquired from a range that does not include a label attached to the slide, a marker written on the slide, dust adhering to the slide or optical system. The tissue color can be defined as a color of a pixel at one of the peaks of the distribution of the frequency of appearance, among the pixels of which lightness is different from the background color by a predetermined value or more (e.g. brightness outside a -5% to +5% range of the lightness of the background).
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a 'non-transitory computer-readable storage medium') to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2014-42518, filed on March 5, 2014, which is hereby incorporated by reference herein in its entirety.
Reference Signs
401: display device
406: computer
408: imaging device
601: image acquisition unit
602: GUI unit
603: sample information acquisition unit
604: reference white determination unit
605: reference white data
606: color change unit

Claims (14)

  1. An image processing apparatus, comprising:
    an image acquisition unit that acquires data of a sample image obtained by imaging a sample;
    a sample information acquisition unit that acquires sample information to specify a plurality of target colors to be identified in the sample image;
    a reference white determination unit that determines a reference white based on the sample information; and
    a color change unit that performs processing of changing a reference white of the sample image to the reference white determined by the reference white determination unit, on the data of the sample image, so as to generate data on a post color change sample image, wherein
    the reference white determination unit determines a reference white which is used for the color change of the sample image, so that an absolute value of the difference of the hue of the plurality of target colors in a predetermined color space for observation becomes greater in the post color change sample image than in the sample image before the color change.
  2. The image processing apparatus according to Claim 1, wherein
    the plurality of target colors includes a color obtained from each of a plurality of target segments specified by a user on the sample image.
  3. The image processing apparatus according to Claim 1, wherein
    the sample information is information which includes information on a staining type to specify a staining method used for preparing the sample and information on a sample type to specify a type of the sample, or information on the staining type, and
    the target color is a color that is linked in advance to the staining type and the sample type, or the staining type.
  4. The image processing apparatus according to Claim 1, wherein
    the plurality of target colors includes a color selected from colors having high frequency of appearance in the data of the sample image.
  5. The image processing apparatus according to any one of Claims 1 to 4, wherein
    the predetermined color space for observation is a color space that is determined based on a reference white which has been set for the image processing apparatus or a display device that is used when the post color change sample image is displayed, or a color space that is determined based on an ambient light that exists when the post color change sample image is observed.
  6. The image processing apparatus according to Claim 4, wherein
    the sample is a sample of a stained tissue, and
    the plurality of target colors includes a color of a background of the sample image and a color of the tissue in the sample image.
  7. The image processing apparatus according to Claim 6, wherein
    the color of the background is a color of a pixel of which frequency of appearance is the highest among pixels of which lightness is within the top 10%, or a color of a pixel of which frequency of appearance is the highest among pixels of which lightness is within the bottom 10%, in the sample image.
  8. The image processing apparatus according to Claim 6 or 7, wherein
    the color of the tissue is a color of a pixel having one of peaks in a distribution of the frequency of appearance, among pixels having lightness that is different from the color of the background by a predetermined value or more, in the sample image.
  9. The image processing apparatus according to any one of Claims 1 to 8, wherein
    the reference white determination unit:
    executes the processing of changing the reference white of the plurality of target colors for each of a plurality of reference white candidates;
    calculates, for each of the plurality of reference white candidates, an absolute value of the difference of the hue in the predetermined color space for observation, based on values of the plurality of target colors after the reference white is changed; and
    selects, out of the plurality of reference white candidates, a reference white candidate of which absolute value of the difference of the hue in the predetermined color space for observation is the greatest, as the reference white that is used for the color change of the sample image.
  10. The image processing apparatus according to Claim 9, wherein
    the plurality of reference white candidates include a color selected from a range that deviates 0.03 from a black body locus in a uv chromaticity diagram.
  11. The image processing apparatus according to any one of Claims 1 to 8, wherein
    the reference white determination unit determines the reference white that is used for the color change of the sample image, by using a lookup table linking the sample information and the reference white.
  12. The image processing apparatus according to Claim 11, wherein
    the lookup table is:
    a lookup table linking information which includes information on a staining type to specify a staining method, and information on a sample type to specify a type of the sample, and a reference white to be used for the color change of a sample image corresponding to the staining type and the sample type;
    a lookup table linking information on a staining type and a reference white to be used for the color change of a sample image corresponding to the staining type; or
    a lookup table linking a plurality of target colors statistically determined from a plurality of sample images, and a reference white that allows a color change to increase an absolute value of the difference of the hue of the plurality of target colors.
  13. A method for controlling an image processing apparatus, comprising:
    an image acquisition step of acquiring data of a sample image obtained by imaging a sample;
    a sample information acquisition step of acquiring sample information to specify a plurality of target colors to be identified in the sample image;
    a reference white determination step of determining a reference white based on the sample information; and
    a color change step of performing processing of changing a reference white of the sample image to the reference white determined in the reference white determination step, on the data of the sample image, so as to generate data on a post color change sample image, wherein
    in the reference white determination step, a reference white which is used for the color change of the sample image is determined, so that an absolute value of the difference of the hue of the plurality of target colors in a predetermined color space for observation becomes greater in the post color change sample image than in the sample image before the color change.
  14. A program causing a computer to execute each step of the method for controlling an image processing apparatus according to Claim 13.
PCT/JP2015/001016 2014-03-05 2015-02-26 Image processing apparatus and image processing method WO2015133100A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014042518A JP2015169991A (en) 2014-03-05 2014-03-05 Image processor, and image processing method
JP2014-042518 2014-03-05

Publications (1)

Publication Number Publication Date
WO2015133100A1 true WO2015133100A1 (en) 2015-09-11

Family

ID=54054917

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/001016 WO2015133100A1 (en) 2014-03-05 2015-02-26 Image processing apparatus and image processing method

Country Status (2)

Country Link
JP (1) JP2015169991A (en)
WO (1) WO2015133100A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7467247B2 (en) 2020-06-11 2024-04-15 キヤノン株式会社 Image processing device, image processing method, and program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997020198A2 (en) * 1995-11-30 1997-06-05 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
WO2000070541A1 (en) * 1999-05-13 2000-11-23 Resolution Sciences Corporation Transformation of digital images
JP2008093225A (en) * 2006-10-13 2008-04-24 Olympus Medical Systems Corp Endoscope system and image processing method in this endoscope system
EP2040218A1 (en) * 2006-07-10 2009-03-25 Nikon Corporation Image processing device and image processing program
US20100157042A1 (en) * 2008-06-16 2010-06-24 Olympus Corporation Image data processing device, its storage medium, and its method
JP2011062261A (en) * 2009-09-15 2011-03-31 Hoya Corp Enhanced image processor and medical observation system
JP2011145264A (en) * 2010-01-18 2011-07-28 Olympus Corp Biological specimen analyzer
WO2013179581A1 (en) * 2012-05-30 2013-12-05 パナソニック株式会社 Image measurement device, image measurement method and image measurement system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997020198A2 (en) * 1995-11-30 1997-06-05 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
WO2000070541A1 (en) * 1999-05-13 2000-11-23 Resolution Sciences Corporation Transformation of digital images
EP2040218A1 (en) * 2006-07-10 2009-03-25 Nikon Corporation Image processing device and image processing program
JP2008093225A (en) * 2006-10-13 2008-04-24 Olympus Medical Systems Corp Endoscope system and image processing method in this endoscope system
US20100157042A1 (en) * 2008-06-16 2010-06-24 Olympus Corporation Image data processing device, its storage medium, and its method
JP2011062261A (en) * 2009-09-15 2011-03-31 Hoya Corp Enhanced image processor and medical observation system
JP2011145264A (en) * 2010-01-18 2011-07-28 Olympus Corp Biological specimen analyzer
WO2013179581A1 (en) * 2012-05-30 2013-12-05 パナソニック株式会社 Image measurement device, image measurement method and image measurement system

Also Published As

Publication number Publication date
JP2015169991A (en) 2015-09-28

Similar Documents

Publication Publication Date Title
US20200082517A1 (en) Image processing apparatus, imaging system and image processing method
JP2019533805A (en) Digital pathology system and associated workflow for providing visualized slide-wide image analysis
JP5576993B2 (en) Image measuring apparatus, image measuring method, and image measuring system
JP5442542B2 (en) Pathological diagnosis support device, pathological diagnosis support method, control program for pathological diagnosis support, and recording medium recording the control program
US20220270258A1 (en) Histological image analysis
JP6485146B2 (en) Color conversion information generation apparatus and program
JP7171549B2 (en) Image analysis evaluation method, computer program, image analysis evaluation device
WO2015133100A1 (en) Image processing apparatus and image processing method
US9293113B2 (en) Image processing apparatus and control method thereof
US20220262040A1 (en) Microstructural image analysis device and microstructural image analysis method
WO2016189818A1 (en) Image processing apparatus, imaging system and image processing method
JP5903315B2 (en) Image processing apparatus and image processing program
WO2019239532A1 (en) Image processing device, image processing method and program
Tong et al. Interactive non-uniformity correction and intensity standardization of MR images
JP5910637B2 (en) Biological image analysis system, biological image analysis method, and biological image analysis program
JP7427423B2 (en) Image processing device, image processing method and program
JP7467247B2 (en) Image processing device, image processing method, and program
US11288800B1 (en) Attribution methodologies for neural networks designed for computer-aided diagnostic processes
JP7366689B2 (en) Image processing device, image processing method and program
JP5762571B2 (en) Image processing method, image processing apparatus, image processing program, and recording medium
JP2019054471A (en) Microscope image processing apparatus, microscope image processing method, and microscope image processing program
JP2017146387A (en) Information processing apparatus, control method, and program
JP2014164080A (en) Image display device and method for controlling the same
JP2013109626A (en) Color calculation device, color calculation method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15759219

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15759219

Country of ref document: EP

Kind code of ref document: A1