WO2013100028A1 - Image processing device, image display system, image processing method, and image processing program - Google Patents
Image processing device, image display system, image processing method, and image processing program Download PDFInfo
- Publication number
- WO2013100028A1 WO2013100028A1 PCT/JP2012/083830 JP2012083830W WO2013100028A1 WO 2013100028 A1 WO2013100028 A1 WO 2013100028A1 JP 2012083830 W JP2012083830 W JP 2012083830W WO 2013100028 A1 WO2013100028 A1 WO 2013100028A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- image
- composite
- display
- data
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 159
- 238000003672 processing method Methods 0.000 title claims description 9
- 239000002131 composite material Substances 0.000 claims abstract description 192
- 238000003384 imaging method Methods 0.000 claims abstract description 107
- 238000000034 method Methods 0.000 claims description 74
- 230000008569 process Effects 0.000 claims description 36
- 230000015572 biosynthetic process Effects 0.000 claims description 32
- 238000003786 synthesis reaction Methods 0.000 claims description 32
- 238000003860 storage Methods 0.000 claims description 13
- 238000001000 micrograph Methods 0.000 claims description 8
- 238000002156 mixing Methods 0.000 claims description 6
- 238000004519 manufacturing process Methods 0.000 claims description 5
- 230000007246 mechanism Effects 0.000 claims description 3
- 230000011218 segmentation Effects 0.000 claims 1
- 238000003745 diagnosis Methods 0.000 abstract description 11
- 230000006870 function Effects 0.000 description 21
- 230000008859 change Effects 0.000 description 19
- 230000003287 optical effect Effects 0.000 description 11
- 238000012937 correction Methods 0.000 description 10
- 238000011161 development Methods 0.000 description 8
- 230000000875 corresponding effect Effects 0.000 description 7
- 238000005259 measurement Methods 0.000 description 7
- 239000000203 mixture Substances 0.000 description 7
- 210000004027 cell Anatomy 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005304 joining Methods 0.000 description 6
- 238000012360 testing method Methods 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 5
- 238000010827 pathological analysis Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 210000001519 tissue Anatomy 0.000 description 3
- 238000005401 electroluminescence Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 238000000879 optical micrograph Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 239000006059 cover glass Substances 0.000 description 1
- 210000000805 cytoplasm Anatomy 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000008393 encapsulating agent Substances 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000011068 loading method Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000004171 remote diagnosis Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/41—Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors
Definitions
- the present invention relates to an image processing apparatus, and more particularly to digital image processing for observation of an imaging target.
- the amount of data created by the virtual slide system is enormous, and it is possible to observe from the micro (detailed enlarged image) to the macro (overall bird's-eye view) by performing enlargement / reduction processing with the viewer, providing various conveniences. To do.
- By acquiring all necessary information in advance it is possible to display immediately from the low-magnification image to the high-magnification image at the resolution and magnification required by the user.
- various information useful for pathological diagnosis by image analysis of the acquired digital data, for example, calculating the shape and number of cells, and calculating the area ratio (N / C ratio) between cytoplasm and nucleus. It becomes.
- Patent Literature 1 discloses a microscope system that divides an imaging target into small sections, captures images, and stitches the images in the small sections obtained thereby to display a composite image of the imaging target.
- Patent Document 2 discloses an image display system that obtains a plurality of partial images to be imaged by moving a microscope stage and imaging a plurality of times, correcting distortion of the images, and joining them. Yes. In Patent Document 2, a composite image in which the joints are not conspicuous is created.
- the joint portion in the composite image obtained by the microscope system of Patent Document 1 and the image display system of Patent Document 2 is unavoidably caused by the positional deviation between partial images, artifacts due to distortion correction, and the like. There is a high possibility that the image is different from that observed by the pathologist. Nevertheless, if a diagnosis is performed on a composite image without recognizing the possibility of such a misdiagnosis, a diagnosis is made based on the joint portion of the composite image, and it is difficult to make a highly accurate diagnosis. There was a problem of becoming.
- the gist of the present invention is Image data acquisition means for acquiring a plurality of divided image data obtained by dividing an imaging range of an imaging target into a plurality of areas and imaging; Combined image data generating means for generating combined image data based on the plurality of divided image data; A composite area display data generation means for generating display image data for allowing an observer to recognize a composite area of the composite image data;
- the synthetic area display data generating means includes:
- the present invention relates to an image processing apparatus characterized in that at least one of color and luminance is changed for all the composite areas included in a display area.
- the gist of the present invention is At least in a microscope image display system having an image processing device and an image display device,
- the image processing apparatus according to claim 1 wherein the image processing apparatus is an image processing apparatus according to claim 1.
- the image display device relates to a microscope image display system that displays the composite image data of the imaging target sent from the image processing device and image data for allowing an observer to recognize the composite portion.
- the gist of the present invention is An image data acquisition step of acquiring a plurality of divided image data obtained by dividing and imaging an imaging range of an imaging target; A composite image data generation step of generating composite image data based on the plurality of divided image data; A composite region display data generation step for generating display image data for allowing an observer to recognize a composite region of the composite image data,
- the synthetic region display data generation step includes:
- the present invention relates to an image processing method characterized in that at least one of color and luminance is changed for all the composite areas included in a display area.
- the gist of the present invention is An image data acquisition step of acquiring a plurality of divided image data obtained by dividing the imaging range of the imaging target into a plurality of regions and imaging; A composite image data generation step for generating composite image data based on the plurality of divided image data; A combined region display data generation step for generating display image data for allowing an observer to recognize the combined region of the combined image data;
- the synthetic region display data generation step includes The present invention relates to a program that causes a computer to change at least one of color and luminance for all the combined areas included in a display area.
- image processing apparatus image display system, image processing method, and image processing program of the present invention, it is possible to prevent difficulty in making a highly accurate diagnosis based on the composite portion of the composite image.
- FIG. 1 shows an example of an overall view of an apparatus configuration of an image display system using an image processing apparatus of the present invention.
- the example of the functional block diagram of the imaging device in the image display system using the image processing apparatus of this invention is shown.
- 1 shows an example of a functional block diagram of an image processing apparatus of the present invention.
- the example of the hardware block diagram of the image processing apparatus of this invention is shown. It is a figure explaining the concept of synthetic
- a preferred image processing apparatus includes an image data acquisition unit that acquires a plurality of divided image data obtained by dividing an imaging range of an imaging target into a plurality of regions, and the plurality of divided image data.
- Composite image data generating means for generating composite image data based on the above, and composite area display data generating means for generating display image data for allowing an observer to recognize a composite area of the composite image data.
- the synthetic area display data generation means changes at least one of color and luminance for all the synthetic areas included in the display area.
- the image processing apparatus of the present invention can be applied to an image acquired by a microscope.
- the image processing apparatus of the present invention can be used in an image display system, and in particular, can be used in a microscope image display system and a virtual slide system.
- a plurality of image data are smoothly connected by connecting, superimposing, alpha blending, or interpolation processing.
- a method of connecting a plurality of superimposed image data a method of aligning and connecting based on stage position information, a method of connecting corresponding points or lines of a plurality of divided images, and divided image data Including a connection method based on the position information.
- To superimpose broadly means to superimpose image data.
- the method for superimposing a plurality of image data includes a case where a part or all of the plurality of image data is superimposed in a region having overlapping image data.
- Alpha blending refers to combining two images with a coefficient ( ⁇ value).
- the method of smoothly connecting by interpolation processing includes processing by zero-order interpolation, linear interpolation, and high-order interpolation. In order to connect images smoothly, it is preferable to perform processing by high-order interpolation.
- the synthesized part data generating means is means for generating data for allowing an observer to visually recognize the synthesized part in the displayed image.
- the synthesized part is the joint between the original partial image data, or how it looks different from the original partial image data in the synthesized image by the synthesis process This is an area where image data to be generated is generated.
- the connection of the original partial image data is not a mere line, but an area including a certain degree around it. To do.
- the width of the fixed area can be changed depending on the magnification at the time of display.
- data for displaying a composite part is not simply data of a composite part (position information of the composite part, etc.), but the composite image data is processed so that the composite part in the displayed image can be visually recognized.
- Data that is the previous stage for this purpose, or a portion that has been rewritten with data different from the partial image data so that the composite portion in the composite image data can be visually recognized (this forms part of the composite image data) is there.
- the composite part data generation means may generate the composite image data after generating the composite image data and generate image data for allowing the observer to recognize the composite part.
- Image data for allowing the observer to recognize the location may be generated.
- the order of the generation of the composite image data and the generation of the image data for causing the observer to recognize the composite portion is not limited, and may be generated simultaneously, for example.
- As a method for displaying the synthesized portion it is preferable to change the color or the luminance.
- the data acquisition unit acquires the plurality of divided image data obtained by capturing a microscope image (optical microscope image) and uses the divided image data in the virtual slide system.
- the image data acquisition means acquires a plurality of divided image data obtained by imaging so that the divided image data has an overlapping area, and the synthesized part data generating means determines the area where the divided image data overlaps, It can generate as image data for making an observer recognize a synthetic
- the image data generating means When acquiring a plurality of divided image data obtained by capturing the divided image data so that the divided image data has an overlapping area, the image data generating means superimposes or blends the plurality of divided image data to obtain a composite image. It is preferable to generate data.
- the image data generating means When acquiring a plurality of divided image data obtained by imaging so that the divided image data has overlapping regions, the image data generating means performs an interpolation process on the region where the plurality of divided image data overlaps. Thus, it is preferable to generate composite image data.
- the composite image data generation unit generates composite image data for connecting and displaying a plurality of divided image data, and the composite part data generation unit generates a joint line of the plurality of divided image data as composite part data. be able to.
- the image processing apparatus further includes a combined portion data switching unit that switches image data for allowing an observer to recognize the combined portion generated by the combined portion data generating unit. It is preferable that the synthetic part data switching means can switch whether or not to display the synthetic part and the display mode of the synthetic part.
- the switching unit switches the image data for allowing the observer to recognize the combined portion generated by the combined portion data generating unit at a predetermined boundary.
- a predetermined magnification and a predetermined scroll speed can be used as the predetermined boundary.
- a suitable image display system includes at least an image processing device described above, composite image data to be imaged sent from the image processing device, and an image for allowing an observer to recognize a composite portion in the composite image data. And an image display device for displaying data.
- an image processing method suitable for the present invention includes an image data acquisition step of acquiring a plurality of divided image data obtained by dividing an imaging range of an imaging target into a plurality of areas and imaging.
- a composite image data generation step for generating composite image data based on the plurality of divided image data, and composite region display data generation for generating display image data for allowing an observer to recognize a composite region of the composite image data
- the combined region display data generation step changes at least one of color and luminance for all the combined regions included in the display region.
- the composite image data generation step and the composite portion data generation step can be performed simultaneously.
- a suitable program of the present invention includes an image data acquisition step for acquiring a plurality of divided image data obtained by dividing an imaging range of an imaging target into a plurality of areas and imaging the plurality of divided image data.
- the computer executes a composite image data generation step for generating composite image data based on the image, and a composite region display data generation step for generating display image data for allowing an observer to recognize a composite region of the composite image data.
- the computer causes the computer to execute a process of changing at least one of color and luminance for all the composite areas included in the display area.
- the preferred mode described in the image processing apparatus of the present invention can be reflected.
- the image processing apparatus of the present invention can be used in an image display system including an imaging device and an image display device. This image display system will be described with reference to FIG.
- FIG. 1 shows an image display system using an image processing apparatus according to the present invention, which includes an imaging apparatus (microscope apparatus or virtual slide scanner) 101, an image processing apparatus 102, and an image display apparatus 103, and is an imaging target.
- This is a system having a function of acquiring and displaying a two-dimensional image of a target (test sample).
- the imaging apparatus 101 and the image processing apparatus 102 are connected by a dedicated or general-purpose I / F cable 104, and the image processing apparatus 102 and the image display apparatus 103 are connected by a general-purpose I / F cable 105. .
- the imaging device 101 can use a virtual slide device having a function of capturing a plurality of two-dimensional images having different positions in the two-dimensional direction and outputting a digital image.
- a solid-state image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) is used to acquire a two-dimensional image.
- the imaging device 101 can be configured by a digital microscope device in which a digital camera is attached to an eyepiece of a normal optical microscope.
- the image processing apparatus 102 is an apparatus having a function of generating composite image data by using original image data obtained by dividing from a plurality of original image data acquired from the imaging apparatus 101.
- the image processing apparatus 102 includes a general-purpose computer or workstation including hardware resources such as a CPU (Central Processing Unit), a RAM, a storage device, an operation unit, and an I / F.
- the storage device is a large-capacity information storage device such as a hard disk drive, and stores programs, data, OS (operating system) and the like for realizing each processing described later.
- Each function described above is realized by the CPU loading a necessary program and data from the storage device to the RAM and executing the program.
- the operation unit includes a keyboard, a mouse, and the like, and is used by an operator to input various instructions.
- the image display device 103 is a monitor that displays an observation image that is a result of the arithmetic processing performed by the image processing device 102, and includes a CRT, a liquid crystal display, or the like.
- the imaging system is configured by the three devices of the imaging device 101, the image processing device 102, and the image display device 103, but the configuration of the present invention is not limited to this configuration.
- an image processing device in which an image display device is integrated may be used, or the function of the image processing device may be incorporated in the imaging device.
- the functions of the imaging device, the image processing device, and the image display device can be realized by a single device.
- the functions of the image processing apparatus and the like may be divided and realized by a plurality of apparatuses.
- FIG. 2 is a block diagram illustrating a functional configuration of the imaging apparatus 101.
- the imaging apparatus 101 is generally configured by an illumination unit 201, a stage 202, a stage control unit 205, an imaging optical system 207, an imaging unit 210, a development processing unit 216, a pre-measurement unit 217, a main control system 218, and a data output unit 219. Is done.
- the illumination unit 201 is means for uniformly irradiating light to the preparation 206 arranged on the stage 202, and includes a light source, an illumination optical system, and a light source drive control system.
- the stage 202 is driven and controlled by a stage control unit 205, and can move in three directions of XYZ.
- the preparation 206 is a member in which a section of tissue to be observed and a smeared cell are attached on a slide glass and fixed together with an encapsulant under the cover glass.
- the stage control unit 205 includes a drive control system 203 and a stage drive mechanism 204.
- the drive control system 203 receives an instruction from the main control system 218 and performs drive control of the stage 202.
- the moving direction, moving amount, and the like of the stage 202 are determined based on the position information and thickness information (distance information) of the imaging target measured by the pre-measurement unit 217 and an instruction from the user as necessary.
- the stage drive mechanism 204 drives the stage 202 in accordance with instructions from the drive control system 203.
- the imaging optical system 207 is a lens group for forming an optical image of the imaging target of the preparation 206 on the imaging sensor 208.
- the imaging unit 210 includes an imaging sensor 208 and an analog front end (AFE) 209.
- the imaging sensor 208 is a one-dimensional or two-dimensional image sensor that changes a two-dimensional optical image into an electrical physical quantity by photoelectric conversion, and for example, a CCD or a CMOS device is used. In the case of a one-dimensional sensor, a two-dimensional image is obtained by scanning in the scanning direction.
- the imaging sensor 208 outputs an electrical signal having a voltage value corresponding to the light intensity.
- a color image is desired as the captured image, for example, a single-plate image sensor to which a Bayer array color filter is attached may be used.
- the imaging unit 210 captures a divided image to be imaged by driving the stage 202 in the XY axis direction.
- the AFE 209 is a circuit that converts an analog signal output from the image sensor 208 into a digital signal.
- the AFE 209 includes an H / V driver, a CDS (Correlated double sampling), an amplifier, an AD converter, and a timing generator, which will be described later.
- the H / V driver converts a vertical synchronization signal and a horizontal synchronization signal for driving the image sensor 208 into potentials necessary for driving the sensor.
- CDS is a double correlation sampling circuit that removes noise of fixed patterns.
- the amplifier is an analog amplifier that adjusts the gain of an analog signal from which noise has been removed by CDS.
- the AD converter converts an analog signal into a digital signal.
- the AD converter converts the analog signal into digital data quantized from about 10 bits to about 16 bits and outputs it in consideration of subsequent processing.
- the converted sensor output data is called RAW data.
- the RAW data is developed by the subsequent development processing unit 216.
- the timing generator generates a signal for adjusting the timing of the image sensor 208 and the timing of the development processing unit 216 at the subsequent stage.
- the AFE 209 is indispensable, but in the case of a CMOS image sensor capable of digital output, the function of the AFE 209 is included in the sensor.
- an imaging control unit that controls the imaging sensor 208, and controls the operation timing and control of the imaging sensor 208 such as the operation control, shutter speed, frame rate, and ROI (Region Of Interest). Do.
- the development processing unit 216 includes a black correction unit 211, a white balance adjustment unit 212, a demosaicing processing unit 213, a filter processing unit 214, and a ⁇ correction unit 215.
- the black correction unit 211 performs a process of subtracting the black correction data obtained at the time of shading from each pixel of the RAW data.
- the white balance adjustment unit 212 performs a process of reproducing a desired white color by adjusting the gain of each RGB color according to the color temperature of the light of the illumination unit 201. Specifically, white balance correction data is added to the RAW data after black correction. When handling a monochrome image, the white balance adjustment process is not necessary.
- the development processing unit 216 generates divided image data to be imaged captured by the imaging unit 210.
- the demosaicing processing unit 213 performs processing for generating image data for each color of RGB from RAW data in the Bayer array.
- the demosaicing processing unit 213 calculates the value of each RGB color of the target pixel by interpolating the values of peripheral pixels (including pixels of the same color and other colors) in the RAW data.
- the demosaicing processing unit 213 also executes defective pixel correction processing (interpolation processing). Note that when the imaging sensor 208 does not have a color filter and a single color image is obtained, the demosaicing process is not necessary.
- the filter processing unit 214 is a digital filter that realizes suppression of high-frequency components included in an image, noise removal, and enhancement of resolution.
- the gamma correction unit 215 executes processing for adding an inverse characteristic to an image in accordance with the gradation expression characteristic of a general display device, or adjusts to a human visual characteristic by gradation compression or dark part processing of a high luminance part. Or perform tone conversion.
- gradation conversion suitable for the subsequent synthesis processing and display processing is applied to the image data.
- the pre-measurement unit 217 is a unit that performs pre-measurement for calculating the position information of the imaging target on the slide 206, the distance information to the desired focal position, and the parameter for adjusting the amount of light caused by the thickness of the imaging target. By acquiring information by the pre-measurement unit 217 before the main measurement, it is possible to perform image pickup without waste.
- a two-dimensional image sensor having a lower resolving power than the image sensor 208 is used to acquire position information on the two-dimensional plane.
- the pre-measurement unit 217 grasps the position of the imaging target on the XY plane from the acquired image. For obtaining distance information and thickness information, a laser displacement meter or a Shack-Hartmann measuring instrument is used.
- the main control system 218 is a function for controlling the various units described so far.
- the functions of the main control system 218 and the development processing unit 216 are realized by a control circuit having a CPU, a ROM, and a RAM. That is, the program and data are stored in the ROM, and the functions of the main control system 218 and the development processing unit 216 are realized by the CPU executing the program using the RAM as a work memory.
- a device such as an EEPROM or a flash memory is used as the ROM, and a DRAM device such as DDR3 is used as the RAM.
- the data output unit 219 is an interface for sending the RGB color image generated by the development processing unit 216 to the image processing apparatus 102.
- the imaging apparatus 101 and the image processing apparatus 102 are connected by an optical communication cable.
- a general-purpose interface such as USB or Gigabit Ethernet (registered trademark) is used.
- FIG. 3 is a block diagram showing a functional configuration of the image processing apparatus 102 of the present invention.
- the image processing apparatus 102 generally includes a data input / output unit 301 308, a storage holding unit 302, a synthesis processing unit 303, a synthesis location extraction unit 304, a synthesis location drawing unit 305, a superimposition processing unit 306, and a mode selection unit 307.
- the image processing apparatus 102 generally includes a data input / output unit 301 308, a storage holding unit 302, a synthesis processing unit 303, a synthesis location extraction unit 304, a synthesis location drawing unit 305, a superimposition processing unit 306, and a mode selection unit 307.
- the storage holding unit 302 receives, through the data input unit 301, stores and holds RGB color division image data obtained by dividing and imaging an imaging target acquired from an external device.
- the color image data includes not only image data but also position information.
- the position information is information indicating which part of the imaging target is captured by the divided image data.
- the position information can be acquired by recording the XY coordinates at the time of driving the stage 202 together with the divided image data at the time of imaging.
- the synthesis processing unit 303 applies the composite image data of the imaging target to the color image data (divided image data) obtained by dividing the imaging target and capturing the image based on the position information of each divided image data. Generate.
- the synthesis part extraction unit 304 extracts a synthesis part subjected to interpolation processing or the like in the composite image data generated by the synthesis processing unit 303. For example, when the divided image data is simply connected, the joint is extracted as a synthesis location. When the divided image data is smoothly connected using an interpolation process or the like, a joint area to which the interpolation process or the like is applied is extracted as a synthesis portion. In the present embodiment, a configuration is assumed in which imaging is performed so that regions corresponding to joints overlap, and smoothing is performed by applying interpolation processing to the obtained divided image data.
- the mode selection part 307 selects the aspect which displays a synthetic
- a mode for displaying the synthesized portion a change in color, a change in luminance, a dotted line display, a dot reduction display, and the like are defined. Details will be described with reference to FIG.
- the composite part drawing unit 305 draws the composite part in the form selected by the mode selection unit 307 for the composite part extracted by the composite part extraction unit 304.
- the superimposition processing unit 306 performs a superimposition process on the drawing data of the combined portion drawn by the combining portion drawing unit 305 and the combined image data generated by the combining processing unit 303.
- the composite image data after the superimposition process obtained by superimposing and forming the obtained joint the original divided image data that has not been subjected to the composite process is distinguished from the portion that has undergone the composite process.
- the obtained composite image data after the superimposition processing that clearly indicates the composite part is sent to an external monitor or the like via the data output unit 308.
- FIG. 4 is a block diagram showing a hardware configuration of the image processing apparatus according to the present invention.
- a PC Personal Computer 400 is used as the information processing apparatus.
- the PC 400 includes a CPU (Central Processing Unit) 401, an HDD 402 (Hard Disk Drive), a RAM (Random Access Memory) 403, a data input / output unit 405, and a bus 404 that connects these components to each other.
- a CPU Central Processing Unit
- HDD Hard Disk Drive
- RAM Random Access Memory
- the CPU 401 appropriately accesses the RAM 403 or the like as necessary, and comprehensively controls each block of the PC 400 while performing various arithmetic processes.
- An HDD (Hard Disk Drive) 402 is an auxiliary storage device that records and reads information in which firmware such as an OS, a program, and various parameters to be executed by the CPU 401 is fixedly stored.
- the RAM 403 is used as a work area for the CPU 401, and temporarily stores various data to be processed such as an OS, various programs being executed, and a composite image after superimposition processing, which is a feature of the present invention.
- the data input / output unit 405 is connected to the image display device 103, the input device 407, the imaging device 101 as an external device, and the like.
- the image display device 103 is a display device using, for example, liquid crystal, EL (Electro-Luminescence), CRT (Cathode Ray Tube), or the like.
- the image display device 103 is assumed to be connected as an external device, but a PC 400 integrated with the image display device may be assumed.
- the input device 407 is, for example, a pointing device such as a mouse, a keyboard, a touch panel, and other operation input devices.
- the input device 407 includes a touch panel, the touch panel can be integrated with the image display device 103.
- the imaging apparatus 101 is an imaging apparatus such as a microscope apparatus or a virtual slide scanner.
- Composition location display A composite image after superimposition processing displayed by the image display device 103, which is display data generated by the superimposition processing unit 306 included in the image processing device of the present invention, will be described with reference to FIG.
- the image processing apparatus of the present invention generates a composite image data by combining a plurality of image data obtained by dividing and capturing an image (FIG. 5A). By drawing and superimposing a synthetic part on the obtained composite image data, the composite image data after the superimposition processing in which the synthetic part is clearly shown is obtained (FIG. 5B).
- data is generated by changing the color information of the composite part, and data is generated by changing luminance information of the composite part.
- Use data such as generating data to display the center line) in a grid, displaying the composite part with a marker such as an arrow, switching the drawn composite part in a time-sharing manner, and flashing the presence or absence of superposition.
- a method of generating display data by changing the color of the combined portion is preferable because the region of the combined portion can be clearly indicated.
- a method of generating display data by changing the luminance of the synthesized portion is preferable because the region of the synthesized location is clearly indicated and the image data of the synthesized location necessary for diagnosis can be used.
- step 601 the image processing apparatus 300 obtains a plurality of pieces of image data obtained by dividing the image pickup range of the image pickup target into a plurality of areas via the data input / output unit 301 from the external image pickup apparatus 101 or the like. (Divided image data) is acquired and sent to the storage holding unit 302.
- step 602 position information included in the divided image data stored in the storage holding unit 302 or position information attached to the divided image data as separate data is grasped.
- the position information is information indicating which part of the imaging target is captured by the divided image data.
- the divided image data is synthesized by the synthesis processing unit 303 based on the grasped position information to generate synthesized image data to be imaged.
- the synthesis processing method includes a method of connecting a plurality of partial image data by connecting, superimposing, alpha blending, and smoothly connecting by interpolation processing.
- a method of connecting a plurality of superimposed image data a method of aligning and connecting based on stage position information, a method of connecting corresponding points or lines of a plurality of divided images, and divided image data Including a connection method based on the position information.
- To superimpose broadly means to superimpose image data.
- the method for superimposing a plurality of image data includes a case where a part or all of the plurality of image data is superimposed in a region having overlapping image data.
- Alpha blending refers to combining two images with a coefficient ( ⁇ value).
- the method of smoothly connecting by interpolation processing includes processing by zero-order interpolation, linear interpolation, and high-order interpolation. In order to connect images smoothly, it is preferable to perform processing by high-order interpolation.
- the mode selection unit 307 selects the display method of the composite part.
- the mode selection unit first, it is selected whether or not to display the composite part.
- how to display it is selected. For example, as a display method, selection of a display mode such as color change or luminance change is performed.
- step 605 it is determined whether or not to display the composite part on the composite image.
- the composite part is not displayed on the composite image, the composite part is not drawn or the drawn composite part is not superimposed by the superimposition processing unit 306 via the data input / output unit 308.
- the image data is sent to.
- the process proceeds to the next step S606.
- step 606 based on the position information, the region of the composite part is extracted from the generated composite image data.
- step 607 the composite part drawing unit 305 generates the composite part drawing data in the composite part extracted in step 606 by the display method selected in step 604. The details of the combined part drawing data generation will be described later using another flowchart.
- step 608 the superimposition processing unit 306 superimposes the composite part drawing data generated in step 607 and the composite image data obtained in step 603 to obtain composite image data after the superimposition process in which the composite part is clearly shown. Details of the superimposition processing will be described later using a separate flowchart.
- FIG. 7 is a flowchart showing a flow of drawing data generation for a joint which is a synthesis part.
- the flow in the case of changing and displaying either the luminance or the color for the synthesized part will be described.
- step S701 the mode set by the mode selection unit 307 is grasped. Here, it is grasped whether the luminance is changed or the color is changed with respect to the drawing method of the composite portion.
- step S702 it is determined whether or not to change the brightness for the drawing method of the composite portion. If the luminance is changed, the process proceeds to step S703. If the luminance is not changed, the process proceeds to step S706.
- step S703 it is determined whether or not to reduce the luminance of the combined portion in response to the luminance change. If the luminance of the synthesis location is to be lowered relative to the luminance of the region other than the synthesis location, the process proceeds to step S704. If the luminance is not reduced, that is, the luminance of the region other than the synthesis location is changed, or the luminance is increased. In that case, the process proceeds to step S705.
- step S704 drawing data of the combined portion with reduced brightness is generated for the combined portion of the combined image.
- step S705 drawing data of the synthesized portion where the luminance is not changed or the luminance is improved for the synthesized portion is generated.
- step S706 in response to the color change, a color for displaying the composite portion is set.
- step S707 the drawing data of the composite portion is generated based on the color set in step S706.
- FIG. 8 is a flowchart showing a flow of superimposing the combined portion drawing data on the combined image.
- the composite image data and the composite part drawing data are subjected to superposition processing.
- the combined portion can be clearly indicated by reducing the luminance of the combined image as one display method. By reducing the brightness of the composite image, it is useful when gazing at the composite part while observing the displayed image.
- step S801 the synthesized image data synthesized by the synthesis processing unit 303 is acquired.
- step S802 the synthesized part drawing data generated by the synthesized part drawing data generation unit 305 is acquired.
- step S803 it is determined whether or not to reduce the brightness of the combined portion based on the setting of the drawing method of the combined portion grasped in step S701. In the case where the luminance of the synthesis portion is decreased relative to the luminance of the region other than the synthesis portion, the process proceeds to step S804. In the case where the luminance of the synthesis portion is not decreased, that is, when the luminance of the region other than the synthesis portion is changed. Advances to step S805.
- step S804 the combined portion drawing data in which the luminance of the combined portion is reduced in response to the luminance of the combined portion being decreased relative to the luminance of the region other than the combined portion, and the combined image generated in step S603. Data is superimposed.
- the superimposition processing includes not only a process for generating a superimposed image in such a manner that the combined portion drawing data is overwritten on the composite image data, but also a process for generating a superimposed image by alpha blending each image data.
- FIG. 9 illustrates an example where display image data generated by the image processing apparatus 102 of the present invention is displayed on the image display apparatus 103.
- FIG. 9A shows a screen layout of the image display device 103.
- a display area 902 for imaging object image data for detailed observation On the screen, a display area 902 for imaging object image data for detailed observation, an imaging object thumbnail image 903 for observation, and a display setting area 904 are displayed in an overall window 901.
- Each area and image may have a form in which the display area of the entire window 901 is divided into functional areas by a single document interface or a form in which each area is constituted by a separate window by a multi-document interface.
- imaging target image data for detailed observation is displayed in the display area 902 for imaging target image data.
- an enlarged / reduced image is displayed by moving the display area (selecting and moving a partial area to be observed from among the entire imaging target) or changing the display magnification.
- the thumbnail image 902 displays the position and size of the display area 902 of the imaging target image data in the whole image of the imaging target.
- a display button can be changed by selecting and pressing a setting button 905 according to a user instruction from an externally connected input device 407 such as a touch panel or a mouse 906.
- FIG. 9B is a display setting screen displayed in a dialog when the setting button is selected and pressed, and it is selected whether or not to display a joint area which is a composite portion in the composite image.
- a setting button is provided and the setting screen is opened by pressing the button, but a UI that can display, select, and change various detailed settings shown in FIG. 9C is displayed.
- a form directly provided on the setting area 904 may be used.
- a configuration may be adopted in which a screen in which a list of detailed settings including the presence / absence of a composite location display is integrated is displayed.
- FIG. 9C is a screen for setting the display method of the joint area displayed in the dialog when the joint area of the composite part is displayed. Select or set the display. Specifically, the display method of the joint area at the composite location is selected from among changing the color of the joint area, changing the brightness of the joint area, changing the brightness of the composite image, and the like. The choice of color change and brightness change is exclusive. When changing the color of the joint area, the color of the joint area can be selected. In the color change, a list of color swatches is further presented, and the user can select a desired color.
- the luminance can be changed by using an intuitive interface such as a slider or inputting a numerical value for changing the luminance relative to the current luminance value.
- FIG. 9D is an example of a display screen when the color is changed for the composite area.
- FIG. 9E is an example of a display screen in which the synthesized part is clearly shown by lowering the luminance value of the synthesized image in the area other than the synthesized part.
- the display mode of the composite part can be changed according to a user instruction.
- a flow of changing the mode (superimposition processing) of the area display of the composite part will be described with reference to the flowchart of FIG.
- step S1001 it is determined whether or not there is an instruction to change the composite location display by the user. If there is an instruction, the process proceeds to step S1002. If there is no instruction, the current display content is retained.
- step S1002 the user's instruction content regarding the display mode of the composite part is grasped.
- step S1003 the image data of the synthesized part is acquired.
- the contents of the processing are the same as those in step 606 in FIG.
- step S1004 the composite part drawing unit 305 generates composite part drawing data by the display method grasped in step S1002.
- the processing content of the combined portion drawing data generation processing is the same as that shown in FIG.
- step S1005 the composite image data after the superimposition process in which the composite part is clearly shown is obtained by superimposing the composite part drawing data generated in step S1004 and the composite image described in step 303 of FIG.
- the contents of the superimposition process are the same as those shown in FIG.
- the display method of the composite part in the observation image it is possible to change the display method of the composite part in the observation image according to the user's instruction and intention. For example, when observing an image with a display method with a reduced brightness value at the synthesis location selected, if the region of interest moves from a region other than the synthesis location to the synthesis location area, this time, the synthesis location By returning the luminance value to the original value and reducing the luminance value of the region other than the synthetic part instead, it becomes possible to observe the morphology of the tissue and cells smoothly while being aware of the synthetic part.
- combined portion drawing data is superimposed on all the combined image data in advance, a partial region of the superimposed combined image displayed on the image display device 103 is selected, and image display is performed. It is possible to output to the apparatus, or to superimpose and output the combined portion drawing data corresponding to a part of the area displayed on the image display apparatus 103.
- the synthesized part has a certain area connected by the interpolation process, and the diagnosis process is performed by combining the display of the observation image and the explicit indication by changing the luminance of the synthesized part. Image diagnosis can be performed without hindering.
- display data for generating a display by changing the color and brightness of the region of the composite portion is generated with respect to image data obtained by combining interpolation processing or the like to a plurality of pieces of image data captured by division.
- display data for explicitly displaying a line of a joint portion with respect to a composite image obtained by arranging a plurality of images captured in a divided manner along a one-dimensional (line) joint. is generated.
- the image composition method is a method of arranging only the position accuracy of the stage, or by changing the pixel position by geometric transformation such as affine transformation of the obtained divided image data.
- the configuration described in the first embodiment can be used except for the configuration different from the first embodiment.
- FIG. 11 is an overall view of an apparatus configuration of an image display system according to the second embodiment of the present invention.
- the image display system using the image processing apparatus of the present invention includes an image server 1101, an image processing apparatus 102, and an image display apparatus 103.
- the image processing apparatus 102 can acquire a divided image to be captured from the image server 1101 and generate image data to be displayed on the image display apparatus 103.
- the image server 1101 and the image processing apparatus 102 are connected by a general-purpose I / F LAN cable 1103 via a network 1102.
- the image server 1101 is a computer including a large-capacity storage device that stores divided image data captured by the imaging device 101 that is a virtual slide device.
- the image server 1101 may store the divided images as a single unit in a local storage connected to the image server 1101, or a server group (cloud server) that is divided and exists somewhere on the network. ), And the divided image data entity and link information may be separately provided.
- the divided image data itself does not need to be stored on one server.
- the image processing apparatus 102 and the image display apparatus 103 are the same as those of the imaging system
- the image processing system is configured by three devices, the image server 1101, the image processing device 102, and the image display device 103, but the present invention is not limited to this configuration.
- an image processing device integrated with an image display device may be used, or a part of the functions of the image processing device 102 may be incorporated in the image server 1101.
- the functions of the image server 1101 and the image processing apparatus 102 may be divided and realized by a plurality of apparatuses.
- Composition location display A composite image after superimposition processing displayed by the image display device 103, which is display data generated by the superimposition processing unit 306 included in the image processing device of the second embodiment, will be described with reference to FIG.
- a plurality of pieces of image data obtained by dividing and capturing images are connected, for example, after arbitrarily performing processing such as coordinate conversion.
- a composite image is generated by arranging the image data after the conversion processing in accordance with an arbitrary boundary.
- a color, a line width, and a line type can be set for the line at the joining point.
- the line type can be a single line or a multiple line, a dotted line, a broken line or a chain line, or a combination thereof.
- FIG. 13 is a flowchart showing a process flow in the second embodiment corresponding to the process flow of the drawing data generation 607 of the combined portion in FIG. 6 in the first embodiment.
- steps S1301, S1302, and S1303 the color, line width, and line type are selected according to the selection of the mode selection unit 307, respectively.
- step S1304 the setting data selected in steps S1301, S1302, and S1303 is reflected to generate drawing data for a line that is a composite location.
- FIG. 14 is a flowchart showing a flow of superimposing the synthesized part drawing data on the synthesized image in the second embodiment. The flow of this process corresponds to FIG. 8 in the first embodiment.
- step S140 synthesized image data synthesized by the synthesis processing unit 303 is acquired.
- step S1402 the combined portion drawing data generated by the combined portion drawing data generation unit 305 is acquired.
- the contents of the combined part drawing data generation process are as described in FIG.
- step S1403 the composite image data acquired in step S1401 and the composite portion drawing data acquired in step S1402 are superimposed.
- FIG. 15A is an example of a screen layout when image data generated by the image processing apparatus 102 is displayed on the image display apparatus 103 in the second embodiment.
- a display area 1502 for imaging object image data for detailed observation In the display area, a display area 1502 for imaging object image data for detailed observation, an imaging object thumbnail image 1503 for observation, and a display setting area 1504 are displayed in the entire window 1501.
- the joint 1505 of the composite image In the display area 1502 of the imaging target image data, the joint 1505 of the composite image is displayed with lines in addition to the imaging target image data for detailed observation.
- FIG. 10B is a display setting screen displayed in a dialog when the setting button is selected and pressed, and it is selected whether or not to display a joint that is a composite portion in the composite image as a line. .
- FIG. 10B is a display setting screen displayed in a dialog when the setting button is selected and pressed, and it is selected whether or not to display a joint that is a composite portion in the composite image as a
- 15C is a screen for various display settings of the joint line displayed in the dialog when the joint of the composite part is displayed as a line, and the line that is the joint of the composite part in the composite image is displayed.
- Set the display format Specifically, the line color, line width, line type, and the like can be selected and set.
- a setting button is provided and the setting screen is opened by pressing the button, but various detailed settings shown in FIG. 15C are displayed and selected.
- a UI that can be changed may be provided directly on the display setting area 1504.
- a configuration may be adopted in which a screen in which a list of detailed settings including the presence / absence of a composite location display is integrated is displayed.
- the mode of displaying the composite part of the composite image is changed by setting the boundary.
- the display magnification is set as a boundary on the assumption that the joint portion is an area having a certain width, and the first is used when the display magnification is high.
- the composite part of the embodiment can be displayed as an area, and when the display magnification becomes low, the composite part of the second embodiment can be displayed with a line.
- the width of the joint area is data of 64 pixels (64 pixels) at a display magnification of 40
- the display magnification observation magnification referred to by an optical microscope
- the number of pixels used for the display of the joint area is 64 pixels and 32 pixels, respectively, and since it has a sufficient width, it is desirable for observation that an image of the joint area is displayed although the luminance changes.
- the number of pixels used for displaying the joint region is 8 pixels and 16 pixels, respectively. It is not enough to observe the morphology of cells and cells.
- the stitching of image data equivalent to what a user can visually observe such as a microscope image has been described.
- the present invention is not only for joining images based on such visible information, but also for the internal structure of the human body such as a magnetic resonance imaging apparatus (MRI), an X-ray diagnostic apparatus, a diagnostic apparatus using optical ultrasound, and the like.
- the present invention can be applied to the joining of display data obtained by these apparatuses that visualize information using various means and principles, and brings about the same effect.
- image data is generated from intensity information, so contrast and brightness changes around joints, and errors due to correction processing and errors due to stitching alignment are acquired with a microscope with color information.
- the object of the present invention may be achieved by the following. That is, a recording medium (or storage medium) in which a program code of software that realizes all or part of the functions of the above-described embodiments is recorded is supplied to the system or apparatus. Then, the computer (or CPU or MPU) of the system or apparatus reads and executes the program code stored in the recording medium.
- the program code itself read from the recording medium realizes the functions of the above-described embodiments, and the recording medium on which the program code is recorded constitutes the present invention.
- an operating system (OS) operating on the computer performs part or all of the actual processing based on the instruction of the program code.
- OS operating system
- the program code read from the recording medium is written in a memory provided in a function expansion card inserted into the computer or a function expansion unit connected to the computer. Thereafter, based on the instruction of the program code, the CPU of the function expansion card or function expansion unit performs part or all of the actual processing, and the functions of the above-described embodiments are realized by the processing. It can be included in the invention.
- program code corresponding to the flowchart described above is stored in the recording medium.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Microscoopes, Condenser (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
撮像対象の撮像範囲を複数の領域に分割して撮像することにより得られた複数の分割画像データを取得する画像データ取得手段と、
前記複数の分割画像データに基づいて合成画像データを生成する合成画像データ生成手段と、
前記合成画像データの合成領域を観察者に認識させるための表示用画像データを生成する合成領域表示用データ生成手段と、を有し、
前記合成領域表示用データ生成手段は、
表示領域に含まれる前記合成領域すべてに対して色又は輝度の少なくとも何れか一方を変更することを特徴とする画像処理装置に関する。 The gist of the present invention is
Image data acquisition means for acquiring a plurality of divided image data obtained by dividing an imaging range of an imaging target into a plurality of areas and imaging;
Combined image data generating means for generating combined image data based on the plurality of divided image data;
A composite area display data generation means for generating display image data for allowing an observer to recognize a composite area of the composite image data;
The synthetic area display data generating means includes:
The present invention relates to an image processing apparatus characterized in that at least one of color and luminance is changed for all the composite areas included in a display area.
少なくとも、画像処理装置と画像表示装置を有する顕微鏡画像表示システムにおいて、
前記画像処理装置は、請求項1乃至6のいずれか一項に記載の画像処理装置であって、
前記画像表示装置は、前記画像処理装置から送られてくる前記撮像対象の合成画像データ及び前記合成箇所を観察者に認識させるための画像データを表示する
ことを特徴とする顕微鏡画像表示システムに関する。 The gist of the present invention is
At least in a microscope image display system having an image processing device and an image display device,
The image processing apparatus according to
The image display device relates to a microscope image display system that displays the composite image data of the imaging target sent from the image processing device and image data for allowing an observer to recognize the composite portion.
撮像対象の撮像範囲を複数の領域に分割して撮像することにより得られた複数の分割画像データを取得する画像データ取得工程と、
前記複数の分割画像データに基づいて合成画像データを生成する合成画像データ生成工程と、
前記合成画像データの合成領域を観察者に認識させるための表示用画像データを生成する合成領域表示用データ生成工程と、を有し、
前記合成領域表示用データ生成工程は、
表示領域に含まれる前記合成領域すべてに対して色又は輝度の少なくとも何れか一方を変更することを特徴とする画像処理方法に関する。 The gist of the present invention is
An image data acquisition step of acquiring a plurality of divided image data obtained by dividing and imaging an imaging range of an imaging target;
A composite image data generation step of generating composite image data based on the plurality of divided image data;
A composite region display data generation step for generating display image data for allowing an observer to recognize a composite region of the composite image data,
The synthetic region display data generation step includes:
The present invention relates to an image processing method characterized in that at least one of color and luminance is changed for all the composite areas included in a display area.
撮像対象の撮像範囲を複数の領域に分割して撮像することにより得られた複数の分割画像データを取得する画像データ取得ステップと、
前記複数の分割画像データに基づいて合成画像データを生成する合成画像データ生成ステップと、
前記合成画像データの合成領域を観察者に認識させるための表示用画像データを生成する合成領域表示用データ生成ステップと、
前記合成領域表示用データ生成ステップは、
表示領域に含まれる前記合成領域すべてに対して色又は輝度の少なくとも何れか一方を変更するコンピュータに実行させることを特徴とするプログラムに関する。 The gist of the present invention is
An image data acquisition step of acquiring a plurality of divided image data obtained by dividing the imaging range of the imaging target into a plurality of regions and imaging;
A composite image data generation step for generating composite image data based on the plurality of divided image data;
A combined region display data generation step for generating display image data for allowing an observer to recognize the combined region of the combined image data;
The synthetic region display data generation step includes
The present invention relates to a program that causes a computer to change at least one of color and luminance for all the combined areas included in a display area.
前記複数の分割画像データに基づいて合成画像データを生成する合成画像データ生成工程と、前記合成画像データの合成領域を観察者に認識させるための表示用画像データを生成する合成領域表示用データ生成工程と、を有し、前記合成領域表示用データ生成工程は、表示領域に含まれる前記合成領域すべてに対して色又は輝度の少なくとも何れか一方を変更する。合成画像データ生成工程と前記合成箇所データ生成工程は、同時に行うことができる。 In addition, an image processing method suitable for the present invention includes an image data acquisition step of acquiring a plurality of divided image data obtained by dividing an imaging range of an imaging target into a plurality of areas and imaging.
A composite image data generation step for generating composite image data based on the plurality of divided image data, and composite region display data generation for generating display image data for allowing an observer to recognize a composite region of the composite image data And the combined region display data generation step changes at least one of color and luminance for all the combined regions included in the display region. The composite image data generation step and the composite portion data generation step can be performed simultaneously.
本発明の画像処理装置は、撮像装置と画像表示装置を備えた画像表示システムにおいて用いることができる。この画像表示システムについて、図1を用いて説明する。 [First Embodiment]
The image processing apparatus of the present invention can be used in an image display system including an imaging device and an image display device. This image display system will be described with reference to FIG.
図1は、本発明の画像処理装置を用いた画像表示システムであり、撮像装置(顕微鏡装置、又はバーチャルスライドスキャナ)101、画像処理装置102、画像表示装置103から構成され、撮像対象となる撮像対象(被検試料)の二次元画像を取得し表示する機能を有するシステムである。撮像装置101と画像処理装置102の間は、専用もしくは汎用I/Fのケーブル104で接続され、画像処理装置102と画像表示装置103の間は、汎用のI/Fのケーブル105で接続される。 (Configuration of imaging system)
FIG. 1 shows an image display system using an image processing apparatus according to the present invention, which includes an imaging apparatus (microscope apparatus or virtual slide scanner) 101, an
図2は、撮像装置101の機能構成を示すブロック図である。 (Configuration of imaging device)
FIG. 2 is a block diagram illustrating a functional configuration of the
図3は、本発明の画像処理装置102の機能構成を示すブロック図である。 (Configuration of image processing apparatus)
FIG. 3 is a block diagram showing a functional configuration of the
図4は、本発明の画像処理装置のハードウェア構成を示すブロック図である。情報処理装置として、例えばPC(Personal Computer)400が用いられる。 (Hardware configuration of image forming apparatus)
FIG. 4 is a block diagram showing a hardware configuration of the image processing apparatus according to the present invention. For example, a PC (Personal Computer) 400 is used as the information processing apparatus.
本発明の画像処理装置が有する重畳処理部306によって生成された表示データである、画像表示装置103によって表示される重畳処理後の合成画像ついて図5を用いて説明する。 (Composition location display)
A composite image after superimposition processing displayed by the
本発明の画像処理装置における合成箇所表示データ生成の流れを図6のフローチャートを用いて説明する。 (Method of displaying the composite part)
The flow of generating the combined part display data in the image processing apparatus of the present invention will be described with reference to the flowchart of FIG.
図7は、合成箇所である繋ぎ目の描画データ生成の流れを示すフローチャートである。図7では、合成箇所について輝度又は色のいずれかを変更して表示する場合の流れについて説明する。 (Drawing joints)
FIG. 7 is a flowchart showing a flow of drawing data generation for a joint which is a synthesis part. In FIG. 7, the flow in the case of changing and displaying either the luminance or the color for the synthesized part will be described.
図8は、合成箇所描画データを合成画像に重畳する流れを示すフローチャートである。図8では、合成画像データと合成箇所描画データが重畳処理される。合成箇所の描画方法として輝度の変更が選択された場合、一つの表示方法として合成画像の輝度を低下させることで、合成箇所を明示することができる。合成画像の輝度を低下させることで、表示された画像を観察しつつ、合成箇所を注視する場合に有用である。 (Composite location drawing data superimposition)
FIG. 8 is a flowchart showing a flow of superimposing the combined portion drawing data on the combined image. In FIG. 8, the composite image data and the composite part drawing data are subjected to superposition processing. When the change in luminance is selected as the method for drawing the combined portion, the combined portion can be clearly indicated by reducing the luminance of the combined image as one display method. By reducing the brightness of the composite image, it is useful when gazing at the composite part while observing the displayed image.
図9は、本発明の画像処理装置102で生成した表示用の画像データを画像表示装置103に表示した場合の一例を説明する。 (Display screen layout)
FIG. 9 illustrates an example where display image data generated by the
ユーザーに所望の形態で合成画像データ上の合成箇所を表示した後、さらにユーザーの指示によって合成箇所の表示の態様を変更することができる。合成箇所の領域表示の態様(重畳処理)を変更する流れを図10のフローチャートを用いて説明する。 (Change display of composite area)
After displaying the composite part on the composite image data in a desired form to the user, the display mode of the composite part can be changed according to a user instruction. A flow of changing the mode (superimposition processing) of the area display of the composite part will be described with reference to the flowchart of FIG.
本発明の第2実施形態に係る画像表示システムについて図を用いて説明する。 [Second Embodiment]
An image display system according to a second embodiment of the present invention will be described with reference to the drawings.
図11は、本発明の第2の実施形態に係る画像表示システムの装置構成の全体図である。 (Configuration of image processing system)
FIG. 11 is an overall view of an apparatus configuration of an image display system according to the second embodiment of the present invention.
第2実施形態の画像処理装置が有する重畳処理部306によって生成された表示データである、画像表示装置103によって表示される重畳処理後の合成画像について図12を用いて説明する。 (Composition location display)
A composite image after superimposition processing displayed by the
図13は、実施形態1における図6の合成箇所の描画データ生成607の処理の流れに相当する実施態様2における処理の流れを示すフローチャートである。 (Drawing joints)
FIG. 13 is a flowchart showing a process flow in the second embodiment corresponding to the process flow of the drawing data generation 607 of the combined portion in FIG. 6 in the first embodiment.
図14は、実施態様2における合成画像に合成箇所描画データを重畳する流れを示すフローチャートである。本処理の流れは実施形態1における図8に相当する。 (Superimposition of combined part drawing data)
FIG. 14 is a flowchart showing a flow of superimposing the synthesized part drawing data on the synthesized image in the second embodiment. The flow of this process corresponds to FIG. 8 in the first embodiment.
図15(a)は、実施態様2において、画像処理装置102で生成した画像データを画像表示装置103で表示した場合の画面レイアウトの一例である。表示領域には、全体ウィンドウの1501内に、詳細観察用の撮像対象画像データの表示領域1502、観察対象の撮像対象サムネイル画像1503、表示設定の領域1504がそれぞれ表示されている。撮像対象画像データの表示領域1502では、詳細観察用の撮像対象画像データの他、合成画像の繋ぎ目1505が線で表示されている。図10(b)は、設定ボタンが選択、押下された場合に、ダイアログで表示される表示設定の画面であり、合成画像における合成箇所である繋ぎ目を線で表示するか否かを選択する。図15(c)は、合成箇所の繋ぎ目を線で表示する場合に、ダイアログで表示される繋ぎ目の線の各種表示設定の画面であり、合成画像における合成箇所の繋ぎ目である線をどのような形態で表示するかを設定する。具体的には、線の色、線の幅、線の種類等が選択、設定できる。なお、本実施形態でも実施形態1と同様に、設定ボタンを設け、ボタンを押下することで設定画面が開く形態を想定しているが、図15(c)に示す各種詳細設定が表示、選択、変更可能なUIを表示設定の領域1504上に直接設ける形態でも構わない。また、合成箇所表示の有無を含めて詳細設定の一覧が一体化された画面を表示する構成をとっても良い。 (Screen layout)
FIG. 15A is an example of a screen layout when image data generated by the
第3実施形態では、合成画像の合成箇所を表示する態様を、境界を設定して変更した。 [Third Embodiment]
In the third embodiment, the mode of displaying the composite part of the composite image is changed by setting the boundary.
第1から第3までの実施形態では、顕微鏡画像等のユーザーが目視観察できるのと同等な画像データのつなぎ合わせについて説明してきた。本発明は、こうした可視情報に基づく画像のつなぎ合わせだけではなく、磁気共鳴イメージング装置(MRI)、X線診断装置の他、光超音波を用いた診断装置など、人体の内部構造等一般には不可視である情報を様々な手段、原理を用いて可視化したこれらの装置で得られた表示データのつなぎ合わせにおいても適用可能であり、同様の効果をもたらす。特に可視化画像と異なり、強度の情報から生成される画像データのため、つなぎ目周辺のコントラストや輝度変化および補正処理による画像の劣化やスティッチングの位置合わせによる誤差は、色情報を持つ顕微鏡等で取得された画像情報と比べて、画像の変化、特異点がつなぎ合わせによって生じたものなのか、それとも診断部位の異常な状態を示すものなのかを判断するのがより困難である。そのため、ユーザーに対してつなぎ目を明確に提示することで、つなぎ目周辺の画像領域の信頼性が低い可能性があることを示すことは非常に重要である。 [Other Embodiments]
In the first to third embodiments, the stitching of image data equivalent to what a user can visually observe such as a microscope image has been described. The present invention is not only for joining images based on such visible information, but also for the internal structure of the human body such as a magnetic resonance imaging apparatus (MRI), an X-ray diagnostic apparatus, a diagnostic apparatus using optical ultrasound, and the like. The present invention can be applied to the joining of display data obtained by these apparatuses that visualize information using various means and principles, and brings about the same effect. In particular, unlike visualized images, image data is generated from intensity information, so contrast and brightness changes around joints, and errors due to correction processing and errors due to stitching alignment are acquired with a microscope with color information. It is more difficult to determine whether an image change or a singular point is caused by stitching or an abnormal state of the diagnostic region, compared to the image information. Therefore, it is very important to show that there is a possibility that the reliability of the image area around the joint may be low by clearly presenting the joint to the user.
102 画像処理装置
103 画像表示装置
303 合成処理部
304 合成箇所抽出部
305 合成箇所描画部
306 重畳処理部
307 モード選択部
1101 画像サーバー
DESCRIPTION OF
Claims (16)
- 撮像対象の撮像範囲を複数の領域に分割して撮像することにより得られた
複数の分割画像データを取得する画像データ取得手段と、
前記複数の分割画像データに基づいて合成画像データを生成する合成画像データ生成手段と、
前記合成画像データの合成領域を観察者に認識させるための表示用画像データを生成する合成領域表示用データ生成手段と、を有し、
前記合成領域表示用データ生成手段は、
表示領域に含まれる前記合成領域すべてに対して色又は輝度の少なくとも何れか一方を変更する
ことを特徴とする画像処理装置。 Image data acquisition means for acquiring a plurality of divided image data obtained by dividing an imaging range of an imaging target into a plurality of areas and imaging;
Combined image data generating means for generating combined image data based on the plurality of divided image data;
A composite area display data generation means for generating display image data for allowing an observer to recognize a composite area of the composite image data;
The synthetic area display data generating means includes:
An image processing apparatus, wherein at least one of color and luminance is changed for all the composite areas included in a display area. - 前記画像処理装置は、顕微鏡で取得した画像に適用することを特徴とする請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the image processing apparatus is applied to an image acquired with a microscope.
- 前記画像処理装置は、バーチャルスライドシステムに用いることを特徴とする請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the image processing apparatus is used in a virtual slide system.
- 前記画像データ取得手段は、前記分割画像データが重複する領域を持つようにして撮像することにより得られた前記複数の分割画像データを取得し、
前記合成箇所データ生成手段は、前記分割画像データが重複する領域の画像データを用いて、前記合成箇所を観察者に認識させるための画像データを生成することを特徴とする請求項1乃至3のいずれか一項に記載の画像処理装置。 The image data acquisition means acquires the plurality of divided image data obtained by imaging so that the divided image data has an overlapping region,
The said synthetic | combination location data production | generation means produces | generates the image data for making an observer recognize the said synthetic | combination location using the image data of the area | region where the said division | segmentation image data overlaps. The image processing apparatus according to any one of claims. - 前記画像データ生成手段は、前記複数の分割画像データを重畳又はブレンディングして合成画像データを生成することを特徴とする請求項4に記載の画像処理装置。 5. The image processing apparatus according to claim 4, wherein the image data generation means generates composite image data by superimposing or blending the plurality of divided image data.
- 前記画像データ生成手段は、前記複数の分割画像データが重複する領域について補間処理をして合成画像データを生成することを特徴とする請求項4に記載の画像処理装置。 5. The image processing apparatus according to claim 4, wherein the image data generation means generates composite image data by performing interpolation processing on a region where the plurality of divided image data overlap.
- 前記合成画像データ生成手段は、前記複数の分割画像データを繋いで表示させるための合成画像データを生成し、前記合成箇所データ生成手段は、前記複数の分割画像データの繋ぎ目の線を合成箇所データとして生成することを特徴とする請求項1乃至3のいずれか一項に記載の画像処理装置。 The composite image data generation unit generates composite image data for connecting and displaying the plurality of divided image data, and the composite part data generation unit generates a line of a joint of the plurality of divided image data. The image processing apparatus according to claim 1, wherein the image processing apparatus is generated as data.
- 前記画像処理装置は、前記合成箇所データ生成手段で生成された合成箇所を観察者に認識させるための画像データを切り替える合成箇所データ切り替え手段を更に有することを特徴とする請求項1乃至7のいずれか一項に記載の画像処理装置。 The said image processing apparatus further has a synthetic | combination location data switching means which switches the image data for making an observer recognize the synthetic | combination location produced | generated by the said synthetic | combination location data production | generation means. An image processing apparatus according to claim 1.
- 前記切り替え手段は、所定の境界で、前記合成箇所データ生成手段で生成された合成箇所を観察者に認識させるための画像データを切り替えることを特徴とする請求項8に記載の画像処理装置。 9. The image processing apparatus according to claim 8, wherein the switching unit switches image data for allowing an observer to recognize the combined portion generated by the combined portion data generating unit at a predetermined boundary.
- 前記切り替え手段は、所定の倍率で、前記合成箇所データ生成手段で生成された合成箇所を観察者に認識させるための画像データを切り替えることを特徴とする請求項8に記載の画像処理装置。 9. The image processing apparatus according to claim 8, wherein the switching unit switches image data for allowing an observer to recognize the combined portion generated by the combined portion data generating unit at a predetermined magnification.
- 少なくとも、画像処理装置と画像表示装置を有する顕微鏡画像表示システムにおいて、
前記画像処理装置は、請求項1乃至6のいずれか一項に記載の画像処理装置であって、
前記画像表示装置は、前記画像処理装置から送られてくる前記撮像対象の合成画像データ
及び前記合成箇所を観察者に認識させるための画像データを表示する
ことを特徴とする顕微鏡画像表示システム。 At least in a microscope image display system having an image processing device and an image display device,
The image processing apparatus according to claim 1, wherein the image processing apparatus is an image processing apparatus according to claim 1.
The microscopic image display system, wherein the image display device displays composite image data of the imaging target and image data for allowing an observer to recognize the composite portion transmitted from the image processing device. - 前記顕微鏡画像表示システムは、合成箇所表示の指示が入力されると前記合成箇所を表示する機構を有することを特徴とする請求項11に記載の顕微鏡画像表示システム。 The microscope image display system according to claim 11, wherein the microscope image display system has a mechanism for displaying the composite part when an instruction to display the composite part is input.
- 撮像対象の撮像範囲を複数の領域に分割して撮像することにより得られた複数の分割画像データを取得する画像データ取得工程と、
前記複数の分割画像データに基づいて合成画像データを生成する合成画像データ生成工程と、
前記合成画像データの合成領域を観察者に認識させるための表示用画像データを生成する合成領域表示用データ生成工程と、を有し、
前記合成領域表示用データ生成工程は、
表示領域に含まれる前記合成領域すべてに対して色又は輝度の少なくとも何れか一方を変更する
ことを特徴とする画像処理方法。 An image data acquisition step of acquiring a plurality of divided image data obtained by dividing and imaging an imaging range of an imaging target;
A composite image data generation step of generating composite image data based on the plurality of divided image data;
A composite region display data generation step for generating display image data for allowing an observer to recognize a composite region of the composite image data,
The synthetic region display data generation step includes:
An image processing method, comprising: changing at least one of color and luminance for all the composite areas included in a display area. - 前記合成画像データ生成工程と前記合成箇所データ生成工程は、
同時に行うことを特徴とする請求項13に記載の画像処理方法。 The composite image data generation step and the composite portion data generation step are:
The image processing method according to claim 13, wherein the image processing methods are performed simultaneously. - 撮像対象の撮像範囲を複数の領域に分割して撮像することにより得られた
複数の分割画像データを取得する画像データ取得ステップと、
前記複数の分割画像データに基づいて合成画像データを生成する合成画像データ生成ステップと、
前記合成画像データの合成領域を観察者に認識させるための表示用画像データを生成する合成領域表示用データ生成ステップと、
をコンピュータに実行させるプログラムであって、
前記合成領域表示用データ生成ステップにおいて、
表示領域に含まれる前記合成領域すべてに対して色又は輝度の少なくとも何れか一方を変更する処理をコンピュータに実行させることを特徴とするプログラム。 An image data acquisition step of acquiring a plurality of divided image data obtained by dividing the imaging range of the imaging target into a plurality of areas and imaging;
A composite image data generation step for generating composite image data based on the plurality of divided image data;
A combined region display data generation step for generating display image data for allowing an observer to recognize the combined region of the combined image data;
A program for causing a computer to execute
In the synthetic area display data generation step,
A program for causing a computer to execute a process of changing at least one of color and luminance for all the synthesis areas included in a display area. - 請求項15記載のプログラムを記録したコンピュータが読み取り可能な記憶媒体。
A computer-readable storage medium storing the program according to claim 15.
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280064883.1A CN104025150A (en) | 2011-12-27 | 2012-12-27 | Image processing device, image display system, image processing method, and image processing program |
BR112014015375A BR112014015375A8 (en) | 2011-12-27 | 2012-12-27 | image processing apparatus, image display system and image processing method and program |
EP12863006.8A EP2800053A4 (en) | 2011-12-27 | 2012-12-27 | Image processing device, image display system, image processing method, and image processing program |
RU2014131048A RU2014131048A (en) | 2011-12-27 | 2012-12-27 | IMAGE PROCESSING DEVICE, IMAGE DISPLAY SYSTEM AND MACHINE-READABLE MEDIA FOR IMAGE PROCESSING |
KR1020147020975A KR20140105036A (en) | 2011-12-27 | 2012-12-27 | Image processing device, image display system, image processing method, and image processing program |
US13/909,847 US8854448B2 (en) | 2011-12-27 | 2013-06-04 | Image processing apparatus, image display system, and image processing method and program |
US14/475,289 US20140368632A1 (en) | 2011-12-27 | 2014-09-02 | Image processing apparatus, image display system, and image processing method and program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011286785 | 2011-12-27 | ||
JP2011-286785 | 2011-12-27 | ||
JP2012-282781 | 2012-12-26 | ||
JP2012282781A JP5350532B2 (en) | 2011-12-27 | 2012-12-26 | Image processing apparatus, image display system, image processing method, and image processing program |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/909,847 Continuation US8854448B2 (en) | 2011-12-27 | 2013-06-04 | Image processing apparatus, image display system, and image processing method and program |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2013100028A1 true WO2013100028A1 (en) | 2013-07-04 |
WO2013100028A9 WO2013100028A9 (en) | 2014-05-30 |
Family
ID=48697507
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/083830 WO2013100028A1 (en) | 2011-12-27 | 2012-12-27 | Image processing device, image display system, image processing method, and image processing program |
Country Status (8)
Country | Link |
---|---|
US (2) | US8854448B2 (en) |
EP (1) | EP2800053A4 (en) |
JP (1) | JP5350532B2 (en) |
KR (1) | KR20140105036A (en) |
CN (1) | CN104025150A (en) |
BR (1) | BR112014015375A8 (en) |
RU (1) | RU2014131048A (en) |
WO (1) | WO2013100028A1 (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6455829B2 (en) * | 2013-04-01 | 2019-01-23 | キヤノン株式会社 | Image processing apparatus, image processing method, and program |
WO2014196203A1 (en) * | 2013-06-06 | 2014-12-11 | パナソニックIpマネジメント株式会社 | Image acquisition device, image acquisition method, and program |
EP3007432B1 (en) * | 2013-06-07 | 2020-01-01 | Panasonic Intellectual Property Management Co., Ltd. | Image acquisition device and image acquisition method |
KR20150033162A (en) * | 2013-09-23 | 2015-04-01 | 삼성전자주식회사 | Compositor and system-on-chip having the same, and driving method thereof |
US9509909B2 (en) * | 2013-11-18 | 2016-11-29 | Texas Instruments Incorporated | Method and apparatus for a surround view camera system photometric alignment |
JP6355334B2 (en) * | 2013-12-27 | 2018-07-11 | 株式会社キーエンス | Magnification observation apparatus, magnification image observation method, magnification image observation program, and computer-readable recording medium |
CN106133787B (en) * | 2014-02-04 | 2020-09-01 | 皇家飞利浦有限公司 | Method for registering and visualizing at least two images |
US9892493B2 (en) | 2014-04-21 | 2018-02-13 | Texas Instruments Incorporated | Method, apparatus and system for performing geometric calibration for surround view camera solution |
US10141354B2 (en) | 2014-10-23 | 2018-11-27 | Panasonic Intellectual Property Management Co., Ltd. | Imaging device and image acquisition device |
JP6511893B2 (en) * | 2015-03-23 | 2019-05-15 | 日本電気株式会社 | Image processing apparatus, image processing method, and program |
KR20180013169A (en) * | 2016-07-28 | 2018-02-07 | 삼성전자주식회사 | Method for displaying content and electronic device using the same |
CN111094942B (en) * | 2017-06-27 | 2024-01-23 | 生命科技控股私人有限公司 | Sample analysis method, analysis device, and computer program |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000285252A (en) * | 1999-03-31 | 2000-10-13 | Fuji Photo Film Co Ltd | Method and device for picture display and method and device for picture alignment |
JP2007121837A (en) | 2005-10-31 | 2007-05-17 | Olympus Corp | Microscope system |
JP2009141630A (en) * | 2007-12-05 | 2009-06-25 | Canon Inc | Image processor, its control method, and program |
JP2010061129A (en) * | 2009-08-14 | 2010-03-18 | Claro Inc | Slide image data |
JP2010134374A (en) | 2008-12-08 | 2010-06-17 | Olympus Corp | Microscope system and method of operation thereof |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6031930A (en) * | 1996-08-23 | 2000-02-29 | Bacus Research Laboratories, Inc. | Method and apparatus for testing a progression of neoplasia including cancer chemoprevention testing |
JP3702695B2 (en) | 1999-03-11 | 2005-10-05 | 富士通株式会社 | Infrared imaging device |
US6847729B1 (en) * | 1999-04-21 | 2005-01-25 | Fairfield Imaging Limited | Microscopy |
US6711283B1 (en) * | 2000-05-03 | 2004-03-23 | Aperio Technologies, Inc. | Fully automatic rapid microscope slide scanner |
US20030210262A1 (en) * | 2002-05-10 | 2003-11-13 | Tripath Imaging, Inc. | Video microscopy system and multi-view virtual slide viewer capable of simultaneously acquiring and displaying various digital views of an area of interest located on a microscopic slide |
CN100454078C (en) * | 2006-06-22 | 2009-01-21 | 北京普利生仪器有限公司 | Method for preparing microscopic image of holographic digitalized sliced sheet |
JP4970148B2 (en) | 2007-05-31 | 2012-07-04 | 株式会社東芝 | Magnetic resonance imaging apparatus, image display apparatus, image display program, and image display system |
CA2731956A1 (en) * | 2008-07-25 | 2010-01-28 | Daniel S. Gareau | Rapid confocal microscopy to support surgical procedures |
-
2012
- 2012-12-26 JP JP2012282781A patent/JP5350532B2/en not_active Expired - Fee Related
- 2012-12-27 BR BR112014015375A patent/BR112014015375A8/en not_active IP Right Cessation
- 2012-12-27 RU RU2014131048A patent/RU2014131048A/en unknown
- 2012-12-27 EP EP12863006.8A patent/EP2800053A4/en not_active Withdrawn
- 2012-12-27 CN CN201280064883.1A patent/CN104025150A/en active Pending
- 2012-12-27 WO PCT/JP2012/083830 patent/WO2013100028A1/en active Application Filing
- 2012-12-27 KR KR1020147020975A patent/KR20140105036A/en active IP Right Grant
-
2013
- 2013-06-04 US US13/909,847 patent/US8854448B2/en not_active Expired - Fee Related
-
2014
- 2014-09-02 US US14/475,289 patent/US20140368632A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000285252A (en) * | 1999-03-31 | 2000-10-13 | Fuji Photo Film Co Ltd | Method and device for picture display and method and device for picture alignment |
JP2007121837A (en) | 2005-10-31 | 2007-05-17 | Olympus Corp | Microscope system |
JP2009141630A (en) * | 2007-12-05 | 2009-06-25 | Canon Inc | Image processor, its control method, and program |
JP2010134374A (en) | 2008-12-08 | 2010-06-17 | Olympus Corp | Microscope system and method of operation thereof |
JP2010061129A (en) * | 2009-08-14 | 2010-03-18 | Claro Inc | Slide image data |
Non-Patent Citations (1)
Title |
---|
See also references of EP2800053A4 |
Also Published As
Publication number | Publication date |
---|---|
JP2013152452A (en) | 2013-08-08 |
CN104025150A (en) | 2014-09-03 |
US8854448B2 (en) | 2014-10-07 |
BR112014015375A2 (en) | 2017-06-13 |
BR112014015375A8 (en) | 2017-06-13 |
KR20140105036A (en) | 2014-08-29 |
EP2800053A4 (en) | 2015-09-09 |
RU2014131048A (en) | 2016-02-20 |
US20140368632A1 (en) | 2014-12-18 |
US20130271593A1 (en) | 2013-10-17 |
EP2800053A1 (en) | 2014-11-05 |
WO2013100028A9 (en) | 2014-05-30 |
JP5350532B2 (en) | 2013-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5350532B2 (en) | Image processing apparatus, image display system, image processing method, and image processing program | |
JP6091137B2 (en) | Image processing apparatus, image processing system, image processing method, and program | |
US20200050655A1 (en) | Image processing apparatus, control method for the same, image processing system, and program | |
JP5780865B2 (en) | Image processing apparatus, imaging system, and image processing system | |
WO2013100025A9 (en) | Image processing device, image processing system, image processing method, and image processing program | |
US20140184778A1 (en) | Image processing apparatus, control method for the same, image processing system, and program | |
JP2014071207A (en) | Image processing apparatus, imaging system, and image processing system | |
WO2013100029A9 (en) | Image processing device, image display system, image processing method, and image processing program | |
WO2013100026A9 (en) | Image processing device, image processing system, image processing method, and image processing program | |
US20160042122A1 (en) | Image processing method and image processing apparatus | |
JP2014029380A (en) | Information processing device, information processing method, program, and image display device | |
US20130162805A1 (en) | Image processing apparatus, image processing system, image processing method, and program for processing a virtual slide image | |
JP2016038542A (en) | Image processing method and image processing apparatus | |
JP6338730B2 (en) | Apparatus, method, and program for generating display data | |
JP2013250574A (en) | Image processing apparatus, image display system, image processing method and image processing program | |
JP2013250400A (en) | Image processing apparatus, image processing method, and image processing program | |
JP2016038541A (en) | Image processing method and image processing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12863006 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012863006 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20147020975 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2014131048 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014015375 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112014015375 Country of ref document: BR Kind code of ref document: A2 Effective date: 20140623 |