WO2013100026A9 - 画像処理装置、画像処理システム、画像処理方法および画像処理プログラム - Google Patents
画像処理装置、画像処理システム、画像処理方法および画像処理プログラム Download PDFInfo
- Publication number
- WO2013100026A9 WO2013100026A9 PCT/JP2012/083825 JP2012083825W WO2013100026A9 WO 2013100026 A9 WO2013100026 A9 WO 2013100026A9 JP 2012083825 W JP2012083825 W JP 2012083825W WO 2013100026 A9 WO2013100026 A9 WO 2013100026A9
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- display
- image processing
- display image
- image
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 200
- 238000003672 processing method Methods 0.000 title claims description 21
- 238000000034 method Methods 0.000 claims abstract description 60
- 230000008569 process Effects 0.000 claims abstract description 42
- 238000003384 imaging method Methods 0.000 claims description 64
- 230000000007 visual effect Effects 0.000 claims description 40
- 238000012937 correction Methods 0.000 claims description 21
- 238000006243 chemical reaction Methods 0.000 claims description 14
- 230000005540 biological transmission Effects 0.000 claims description 7
- 230000006870 function Effects 0.000 description 23
- 230000000875 corresponding effect Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 10
- 238000011161 development Methods 0.000 description 9
- 230000008859 change Effects 0.000 description 8
- 238000005259 measurement Methods 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 230000009467 reduction Effects 0.000 description 6
- 238000012360 testing method Methods 0.000 description 6
- 238000007906 compression Methods 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- 230000006835 compression Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 3
- 238000010827 pathological analysis Methods 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000007170 pathology Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 239000006059 cover glass Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000008393 encapsulating agent Substances 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 238000000879 optical micrograph Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000004171 remote diagnosis Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
Definitions
- the present invention relates to an image processing apparatus, an image processing method, an image processing system, and a program.
- Patent Document 1 a microscope capable of obtaining an easy-to-see information image by controlling the amount of light displayed on the information image when the sample image and the information image are simultaneously observed with a microscope.
- the image of the virtual slide is different from the image observed with the microscope because the image data obtained by imaging the observation target is image-processed and displayed on the display.
- the image display of the virtual slide often displays a region wider than the observation field of view observed with a microscope. Therefore, the display image on the display based on the image data of the virtual slide (hereinafter also referred to as “display image”) contains a lot of information, and the observer must pay attention to a large area. Therefore, there was a problem that it may be a burden on the observer.
- an object of the present invention is to propose an image processing apparatus capable of generating a virtual slide display image that reduces the burden on the observer.
- An image processing apparatus for processing a virtual slide image
- An image data acquisition unit for acquiring image data obtained by imaging an imaging target; From the image data, the observation area display image data for displaying the observation area determined based on a predetermined method or designated by the user on the display device and the area other than the observation area are displayed on the display device.
- a display image data generation unit for generating display image data composed of image data for display outside the observation area for The display image data generation unit performs uniform image processing on the entire image data by performing image processing on at least one of the observation region display image data and the non-observation region display image data. Is characterized by generating display image data for displaying different images on a display device.
- An image processing method for processing a virtual slide image An image data acquisition step of acquiring image data obtained by imaging the imaging target; From the image data acquired in the image data acquisition step, observation region display image data for displaying on the display device an observation region determined based on a predetermined method or designated by the user, and other than the observation region
- the display image data generation step includes performing uniform image processing on the entire image data by performing image processing on at least one of the observation region display image data and the non-observation region display image data. Is a step of generating display image data for displaying different images on a display device.
- An image processing method for processing a virtual slide image An image data acquisition step of acquiring image data obtained by imaging the imaging target; From the image data acquired in the image data acquisition step, observation region display image data for displaying on the display device an observation region determined based on a predetermined method or designated by the user, and other than the observation region
- the display image data generation step includes performing uniform image processing on the entire image data by performing image processing on at least one of the observation region display image data and the non-observation region display image data.
- first display image data for displaying different images on a display device includes first display image data for displaying different images on a display device, and second display image data for which image processing is not performed on the image data or uniform image processing is performed on the entire image data.
- An image processing system comprising: the image processing device; and a display device that displays a virtual slide image processed by the image processing device in an aspect having an observation region that reproduces a field of view of a microscope.
- Another aspect of the present invention is: A program for causing a computer to execute each step of the image processing method.
- the burden on the observer can be reduced by distinguishing and displaying the observation area and the outside of the observation area.
- FIG. 1 is a schematic overall view showing an example of an apparatus configuration of an image processing system using an image processing apparatus of the present invention. It is a functional block diagram which shows an example of a function structure of the imaging device in the image processing system using the image processing apparatus of this invention. It is a functional block diagram which shows an example of a function structure of the image processing apparatus of this invention. It is a block diagram which shows an example of the hardware constitutions of the image processing apparatus of this invention. It is a schematic diagram for demonstrating the concept of a microscope visual field display (circular display). It is a flowchart which shows an example of the flow of a microscope visual field display process of the image processing apparatus of this invention.
- the image processing apparatus of the present invention is a virtual slide image processing apparatus, and has an image data acquisition unit and a display image data generation unit.
- the display image data generation unit generates display image data to be displayed on a display device including the observation area display image data and the non-observation area display image data from the image data acquired by the image data acquisition unit.
- the range to be used as the observation area is determined based on a predetermined method, for example, based on information stored in advance in the image processing apparatus or the external storage device, and / or based on a user instruction.
- the observation region is preferably a reproduction of the field of view of the microscope (generally circular). What microscope field of view is reproduced is preferably stored in advance as the above information.
- the information stored in advance includes initial visual field information (information selected as an observation area when there is no instruction from the user; hereinafter, also simply referred to as “initial information”), and / or a plurality of specific actual items. It is desirable to include microscope field-of-view information (a plurality of microscope field-of-view information selectable by the user). Note that the initial visual field information may be stored in a format in which one of the visual field information of the plurality of microscopes is selected. Further, a new observation area determined based on a user instruction may be stored as additional visual field information so that it can be selected as one of visual field information options. Furthermore, a new observation area determined based on a user instruction may be managed for each user to be used.
- the display image data generation unit performs an image process on at least one of the observation area display image data and the non-observation area display image data, thereby generating an image different from that obtained when uniform image processing is performed on the entire image data. Display image data to be displayed on the display device is generated.
- the display image data generation unit preferably generates display image data for displaying an observation region that reproduces the field of view of the microscope.
- the display image data generation unit is preferably capable of generating display image data based on information relating to the actual field of view of the microscope. In this case, it is preferable that the display image data generation unit can generate display image data based on information on the actual field of view of the microscope and information on magnification to be displayed as an image. It is preferable that the display image data generation unit can generate display image data by using, as initial information, a predetermined one of a plurality of information related to the field of view of the actual microscope.
- the display image data generation unit can generate the display image data using one of a plurality of pieces of information regarding the field of view of an actual microscope based on a user's selection. It is preferable that the display image data generation unit can generate the display image data so that the luminance outside the observation region is lower than the luminance of the observation region.
- Display image data can be generated by multiplying multi-value mask information and image data in units of pixels. Further, display image data can be generated by an image data calculation process based on mask information indicating two processing forms. As the mask information indicating these two processing modes, information expressing a position where the image data is employed and a position where the image data is bit-shifted can be used.
- the “position” here is a display position on the display device when an image is displayed. The expression of the position can be realized by including coordinate information in the mask information.
- the shift amount of the bit shift operation can be changed according to an instruction input from the outside.
- the display image data generation unit converts the image data into luminance color difference data, and then performs the arithmetic processing on the luminance value obtained by the conversion. be able to.
- the shift amount of the bit shift operation depends on the distance from the center of the circular observation region. It is preferable to change.
- the display image data generation unit can generate display image data that does not distinguish between the observation region and the observation region while the position of the image to be displayed on the display device or the display magnification is changing.
- the display image data other than the observation area may include information on the imaging target.
- a preferred image processing system of the present invention includes an image processing device and an image display device.
- image display device may be abbreviated as “display device”.
- the image processing apparatus in the image processing system the above-described image processing apparatus can be used.
- the image display system of the present invention may have an imaging device and / or an image server to be described later.
- a preferred image processing method of the present invention is an image processing method for processing a virtual slide image, and includes at least an image data acquisition step and a display image data generation step.
- the image processing method of the present invention may include a display image data transmission step after the display image data generation step.
- image data acquisition step image data obtained by imaging the imaging target is acquired.
- display image data generation step display image data including an observation region and other than the observation region is generated. This display image data is data for displaying an image on the display device.
- the image processing method of the present invention can reflect the aspects described above or described later regarding the image processing apparatus.
- the first display image data and / or the observation region display composed of the observation region display image data and the non-observation region display image data.
- Second display image data that does not distinguish between the image data for display and the image data for display outside the observation area is generated.
- the second display image data can be obtained by performing no image processing on the image data or by performing uniform image processing on the entire image data.
- the display image data transmission step transmits the first display image data to the display device when the position of the image to be displayed on the display device or the display magnification changes. To do.
- the display image data transmission step according to a preferred embodiment of the present invention transmits the second display image data to the display device when the position of the image to be displayed or the display magnification has not changed.
- the program of the present invention is a program that causes a computer to execute each step of the image processing method.
- the image processing apparatus of the present invention can be used in an image processing system including an imaging device and a display device. This image processing system will be described with reference to FIG.
- FIG. 1 is a schematic overall view showing an example of an image processing system using an image processing apparatus of the present invention.
- An imaging apparatus microwave apparatus or virtual slide scanner
- an image processing apparatus 102 an image display apparatus 103 is shown.
- the imaging apparatus 101 and the image processing apparatus 102 are connected by a dedicated or general-purpose I / F cable 104, and the general-purpose I / F cable is connected between the image processing apparatus 102 and the image display apparatus 103.
- 105 is connected.
- a virtual slide device having a function of imaging a single two-dimensional image or a plurality of two-dimensional images at different positions in a two-dimensional plane direction and outputting a digital image can be suitably used.
- a solid-state imaging device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) is preferably used for acquiring a two-dimensional image.
- a digital microscope apparatus in which a digital camera is attached to an eyepiece part of a normal optical microscope can be used instead of the virtual slide apparatus.
- the obtained image can be divided into an observation area and an outside of the observation area.
- the image processing apparatus 102 has a function of generating data to be displayed on the display apparatus 103 from one or a plurality of original image data acquired from the imaging apparatus 101 in response to a request from the user based on the original image data. Etc. can be suitably used.
- an apparatus composed of a general-purpose computer or a workstation having hardware resources such as various I / Fs including a CPU (Central Processing Unit), a RAM, a storage device, and an operation unit is used. be able to.
- a large-capacity information storage device such as a hard disk drive can be suitably used.
- the storage device preferably stores programs and data for realizing each process described later, an OS (operating system), and the like.
- the operation unit 106 includes, for example, a keyboard and a mouse, and is used for an operator to input various instructions.
- the operation unit 106 may be a component of the image processing apparatus 102.
- the image display device 103 in this example is a display that displays an observation image that is a result of the arithmetic processing performed by the image processing device 102, and includes a CRT, a liquid crystal display, or the like.
- the image processing system is configured by three devices of the imaging device 101, the image processing device 102, and the image display device 103, but the configuration of the present invention is not limited to this configuration.
- an image processing device integrated with the image display device may be used, or the function of the image processing device may be incorporated in the imaging device.
- the functions of the imaging device, the image processing device, and the image display device can be realized by a single device.
- the functions of the image processing apparatus and the like may be divided and realized by a plurality of apparatuses.
- FIG. 2 is a functional block diagram illustrating an example of a functional configuration of the imaging apparatus 101.
- the image pickup apparatus 101 of this example is roughly shown in the illumination unit 201, stage 202, stage control unit 205, imaging optical system 207, image pickup unit 210, development processing unit 219, pre-measurement unit 220, main control system 221, and data output unit. (I / F) 222.
- the illumination unit 201 in this example is a means for uniformly irradiating light to the preparation 206 disposed on the stage 202, and preferably includes a light source, an illumination optical system, and a light source drive control system. .
- the stage 202 of this example is driven and controlled by a stage control unit 205, and can move in three directions of XYZ.
- the preparation 206 of this example is a member in which a tissue section or smeared cells to be observed are pasted on a slide glass, and the tissue section or smeared cells are fixed under a cover glass together with an encapsulating agent.
- the stage control unit 205 in this example includes a drive control system 203 and a stage drive mechanism 204.
- the drive control system 203 receives the instruction from the main control system 221 and controls the drive of the stage 202.
- the moving direction, moving amount, and the like of the stage 202 are based on the position information and thickness information (distance information) of the sample measured by the pre-measurement unit 220 and an instruction from the user that is input as necessary. It is determined.
- the stage drive mechanism 204 in this example drives the stage 202 in accordance with instructions from the drive control system 203.
- the imaging optical system 207 of this example is a lens group for forming an optical image of the specimen of the preparation 206 on the image sensor 208.
- the imaging unit 210 of this example includes an imaging sensor 208 and an analog front end (AFE) 209.
- the imaging sensor 208 of this example is a one-dimensional or two-dimensional image sensor that changes a two-dimensional optical image into an electrical physical quantity by photoelectric conversion.
- As the image sensor 208 for example, a CCD or a CMOS device is used.
- a one-dimensional sensor is used, a two-dimensional image can be obtained by scanning the one-dimensional sensor in the scanning direction.
- the imaging sensor 208 of this example outputs an electrical signal having a voltage value corresponding to the light intensity.
- a color image is desired as the captured image, for example, a single-plate image sensor to which a Bayer color filter is attached can be used as the image sensor.
- the imaging unit 210 of this example can capture a divided image of the specimen by moving the stage 202 in the XY axis direction and capturing an image.
- the AFE 209 in this example is a circuit that converts an analog signal output from the image sensor 208 into a digital signal.
- the AFE 209 is preferably configured by an H / V driver, a CDS (Correlated double sampling), an amplifier, an AD converter, and a timing generator, which will be described later.
- the H / V driver of this example converts a vertical synchronization signal and a horizontal synchronization signal for driving the image sensor 208 into potentials necessary for driving the sensor.
- the CDS of this example is a double correlation sampling circuit that removes fixed pattern noise.
- the amplifier of this example is an analog amplifier that adjusts the gain of an analog signal from which noise has been removed by CDS.
- the AD converter of this example converts an analog signal into a digital signal.
- the AD converter converts the analog signal into digital data quantized from about 10 bits to about 16 bits and outputs in consideration of subsequent processing.
- the converted sensor output data is referred to as RAW data.
- the RAW data is developed by the subsequent development processing unit 219.
- the timing generator of this example generates a signal for adjusting the timing of the image sensor 208 and the timing of the development processing unit 219 in the subsequent stage.
- the AFE 209 When a CCD is used as the image sensor 208, the AFE 209 is usually used. On the other hand, when a CMOS image sensor capable of digital output is used as the image sensor 208, the function of the AFE 209 is usually included in the sensor.
- the development processing unit 219 in this example includes a black correction unit 211, a white balance adjustment unit 212, a demosaicing processing unit 213, an image composition processing unit 214, a resolution conversion processing unit 215, a filter processing unit 216, a ⁇ correction unit 217, and a compression process. Part 218.
- the black correction unit 211 of this example performs a process of subtracting the black correction data obtained at the time of shading from each pixel of the RAW data.
- the white balance adjustment unit 212 of this example performs a process of reproducing a desired white color by adjusting the gain of each RGB color according to the color temperature of the light of the illumination unit 201. Specifically, white balance correction data is added to the RAW data after black correction. When handling a monochrome image, the white balance adjustment process is not necessary.
- the development processing unit 219 of the present example generates hierarchical image data, which will be described later, from the divided image data of the specimen imaged by the imaging unit 210.
- the demosaicing processing unit 213 in this example performs processing for generating image data of each color of RGB from RAW data in the Bayer array.
- the demosaicing processing unit 213 of this example calculates the values of each RGB color of the target pixel by interpolating the values of peripheral pixels (including pixels of the same color and other colors) in the RAW data.
- the demosaicing processing unit 213 of the present example also executes defective pixel correction processing (interpolation processing). Note that when the imaging sensor 208 does not have a color filter and a single color image is obtained, the demosaicing process is not necessary.
- the image composition processing unit 214 of this example performs processing for generating large-capacity image data in a desired imaging range by connecting image data acquired by dividing the imaging range by the imaging sensor 208.
- a piece of two-dimensional image data is generated by joining the divided image data. For example, assuming that a 10 mm square area on the slide 206 is imaged with a resolution of 0.25 ⁇ m, the number of pixels on one side is 40,000 pixels of 10 mm / 0.25 ⁇ m, and the total number of pixels is 1.6 billion, which is the square of the number of pixels. It becomes a pixel.
- the resolution conversion processing unit 215 of this example performs processing for generating a magnification image corresponding to the display magnification in advance by resolution conversion in order to display the large-capacity two-dimensional image generated by the image composition processing unit 214 at high speed.
- a plurality of stages of image data from low magnification to high magnification are generated and configured as image data having a hierarchical structure.
- the image data acquired by the imaging device 101 is desired to be high-resolution and high-resolution imaging data for the purpose of diagnosis. However, when a reduced image of image data consisting of billions of pixels is displayed as described above, if the resolution conversion is performed each time in accordance with a display request, the processing may be slow.
- the hierarchical image data for display is generated by reducing the resolution using a resolution conversion method based on image data having the highest resolution.
- bilinear which is a two-dimensional linear interpolation process
- bicubic using a cubic interpolation equation can be used as a resolution conversion method.
- the filter processing unit 216 of this example is a digital filter that realizes suppression of high frequency components included in an image, noise removal, and enhancement of resolution.
- the gamma correction unit 217 of the present example executes processing for adding an inverse characteristic to an image in accordance with the gradation expression characteristic of a general display device, or performs human vision through gradation compression or dark part processing of a high luminance part. Perform gradation conversion according to the characteristics.
- gradation conversion suitable for the subsequent composition processing and display processing is applied to the image data.
- the compression processing unit 218 of this example is a compression encoding process performed for the purpose of improving the efficiency of transmission of large-capacity 2D image data and reducing the capacity when storing the data.
- standardized encoding methods such as JPEG (Joint Photographic Experts Group) and JPEG 2000 and JPEG XR improved and evolved from JPEG can be used.
- the pre-measurement unit 220 of this example is a unit that performs pre-measurement to calculate the position information of the specimen on the slide 206, the distance information to the desired focal position, and the parameter for adjusting the amount of light caused by the specimen thickness. .
- the pre-measurement unit 220 of this example grasps the position of the sample on the XY plane from the acquired image. For obtaining distance information and thickness information, a laser displacement meter or a Shack-Hartmann measuring instrument can be used.
- the main control system 221 of this example controls various units described so far.
- Control of the main control system 221 and the development processing unit 219 can be realized by a control circuit having a CPU, a ROM, and a RAM.
- the functions of the main control system 221 and the development processing unit 219 are realized by storing programs and data in the ROM in advance and executing the programs using the RAM as a work memory.
- a device such as an EEPROM or a flash memory can be used as the ROM, and a DRAM device such as DDR3 can be used as the RAM.
- the function of the development processing unit 219 may be replaced with an ASIC implemented as a dedicated hardware device.
- the data output unit 222 in this example is an interface for sending the RGB color image generated by the development processing unit 219 to the image processing apparatus 102.
- the imaging apparatus 101 and the image processing apparatus 102 in this example are connected by an optical communication cable. Instead of this cable, a general-purpose interface such as USB or Gigabit Ethernet (registered trademark) may be used.
- FIG. 3 is a functional block diagram illustrating an example of a functional configuration of the image processing apparatus 102 according to the present invention.
- the image processing apparatus 102 includes an outline, an image data acquisition unit 301, a storage holding unit (memory) 302, a user input information acquisition unit 303, a display device information acquisition unit 304, a display data generation control unit 305, mask information 306,
- the display image data acquisition unit 307, the display image data generation unit 308, and the display data output unit 309 are configured.
- display image data may be abbreviated as “display data”
- “display image” may be abbreviated as “display image”.
- the image data acquisition unit 301 in this example acquires image data captured by the imaging device 101.
- the image data referred to in this example is based on RGB color divided image data obtained by dividing and imaging a specimen, one piece of two-dimensional image data obtained by combining the divided image data, and two-dimensional image data. Or at least one of the image data hierarchized for each display magnification.
- the divided image data may be monochrome image data.
- the storage holding unit 302 of this example takes in image data acquired from an external device via the image data acquisition unit 301, stores it, and holds it. In addition, it is desirable that the memory holding unit 302 holds the above-described specific real-time field-of-view information of a plurality of microscopes and information on which of the field-of-view information is initially used.
- the user input information acquisition unit 303 of the present example through an operation unit such as a mouse or a keyboard, updates display image data such as a display position change, enlargement / reduction display, display mode selection, and observation region specification ( For example, input information by the user such as selecting one of a plurality of microscope visual field information held by the memory holding unit) is acquired.
- the display mode in this example includes a mode that reproduces a display form simulating a microscope observation field and a mode that does not reproduce. The user can also specify and change the bit shift amount described later.
- the shape of the microscope observation field is assumed to be circular in this example, but is not limited to this.
- the display device information acquisition unit 304 of this example acquires display magnification information of the currently displayed image, in addition to display area information (screen resolution) of the display held by the display device 103.
- the display data generation control unit 305 of this example controls the generation of display image data in accordance with an instruction from the user acquired by the user input information acquisition unit 303. Further, the display data generation control unit of this example generates and updates mask information described later.
- the mask information 306 in this example is control information for generating display image data necessary for reproducing the microscope field of view on the display screen.
- the mask information 306 in this example has information for display pixels constituting the display area of the display device 103, and whether to display the corresponding image data with the luminance value as it is or to change the luminance value for each pixel. To be able to judge.
- each value has a value of 5 bits.
- the mask information is 0, the value of the image data is used as it is as display image data.
- the luminance value is set in the lower direction according to the value. I will bit shift.
- the luminance data of the target display pixel is set to 0.
- the luminance value outside the observation area may be set to 0, or the luminance value may be reduced to a value smaller than the original luminance value other than 0.
- the luminance data of each pixel is assumed as a calculation destination with the mask information.
- RGB color image data when RGB color image data is targeted, it is once converted into luminance / color difference signals such as YUV and YCC, The luminance information after the conversion can be the target of the arithmetic processing. Further, a configuration in which a bit shift is applied to each color of RGB may be adopted.
- the bit shift can be arbitrarily set for the display pixels in the display area.
- the mask value in the circular visual field is set to 0, and the mask value in the other area is set. The following description will be made assuming that 2 is. In the display area where 2 is set as the mask value, the luminance value of the acquired image data is reduced to 1 ⁇ 4. Furthermore, by adopting a configuration that gives meaning to specific bits, it is also possible to apply a process for increasing the brightness.
- the mask information 306 in this example reflects either the above-described initial visual field information or observation area information specified by the user.
- the observation area information specified by the user is specified by the user by revising a part of the actual microscope field-of-view information selected by the user or by partially revising the actual microscope field-of-view information. Information on the observation region specified by the user regardless of the field of view information of the microscope.
- the initial visual field information may be included in advance as part of the mask information 306, or the initial visual field information held by the storage holding unit 302 may be read from the storage holding unit 302 by the display data generation control unit 305 or the like. .
- the observation area information designated by the user can be reflected in the mask information from the user input information acquisition unit 303 via the display data generation control unit 305.
- the display image data acquisition unit 307 in this example acquires image data necessary for display from the storage unit 302 under the control of the display data generation control unit 305.
- the display image data generation unit 308 in this example generates display data to be displayed on the display device 103 using the image data acquired by the display image data acquisition unit 307 and the mask information 306. Details of the display data generation will be described later with reference to the flowcharts of FIGS.
- the display data output unit 309 of this example outputs the display data generated by the display image data generation unit 308 to the display device 103 which is an external device.
- FIG. 4 is a block diagram illustrating an example of a hardware configuration of the image processing apparatus according to the present invention.
- a PC Personal Computer
- FIG. 4 is a block diagram illustrating an example of a hardware configuration of the image processing apparatus according to the present invention.
- a PC Personal Computer
- FIG. 4 is a block diagram illustrating an example of a hardware configuration of the image processing apparatus according to the present invention.
- a PC Personal Computer
- FIG. 4 is a block diagram illustrating an example of a hardware configuration of the image processing apparatus according to the present invention.
- a PC Personal Computer
- the PC of this example includes a CPU (Central Processing Unit) 401, a RAM (Random Access Memory) 402, a storage device 403, a data input / output I / F 405, and an internal bus 404 that connects them to each other.
- a CPU Central Processing Unit
- RAM Random Access Memory
- storage device 403 a data input / output I / F 405, and an internal bus 404 that connects them to each other.
- the CPU 401 in this example accesses the RAM 402 and the like as appropriate, and performs overall control of the entire block of the PC while performing various arithmetic processes.
- the RAM 402 is used as a work area for the CPU 401 and the like, and various data (a plurality of microscopes) to be processed such as the OS, various programs being executed, and generation of display data that simulates the microscope observation field of view that is a feature of the present invention. (Including visual field information etc.) etc. temporarily.
- the storage device 403 in this example is an auxiliary storage device that records and reads information in which firmware such as an OS, a program, and various parameters that are executed by the CPU 401 is fixedly stored.
- a semiconductor device using a magnetic disk drive or flash memory such as HDD (Hard Disk Drive) or SSD (Solid State Disk) of this example is used.
- the storage device 403 in this example includes various types of data to be processed (view information of a plurality of microscopes, etc.), such as the OS, various programs being executed, and generation of display data that mimics the microscope observation field of view that is a feature of the present invention. Etc.) are stored.
- the data input / output I / F 405 of this example is represented by an image server via a LAN I / F 406, a display device 103 via a graphics board, and a virtual slide device or a digital microscope via an external device I / F.
- the imaging apparatus 101 is connected to a keyboard 410 and a mouse 411 via an operation I / F 409.
- the display device 103 of this example is a display device using, for example, liquid crystal, EL (Electro-Luminescence), CRT (Cathode Ray Tube), or the like.
- the display device 103 is assumed to be connected as an external device, but a PC integrated with the display device may be assumed.
- a notebook PC corresponds to this.
- a pointing device such as a keyboard 410 or a mouse 411 is assumed.
- a configuration in which the screen of the display device 103 such as a touch panel is directly used as an input device is also possible. is there. In that case, the touch panel can be integrated with the display device 103.
- FIG. 5 is a schematic diagram for conceptually explaining the microscope field of view and the reproduced display form.
- FIG. 5 (a) shows the field of view observed when looking through a microscope.
- the microscope field is uniquely determined by the magnification of the objective lens of the microscope and the number of fields.
- F.I. O. V. (Number of field of view of eyepiece) / ((magnification of objective lens) ⁇ (zoom magnification)).
- a pathologist Prior to the existence of a virtual slide device, a pathologist as a user made a diagnosis by viewing such an observation image.
- a digital observation image can be acquired by placing a digital camera on the eyepiece of the optical microscope, the acquired image data loses information outside the circular observation region as in FIG. It is in a broken state.
- FIG. 5B shows an example in which the image data acquired by the virtual slide device is presented on the display screen of the display device 103.
- the image data acquired by the virtual slide device is prepared as a stitched image of image data captured by dividing a partial region of the specimen.
- information can be presented on the entire surface of the display device 103 in a wider range than the field of view of the microscope, and there is no need to look into it, a certain viewing distance can be secured, and more image data and specimens are related.
- Various conveniences can be provided such that information can be displayed together.
- FIG. 5C is an example in which a process for performing a display simulating a microscope field of view is applied when image data acquired by the virtual slide device is displayed on the display device 103.
- the specimen image is displayed in a wide display area, by reducing the brightness of the area other than the microscope observation field to be watched, the microscope observation field familiar to the pathology is reproduced, and the virtual slide is also displayed in the surrounding area. It is compatible with presenting a lot of good image information which is an advantage of the device.
- As a method for reducing the amount of information other than the microscope observation field of view there is a method of reducing the color information and displaying it as monochrome data in addition to reducing the luminance.
- FIG. 5 (d) is a display image display example simulating the field of view of the microscope, as in FIG. 5 (c).
- the brightness of the region other than the microscope observation field is reduced according to the distance from the center of the circle that is the gazing point, without changing the presentation of the image in the microscope observation field of view.
- FIG. 5D a region of interest is found by relatively increasing the amount of information in the region to be watched and its vicinity, compared to the uniform information reduction for the region other than the microscopic field shown in FIG. Convenience has been increased by making it easier.
- step S601 the display device information acquisition unit 304 acquires the size information (screen resolution) of the display area of the display, which is the display device 103, from the display device 103.
- the size information of the display area is used when determining the size of display data to be generated.
- step S602 the display device information acquisition unit 304 acquires information about the display magnification of the image currently displayed on the display device 103.
- a specified magnification is set.
- the display magnification is used when selecting any image data from the hierarchical image.
- step S603 image data to be displayed on the display device 103 is acquired from the storage unit 302 based on the display area size information acquired in step S601 and the display magnification information acquired in step S602.
- step S604 it is determined whether or not the displayed image is shared by a plurality of people and used. If the display image is not shared by a plurality of people, that is, if it is used alone by one user, the process proceeds to step S605.
- Examples of sharing and using a display image among a plurality of people include a conference in a hospital with a pathologist and a plurality of concerned persons such as clinicians, and a presentation for educational use to students and trainees.
- the viewing area for the displayed display image may vary depending on the user, so select the normal viewing mode instead of the microscopic viewing mode that might interfere with it. It is desirable to do.
- step S605 it is determined whether or not the user has selected the microscope observation visual field mode. If the microscope observation visual field mode is selected, the process proceeds to step S606. If the normal visual field observation mode is selected, the process proceeds to step S608.
- step S606 it is determined whether or not the display area information (screen resolution) of the display device 103 acquired in step S601 is greater than or equal to a preset value. If it is greater than or equal to the set value, the process proceeds to step S607. If it is less than the set value, the process proceeds to step S608.
- the screen resolution which is the display area information of the display device 103
- the amount of information that can be displayed increases, and the area that the user is gazing with increases accordingly. In order to reduce the burden on the user, it is desirable to select the microscope observation field mode.
- step S605 even when the user selects the microscope observation visual field mode in step S605, if the screen resolution of the display device 103 is low, it is desirable to select the normal observation mode that presents displayable information as it is.
- the user can specify a setting value that is a criterion for determination. Note that the order of the determination processing in step S605 and step S606 may be reversed.
- step S607 in response to the selection of the microscope observation field mode, image data for displaying a microscope field is generated.
- This image data for microscopic field display is composed of the above-described observation area display image data and out-of-observation area display image data, and image processing is performed on at least one of them, so that the entire image data is displayed.
- This is image data that allows a display device to display an image different from the case where uniform image processing is performed. Details will be described later with reference to FIG.
- step S608 in response to the selection of the normal observation mode, display image data for normal observation is generated.
- a resolution conversion process is applied so that the image data of the close display magnification in the hierarchical image acquired in step S603 has a desired resolution.
- Correction processing is performed according to the characteristics of the display device 103 as necessary.
- step S609 the display data generated in step S607 or step S608 is output to the display device 103.
- step S610 the display image data input by the display device 103 is displayed on the screen.
- step S611 it is determined whether the image display is completed. If another sample image is selected by the user, the process ends when the display application is closed. If the display screen is continuously updated, the process returns to step S602 and the subsequent processing is repeated.
- FIG. 7 is a flowchart showing an example of a detailed flow of display image data generation processing for reproducing the microscope visual field shown in step S607 of FIG.
- the microscope field here corresponds to the observation region described above.
- the outside of the microscope field is outside the observation region.
- step S701 mask information 306 is acquired.
- the mask information 306 has information for display pixels constituting the display area of the display device 103, and can determine for each pixel whether the corresponding image data is displayed with the same luminance value or the luminance value is changed. It is what I did.
- step S702 it is determined whether there is a change in the display screen. If the display screen is not changed and the state of the currently displayed screen is maintained, the process proceeds to step S704. If the screen scrolling or enlargement / reduction operation is performed and the display screen is updated, the process proceeds to step S703. Proceed with each.
- step S703 it is determined whether the current display mode is the high-speed display mode or the normal observation visual field display mode.
- the high-speed display mode is one of the modes for reproducing the microscope observation field.
- the display screen is not updated, the circular microscope observation field is reproduced in the stationary state, and the display screen is updated such as screen scrolling.
- the circular display is stopped, and the normal luminance display image that can be gazed using the rectangular display area and the display image are separated and displayed so as not to interfere with the gazing. It is a mode for. If the high-speed display mode is selected, the process proceeds to step S707. If the microscope field of view is to be reproduced regardless of the display screen update, the process proceeds to step S704.
- step S704 in response to reproducing the microscope field of view, the value of the mask information acquired and grasped in step S701 is referred between the corresponding pixels. It is determined whether the value of the mask information of the corresponding display pixel referred to is 0, that is, a pixel having normal luminance presented as a gaze area, or a pixel whose luminance is to be reduced outside the microscope observation field. If the mask value is 0, the process proceeds to step S705. If the mask value is other than 0, that is, if the luminance value of the pixel is decreased by bit shift, the process proceeds to step S706.
- step S705 in response to the mask value being 0, the luminance value of the pixel of the acquired image data is directly adopted as the pixel value for display. Note that when performing correction processing in accordance with the characteristics of the display device 103, the luminance value may change.
- step S706 in response to the mask value being a value other than 0, the luminance value of the pixel of the acquired image data is bit-shifted in the lower direction according to the value of the mask information acquired in step S701. As a result, it is possible to realize a reduction in luminance according to the mask value.
- step S707 in response to the high-speed display mode, it is determined whether or not the mask shape (observation visual field shape) for high-speed display is a rectangular size smaller than the display area. If the observation field of view smaller than the display area of the screen is to be displayed, the process proceeds to step S708. If the observation field is not changed as it is, the process proceeds to step S608. Note that the processing content of the normal observation display image data generation in step S608 is the same as that described in the flowchart of FIG.
- step S708 the size of the observation field of view smaller than the size of the display area is set.
- a configuration in which the size is set or selected by the user may be employed, or a configuration in which a predetermined value is selected may be employed.
- step S709 the value of the mask information acquired and grasped in step S701 is referred between the corresponding pixels. It is determined whether the value of the mask information of the corresponding display pixel referred to is 0, that is, a pixel having normal luminance presented as a gaze area, or a pixel whose luminance is to be reduced outside the microscope observation field. Since the processing content is the same as that in step S704, detailed description thereof is omitted.
- step S710 and step S711 are the same as that in step S705 and step S706, respectively, description thereof will be omitted.
- the difference is that the shape of the microscopic observation field is circular or rectangular.
- FIG. 8 is a schematic diagram illustrating an example of a display screen when display data generated by the image processing apparatus 102 of the present invention is displayed on the display apparatus 103.
- FIG. 8 illustrates a display mode and a high-speed display mode in which a microscope observation visual field is reproduced, and information presentation at the time of reproducing the microscope visual field.
- FIG. 8A is a schematic diagram illustrating a basic configuration example of the screen layout of the display device 103.
- the display screen of this example includes an information area 802 indicating the status of display and operation and information of various images, a specimen thumbnail image 803 to be observed, and a detailed display area indicating an area of detailed observation in the thumbnail image in the entire window 801 804, a display area 805 for specimen image data for detailed observation, and a display magnification 806 for the display area 805, respectively.
- Each area and image may have a form in which the display area of the entire window 801 is divided into functional areas by a single document interface, or each area and image may be constituted by separate windows by a multi-document interface.
- the thumbnail image 803 of this example displays the position and size of the display area 805 of the sample image data in the whole image of the sample.
- the position and size can be grasped by the frame of the detailed display area 804.
- the detailed display area 804 can be set by, for example, direct setting by a user instruction from an externally connected input device such as a touch panel or a mouse 411, or by moving the display area with respect to the displayed image or by enlarging / reducing the display area. Can be updated.
- specimen image data display area 805 specimen image data for detailed observation is displayed.
- an enlarged / reduced image is displayed by moving the display area (selecting and moving a partial area to be observed from the entire specimen image) and changing the display magnification.
- FIG. 8B shows an example of a display screen in which the microscope field of view is reproduced and the luminance is reduced uniformly outside the microscope field of view.
- Reference numeral 806 denotes a display magnification.
- Reference numeral 808 denotes an observation region that reproduces the field of view of the microscope, and an image is displayed at a normal luminance in the circular field of view.
- the brightness of the region outside the field of view of the microscope 807 is reduced at a constant rate.
- FIG. 8C shows an example of a display screen in which the microscope field of view is reproduced and the brightness is reduced according to the distance from the center of the microscope field outside the microscope field of view.
- the brightness of the region outside the microscopic field 809 gradually decreases in accordance with the distance from the center of the circular region that reproduces the microscopic field. The generation of such a display image will be described in the second embodiment.
- FIG. 8D is an example of a change in the microscope visual field when the display screen is updated (scrolled).
- a rectangle having the same size as the microscope field of view is used as the observation field of view, and the luminance of other regions is reduced.
- Reference numeral 810 denotes a microscope observation field in a stationary state.
- Reference numeral 811 denotes an observation visual field that is changed as the screen is updated.
- the microscopic observation field 810 is included.
- Reference numeral 812 denotes an area outside the observation visual field, and the luminance is reduced at a certain rate. Note that it is also possible to adopt a configuration in which the luminance is gradually reduced according to the distance from the center of the microscope observation field of view described with reference to FIG.
- FIG. 8 (e) is an example of a display screen presenting various information outside the microscope field of view. Since the gaze area is a microscopic observation area, in addition to the observation image with reduced brightness, in addition to the specimen information necessary for diagnosis, patient information, etc., a menu screen or the like can be arranged in the outer area.
- Reference numeral 813 denotes an area for presenting a thumbnail image 803 showing the whole image of the specimen, and reference numeral 814 denotes an information area 802.
- an image processing apparatus that reduces the burden on the observer can be provided.
- the processing efficiency can be improved by changing the observation field of view to a rectangle when scrolling the screen, which requires high-speed display.
- the observation visual field region circular region
- various information in addition to the image can be presented, thereby making it easy to find a lesion and improving operability.
- the reproduction of the microscope observation field is divided into selective processing using multi-value mask information, that is, image data adoption and luminance reduction processing by bit shift.
- multi-value mask information that is, image data adoption and luminance reduction processing by bit shift.
- multi-level mask information it is possible to perform equivalent visual field reproduction without dividing the processing by region by multiplying the mask information and the brightness held by the image data. To do.
- the configuration described in the first embodiment can be used except for the configuration different from the first embodiment.
- FIG. 9 is a schematic overall view showing an example of an apparatus constituting an image processing system according to the second embodiment of the present invention.
- An image processing system using the image processing apparatus illustrated in FIG. 9 includes an image server 901, an image processing apparatus 102, and a display apparatus 103.
- the image processing apparatus 102 of this example can acquire image data obtained by imaging a specimen from the image server 901 and generate image data to be displayed on the display apparatus 103.
- the image server 901 and the image processing apparatus 102 are connected by a general-purpose I / F LAN cable 903 via a network 902.
- the image server 901 of this example is a computer including a large-capacity storage device that stores image data captured by the imaging device 101 that is a virtual slide device.
- the image server 901 of this example may store image data with different display magnifications as a single unit in a local storage connected to the image server 901, or divide each of them to somewhere on the network.
- the server group (cloud server) may have a separate entity of each divided image data and link information.
- the hierarchical image data itself does not need to be stored on a single server.
- the image processing apparatus 102 and the display apparatus 103 are the same as the image processing system of the first embodiment.
- the image server 901, the image processing apparatus 102, and the display apparatus 103 constitute an image processing system.
- the present invention is not limited to this configuration.
- the image processing apparatus 102 integrated with the display apparatus 103 may be used, or a part of the functions of the image processing apparatus 102 may be incorporated in the image server 901.
- the functions of the image server 901 and the image processing apparatus 102 may be divided and realized by a plurality of apparatuses.
- FIG. 10 shows a process of generating image data for displaying a microscope field of view described in FIG. 7 of the first embodiment by a multiplication process of the mask information and the brightness held by the image data, which is a feature of this embodiment.
- 10 is a flowchart showing an example of a flow of processing for performing visual field reproduction without dividing the processing according to regions. Except for the process of generating the display image data based on the mask information, the process is the same as that in FIG.
- step S701 to step S703 are the same as the contents described in FIG. 7 of the first embodiment.
- step S1001 the mask information corresponding to each pixel of the image data is grasped.
- the mask information is, for example, 8-bit information and takes a value from 0 to 255.
- step S1002 the brightness value of the corresponding pixel is multiplied by the value of the mask information to calculate a new brightness value.
- the multiplied result by the value obtained by dividing by 255 which is the maximum value of the mask information, when the mask information is 255, the same luminance value as before the division is calculated.
- the microscope field of view can be reproduced as in the first embodiment.
- the brightness is reduced by bit shift, whereas in the second embodiment, the brightness can be calculated by multiplication with mask information, and the degree of freedom of brightness setting is further increased.
- the mask information can be changed or newly set by a user instruction even when a predetermined value prepared in advance is used. As a result, it is possible to flexibly cope with shapes other than the circular observation visual field shape imitating the microscope visual field.
- the shape change can be dealt with similarly in the first embodiment.
- step S711 Since the processing up to step S711 after the determination of the rectangular mask display in the high-speed display mode in step S707 is the same as that in the first embodiment, description thereof is omitted.
- an image processing apparatus that reduces the burden on the observer can be provided.
- the decision branch can be eliminated, and the load can be reduced in the software process.
- the object of the present invention may be achieved by the following. That is, a recording medium (or storage medium) in which a program code of software that realizes all or part of the functions of the above-described embodiments is recorded is supplied to the system or apparatus. Then, the computer (or CPU or MPU) of the system or apparatus reads and executes the program code stored in the recording medium. In this case, the program code itself read from the recording medium realizes the functions of the above-described embodiment, and the recording medium on which the program code is recorded constitutes the present invention.
- an operating system (OS) operating on the computer performs part or all of the actual processing based on the instruction of the program code.
- OS operating system
- the program code read from the recording medium is written in a memory provided in a function expansion card inserted into the computer or a function expansion unit connected to the computer. Thereafter, the CPU of the function expansion card or function expansion unit performs part or all of the actual processing based on the instruction of the program code, and the functions of the above-described embodiments are realized by the processing. It can be included in the invention.
- program code corresponding to the flowchart described above is stored in the recording medium.
- the configurations described in the first and second embodiments can be combined with each other.
- the image processing apparatus may be connected to both the imaging apparatus and the image server, and an image used for processing may be acquired from any apparatus.
- configurations obtained by appropriately combining various techniques in the above embodiments also belong to the category of the present invention.
- the technical scope of the present invention is defined by each claim in the claims, and should not be construed as limited by the above embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Analytical Chemistry (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- Microscoopes, Condenser (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
バーチャルスライド画像を処理する画像処理装置であって、
撮像対象を撮像することにより得られた画像データを取得する画像データ取得ユニットと、
前記画像データから、あらかじめ定められた手法に基づいて決定されたあるいはユーザーが指定した観察領域を表示装置に表示させるための観察領域表示用画像データと、該観察領域以外の領域を表示装置に表示させるための観察領域外表示用画像データとから構成される表示用画像データを生成する表示用画像データ生成ユニットと、を有し、
前記表示用画像データ生成ユニットは、前記観察領域表示用画像データ、前記観察領域外表示用画像データの少なくとも一方に画像処理を施すことで、前記画像データ全体に一律の画像処理を施した場合とは異なる画像を表示装置に表示させる表示用画像データを生成することを特徴とする。
バーチャルスライド画像を処理する画像処理方法であって、
撮像対象を撮像することにより得られた画像データを取得する画像データ取得ステップと、
前記画像データ取得ステップで取得した画像データから、あらかじめ定められた手法に基づいて決定されたあるいはユーザーが指定した観察領域を表示装置に表示させるための観察領域表示用画像データと該観察領域以外の領域を表示装置に表示させるための観察領域外表示用画像データとから構成される表示用画像データを生成する表示用画像データ生成ステップと、を有し、
前記表示用画像データ生成ステップは、前記観察領域表示用画像データ、前記観察領域外表示用画像データの少なくとも一方に画像処理を施すことで、前記画像データ全体に一律の画像処理を施した場合とは異なる画像を表示装置に表示させる表示用画像データを生成するステップであることを特徴とする。
バーチャルスライド画像を処理する画像処理方法であって、
撮像対象を撮像することにより得られた画像データを取得する画像データ取得ステップと、
前記画像データ取得ステップで取得した画像データから、あらかじめ定められた手法に基づいて決定されたあるいはユーザーが指定した観察領域を表示装置に表示させるための観察領域表示用画像データと該観察領域以外の領域を表示装置に表示させるための観察領域外表示用画像データとから構成される表示用画像データを生成する表示用画像データ生成ステップと、を有し、
前記表示用画像データ生成ステップは、前記観察領域表示用画像データ、前記観察領域外表示用画像データの少なくとも一方に画像処理を施すことで、前記画像データ全体に一律の画像処理を施した場合とは異なる画像を表示装置に表示させる第1の表示用画像データと、前記画像データに画像処理を施さない、あるいは画像データ全体に一律の画像処理を施した第2の表示用画像データと、を生成するステップであり、
前記表示装置に表示させる画像の位置又は表示倍率が変化している時には、前記第1の表示用画像データを、前記表示装置に表示させる画像の位置又は表示倍率が変化していない時には、前記第2の表示用画像データを、前記表示装置に送信する表示用画像データ送信ステップと、を有することを特徴とする。
前記画像処理装置と、前記画像処理装置で処理されたバーチャルスライド画像を顕微鏡の視野を再現した観察領域を有する態様で表示する表示装置と、を備える画像処理システムである。
上記画像処理方法の各ステップをコンピュータに実行させることを特徴とするプログラムである。
本発明の画像処理装置は、撮像装置と表示装置を備えた画像処理システムにおいて用いることができる。この画像処理システムについて、図1を用いて説明する。
図1は、本発明の画像処理装置を用いた画像処理システムの一例を示す模式的な全体図であり、撮像装置(顕微鏡装置、またはバーチャルスライドスキャナ)101、画像処理装置102、画像表示装置103から構成され、撮像対象となる検体(被検試料)の二次元画像を取得し表示する機能を有するシステムである。本例では、撮像装置101と画像処理装置102との間は、専用もしくは汎用I/Fのケーブル104で接続され、画像処理装置102と画像表示装置103の間は、汎用のI/Fのケーブル105で接続されている。
図1の例では、撮像装置101と画像処理装置102と画像表示装置103の3つの装置により画像処理システムが構成されているが、本発明の構成はこの構成に限定されるものではない。例えば、画像表示装置と一体化した画像処理装置を用いてもよいし、画像処理装置の機能を撮像装置に組み込んでもよい。また撮像装置、画像処理装置、画像表示装置の機能を1つの装置で実現することもできる。また逆に、画像処理装置等の機能を分割して複数の装置によって実現してもよい。
図2は、撮像装置101の機能構成の一例を示す機能ブロック図である。
図3は、本発明の画像処理装置102の機能構成の一例を示す機能ブロック図である。
本例の表示データ出力部309は、表示用画像データ生成部308で生成された表示データを外部装置である表示装置103へ出力する。
(画像処理装置のハードウェア構成)
図4は、本発明の画像処理装置のハードウェア構成の一例を示すブロック図である。情報処理を行う装置として、例えばPC(Personal Computer)が用いられる。
図5は、顕微鏡視野ならびにその再現された表示形態を概念的に説明するための模式図である。
本発明の画像処理装置における顕微鏡視野表示処理の流れの一例を図6のフローチャートを用いて説明する。
(顕微鏡視野表示用画像データ生成処理)
図7は、図6のステップS607で示した顕微鏡視野を再現する表示用の画像データ生成処理の詳細な流れの一例を示すフローチャートである。ここでいう顕微鏡視野は前述した観察領域にあたる。一方、ここでは顕微鏡視野外は観察領域外にあたる。
ステップS707では、高速表示モードであることを受けて、高速表示する際のマスク形状(観察視野形状)を表示領域より小さい矩形のサイズとするか否かを判断する。画面の表示領域より小さい矩形の観察視野表示を行う場合はステップS708へ、表示領域をそのまま観察視野の変更を行わない場合はステップS608へそれぞれ進む。なお、ステップS608の通常観察用の表示用画像データ生成の処理内容は図6のフローチャートで説明した内容と同じため、省略する。
ステップS710およびステップS711の処理は、それぞれステップS705およびステップS706と同じため、説明は省略する。異なる点は顕微鏡観察視野の形状が円形か矩形かである。
図8は、本発明の画像処理装置102で生成した表示データを表示装置103に表示した場合の表示画面の一例を示す模式図である。図8では、顕微鏡観察視野を再現した表示モードおよび高速表示モード、および顕微鏡視野再現時の情報提示について説明する。
バーチャルライド画像の中に観察領域を設けることにより、観察者の負担を低減する画像処理装置を提供することができる。本実施の形態では特に、高速な表示が求められる画面スクロール時には観察視野を矩形に変更することで、処理の効率を上げることができる。また、顕微鏡と異なり注視される観察視野の領域(円形領域)外にも画像の他、各種情報を提示することで、病変部を見つけやすくしたり、操作性を向上させたりすることができる。
本発明の第2実施形態に係る画像処理システムについて図を用いて説明する。
図9は、本発明の第2の実施形態に係る画像処理システムを構成する装置の一例を示す模式的な全体図である。
図10は、第1の実施形態の図7で説明した顕微鏡視野表示用画像データ生成の処理に対して、本実施例の特徴である、マスク情報と画像データが保有する輝度との乗算処理によって、領域によって処理を分けることなく視野再現を行う処理の流れの一例を示すフローチャートである。マスク情報に基づく表示用画像データ生成のプロセス以外は図7と同様のため、同じ処理の説明については省略する。
ステップS1001では、画像データの各画素に対応するマスク情報を把握する。マスク情報は例えば8ビットの情報であり、0から255までの値をとる。
ステップS1002では、対応する画素の輝度値とマスク情報の値を乗算し、新たな輝度値として算出する。実際には、乗算された結果をマスク情報の最大値である255で割った値で正規化することによって、マスク情報が255の場合には除算前と同じ輝度値が算出されることになる。このように画素単位で同一処理を適用することでも第1の実施形態と同様に顕微鏡視野を再現することができる。第1の実施形態ではビットシフトによって輝度低下を行っていたのに対して、第2の実施形態ではマスク情報との乗算によって算出できることになり、より輝度の設定の自由度が増えることになる。マスク情報は予め用意しておいた規定の値を用いても、ユーザー指示によって変更または新たに設定することも可能である。その結果、顕微鏡視野を模した円形の観察視野形状以外の形状にも柔軟に対応することができる。形状変更に関しては第1の実施の形態でも同様に対応が可能である。
バーチャルライド画像の中に観察領域を設けることにより、観察者の負担を低減する画像処理装置を提供することができる。特に観察視野の内外で同一の処理を用いて表示画像を生成することで、判断分岐をなくし、ソフト処理ではその負荷を減らすことができる。また、輝度低下の階調表現をより滑らかにすることでユーザーの負担をさらに軽減することができる。
本発明の目的は、以下によって達成されてもよい。すなわち、前述した実施形態の機能の全部または一部を実現するソフトウェアのプログラムコードを記録した記録媒体(または記憶媒体)を、システムあるいは装置に供給する。そして、そのシステムあるいは装置のコンピュータ(またはCPUやMPU)が記録媒体に格納されたプログラムコードを読み出し実行する。この場合、記録媒体から読み出されたプログラムコード自体が前述した実施形態の機能を実現することになり、そのプログラムコードを記録した記録媒体は本発明を構成することになる。
101 撮像装置
102 画像処理装置
103 表示装置
301 画像データ取得部
302 記憶保持部
303 ユーザー入力情報取得部
304 表示装置情報取得部
305 表示データ生成制御部
306 マスク情報
307 表示用画像データ取得部
308 表示用画像データ生成部
309 表示データ出力部
901 画像サーバー
Claims (28)
- バーチャルスライド画像を処理する画像処理装置であって、
撮像対象を撮像することにより得られた画像データを取得する画像データ取得ユニットと、
前記画像データから、あらかじめ定められた手法に基づいて決定されたあるいはユーザーが指定した観察領域を表示装置に表示させるための観察領域表示用画像データと、該観察領域以外の領域を表示装置に表示させるための観察領域外表示用画像データとから構成される表示用画像データを生成する表示用画像データ生成ユニットと、を有し、
前記表示用画像データ生成ユニットは、前記観察領域表示用画像データ、前記観察領域外表示用画像データの少なくとも一方に画像処理を施すことで、前記画像データ全体に一律の画像処理を施した場合とは異なる画像を表示装置に表示させる表示用画像データを生成することを特徴とする画像処理装置。 - 前記表示用画像データ生成ユニットは、顕微鏡の視野を再現した観察領域を表示させるための前記表示用画像データを生成することを特徴とする請求項1に記載の画像処理装置。
- 前記表示用画像データ生成ユニットは、実在する顕微鏡の視野に関する情報に基づいて前記表示用画像データを生成することを特徴とする請求項2に記載の画像処理装置。
- 前記表示用画像データ生成ユニットは、実在する顕微鏡の視野に関する情報及び画像として表示すべき倍率の情報に基づいて前記表示用画像データを生成することを特徴とする請求項3に記載の画像処理装置。
- 前記表示用画像データ生成ユニットは、実在する顕微鏡の視野に関する複数の情報のうちのあらかじめ定められた一つを初期情報として用いて、前記表示用画像データを生成することを特徴とする請求項3又は4に記載の画像処理装置。
- 前記表示用画像データ生成ユニットは、ユーザーの選択に基づいて、実在する顕微鏡の視野に関する複数の情報のうちの一つを用いて、前記表示用画像データを生成することを特徴とする請求項3~5のいずれかに記載の画像処理装置。
- 前記表示用画像データ生成ユニットは、前記観察領域外の輝度が前記観察領域の輝度よりも低くなるように、表示用画像データを生成することを特徴とする請求項1~6のいずれかに記載の画像処理装置。
- 前記表示用画像データ生成ユニットは、多値のマスク情報と前記画像データの画素単位での乗算を行うことによって、前記表示用画像データを生成することを特徴とする請求項1~7のいずれかに記載の画像処理装置。
- 前記表示用画像データ生成ユニットは、2つの処理形態を示すマスク情報をもとにした前記画像データの演算処理によって前記表示用画像データを生成することを特徴とする請求項1~7のいずれかに記載の画像処理装置。
- 前記画像データはRGBの色情報を持つカラー画像データであり、
前記表示用画像データ生成ユニットは、前記画像データを輝度色差データへ変換した後に、該変換で得られた輝度値に前記演算処理をすることを特徴とする請求項9に記載の画像処理装置。 - 前記2つの処理形態を示すマスク情報は、前記画像データを採用する位置と、前記画像データをビットシフト演算する位置とが表現されていることを特徴とする請求項9又は10に記載の画像処理装置。
- 前記ビットシフト演算のシフト量は、外部から入力された指示によって変更されることを特徴とする請求項11に記載の画像処理装置。
- 前記表示用画像データ生成ユニットは、顕微鏡の視野を模した円形の前記観察領域を表示させる前記表示用画像データを生成し、
前記ビットシフト演算のシフト量は、前記円形の前記観察領域の中心からの距離に応じてシフト量を変更したことを特徴とする請求項11に記載の画像処理装置。 - 前記表示用画像データ生成ユニットは、前記表示装置に表示させる画像の位置又は表示倍率が変化している間には、前記観察領域と前記観察領域以外とを区別しない表示用画像データを生成することを特徴とする請求項1乃至13のいずれか一項に記載の画像処理装置。
- 前記観察領域以外の表示用画像データは、前記撮像対象の情報を含んでいることを特徴とする請求項1乃至14のいずれか一項に記載の画像処理装置。
- バーチャルスライド画像を処理する画像処理方法であって、
撮像対象を撮像することにより得られた画像データを取得する画像データ取得ステップと、
前記画像データ取得ステップで取得した画像データから、あらかじめ定められた手法に基づいて決定されたあるいはユーザーが指定した観察領域を表示装置に表示させるための観察領域表示用画像データと該観察領域以外の領域を表示装置に表示させるための観察領域外表示用画像データとから構成される表示用画像データを生成する表示用画像データ生成ステップと、を有し、
前記表示用画像データ生成ステップは、前記観察領域表示用画像データ、前記観察領域外表示用画像データの少なくとも一方に画像処理を施すことで、前記画像データ全体に一律の画像処理を施した場合とは異なる画像を表示装置に表示させる表示用画像データを生成するステップであることを特徴とする画像処理方法。 - 前記表示用画像データを前記表示装置に送信する表示用画像データ送信ステップ、をさらに有することを特徴とする請求項16に記載の画像処理方法。
- 前記表示用画像データ生成ステップは、顕微鏡の視野を再現した観察領域を表示させるための前記表示用画像データを生成するするステップであることを特徴とする請求項16又は17に記載の画像処理方法。
- 前記表示用画像データ生成ステップは、実在する顕微鏡の視野に関する情報に基づいて前記表示用画像データを生成するステップであることを特徴とする請求項18に記載の画像処理方法。
- 前記表示用画像データ生成ステップは、実在する顕微鏡の視野に関する情報及び表示すべき倍率の情報に基づいて前記表示用画像データを生成するステップであることを特徴とする請求項19に記載の画像処理方法。
- 前記表示用画像データ生成ステップは、実在する顕微鏡の視野に関する複数の情報のうちのあらかじめ定められた一つを初期情報として用いて、前記表示用画像データを生成するステップであることを特徴とする請求項18又は19に記載の画像処理方法。
- [規則91に基づく訂正 25.02.2014]
前記表示用画像データ生成ステップは、ユーザーの選択に基づいて、実在する顕微鏡の視野に関する複数の情報のうちの一つを用いて、前記表示用画像データを生成するステップであることを特徴とする請求項19~21のいずれかに記載の画像処理方法。 - [規則91に基づく訂正 25.02.2014]
前記表示用画像データ生成ステップは、前記観察領域外の輝度が前記観察領域の輝度よりも低くなるように、表示用画像データを生成するステップであることを特徴とする請求項16~22のいずれかに記載の画像処理方法。 - 前記表示用画像データ生成ステップは、画像データを採用する位置と前記画像データをビットシフト演算する位置とが表現されている2つの処理形態を示すマスク情報に基づき、前記画像データを演算処理して前記表示用画像データを生成することを特徴とする請求項16~23のいずれかに記載の画像処理方法。
- 前記画像データ取得ステップで取得する前記画像データは、RGBの色情報を持つカラー画像データであり、
前記表示用画像データ生成ステップは、前記画像データを輝度色差データへ変換した後に、該変換で得られた輝度値に前記演算処理をすることを特徴とする請求項16~24のいずれか一項に記載の画像処理方法。 - バーチャルスライド画像を処理する画像処理方法であって、
撮像対象を撮像することにより得られた画像データを取得する画像データ取得ステップと、
前記画像データ取得ステップで取得した画像データから、あらかじめ定められた手法に基づいて決定されたあるいはユーザーが指定した観察領域を表示装置に表示させるための観察領域表示用画像データと該観察領域以外の領域を表示装置に表示させるための観察領域外表示用画像データとから構成される表示用画像データを生成する表示用画像データ生成ステップと、を有し、
前記表示用画像データ生成ステップは、前記観察領域表示用画像データ、前記観察領域外表示用画像データの少なくとも一方に画像処理を施すことで、前記画像データ全体に一律の画像処理を施した場合とは異なる画像を表示装置に表示させる第1の表示用画像データと、前記画像データに画像処理を施さない、あるいは画像データ全体に一律の画像処理を施した第2の表示用画像データと、を生成するステップであり、
前記表示装置に表示させる画像の位置又は表示倍率が変化している時には、前記第1の表示用画像データを、前記表示装置に表示させる画像の位置又は表示倍率が変化していない時には、前記第2の表示用画像データを、前記表示装置に送信する表示用画像データ送信ステップと、を有することを特徴とする画像処理方法。 - 請求項1~15のいずれかに記載の画像処理装置と、前記画像処理装置で処理されたバーチャルスライド画像を顕微鏡の視野を再現した観察領域を有する態様で表示する表示装置と、を備える画像処理システム。
- 請求項16~26のいずれか1項に記載の画像処理方法の各ステップをコンピュータに実行させることを特徴とするプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112012005484.9T DE112012005484T5 (de) | 2011-12-27 | 2012-12-27 | Bildverarbeitungsgerät, Bildverarbeitungssystem, Bildverarbeitungsverfahren und Bildverarbeitungsprogramm |
CN201280064884.6A CN104185806A (zh) | 2011-12-27 | 2012-12-27 | 图像处理设备,图像处理系统,图像处理方法和图像处理程序 |
KR1020147019756A KR20140107469A (ko) | 2011-12-27 | 2012-12-27 | 화상 처리장치, 화상 처리 시스템, 화상 처리방법 및 기억매체 |
US13/909,918 US20130265322A1 (en) | 2011-12-27 | 2013-06-04 | Image processing apparatus, image processing system, image processing method, and image processing program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011286784 | 2011-12-27 | ||
JP2011-286784 | 2011-12-27 | ||
JP2012-282783 | 2012-12-26 | ||
JP2012282783A JP2013152453A (ja) | 2011-12-27 | 2012-12-26 | 画像処理装置、画像処理システム、画像処理方法および画像処理プログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/909,918 Continuation US20130265322A1 (en) | 2011-12-27 | 2013-06-04 | Image processing apparatus, image processing system, image processing method, and image processing program |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2013100026A1 WO2013100026A1 (ja) | 2013-07-04 |
WO2013100026A9 true WO2013100026A9 (ja) | 2014-05-22 |
Family
ID=48697505
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/083825 WO2013100026A1 (ja) | 2011-12-27 | 2012-12-27 | 画像処理装置、画像処理システム、画像処理方法および画像処理プログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20130265322A1 (ja) |
JP (1) | JP2013152453A (ja) |
KR (1) | KR20140107469A (ja) |
CN (1) | CN104185806A (ja) |
DE (1) | DE112012005484T5 (ja) |
WO (1) | WO2013100026A1 (ja) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8928613B2 (en) * | 2012-10-08 | 2015-01-06 | Chin Ten Chang | Touch control system for touch panel |
JP6455829B2 (ja) * | 2013-04-01 | 2019-01-23 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
US10178394B2 (en) * | 2016-06-10 | 2019-01-08 | Apple Inc. | Transcoding techniques for alternate displays |
EP3682629A4 (en) * | 2018-08-27 | 2020-07-29 | SZ DJI Technology Co., Ltd. | IMAGE PROCESSING AND PRESENTATION |
CN111111163B (zh) * | 2019-12-24 | 2022-08-30 | 腾讯科技(深圳)有限公司 | 管理计算资源的方法、设备和电子设备 |
JP7063515B1 (ja) * | 2020-11-09 | 2022-05-09 | 有限会社ウィン | 人間工学的視野に基づく光学顕微鏡システム |
US20230186446A1 (en) * | 2021-12-15 | 2023-06-15 | 7 Sensing Software | Image processing methods and systems for low-light image enhancement using machine learning models |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3560663B2 (ja) * | 1994-12-07 | 2004-09-02 | オリンパス株式会社 | 走査型光学顕微鏡 |
US7738688B2 (en) * | 2000-05-03 | 2010-06-15 | Aperio Technologies, Inc. | System and method for viewing virtual slides |
JP2002196257A (ja) * | 2000-12-25 | 2002-07-12 | Nikon Corp | 画像処理装置、コンピュータ読取可能な記録媒体、及び顕微鏡システム。 |
US20030210262A1 (en) * | 2002-05-10 | 2003-11-13 | Tripath Imaging, Inc. | Video microscopy system and multi-view virtual slide viewer capable of simultaneously acquiring and displaying various digital views of an area of interest located on a microscopic slide |
JP5658451B2 (ja) * | 2009-11-30 | 2015-01-28 | ソニー株式会社 | 情報処理装置、情報処理方法及びそのプログラム |
JP2011181015A (ja) * | 2010-03-03 | 2011-09-15 | Olympus Corp | 診断情報配信装置および病理診断システム |
JP5555014B2 (ja) * | 2010-03-10 | 2014-07-23 | オリンパス株式会社 | バーチャルスライド作成装置 |
JP2012014668A (ja) * | 2010-06-04 | 2012-01-19 | Sony Corp | 画像処理装置、画像処理方法、プログラム、および電子装置 |
JP2012003326A (ja) * | 2010-06-14 | 2012-01-05 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
US8754902B2 (en) * | 2011-06-03 | 2014-06-17 | Apple Inc. | Color-space selective darkness and lightness adjustment |
-
2012
- 2012-12-26 JP JP2012282783A patent/JP2013152453A/ja active Pending
- 2012-12-27 KR KR1020147019756A patent/KR20140107469A/ko not_active Application Discontinuation
- 2012-12-27 CN CN201280064884.6A patent/CN104185806A/zh active Pending
- 2012-12-27 WO PCT/JP2012/083825 patent/WO2013100026A1/ja active Application Filing
- 2012-12-27 DE DE112012005484.9T patent/DE112012005484T5/de not_active Withdrawn
-
2013
- 2013-06-04 US US13/909,918 patent/US20130265322A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
JP2013152453A (ja) | 2013-08-08 |
US20130265322A1 (en) | 2013-10-10 |
KR20140107469A (ko) | 2014-09-04 |
CN104185806A (zh) | 2014-12-03 |
WO2013100026A1 (ja) | 2013-07-04 |
DE112012005484T5 (de) | 2014-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013100025A1 (ja) | 画像処理装置、画像処理システム、画像処理方法および画像処理プログラム | |
US20200050655A1 (en) | Image processing apparatus, control method for the same, image processing system, and program | |
JP6091137B2 (ja) | 画像処理装置、画像処理システム、画像処理方法およびプログラム | |
WO2013100026A9 (ja) | 画像処理装置、画像処理システム、画像処理方法および画像処理プログラム | |
JP5350532B2 (ja) | 画像処理装置、画像表示システム、画像処理方法および画像処理プログラム | |
JP2014130221A (ja) | 画像処理装置、その制御方法、画像処理システム、及びプログラム | |
US20160042122A1 (en) | Image processing method and image processing apparatus | |
JP2013200640A (ja) | 画像処理装置、画像処理システム、画像処理方法、およびプログラム | |
WO2013100029A9 (ja) | 画像処理装置、画像表示システム、画像処理方法および画像処理プログラム | |
JP6035931B2 (ja) | 情報処理装置、情報処理方法、および情報処理プログラム | |
JP5832281B2 (ja) | 画像処理装置、画像処理システム、画像処理方法、およびプログラム | |
JP2016038542A (ja) | 画像処理方法および画像処理装置 | |
JP2018120227A (ja) | 表示方法、情報処理装置および記憶媒体 | |
JP6338730B2 (ja) | 表示データを生成する装置、方法、及びプログラム | |
JP2013250400A (ja) | 画像処理装置、画像処理方法、および画像処理プログラム | |
JP2016206228A (ja) | 合焦位置検出装置、合焦位置検出方法、撮像装置、撮像システム | |
JP6299840B2 (ja) | 表示方法、情報処理装置および記憶媒体 | |
JP2013250574A (ja) | 画像処理装置、画像表示システム、画像処理方法および画像処理プログラム | |
JP2016038541A (ja) | 画像処理方法および画像処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12861760 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120120054849 Country of ref document: DE Ref document number: 112012005484 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 20147019756 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12861760 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12861760 Country of ref document: EP Kind code of ref document: A1 |