EP2256688B1 - Appareil d'obtention d'images, procédé de synthèse d'images et système de microscope - Google Patents

Appareil d'obtention d'images, procédé de synthèse d'images et système de microscope Download PDF

Info

Publication number
EP2256688B1
EP2256688B1 EP10004593.9A EP10004593A EP2256688B1 EP 2256688 B1 EP2256688 B1 EP 2256688B1 EP 10004593 A EP10004593 A EP 10004593A EP 2256688 B1 EP2256688 B1 EP 2256688B1
Authority
EP
European Patent Office
Prior art keywords
image
area
partial
area image
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP10004593.9A
Other languages
German (de)
English (en)
Other versions
EP2256688A1 (fr
Inventor
Yuki Yokomachi
Yujin Arai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of EP2256688A1 publication Critical patent/EP2256688A1/fr
Application granted granted Critical
Publication of EP2256688B1 publication Critical patent/EP2256688B1/fr
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Definitions

  • the present invention relates to image processing techniques, especially a technique for obtaining a high-quality image from a plurality of captured images.
  • image capturing apparatuses are generally configured to have image sensors using a CCD (Charge Coupled Device), CMOS (Complementary Metal Oxide Semiconductor), and the like.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • Such image sensors have a narrower dynamic range for contrast, compared with that of photographic films and human vision. For this reason, a problem may emerge in camera shooting in a scene with strong contrast (shooting with backlight or indoor and outdoor simultaneous shooting) or in capturing images for observing industrial samples (such as an IC chip and an electronic substrate) related to microscopic measurement.
  • Japanese Laid-open Patent Publication No. 6-141229 proposes an image capturing apparatus with which variable control can be performed for the image capturing time.
  • the image capturing apparatus obtains a wide dynamic range image (an image with a wide dynamic range for contrast) by alternating long exposure-time image capturing and short exposure-time image capturing and synthesizing the two images with the different image capturing times.
  • Japanese Laid-open Patent Publication No. 2003-46857 proposes a technique for preventing the decline of the frame rate due to the capturing of a plurality of images used for generating a wide dynamic range image.
  • This technique makes it possible to generate a wide dynamic range image at the same frame rate as for taking in the image.
  • long exposure-time image capturing and short exposure-time image capturing are performed alternately, and a synthesis algorithm of an image captured with the long exposure time and an image captured with the short exposure-time immediately before or after the image captured with the long exposure time, and a synthesis algorithm of an image captured with the short exposure time and an image captured with the long exposure-time immediately before or after the image captured with the short exposure time are performed alternately.
  • this technique virtually generates a wide dynamic range image for one frame from captured images for one frame.
  • the capture of an observation image is performed with the entire area of the light receiving surface of the image sensor, and the wide dynamic range image is obtained by synthesizing images captured in the entire area.
  • the generation frame rate for the wide dynamic range image is limited by the time required for obtaining images with the entire area in both image capturing with the long exposure time and image capturing with the short exposure time.
  • Fig. 1 illustrates a first example of the configuration of the image capturing apparatus for a microscope that is an image obtaining apparatus with which the present invention is implemented.
  • This image capturing apparatus for a microscope captures one of a long-time exposure image and a short-time exposure image used for the generation of a wide dynamic range image, as an image (hereinafter, referred to as an "original entire-area image”) captured with the entire area of the light-receiving surface of the image sensor, and captures the other of a long-time exposure image and a short-time exposure image as an image (hereinafter, referred to as a "partial-area image”) obtained by capturing an observation image in a partial area of the light-receiving surface of the image sensor.
  • the generation of a wide dynamic range image is performed by synthesizing the original entire-area image and the partial-area image obtained as described above.
  • the image capturing apparatus has an optical system 10, an image capturing unit 11, a recording unit 12, an image output unit 13, a condition setting unit 18, a synthesizing unit 19, an input unit 20, and a display unit 21.
  • the condition setting unit 18 has a partial area extraction unit 14, an exposure control unit 15, an image capturing condition storage unit 16 and an order adjustment unit 17.
  • the optical system 10 has optical components such as a lens or an optical filter, and makes the light from the subject enter the image capturing unit 11 to form an observation image.
  • the image capturing unit 11 captures the formed observation image, and outputs a digital image signal that represents the picture of the observation image to the recording unit 12.
  • the recording unit 12 records the image signal.
  • the image output unit 13 reads out the image signal recorded in the recording unit 12, and displays and outputs the picture of the observation image represented by the image signal. Meanwhile, the image signal recorded in the recording unit 12 is also transferred to the partial area extraction unit 14 and the exposure control unit 15 of the condition setting unit 18.
  • the partial area extraction unit 14 decides a region of interest for capturing a partial-area image, in the picture of the observation image.
  • the exposure control unit 15 decides a capturing condition (here, the exposure time) for capturing the picture, in accordance with the luminance of the picture in the region of interest.
  • the results of the decision of the region of interest and the exposure time are sent to the image capturing condition storage unit 16, and stored there as the image capturing conditions for the partial-area image.
  • the order adjustment unit 17 adjusts the order of the image capturing conditions stored in the image capturing condition storage unit 16, and controls the image capturing unit 11 so that it captures the original entire-area image and the partial-area image alternately. In addition, the order adjustment unit 17 performs control of the timing of the image synthesis in the synthesizing unit 19.
  • the condition setting unit 18 reflects, in the partial area extraction unit 14, the exposure control unit 15, the image capturing condition storage unit 16 and the order adjustment unit 17, parameters input from the user of the image capturing apparatus to the input unit 20, and also displays and outputs the result of the reflection by the display unit 21.
  • the parameters include ones related to the region of interest, the exposure time, the image capturing conditions etc. Details of the parameters are described later.
  • the original entire-area image and the partial-area image captured by the image capturing unit 11 are temporarily recorded in the recording unit 12 and after that, transferred to the synthesizing unit 19 from the recording unit.
  • the synthesizing unit 19 performs pattern matching of the original entire-area image and the partial-area image, and performs a synthesis process after that.
  • the image output unit 13 displays and outputs the wide dynamic range image of the observation image generated by the synthesis process.
  • the image capturing unit 11 has an image sensor 111, an AFE (Analog Front End) 112 and a TG (Timing Generator) 113.
  • the image sensor 111 is an image sensor such as a CCD.
  • the optical system 10 captures the observation image of the subject formed on the effective light-receiving surface of the image sensor 111, and performs photoelectric conversion of the observation image to output an electric signal.
  • an A/D (analog-digital) conversion of the electric signal output from the image sensor 111 is performed, after CDS (Correlated Double Sampling) process and an AGC (Automatic Gain Control) are performed for the electric signal.
  • the AFE 112 outputs a digital image signal obtained as described above and represents the picture of the observation image to the recording unit 12. Meanwhile, it is assumed here that the dynamic range of the digital image signal (the dynamic range of the luminance value (pixel value) of each pixel constituting the picture) is 8 bit (the values that the luminance value of the pixel may take are 256 values from "0" to "255").
  • the TG 113 gives a drive signal to the image sensor 111, and also gives a synchronization signal to the AFE 112.
  • the TG 113 performs control of reading out of the original entire-area image and reading out of the partial-area image with respect to the image sensor 111.
  • the recording unit 12 has a frame memory A121 and a frame memory B122, and records digital image signals output from the AFE 112.
  • the frame memory A121 records a digital image signal for the original entire-area image of the observation image
  • the frame memory B122 records a digital image signal for the partial-area image of the observation image.
  • the image output unit 13 reads out the digital image signal recorded in the frame memory A121, and displays and outputs the original entire-area image of the observation image represented by the image signal.
  • a digital image signal recorded in the frame memory A121 may be represented simply as an "original entire-area image”
  • a digital image signal recorded in the frame memory B122 may be represented simply as a "partial-area image”.
  • the condition setting unit 18 has the partial area extraction unit 14, the exposure control unit 15, the image capturing condition storage unit 16 and the order adjustment unit 17, as described above.
  • the partial area extraction unit 14 decides and extracts a region of interest for capturing a partial-area image, in the picture of the observation image, with respect to the original entire-area image recorded in the frame memory A121.
  • the exposure control unit 15 receives information of the exposure time input by the user to the input unit 20, and decides the exposure time for capturing the picture, in accordance with the information and the luminance of the picture in the region of interest.
  • the image capturing condition storage unit 16 stores and holds the region of interest extracted by the partial area extraction unit 14 and the exposure time for capturing the picture in the region of interest decided by the exposure control unit 15, as the image capturing conditions for capturing the partial-area image.
  • the order adjustment unit 17 reads out the image capturing conditions held in the image capturing conditions storage unit 16, and sends the read-out image capturing conditions to the TG 113 following a predetermined order. More specifically, the order adjustment unit 17 sends the image capturing condition (first exposure condition) of the original entire-area image and the image capturing condition (second exposure condition) of the partial-area image alternately to the TG 113.
  • the TG 113 receives the first exposure condition, it sets the exposure condition to the first exposure condition, and then controls the image sensor 111 to capture the original entire-area image.
  • the TG 113 receives a second exposure condition that is different from the first exposure condition, it sets the exposure condition to the second exposure condition, and then controls the image sensor 111 to capture the partial-area image.
  • the synthesizing unit 19 has a detection unit A191, a detection unit B192, a pattern matching unit 193 and an image joining unit 194, and synthesize an original entire-area image and a partial-area image to obtain an entire-area image having a wider dynamic range than the original entire-area image.
  • the detection unit A191 (first detection unit) reads out an original entire-area image recorded in the frame memory A121, and separates the original entire-area image into an area X that exceeds a threshold value and an area X- other than the area X. More specifically, the detection unit A191 detects, from the original entire-area image, an area X (replacement-target area) that consists of a group of pixels of which luminance value exceeds a predetermined threshold value in the pixels constituting the original entire-area image, and separates the original entire-area image into the area X and the area X- other than the area X.
  • an area X replacement-target area
  • the detection unit B192 (second detection unit) reads out a partial-area image recorded in the frame memory B122, and detects, from the partial-area image, an area Y (replacement area) that is estimated to correspond to the area X in the original entire-area image. More specifically, the detection unit B192 detects, from a partial-area image recorded in the frame memory B122, an area Y that is estimated to exceed a threshold value in the original entire-area image on the basis of the ratio of exposure times for the original entire-area image and the partial-area image.
  • the pattern matching unit 193 generates an area Z by performing pattern matching of the area X extracted by the detection unit A191 and the area Y extracted by the detection unit B192. More specifically, the pattern matching unit 193 changes the shape of the area Y, to generate an area X in which the shapes of the contours of the area Y and the area X are matched.
  • the image joining unit 194 performs feedback of the pattern matching result in the pattern matching unit 193, and joins the area Z (that is, the are Y in the partial-area image after its shape is changed by the pattern matching unit 193), and the area X-, to generate a synthesized image. More specifically, the image joining unit 194 performs an image synthesis process to replace the picture in the area X in the original entire-area image with the picture in the area Z to join the picture of the area X-in the original entire-area image and the picture of the area X.
  • the synthesized image obtained as describe is an entire-area image (hereinafter, referred to as a "wide dynamic range image") that has a wider dynamic range than that for the original entire-area image before the synthesis.
  • the image output unit 13 displays and outputs the synthesized image.
  • FIG. 2 A microscope system being an implementation example of the image capturing apparatus configured as described above is illustrated in Fig. 2 .
  • the implementation of the image capturing apparatus is not limited to the example in Fig. 2 .
  • the optical system 10 (not illustrated in Fig. 2 ) is implemented in a microscope main body 1, the image capturing unit 11 is implemented in a camera head 2, and the other constituent elements are implemented in a computer 3.
  • the microscope main body 1 is for obtaining a microscopic image of a sample.
  • the microscopic image obtained by the microscope main body 1 is formed as an observation image on the light-receiving surface of the image sensor 111 provided in the image capturing unit 11 implemented in the camera head 2.
  • the recording unit 12 is implemented in the memory device 4 being a RAM (Random Access Memory)
  • the image output unit 13 and the display unit 21 are implemented in a display device 5
  • the input unit 20 is implemented in an input device 6 such as a keyboard device and a mouse device.
  • condition setting unit 18 (not illustrated in Fig. 2 ) having he partial area extraction unit 14, the exposure control unit 15, the image capturing condition storage unit 16 and the order adjustment unit 17, and the synthesizing unit 19 are implemented in a CPU (Central Processing Unit) 7.
  • the respective functions of the partial area extraction unit 14, the exposure control unit 15, the image capturing condition storage unit 16, the order adjustment unit 17 and the synthesizing unit 19 can be provided by the CPU 7 by making the CPU 7 read out and execute a predetermined control program that has been stored in a storage device not illustrated in the drawing in advance.
  • the memory device 4, the display device 5, the input device 6 and the CPU 7 are connected through a bus and an interface circuit not illustrated in the drawing, to be capable of exchanging data with each other.
  • the CPU 7 also performs operation management of the memory device 4, the display device 5 and the input device 6.
  • the period of the vertical synchronization signal (VD) of the image sensor 111 needs to be more than the period obtained by multiplying the period (H) of its horizontal synchronization signal by the number of light-receiving pixels (He) in the vertical direction on the light-receiving surface of the image sensor 111.
  • the reading out operation of an electric charge generated in each light-receiving pixel with the entire light-receiving surface being the effective pixel area performed with the VD set as described above is the all-pixel reading operation.
  • the CCD has 1024 horizontal scanning lines, and the VD signal needs to have a period of more than 1024H (H is the period of the horizontal synchronization signal).
  • the period of the VD is set as a multiple of a number that is smaller than He, with respect to H, compared to the all-pixel reading operation.
  • the reading out operation of an electric charge generated in each light-receiving pixel with a part of the light-receiving surface as the effective pixel area performed with the VD set as described above is the partial operation.
  • the area including the light-receiving pixels for which the reading out of the generated electric charge is performed is narrower compared with that in the all-pixel reading operation, and the period of the VD is shorter accordingly, making it possible to speed up the reading out.
  • the partial reading operation for the CCD some techniques such as the partial scan and high-speed charge flushing have been known already.
  • the number of horizontal scanning lines in the partial-area image that are read out in the partial reading operation is 424, and every 10 lines of the remaining 600 horizontal scanning lines are transferred by the high-speed flushing.
  • Fig. 4 is a timing chart of a drive signal of the image sensor 111 generated by the TG 113.
  • VD is a vertical synchronization signal
  • HD represents a horizontal synchronization signal
  • SG represents a transport pulse signal for transporting the electric charge from each light-receiving pixel to a transfer line
  • SUV represents an electric shutter pulse signal for discharging the charge from the transfer line to the substrate
  • V represents a vertical transfer clock signal for driving a vertical transfer path.
  • ReadOut represents a period in which image data captured by the image capturing unit 11 is transferred to the recording unit 12.
  • ALL is the transfer period of the original entire-area image
  • ROI is the transfer period of the partial-area image.
  • SG and SUB are generated from VD.
  • the period between when SUB that is generated continuously is discontinued and when the next SG signal is generated is the accumulation time for the electric charge after photoelectric conversion, which practically corresponds to the exposure time.
  • the time “T1" in Fig. 4 is the exposure time for the original entire-area image
  • the time “T2" us the exposure time for the partial-area image.
  • the TG 113 sets the time "T1" and "T2" alternately by controlling them as needed, to make the image sensor 111 capture the original entire-area image and the partial-area image alternately.
  • the TG 113 generates a number of vertical transfer cloak signals "V" with a short period for the pixels in the area other than the effective pixel area, to make the image sensor 111 perform the high-speed flushing operation. This shortens the transfer time of the partial-area image to the recording unit 12.
  • Fig. 5A is a flowchart illustrating the process details of an original entire-area image capturing control process
  • Fig. 5B is a flowchart illustrating the process details of an image capturing mode switching control process
  • Fig. 5C is a flowchart illustrating the process details of a first example of a wide dynamic range image capturing control process
  • Fig. 5D is a flowchart illustrating the process details of a wide dynamic range image synthesis control process.
  • these control processes are performed by the CPU 7.
  • the CPU 7 becomes capable of performing these control processes by reading out and executing predetermined control programs that have been stored in a storage device not illustrated in the drawing in advance.
  • Fig. 6A and Fig. 6B are examples of screen displays on the display device 5, which are examples of screen displays by the image output unit 13.
  • Fig. 6A is a screen display example for the capture of the original entire-area image
  • Fig. 6B is a screen display example for the capture of the wide dynamic range image.
  • Fig. 7 is a diagram illustrating the synthesis of the wide dynamic range image.
  • an initial value of an image capturing condition (in this embodiment, the exposure time) for the first capturing of the original entire-area image is set and held in the image capturing condition storage unit 16 in advance.
  • an original entire-area image capturing process is performed in S21.
  • a process to control the image capturing unit 11 to make it capture only the original entire-area image is performed by the order adjustment unit 17.
  • the image capturing unit 11 captures the observation image of the subject formed on the light-receiving surface by the optical system 10 under the image capturing condition held in the image capturing condition storage unit 16, and outputs the obtained original entire-area image It.
  • an image signal recording process is performed in S22.
  • a process in which the original entire-area image It output from the image capturing unit 11 is recorded by the frame memory A121 of the recording unit 12 is performed.
  • a captured image display process is performed in S23.
  • a process to read out the original entire-area image It recorded in the frame memory A121 of the recording unit 12 and to display and output the read-out original entire-area image It by the image output unit 13 as a screen such as the one illustrated in Fig. 6 is performed.
  • the user observes the displayed original entire-area image It, and determines whether or not exposure correction is required for the original entire-area image It.
  • the user inputs a corrected image capturing condition (in this embodiment, the exposure time) to the input unit 20, to instruct the image capturing apparatus to perform the exposure correction.
  • the original entire-area image It is displayed by the process in S23.
  • the exposure time in the capture of the displayed original entire-area image It is "40" milliseconds.
  • the user observes the display of the original entire-area image It on the screen and determines whether or not exposure correction is required.
  • the user inputs a value of the exposure time that is supposed to be more appropriate on the basis of the original entire-area image It and the value of the exposure time being displayed, to the input unit 20 as the corrected exposure condition.
  • This input is the instruction to the image capturing apparatus for performing the exposure correction.
  • the user inputs the instruction for no correction to the input unit 20.
  • S24 a process to determine whether or not the input content to the input unit 20 instructs the exposure correction or not is performed by the exposure control unit 15.
  • the process proceeds to S26, and when the input content instructs no exposure correction (when the decision result is No), the process proceeds to S25.
  • a process to determine whether or not the input unit 20 has received input of an instruction for the termination of the image capturing from the user is performed by the order adjustment unit 17,
  • the process in Fig. 5A is terminated.
  • the process returns to S21, and the capturing process of the original entire-area image is performed again.
  • the determination result in S25 always becomes No, the processes from S21 to S23 are performed repeatedly, and the image output unit 13 displays and outputs the moving picture of the original entire-area image of the subject.
  • the display and output of the moving picture by the image output unit 13 is referred to as "live observation”.
  • an exposure correction process is performed in S26.
  • a process to give the input value of the exposure time to the image capturing unit 11 to change the image capturing condition for the subsequent capture of the original entire-area image It is performed by the exposure control unit 15.
  • an exposure condition storage process is performed.
  • a process in which the corrected value of the exposure time input by the user to the input unit 20 is stored, updated and held in the image capturing condition storage unit 16 as the corrected image capturing condition is performed by the exposure control unit 15. Then, when the process in S27 is completed, the process returns to S21, and the capturing process of the original entire-area image is performed again.
  • the user observes the original entire-area image obtained by the execution of the original entire-area image capturing control process described above. At this time, if neither "white blow-out” nor “black out” is observed in the original entire-area image, there is no need to generate the wide dynamic range image. On the other hand, if any "white blow-put” or "black out” appears in the original entire-area image no matter how the exposure correction described above is performed, a wide dynamic range image needs to be generated.
  • the user determines the whether or not a wide dynamic range image needs to be generated, on the basis of the observation of the original entire-area image as described above, and inputs the determination result to the input unit 20. If the input is to be performed using the screen in Fig. 6A , a selection instruction of a radio button display in the "SHOOTING MODE" field included in the screen may be given by the user by operating the mouse device and the like of the input unit 20.
  • an operation to select the radio button of the "ENTIRE-AREA IMAGE CAPTURING MODE" may be performed, and when it is determined the generation of the image is required, an operation to select the "WDR SYNTHESIZED IMAGE CAPTURING MODE" may be performed.
  • WDR wide dynamic range
  • a process to determine whether or not the input content to the input unit 20 is for instructing the generation of the WDR synthesized image is performed by the order adjustment unit 17.
  • a WDR synthesized image capturing control process in S12 ( Fig. 5C ) is performed, and after that, the image capturingmode switching control process is terminated.
  • the original entire-area image capturing control process ( Fig. 5A ) described above is performed as the process in Fig. 13, and after that, the image capturing mode switching control process is terminated.
  • the original entire-area image capturing control process ( Fig. 5A ) described above is performed. This results in a state in which the frame memory A121 of the recording unit 12 holds the original entire-area image It.
  • a process to display a display screen for obtaining the WDR image as illustrated in Fig. 6B by the image output unit 13 to output the original entire-area image is performed.
  • a region of interest setting process is performed.
  • a process to obtain the setting of a region of interest Rb for capturing the partial-area image by the user is performed by the partial-area extraction unit 14 through the input unit 20.
  • a process to determine whether or not the setting of the region of interest has been completed is performed by the partial area extraction unit 14.
  • the screen example in Fig. 6B illustrates the situation in which the user is operating the mouse device and the like of the input unit 20 to perform the setting of a rectangle region of interest Rb in the original entire-area image It.
  • the position and the shape of the displayed rectangle change in accordance with the operation of the input unit 20 by the user.
  • the partial-area extraction unit 14 obtains shape information of the region of interest Rb and the position information of the region of interest Rb in the original entire-area image It at the time when the instruction is issued, as the parameter of the region of interest Rb.
  • the shape of the region of interest Rb is not limited to rectangle, and may be any shape.
  • an exposure time setting process is performed.
  • a process to set the exposure time T2 in the capture of the partial-area image of the region of interest Rb is performed by the exposure control unit 15.
  • the exposure control unit 15 performs the setting of the exposure time T2 in the capturing of the partial-area image by calculating the value of the equation below.
  • T ⁇ 2 T ⁇ 1 / K
  • T1 is the exposure time in the capture of the original entire-area image It, and is held in the frame memory A121 of the recording unit 12 by the execution of the original entire-area image capturing control process in S30.
  • K is the ratio of the image capturing times of the original entire-area image It and the partial-area image, and is a predetermined constant in this embodiment.
  • the configuration may also be made so that the user can set the value of the image capturing time ratio K arbitrarily.
  • the exposure time T1 in the capture of the displayed original entire-area image It is 40 milliseconds, and the image capturing time ratio K is "4".
  • S34 a process to determine whether or not the setting of the exposure time has been completed or not is performed by the exposure control unit 15.
  • the process proceeds to S35.
  • the process returns to S33 and the execution of the exposure time setting process is continued.
  • a threshold setting process is performed.
  • a process to obtain the setting of a threshold value for determining which group of pixels in the region of interest Rb in the original entire-area image It is to be replaced with the one obtained from the partial-area image is performed by the synthesizing unit 19.
  • a luminance value of the pixel is set as the threshold value.
  • the screen example in Fig. 6B illustrates the state in which the user operated the keyboard device and the like of the input unit 20 and set the threshold value of the luminance value to "250-255". This is the case in which the user intends to replace only pixels that are blown out to white in the region of interest Rb.
  • the synthesizing unit 19 obtains the set value as the threshold value by the execution of the threshold setting process.
  • S36 a process to determine whether or not the setting of the luminance threshold value has been completed is performed by the synthesizing unit 19.
  • the process proceeds to S37.
  • the process returns to S35 and the execution of the threshold value setting process is continued.
  • an image capturing condition storage process is performed.
  • a process is performed to make the image capturing condition storage unit 16 store and hold the parameters of the region of interest Rb obtained by the partial-area extraction unit 14 by the process in S31 and the exposure time T2 set by the exposure control unit 15 by the process in S33 as the image capturing conditions of the partial-area image.
  • a WDR image synthesis control process ( Fig. 5D ) is performed. While the details of the process are described later, the WDR image is displayed and output by the image output unit 13 by executing this process.
  • the user observes the display of the WDR image, and determines whether or not the setting change of the image capturing conditions (parameters of the region of interest Rb and the exposure time T2 described above.
  • the user determines that a setting change of the image capturing conditions is required as, for example, "white blow-out” or “black out” is observed in the WDR image
  • the user inputs an instruction about the setting of the image capturing conditions after the change, to the input unit 20.
  • a process to determine whether or not the input unit 20 has received input of an instruction for the termination of the image capturing from the user is performed by the order adjustment unit 17.
  • the process in Fig. 5C is terminated.
  • the process returns to S38, and the WDR image synthesis control process is performed again.
  • the image capturing condition storage unit 16 is holding image capturing conditions JT (the parameters and the exposure time T1 of the original entire-area image It) stored in the process in S30 and image capturing conditions JB (the parameters and the exposure time T2 of the region of interest Rb) stored in the process in S37.
  • an original entire-area image capturing process is performed.
  • a process to control the image capturing unit 11 to make it capture the original entire-area image is performed by the order adjustment unit 17.
  • the image capturing unit 11 captures the observation image of the subject formed on the light-receiving surface by the optical system 10 under the image capturing conditions JT held in the image capturing condition storage unit 16, and outputs the obtained original entire-area image It.
  • the output original entire-area image It is stored in the frame memory A121 of the recording unit 12.
  • a partial-area image capturing process is performed.
  • a process to control the image capturing unit 11 to make it capture the partial-area image is performed by the order adjustment unit 17.
  • the image capturing unit 11 captures a part of the observation image of the subject formed on the light-receiving surface by the optical system 10 under the image capturing conditions JB held in the image capturing condition storage unit 16, and outputs the obtained original entire-area image Ib.
  • the output original entire-area image Ib is stored in the frame memory B122 of the recording unit 12.
  • an image synthesis process is performed by the synthesizing unit 19, and in the following S44, an image output process to display and output the WDR image generated by the image synthesis process as the screen as illustrated in Fig. 6B is performed. After that, the process in Fig. 5D is terminated.
  • a detection unit A191 (first detection unit) reads out the original entire-area image It recorded in the frame memory A121.
  • pixels whose luminance value exceeds a predetermined threshold value in the screen example in Fig. 6B , "250-255" in the pixels constituting the original entire-area image It are detected from the region of interest Rb in the original entire-area image It.
  • an area X replacement-target area
  • the area X- obtained as described above becomes an area that consists of the area other than the region of interest Rb in the original entire-area image It, and the area of the pixels whose luminance value is below 250 in the region of interest Rb.
  • a detection unit B192 (second detection unit) first reads out the partial-area image Ib recorded in the frame memory B122.
  • the lower-limit value of the threshold value of the luminance value is divided by the image capturing time ratio K.
  • the threshold value of the luminance value is set as "250-255", so its lower-limit value is 250.
  • the detection unit B192 detects the area Y (replacement area) that is estimated as corresponding to the area X (replacement-target area) from the partial-area image Ib.
  • an image joining unit 194 performs an image synthesis process to replace the picture in the area X in the original entire-area image It with the picture in the area Y in the partial-area image Ib, to join the picture in the area X- in the original entire-area image It with the picture in the area Y.
  • the WDR image is generated as described above.
  • the luminance value of each pixel in the area Y may be multiplied by the image capturing time ratio K, to compensate for the difference in sensitivity in the capture of the picture in the area X- and the picture in the area Y.
  • the borderline between the area X- and the area Y may stand out.
  • respective pixels in the area around the border in the original entire-area image It and in the partial-area image may be overlapped while giving weighting to the luminance value and maybe joined by overlaying them on each other, to generate an image in which the border part is smoothed.
  • the method for overlaying and overlapping two images with each other while giving weighting is introduced in the above-mentioned document, that is, the Japanese Laid-open Patent Publication No. 6-141229 , for example.
  • a pattern matching unit 193 changes the shape of the picture in the area Y to match the shape to the area X in such a case.
  • the method of the pattern matching performed by the pattern matching unit 193 various known methods may be adopted. For example, the contour of the area X and the contour of the area Y are extracted, and the correspondence relationship is obtained for associating each feature point on the contour of the area Y with each correspondent point on the contour of the area X. Then, affine conversion is performed for the picture in the area Y so as to obtain the obtained correspondence relationship, to match its shape to the area X.
  • the image joining unit 194 performs a process to replace the picture in the area X in the original entire-area image It with the picture of the area X, to join the picture of the area X- in the original entire-area image It with the picture of the area Z.
  • the WDR image is generated as described above.
  • Fig. 7 "(c) SYNTHESIZED IMAGE" illustrates the WDR image is obtained as described above.
  • the WDR image is generated by synthesizing the original entire-area image It and the partial-area image Ib captured following the original entire-area image It. It is also possible, together with the generation of the WDR image in that way, to further perform the generation of the WDR image by synthesizing a partial-area image Ib and an original entire-area image It captured following the partial-area image Ib. Accordingly, a synthesized image is generated every time either of the original entire-area image It and the partial-area image Ib is captured, and practically, the WDR image for one frame is generated from captured images for one frame.
  • the original entire-area image It and the partial-area image Rb are captured alternately under different image capturing conditions (exposure time), and the WDR image can be generated on the basis of the obtained original entire-area image It and the partial-area image Rb. Accordingly, since the time required from the image capturing to the generation of the WDR image is shorter than the time required conventionally, the frame rate in the generation of the WDR image can be improved.
  • the setting of the exposure time for the original entire-area image It and the partial-area image Ib is performed by the user.
  • the automatic exposure (AE) control function that is broadly know may be installed to perform the setting of an appropriate exposure time for the capture of the original entire-area image It and the partial-area image Ib.
  • the configuration can also be made so that the image capturing apparatus itself performs the setting of the region of interest Rb. Accordingly, since there is no need for the operation to set the region of interest Rb, the work load for the user is reduced.
  • Fig. 8 illustrates a second example of the configuration of an image capturing apparatus for a microscope that is an image obtaining apparatus with which the present invention is implemented.
  • Fig. 8 differs from the configuration of Fig. 1 only in that a threshold setting unit 141 is particularly provided in the partial area extraction unit 14.
  • the image capturing apparatus for a microscope in Fig. 8 can be implemented in the microscope system illustrated in Fig. 2 , of course.
  • the threshold setting unit 141 sets a region of interest from the original entire-area image recorded in the recording unit 12 on the basis of the threshold input to the input unit 20.
  • the partial-area extraction unit 14 extracts the region of interest set by the threshold setting unit 141.
  • the method of setting the region of interest on the basis of the threshold value by the threshold value setting unit 141 is explained with reference to Fig. 9 .
  • the user inputs a threshold value that is the basis for setting the region of interest by operating the keyboard and the like of the input unit 20.
  • the threshold setting unit 141 reads out the original entire-area image It recorded in the frame memory A121 of the recording unit 12, and binarizes the luminance value of each pixel constituting the original entire-area image It, with the input threshold value as the reference value.
  • the image (a) is an example of the original entire-area image It
  • the image (b) is an example of the biniarized image of the image (a).
  • the binarized image example presents the pixels of which luminance value is equal to or above the threshold value with the white color, and the pixels of which luminance value is below the threshold value with the black color.
  • the threshold value setting unit 141 changes the threshold value, to set the value as the maximum luminance value for the luminance value of the respective pixels constituting the original entire-area image It.
  • the threshold value setting unit 141 performs such change of the threshold value, the threshold value setting unit 141 performs the generation of the binarized image of the original entire-area image It using the threshold value after the change, and also notifies the user of the change by making the display unit 21 display the threshold value after the change.
  • the threshold value setting unit 141 obtains coordinates (XY orthogonal two-dimensional coordinates) that specifies the position on the original entire-area image of the respective pixels whose luminance value is equal to or above the threshold value in the binarized original entire-area image It on the basis of the threshold value.
  • the maximum value and the minimum value are obtained respectively for the X coordinate and the Y coordinate.
  • a rectangle is obtained with the obtained maximum values and the minimum values of the X coordinate and the Y coordinate as the vertices.
  • the obtained rectangle includes all the pixels whose luminance value is equal to or above the threshold value in the original entire-area image It.
  • the threshold setting unit 141 sets the rectangle obtained as descried above as the region of interest.
  • Fig. 9(c) displays the region of interest set as described above on the binarized original entire-area image.
  • the rectangle to be the region of interest may be obtained while excluding such pixels.
  • the region of interest may be set so as to include all pixels within a predetermined distance (the distance may be set by the user) from the pixels whose luminance value is equal to or above the threshold value.
  • each image capturing apparatus for a microscope described above generates the WDR image using one piece of the partial-area image Ib for one piece of the original entire-area image It
  • the WDR image may be generated by using a plurality of pieces of the partial-area image Ib for one piece of the original entire-area image It.
  • Fig. 10 illustrates a variation example of the configuration of the image capturing apparatus for a microscope illustrated in Fig. 1 and Fig. 8 , which is a configuration example in a case inwhich the WDR image is generated using two pieces of the partial-area image for one piece of the original entire-area image.
  • Fig. 10 omits the drawing of the optical system 10, the condition setting unit 18, the input unit 20, and the display unit 21 that are the same constituent elements as the ones illustrated in Fig. 1 and Fig. 8 .
  • the same reference numbers are assigned to the constituent elements that have the same functions as those in the first example illustrated in Fig. 1 and Fig. 8 . As these constituent elements have already been described, detail description for them is omitted here, and different functions are mainly explained.
  • Fig. 10 differs from the configurations of Fig. 1 and Fig. 8 only in that a frame memory C123 is further provided in the recording unit 12, and a detection unit C195 is provided in the synthesizing unit 19.
  • the image capturing apparatus for a microscope in Fig. 10 can be implemented in the microscope system illustrated in Fig. 2 , of course.
  • the frame memory C123 records a partial-area image of an observation image for a region of interest that is different from that recorded in the frame memory B122.
  • the one recorded in the frame memory B122 is referred to as a partial-area image Ib1 and the one recorded in the frame memory C123 is referred to as a partial-area image Ib2, to facilitate the distinction between them.
  • the detection unit C195 (third detection unit) detects an area Yb whose luminance value is estimated to exceed the threshold value in the original entire-area image on the basis of the image capturing time ratio K described above, from the partial-area image Ib2 recorded in the frame memory C123.
  • the detection unit B192 detects an area Ya of which luminance value is estimated to exceed the threshold value in the original entire-area image on the basis of the image capturing time ratio K described above, from the partial-area image Ib1 recorded in the frame memory C123.
  • the WDR image is generated by the execution of the respective control processes in Fig. 5A , Fig. 5B and Fig. 5D .
  • the capturing condition (third exposure condition) of the partial-area image Ib2 is sent to the TG 113 by the order adjustment unit 17 following the sending the capturing condition (second exposure condition) of the partial-area image Ib1.
  • the TG 113 sets the exposure condition to the second exposure condition, and then controls the image sensor 111 to make it capture the partial-area image Ib1.
  • the TG 113 sets the exposure condition to the third exposure condition, and then controls the image sensor 111 to capture the partial-area image Ib2.
  • the flowchart in Fig. 11 differs from the flowchart in Fig. 5D in that when the determination result in S32 is Yes, the process of S51 is performed instead of the process of S33.
  • a process is performed, by the partial area extraction unit 14, to determine whether or not the input unit 20 has obtained an instruction from the user for further performing the setting of the region of interest.
  • the process returns to S31, and the region of interest setting process is performed again.
  • the process proceeds to S33.
  • the partial-area extraction unit 14 obtains the shape information of the regions of interest Rb1 and Rb2, and the position information in the original entire-area image It of the regions of interest Rb1 and Rb2, are the respective parameters of the regions of interest Rb1 and Rb2.
  • a process is performed to store the parameters of the regions of interest Rb1 and Rb2 obtained in the process of S31 and the exposure time T2 set in the process of S33 in the image capturing condition storage unit 16 as the image capturing conditions of the partial-area image Ib1 and the partial-area image Ib2, respectively.
  • This image synthesis process is for obtaining one piece of entire-area image (WDR image) having a wider dynamic range than that of the original entire-area image It, by synthesizing the original entire-area image It and two pieces of partial-area images Ib1 and Ib2.
  • the image joining unit 194 replaces the picture in the area Xa and the picture in Xb in the original entire-area image It with the picture in the area Ya in the partial-area image Ib1 and the area Yb in the partial-area image Ib2, respectively. Then, an image synthesis process is performed to join the picture in the area Xab- in the original entire-area image It (the picture other than the area Xa and other than the area Xb in the original entire-area image It) with the picture in the area Ya and the picture in the area Yb.
  • the WDR image is generated as described above.
  • the shapes of the area Ya detected by the detection unit B192 and the area Xa detected by the detection unit A191 may differ significantly, or the shapes of the area Yb detected by the detection unit C195 and the area Xb detected by the detection unit A191 may differ significantly.
  • the pattern matching unit 193 changes the shape of the picture in the area Ya or Yb to match the shape to the area Xa or Xb in such a case.
  • the method of the pattern matching performed by the pattern matching unit 193 at this time may be the same method as in the image capturing apparatus for a microscope in Fig. 1 and Fig. 8 .
  • the images obtained by processing the pictures in the areas Ya and Yb by the pattern matching unit 193 are referred to as the "picture of the area Za” and the “picture in the area Zb", respectively.
  • "(b) pattern matching" illustrates that the areas Za and Zb are obtained as described above.
  • the image joining unit 194 replaces the picture in the area Xa in the original entire-area image It with the picture of the area Za and joins them.
  • the image joining unit 194 replaces the picture in the area Xb in the original entire-area image It with the picture of the area Zb and joins them.
  • an image synthesis process is performed to join the picture in the area Xab- in the original entire-area image It and the pictures of the areas Za and Zb.
  • the WDR image is generated as described above.
  • "(c) SYNTHESIZED IMAGE" illustrates that the WDR image is obtained as described above.
  • one piece of synthesized image can be generated from an original entire-area image and a plurality of pieces of partial-area images captured by specifying a plurality of regions of interest.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Microscoopes, Condenser (AREA)
  • Image Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Editing Of Facsimile Originals (AREA)

Claims (7)

  1. Appareil (2, 3) d'obtention d'images comprenant :
    un capteur (11) d'image configuré pour capturer une image d'observation formée sur une surface de réception de lumière ;
    un moyen (17) de commande de capture d'image de zone entière d'origine configuré pour commander le capteur (11) d'image dans une première condition d'exposition pour amener le capteur (11) d'image à capturer une image (It) de zone entière d'origine étant une photo de l'image d'observation pour une zone entière de la surface de réception de lumière ;
    un moyen (17) de commande de capture d'image de zone partielle configuré pour commander le capteur (11) d'image dans une deuxième condition d'exposition étant différente de la première condition d'exposition pour amener le capteur (11) d'image à capturer une image (Ib) de zone partielle étant une photo de l'image d'observation formée sur la surface de réception de lumière ; et
    un moyen (19) de synthétisation comprenant
    un premier moyen (191) de détection configuré pour détecter, à partir de l'image (It) de zone entière d'origine, une zone (X) cible de remplacement composée d'un groupe de pixels dont la valeur de luminance dépasse une valeur de seuil prédéterminée dans les pixels constituant l'image (It) de zone entière d'origine ;
    un deuxième moyen (192) de détection configuré pour détecter une zone (Y) de remplacement estimée pour correspondre à la zone (X) cible de remplacement à partir de l'image (Ib) de zone partielle, la zone (Y) de remplacement étant estimée dépasser la valeur de seuil dans l'image (It) de zone entière d'origine sur la base d'un rapport des temps d'exposition pour l'image (It) de zone entière d'origine et l'image (Ib) de zone partielle ; et
    un moyen (194) de fusion d'images configuré pour remplacer une photo dans la zone (X) cible de remplacement dans l'image (It) de zone entière d'origine par une photo dans la zone (Y) de remplacement dans l'image (Ib) de zone partielle pour fusionner une photo dans une zone autre que la zone (X-) cible de remplacement dans l'image (It) de zone entière d'origine et une photo dans la zone (Y) de remplacement dans l'image (Ib) de zone partielle pour obtenir une image de zone entière ayant une plage dynamique plus large que l'image (It) de zone entière d'origine,
    caractérisé par
    une unité (20) d'entrée configurée pour obtenir à partir d'un utilisateur de l'appareil (2, 3) d'obtention d'images une entrée d'information qui spécifie une région d'intérêt (Rb) d'une zone partielle dans l'image d'observation réfléchie sur une image (It) de zone entière, dans lequel
    l'information qui spécifie une région d'intérêt (Rb) indique un seuil d'une valeur de luminance d'un pixel constituant l'image (It) de zone entière d'origine, l'appareil (2, 3) d'obtention d'images comprenant en outre
    un moyen (14) d'établissement de région d'intérêt configuré pour établir, dans l'image (It) de zone entière d'origine, une région d'intérêt (Rb) qui comprend tous les pixels ayant une valeur de luminance égale ou supérieure à la valeur de seuil indiquée par l'information qui spécifie une région d'intérêt (Rb) dans les pixels constituant une image (It) de zone entière d'origine, dans lequel
    le moyen (14) d'établissement de région d'intérêt obtient, concernant chacun des pixels ayant une valeur de luminance égale ou supérieure à la valeur de seuil prédéterminée, des coordonnées en deux dimensions orthogonales XY représentant une position des pixels sur l'image de zone entière ; obtient une valeur maximale et une valeur minimale pour les coordonnées X et les coordonnées Y dans les coordonnées obtenues des pixels ; et établit un rectangle comportant la valeur maximale et la valeur minimale obtenues des coordonnées X et des coordonnées Y en tant que région d'intérêt (Rb) dans l'image (It) de zone entière d'origine, dans lequel
    l'image (Ib) de zone partielle est une photo de l'image d'observation uniquement pour une zone partielle de la surface de réception de lumière et l'image (Ib) de zone partielle est une image de la région d'intérêt (Rb) spécifiée par l'information obtenue par l'unité (20) d'entrée.
  2. Appareil (2, 3) d'obtention d'images selon la revendication 1, dans lequel
    le moyen (19) de synthétisation comprend en outre un moyen (193) de changement de forme configuré pour changer une forme de la photo dans la zone (Y) de remplacement pour faire correspondre les formes des contours de la zone (Y) de remplacement et de la zone (X) cible de remplacement, et
    le moyen (194) de fusion d'images remplace une photo dans la zone (X) cible de remplacement dans l'image (X) de zone entière d'origine par une photo dans la zone de remplacement après le changement (Z) de forme par le moyen (193) de changement de forme dans l'image (Ib) de zone partielle, pour fusionner une photo dans une zone autre que la zone (X-) cible de remplacement dans l'image (It) de zone entière d'origine et la photo dans la zone de remplacement après le changement (Z) de forme par le moyen (193) de changement de forme dans l'image (Ib) de zone partielle.
  3. Appareil (2, 3) d'obtention d'images selon l'une quelconque des revendications 1 ou 2, comprenant en outre
    un moyen (15) de commande d'exposition configuré pour établir, lorsque le moyen (17) de commande de capture d'image (It) de zone entière d'origine amène le capteur (11) d'image à capturer l'image (It) de zone entière d'origine, une condition d'exposition de capture à la première condition d'exposition et configuré pour établir, lorsque le moyen (17) de commande de capture d'image de zone partielle amène le capteur (11) d'image à capturer l'image (Ib) de zone partielle, une condition d'exposition de capture à la deuxième condition d'exposition.
  4. Appareil (2, 3) d'obtention d'images selon l'une quelconque des revendications 1 à 3, dans lequel
    le moyen (17) de commande de capture d'image de zone partielle amène le capteur (11) d'image à capturer une pluralité d'images (Ib1, Ib2) de zones partielles, et
    le moyen (19) de synthétisation obtient une partie de l'image de zone entière ayant une plage dynamique plus large que l'image (It) de zone entière d'origine en synthétisant l'image (It) de zone entière d'origine et une pluralité d'images (Ib1 Ib2) de zones partielles.
  5. Procédé de synthèse d'images comprenant les étapes consistant à :
    détecter, à partir d'une image (It) de zone entière d'origine capturée par un capteur (11) d'image dans une première condition d'exposition étant une photo d'une image d'observation formée sur une surface de réception de lumière du capteur (11) d'image et étant une photo de l'image d'observation pour une zone entière de la surface de réception de lumière du capteur (11) d'image, une zone (X) cible de remplacement composée d'un groupe de pixels ayant une valeur de luminance dépassant une valeur de seuil prédéterminée dans les pixels constituant l'image (It) de zone entière d'origine ;
    détecter, à partir d'une image (Ib) de zone partielle capturée par le capteur (11) d'image dans une deuxième condition d'exposition, qui est différente de la première condition d'exposition, étant une photo d'une image d'observation formée sur une surface de réception de lumière du capteur (11) d'image, une zone (Y) de remplacement estimée correspondre à la zone (X) cible de remplacement, la zone (Y) de remplacement étant estimée dépasser la valeur de seuil dans l'image (It) de zone entière d'origine sur la base d'un rapport des temps d'exposition pour l'image (It) de zone entière d'origine et l'image (Ib) de zone partielle ; et
    réaliser un traitement d'image pour remplacer une photo dans la zone (X) cible de remplacement dans l'image (It) de zone entière d'origine par une photo dans la zone (Y) de remplacement dans l'image (Ib) de zone partielle pour fusionner la photo dans une zone autre que la zone (X-) cible de remplacement dans l'image (It) de zone entière d'origine et une photo dans la zone (Y) de remplacement dans l'image (Ib) de zone partielle,
    caractérisé par
    l'obtention, à partir d'un utilisateur, d'une entrée d'une information qui spécifie une région d'intérêt (Rb) d'une zone partielle dans l'image d'observation réfléchie sur l'image de zone entière, dans laquelle l'information qui spécifie une région d'intérêt (Rb) indique un seuil d'une valeur de luminance d'un pixel constituant l'image (It) de zone entière d'origine, et une région d'intérêt (Rb) est établie, dans l'image (It) de zone entière d'origine, en tant que région d'intérêt (Rb) qui inclut tous les pixels ayant une valeur de luminance égale ou supérieure à la valeur de seuil indiquée par l'information qui spécifie une région d'intérêt (Rb) dans les pixels constituant une image (It) de zone entière d'origine, dans lequel l'établissement de la région d'intérêt (Rb) comprend l'obtention, concernant chacun des pixels ayant une valeur de luminance égale ou supérieure à la valeur de seuil prédéterminée, de coordonnées en deux dimensions orthogonales XY représentant une position des pixels sur l'image de zone entière ; l'obtention d'une valeur maximale et d'une valeur minimale pour les coordonnées X et les coordonnées Y dans les coordonnées obtenues des pixels ; et l'établissement d'un rectangle comportant la valeur maximale et la valeur minimale obtenues des coordonnées X et des coordonnées Y en tant que région d'intérêt (Rb) dans l'image (It) de zone entière d'origine, dans lequel
    l'image (Ib) de zone partielle est une photo de l'image d'observation uniquement pour une zone partielle de la surface de réception de lumière du capteur (11) d'image et l'image (Ib) de zone partielle est une photo de la région d'intérêt (Rb) spécifiée par l'information qui spécifie la région d'intérêt (Rb).
  6. Procédé de synthèse d'images selon la revendication 5, comprenant en outre l'étape consistant à
    changer une forme de la photo dans la zone (Y) de remplacement pour faire correspondre les formes des contours de la zone (Y) de remplacement et de la zone (X) cible de remplacement, dans lequel
    dans le traitement d'image, un traitement d'image est réalisé pour remplacer une photo dans la zone (X) cible de remplacement dans l'image (It) de zone entière d'origine par la photo dans la zone de remplacement après le changement (Z) dans l'image (Ib) de zone partielle, pour fusionner une photo dans une zone autre que la zone (X-) cible de remplacement dans l'image (It) de zone entière d'origine et une photo dans la zone de remplacement après le changement (Z) dans l'image (Ib) de zone partielle.
  7. Système de microscope comprenant un microscope (1) obtenant une image d'observation microscopique d'un échantillon et l'appareil (2, 3) d'obtention d'images selon l'une quelconque des revendications 1 à 4 obtenant une photo de l'image d'observation microscopique.
EP10004593.9A 2009-05-25 2010-04-30 Appareil d'obtention d'images, procédé de synthèse d'images et système de microscope Not-in-force EP2256688B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009125859A JP5214538B2 (ja) 2009-05-25 2009-05-25 画像取得装置、画像合成方法、及び顕微鏡システム

Publications (2)

Publication Number Publication Date
EP2256688A1 EP2256688A1 (fr) 2010-12-01
EP2256688B1 true EP2256688B1 (fr) 2014-12-03

Family

ID=42308300

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10004593.9A Not-in-force EP2256688B1 (fr) 2009-05-25 2010-04-30 Appareil d'obtention d'images, procédé de synthèse d'images et système de microscope

Country Status (3)

Country Link
US (1) US20100295932A1 (fr)
EP (1) EP2256688B1 (fr)
JP (1) JP5214538B2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017115432A1 (de) 2017-07-10 2019-01-10 Karl Storz Se & Co. Kg Medizinisches bildgebendes System

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102405431B (zh) 2009-03-11 2015-09-16 美国樱花检验仪器株式会社 自动聚焦方法和自动聚焦设备
US9025850B2 (en) * 2010-06-25 2015-05-05 Cireca Theranostics, Llc Method for analyzing biological specimens by spectral imaging
US10139613B2 (en) 2010-08-20 2018-11-27 Sakura Finetek U.S.A., Inc. Digital microscope and method of sensing an image of a tissue sample
JP5869239B2 (ja) * 2011-06-21 2016-02-24 浜松ホトニクス株式会社 光測定装置、光測定方法、及び光測定プログラム
JP6006205B2 (ja) * 2011-06-21 2016-10-12 浜松ホトニクス株式会社 光測定装置、光測定方法、及び光測定プログラム
US8773543B2 (en) * 2012-01-27 2014-07-08 Nokia Corporation Method and apparatus for image data transfer in digital photographing
KR102330743B1 (ko) * 2012-06-26 2021-11-23 케이엘에이 코포레이션 각도 분해형 반사율 측정에서의 스캐닝 및 광학 계측으로부터 회절의 알고리즘적 제거
US9307207B2 (en) * 2013-01-07 2016-04-05 GM Global Technology Operations LLC Glaring reduction for dynamic rearview mirror
DE102013103971A1 (de) 2013-04-19 2014-11-06 Sensovation Ag Verfahren zum Erzeugen eines aus mehreren Teilbildern zusammengesetzten Gesamtbilds eines Objekts
JP6143096B2 (ja) * 2013-08-07 2017-06-07 ソニー株式会社 眼底画像処理装置およびプログラム、並びに眼底画像撮影装置
JP6415037B2 (ja) 2013-11-12 2018-10-31 キヤノン株式会社 撮像装置、クライアント装置、撮像装置の制御方法、クライアント装置の制御方法、及びプログラム
WO2015072306A1 (fr) * 2013-11-14 2015-05-21 日本電気株式会社 Système de traitement d'images
US10007102B2 (en) 2013-12-23 2018-06-26 Sakura Finetek U.S.A., Inc. Microscope with slide clamping assembly
JP6485105B2 (ja) * 2015-02-23 2019-03-20 株式会社寺岡精工 商品販売データ処理装置
DE102015219709A1 (de) * 2015-10-12 2017-04-13 Carl Zeiss Microscopy Gmbh Bildkorrekturverfahren und Mikroskop
US11280803B2 (en) 2016-11-22 2022-03-22 Sakura Finetek U.S.A., Inc. Slide management system
CN110149484B (zh) * 2019-04-15 2020-07-10 浙江大华技术股份有限公司 图像合成方法、装置及存储装置

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0449896A (ja) * 1990-06-15 1992-02-19 Toshiba Corp 多数台電動機の運転監視装置
JPH05260372A (ja) * 1992-03-11 1993-10-08 Konica Corp 画像読み取り装置
JP3074967B2 (ja) 1992-10-27 2000-08-07 松下電器産業株式会社 高ダイナミックレンジ撮像・合成方法及び高ダイナミックレンジ撮像装置
JP4178608B2 (ja) * 1998-04-16 2008-11-12 株式会社ニコン 固体撮像装置
JP4282113B2 (ja) * 1998-07-24 2009-06-17 オリンパス株式会社 撮像装置および撮像方法、並びに、撮像プログラムを記録した記録媒体
US6825884B1 (en) * 1998-12-03 2004-11-30 Olympus Corporation Imaging processing apparatus for generating a wide dynamic range image
US20020075563A1 (en) * 2000-02-04 2002-06-20 Olympus Optical Co., Ltd. Camera for microscope and microscope system
JP3740394B2 (ja) * 2001-07-27 2006-02-01 日本電信電話株式会社 高ダイナミックレンジ映像の生成方法とその装置、及びこの方法の実行プログラムとこの実行プログラムの記録媒体
JP2004054045A (ja) * 2002-07-22 2004-02-19 Olympus Corp 顕微鏡撮像装置
JP2004104561A (ja) * 2002-09-11 2004-04-02 Ikegami Tsushinki Co Ltd Ccdカメラ装置
JP4119290B2 (ja) * 2003-03-28 2008-07-16 松下電器産業株式会社 映像処理装置および撮像システム
US7492391B1 (en) * 2003-07-14 2009-02-17 Arecont Vision, Llc. Wide dynamic range network camera
JP4388327B2 (ja) * 2003-08-25 2009-12-24 オリンパス株式会社 顕微鏡像撮像装置及び顕微鏡像撮像方法
US7446812B2 (en) * 2004-01-13 2008-11-04 Micron Technology, Inc. Wide dynamic range operations for imaging
JP2005294913A (ja) * 2004-03-31 2005-10-20 Victor Co Of Japan Ltd 撮像装置
JP2006030713A (ja) * 2004-07-16 2006-02-02 Alpine Electronics Inc 表示装置
US7612804B1 (en) * 2005-02-15 2009-11-03 Apple Inc. Methods and apparatuses for image processing
US7623683B2 (en) * 2006-04-13 2009-11-24 Hewlett-Packard Development Company, L.P. Combining multiple exposure images to increase dynamic range
JP4878914B2 (ja) * 2006-05-24 2012-02-15 オリンパス株式会社 顕微鏡システム、顕微鏡の制御方法、及びプログラム
JP4878913B2 (ja) * 2006-05-24 2012-02-15 オリンパス株式会社 顕微鏡システム、顕微鏡画像の合成方法、及びプログラム
JP4898578B2 (ja) * 2007-07-03 2012-03-14 オリンパス株式会社 画像処理システム、撮像システム、及び顕微鏡撮像システム
JP5063234B2 (ja) * 2007-07-20 2012-10-31 キヤノン株式会社 撮像装置、撮像システム、及び、撮像装置の動作方法
TWI351219B (en) * 2007-09-28 2011-10-21 Altek Corp Image capturing apparatus with image-compensating function and method for compensating image thereof
KR101257942B1 (ko) * 2008-04-23 2013-04-23 고려대학교 산학협력단 광역 역광보정 영상처리에서의 전처리 방법 및 장치
JP5370056B2 (ja) * 2008-11-04 2013-12-18 オムロン株式会社 画像処理装置
US8406569B2 (en) * 2009-01-19 2013-03-26 Sharp Laboratories Of America, Inc. Methods and systems for enhanced dynamic range images and video from multiple exposures
US8248481B2 (en) * 2009-04-08 2012-08-21 Aptina Imaging Corporation Method and apparatus for motion artifact removal in multiple-exposure high-dynamic range imaging

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017115432A1 (de) 2017-07-10 2019-01-10 Karl Storz Se & Co. Kg Medizinisches bildgebendes System

Also Published As

Publication number Publication date
JP2010272094A (ja) 2010-12-02
JP5214538B2 (ja) 2013-06-19
US20100295932A1 (en) 2010-11-25
EP2256688A1 (fr) 2010-12-01

Similar Documents

Publication Publication Date Title
EP2256688B1 (fr) Appareil d'obtention d'images, procédé de synthèse d'images et système de microscope
CN104038702B (zh) 摄像设备及其控制方法
JP4980982B2 (ja) 撮像装置、撮像方法、合焦制御方法及びプログラム
US20040061796A1 (en) Image capturing apparatus
US20100026853A1 (en) Method and system for synchronizing a flash to an imager
JP2013515442A (ja) 静止画像及びプレビュー画像を用いた高ダイナミックレンジ画像の生成
JP2007150643A (ja) 固体撮像素子、固体撮像素子の駆動方法および撮像装置
JP2001086395A (ja) 撮像装置
JP2012050073A (ja) 撮像装置及び画像合成プログラム
US20050264688A1 (en) Pre-strobo light emission in solid image pickup device
WO2017170716A1 (fr) Dispositif de capture d'image, dispositif de traitement d'image et appareil électronique
JP2003324656A (ja) 撮像装置および方法、記録媒体、並びにプログラム
JP2009300831A (ja) 顕微鏡用撮像システム、露光調整プログラム、及び露光調整方法
JP6261397B2 (ja) 撮像装置及びその制御方法
JP2009284136A (ja) 電子カメラ
JP5387341B2 (ja) 撮像装置
JP2008099260A (ja) 画像処理装置、電子カメラ、および画像処理プログラム
JP5452269B2 (ja) 撮像装置
JP5515852B2 (ja) 撮像装置及び画像生成プログラム
JP2010192957A (ja) 電子カメラ
JP2009021909A (ja) 画像加算装置および方法並びにプログラム
JP7086774B2 (ja) 撮像装置、撮像方法およびプログラム
JPWO2017170717A1 (ja) 撮像装置、焦点調節装置、および電子機器
JP2008283477A (ja) 画像処理装置及び画像処理方法
JP5022802B2 (ja) 撮像装置及びその制御方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

AX Request for extension of the european patent

Extension state: AL BA ME RS

17P Request for examination filed

Effective date: 20101125

17Q First examination report despatched

Effective date: 20111117

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20140710

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 699746

Country of ref document: AT

Kind code of ref document: T

Effective date: 20141215

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602010020639

Country of ref document: DE

Effective date: 20150115

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20141203

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 699746

Country of ref document: AT

Kind code of ref document: T

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150303

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150304

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150403

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150403

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010020639

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

26N No opposition filed

Effective date: 20150904

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20150430

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20150430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150430

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150430

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150430

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20151231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20100430

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141203

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20190418

Year of fee payment: 10

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602010020639

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201103