WO2005022901A1 - 撮像系診断装置、撮像系診断プログラム、撮像系診断プログラム製品、および撮像装置 - Google Patents
撮像系診断装置、撮像系診断プログラム、撮像系診断プログラム製品、および撮像装置 Download PDFInfo
- Publication number
- WO2005022901A1 WO2005022901A1 PCT/JP2004/012168 JP2004012168W WO2005022901A1 WO 2005022901 A1 WO2005022901 A1 WO 2005022901A1 JP 2004012168 W JP2004012168 W JP 2004012168W WO 2005022901 A1 WO2005022901 A1 WO 2005022901A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging
- unit
- focus
- monitoring
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
Definitions
- Imaging system diagnosis apparatus imaging system diagnosis program, imaging system diagnosis program product, and imaging apparatus
- the present invention relates to an imaging system diagnosis apparatus and an imaging system diagnosis for suitably using an electronic camera or the like
- An imaging system diagnostic program product An imaging apparatus.
- Patent Document 1 Optical Character Reader
- Patent Document 2 if it is determined that there is a pixel whose output value when read is equal to or less than a predetermined value with respect to a shading correction plate that provides a reference white level when performing shading correction, the correction plate is stained. Notice or warning display of adhesion is being performed.
- Patent Literature 3 discloses a method of detecting a contaminated portion in an optical path and physically cleaning a contaminated location with compressed air.
- Patent Document 1 JP-A-63-221766
- Patent Document 2 JP-A-4-271663
- Patent Document 3 JP-A-11-27475
- the present invention provides an apparatus for suitably diagnosing the state of dust adhesion when using an electronic camera.
- the imaging system diagnostic apparatus shoots a uniform surface with an aperture value smaller than a predetermined aperture value through an optical system having a variable aperture.
- An image acquisition unit for acquiring an image, and a monitoring unit for monitoring a foreign substance included in an optical path based on the image are provided.
- an imaging system diagnostic apparatus includes: an image acquisition unit that acquires an image captured in an out-of-focus state through an optical system; and an image acquisition unit that acquires an object included in an optical path based on the image.
- a monitoring unit for monitoring includes: an image acquisition unit that acquires an image captured in an out-of-focus state through an optical system; and an image acquisition unit that acquires an object included in an optical path based on the image.
- the monitoring unit monitors the amount of a foreign substance contained in the middle of the optical path.
- the monitoring unit creates defect information of the pixel due to the foreign matter based on the image acquired by the image acquiring unit, and creates the created defect information. It is preferable to calculate the area ratio of defective pixels in the image from the above and monitor the amount of foreign matter.
- the monitoring unit when the area ratio of defective pixels exceeds a predetermined value, notifies the photographer of a warning to that effect. Is preferred.
- an imaging system diagnostic apparatus creates an image acquiring unit for acquiring an image and defect information of a pixel due to a foreign substance included in the middle of the optical path based on the image acquired by the image acquiring unit. Calculate the area ratio of defective pixels in the image from the created defect information and monitor the amount of foreign substances contained in the middle of the optical path. If the area ratio of defective pixels exceeds a predetermined value, notify the photographer accordingly. And a monitoring unit that issues a warning notification.
- the warning notification is a warning that recommends physical cleaning of foreign matter.
- the monitoring unit includes, for each pixel constituting the image acquired by the image acquiring unit, a value of the pixel of interest and the pixel of interest. Calculating the relative ratio to the average value of the values of a plurality of pixels within a predetermined range including, and creating defect information in the image based on the relative ratio, and monitoring the amount of foreign matter based on the defect information. preferable.
- the monitoring unit compares the relative ratio with a plurality of thresholds different from 1, and sets a value on the side where the relative ratio departs from 1 from the threshold. It is preferable that the area ratio of the pixel in the case of is calculated for each of the plurality of thresholds, and whether or not the calculated area ratio exceeds a predetermined area ratio is determined for each of the plurality of thresholds.
- the monitoring unit sets the predetermined area ratio to be smaller as the threshold value for comparison with the relative ratio is set smaller than 1. It is better to do it.
- the aperture value in a state where the aperture value is narrowed down from the predetermined aperture value is substantially the aperture value in the state where the aperture value is most narrowed down. preferable.
- an imaging apparatus sets an imaging unit that images a subject through an optical system having a variable aperture, and a foreign substance monitoring mode that monitors foreign substances in the optical path from the optical system to the imaging unit.
- the aperture control unit that controls the aperture of the optical system to an aperture value smaller than the predetermined aperture value when the foreign object monitoring mode is set, and when the foreign object monitoring mode is set
- a monitoring unit that monitors a foreign object based on an image captured by the imaging unit with an aperture value that is smaller than a predetermined aperture value by the optical system.
- the aperture value that is smaller than the predetermined aperture value is the aperture value that is substantially the most narrowed state. Good.
- an imaging apparatus includes: an imaging unit that captures an image of a subject through an optical system; and a mode setting unit that sets a foreign substance monitoring mode that monitors a foreign substance on an optical path from the optical system to the imaging unit.
- a mode setting unit that sets a foreign substance monitoring mode that monitors a foreign substance on an optical path from the optical system to the imaging unit.
- an instruction unit that instructs the photographer to capture an image of a nearby object
- focus control that sets the focal point of the optical system to infinity.
- a monitoring unit that monitors a foreign object based on an image obtained by capturing an image of a close subject by the imaging unit with the focus of the optical system set to infinity when the foreign object monitoring mode is set.
- a focus control unit that automatically controls the focus of the optical system, an imaging unit that captures an image of a subject through the optical system, and a foreign object that monitors foreign substances in the optical path from the optical system to the imaging unit
- An imaging apparatus comprising: a mode setting unit that sets a monitoring mode; and a monitoring unit that monitors a foreign substance based on an image captured by the imaging unit when the foreign substance monitoring mode is set.
- the monitoring mode is set, the focus of the optical system is set from the in-focus state to the out-of-focus state, and the monitoring unit captures an image using the imaging unit with the focus of the optical system set to the out-of-focus state.
- the foreign matter is monitored based on the image obtained.
- the monitoring unit monitors the amount of foreign matter contained in the middle of the optical path.
- an imaging system diagnostic program causes a computer to execute the function of any one of the eleventh to thirteenth aspects.
- an imaging system diagnostic program product has the program of the nineteenth aspect.
- FIG. 1 is a diagram showing a configuration of an electronic camera of an interchangeable lens system.
- FIG. 2 is a block diagram of an electronic camera, showing a personal computer (PC) and peripheral devices.
- PC personal computer
- FIG. 3 A flowchart of a process of monitoring dust attached to the imaging surface and diagnosing the amount of dust attached.
- FIG. 4 is a diagram showing a state where local normalization processing is performed on a luminance plane.
- FIG. 5 is a diagram showing a histogram of a transmittance map.
- FIG. 6 is a flowchart showing a dust accumulation amount monitoring process.
- FIG. 7 is a diagram showing classification according to the size of dust.
- FIG. 8 is a flowchart illustrating an out-of-focus image capturing process according to a second embodiment.
- FIG. 9 is a flowchart illustrating an out-of-focus image capturing process according to a third embodiment.
- FIG. 10 is a flowchart showing an out-of-focus image capturing process according to a fourth embodiment.
- FIG. 11 is a diagram showing how the transmittance of a relatively large dust is converted by F-number conversion.
- FIG. 12 is a diagram showing how a program is provided via a recording medium such as a CD-ROM or a data signal from the Internet or the like.
- FIG. 1 is a diagram showing a configuration of an interchangeable lens type single-lens reflex electronic still camera (hereinafter, referred to as an electronic camera).
- the electronic camera 1 includes a camera body 2 and a variable optical system 3 including a mount-type interchangeable lens.
- the variable optical system 3 has a lens 4 and an aperture 5 inside.
- the lens 4 is typically represented by a single lens in a force diagram composed of a plurality of optical lens groups, and the position of the lens 4 is referred to as a main pupil position (hereinafter, simply referred to as a pupil position).
- the variable optical system 3 may be a zoom lens.
- the pupil position is a value determined by the lens type and the zoom position of the zoom lens. It may change depending on the focal length.
- the camera body 2 has a shutter 6, optical components 7 such as an optical filter and a cover glass, and an imaging element 8.
- the variable optical system 3 is detachable from the mount 9 of the camera body 2. Further, the variable optical system 3 transmits optical parameters such as information on the pupil position and information on the aperture value to the control unit 17 (FIG. 2) of the electronic camera 1 via the mount unit 9.
- the aperture value changes, for example, from F2.8 to F22.
- Reference numeral 10 denotes dust adhering to the surface of the optical component 7 in front of the image sensor 8. A method for diagnosing the amount of dust adhering to the imaging surface and prompting the user to clean the variable optical system as described above is described below.
- FIG. 2 is a block diagram of the electronic camera 1 and shows a PC (personal computer) 31 and peripheral devices.
- the electronic camera 1 includes a variable optical system 3, an optical component 7, a shutter 6 (not shown in FIG. 2), an image sensor 8, an analog signal processor 12, an A / D converter 13, a timing controller 14, an image processor 15. , An operation unit 16, a control unit 17, a memory 18, a compression / Z expansion unit 19, a display image generation unit 20, a monitor 21, a memory card interface unit 22, and an external interface unit 23.
- the imaging element 8 captures an image of a subject through the variable optical system 3, and outputs an image signal (imaging signal) corresponding to the captured subject image.
- the image sensor 8 has a rectangular shape composed of a plurality of pixels.
- An image signal which has a shape-shaped imaging region and is an analog signal corresponding to the electric charge accumulated in each pixel, is sequentially output to the analog signal processing unit 12 in pixel units.
- the imaging device 8 is configured by, for example, a single-plate color CCD.
- the analog signal processing unit 12 includes a CDS (correlation double sampling) circuit, an AGC (auto gain control) circuit, and the like, and performs predetermined analog processing on an input image signal.
- the AZD converter 13 converts the analog signal processed by the analog signal processor 12 into a digital signal.
- the timing control unit 14 is controlled by the control unit 17 and controls the timing of each operation of the image sensor 8, the analog signal processing unit 12, the A / D conversion unit 13, and the image processing unit 15.
- the memory card interface section 22 interfaces with a memory card (card-shaped removable memory) 30.
- the external interface unit 23 interfaces with an external device such as the PC 31 via a predetermined cable or a wireless transmission path.
- the operation unit 16 corresponds to a release button, a selection button for mode switching, and the like.
- the monitor 21 displays various menus and displays a reproduced image based on a subject image captured by the image sensor 8 and image data stored in a memory card.
- the output of the operation unit 16 is connected to the control unit 17, and the output of the display image generation unit 20 is connected to the monitor 21.
- the image processing unit 15 is formed of, for example, a one-chip microprocessor dedicated to image processing.
- the A / D conversion unit 13, the image processing unit 15, the control unit 17, the memory 18, the compression / decompression unit 19, the display image generation unit 20, the memory card interface unit 22, and the external interface unit 23 include a bus 24 Connected to each other.
- a monitor 32, a printer 33, and the like are connected to the PC 31, and an application program recorded on a CD-ROM 34 is installed in advance.
- the PC 31 includes an interface unit (not shown) for a memory card that interfaces with the memory card 30 and an electronic camera 1 or the like via a predetermined cable or wireless transmission path.
- the control unit 17 To control the timing of the image sensor 8, analog signal processor 12, and A / D converter 13. I do.
- the imaging element 8 generates an image signal corresponding to the optical image formed on the imaging area by the variable optical system 3.
- the image signal is subjected to predetermined analog signal processing in an analog signal processing unit 12 and output to the A / D conversion unit 13 as an image signal after analog processing.
- the A / D conversion unit 13 digitizes the image signal after the analog processing and supplies it to the image processing unit 15 as image data.
- the image processing unit 15 performs image processing such as interpolation, gradation conversion, and edge enhancement on such image data.
- the image data on which such image processing has been completed is subjected to predetermined compression processing by a compression / expansion unit 19 as necessary, and is recorded on a memory card 30 via a memory card interface unit 22.
- the image data on which the image processing has been completed may be recorded on the memory card 30 without performing the compression processing.
- the interpolation processing is completed on the image data on which the image processing is completed, and that each pixel has color information of all the RGB color components. Then, based on the image data, the program stored in the memory 18 of the electronic camera 1 is executed according to the procedure described below to determine the amount of dust attached. It is also possible to provide the PC 31 via the memory card 30 and determine the amount of dust attached in a program stored in the PC. In this case, it may be provided to the PC 31 via the external interface 23 and a predetermined cable or wireless transmission path. Hereinafter, in the present embodiment, processing is performed by a program in the camera.
- Figure 3 shows the amount of dust adhering to the imaging surface etc. based on the captured reference image.
- 13 is a flowchart of a process for notifying a user when the accumulated amount of dust exceeds the limit in normal use.
- step S1 it is determined whether or not the “dust diagnosis mode (foreign matter monitoring mode)” is set for the electronic camera 1 and whether or not the electronic camera 1 is in operation.
- the dust diagnosis mode is a mode set when performing dust diagnosis of the electronic camera 1, and is a mode set when the user shoots a reference image described below.
- the electronic camera 1 automatically performs the settings of the electronic camera 1 necessary for capturing the reference image, and issues an instruction to the user to capture the reference image.
- This instruction may be a method of displaying a message on the monitor 21 or a method of giving an instruction by voice.
- a luminance plane is generated in step S20.
- Each pixel [i, j] of the reference image data Then, the following equation (1) is used to generate a luminance signal from the RGB signal. [i, j] indicates the position of the pixel.
- step S30 a transmittance map including the following processing is generated (gain map extraction).
- the reference image data is not always completely uniform as described above. Therefore, the generated luminance surface is not completely uniform.
- the localization (normalization) of the pixel value is performed on such a luminance plane, and the transmittance signal T [i, j] of each pixel is calculated using the following equation (2). I do. That is, the relative ratio between the value of the pixel of interest [i, j] and the average value of the pixels in the local range including this pixel is calculated for each pixel.
- non-uniformities such as gradation and shading included in the uniform surface data can be algorithmically eliminated without any problem, and only a decrease in transmittance due to dust shadows, which are important, can be extracted.
- the transmittance of the entire image obtained in this way is called a transmittance map (gain map).
- the transmittance map shows defect information of the reference image.
- the pixel value is a value of a color signal (color information) or a luminance signal (luminance information) of a color component in each pixel. For example, if it is represented by 1 byte, it takes a value from 0 to 255.
- the range (2a + l) ⁇ (2b + l) pixels for taking the local average is set to be larger than the maximum assumed dust diameter. Ideally, if the area is about three times larger than the dust shadow, accurate transmittance data can be obtained.
- a indicates the number of pixels extending left and right around the pixel of interest [i, j]
- b indicates the number of pixels extending vertically around the pixel of interest [U].
- the pixel pitch of the image sensor 8 is 12 ⁇ m and the distance between the imaging surface and the dust-adhering surface is 1.5 mm
- the aperture value is F22
- the diameter of a giant dust is about 15 pixels
- the aperture value is F4.
- 101 ⁇ 101 pixels may be fixed. These are merely examples, and may be a pixel range based on another number of pixels.
- FIG. 4 is a diagram showing a state in which local normalization processing has been performed on the luminance plane.
- FIG. 4A is a diagram illustrating luminance signals of pixels arranged in a certain horizontal direction in the luminance plane. Reference numerals 41 and 42 indicate that the luminance signal is reduced due to dust.
- FIG. 4 (b) is a diagram obtained by performing the above-described local normalization processing on the luminance signal of FIG. 4 (a).
- Reference numerals 43 and 44 correspond to reference numerals 41 and 42 in FIG. 4 (a), and indicate the transmittance at a portion where dust exists. In this way, non-uniformities such as gradation and shading included in the uniform surface data are eliminated, and only a decrease in transmittance due to dust shadows can be extracted. Thereby, the position of the dust and the degree of the transmittance can be simultaneously determined.
- the low-pass processing of the transmittance map may be selectable, but it is preferable to include this processing because it is effective in most cases. Since the transmittance signal T [i, j] contains random noise due to the quantum fluctuation of the luminance signal, the transmittance is slightly garbage at a level close to 1. Due to the randomness of the area where the influence of the shadow remains, a dust shadow may be extracted as a spot when the following threshold determination in 3-3) is performed. In order to prevent this, if the dust shadows are grouped by a low-pass filter according to the following equation (3), the appearance will be slightly better.
- T [i, j] ⁇ 4 * T [i, j]
- the average value M is calculated by the following equation (4).
- ) And perform a statistical analysis to determine the standard deviation ⁇ by the following equation (5).
- Nx and Ny represent the total number of pixels in the X and y directions.
- the area ratio of dust signals in the transmittance map is sufficiently small 3 ⁇ 2) .
- the result of statistical analysis shows that random noise (shot noise) due to the quantum fluctuation of the transmittance signal is obtained. It is considered that they are evaluating.
- the reference numeral 46 obtained by enlarging the reference numeral 45 in FIG. 4 indicates the presence of this fine and random noise.
- the distribution is a normal distribution with the standard deviation ⁇ centered on the average value ⁇ ( ⁇ is a value close to 1).
- FIG. 5 is a diagram showing a histogram of the transmittance map. The range of this fluctuation is transparent due to dust shadows. Due to the change in the excess rate, it is considered impossible to set the transmittance to 1 forcibly. That is, threshold determination is performed according to the following conditions (6) and (7).
- the average value M used for judgment is always close to 1, and can be replaced with 1 to take the value.
- the transmittance map described above may be called a gain map because it represents a local relative gain.
- the detection of a defect such as dust is performed by a differential filter for edge detection.
- a defect such as dust
- dust in the middle of the optical path is targeted, dust that has a very low contrast with the surroundings due to optical blurring Appears as a shadow.
- the sensitivity of the differential filter is so poor that it is almost impossible to detect it.
- the use of the judgment method using the statistical property of the transmittance enables extremely sensitive dust detection.
- step S40 the dust accumulation amount is monitored according to the flowchart shown in FIG. Using the transmittance map detected with high sensitivity above, the number of pixels affected by dust is counted, and the area ratio to the total number of pixels is calculated. At this time, the type of dust is classified by transmittance, and the area ratio is calculated for each.
- the four types of dust shown in Fig. 7 are separated. Classification of all garbage including small to large garbage is defined as first-class garbage 7a, and the number of pixels considered to be affected by the first-class garbage 7a is assumed to be N1.
- Classification of garbage except for very small trash from Class 1 trash 7a is referred to as Class 2 trash 7b, and the number of pixels considered to be affected by Class 2 trash 7b is denoted by N2. I do.
- the classification of the garbage obtained by removing the small trash from the second trash 7b is referred to as the third trash 7c, and the number of pixels considered to be affected by the third trash 7c is referred to as N3.
- a classification including only large dust is referred to as a fourth classification dust 7d, and the number of pixels considered to be affected by the fourth dust 7d is denoted as N4.
- step S120 the threshold for the transmittance is determined based on the following conditions (11), (12), (13), and (14), and according to the determination in step S120, (11) (12) (13) (14) As shown in the equation, the number of dust pixels N1 (step S130), N2 (step S140), N3 (step S150), and N4 (step S160) for each classification are calculated.
- the thresholds thl, th2, th3, and th4 are set to values larger than 1, and the transmittance larger than those thresholds, that is, the value of the side farther from 1 is detected, as described in FIG.
- the number of pixels affected by striae or the like can be calculated, and defects in optical members other than dust included in the optical path can be detected.
- step S170 the number of pixels of each classification dust is converted into the area ratio.
- the area ratio of Class 1 garbage is Rl
- the area ratio of Class 2 garbage is R2
- the area ratio of Class 3 garbage is R3
- the area ratio of Class 4 garbage is R4, and the following formula (15) (16) ( 17) Calculate the area ratio for each classification according to (18).
- R2 N2 / (total number of pixels) ... (16)
- R3 N3 / (total number of pixels) ... (17)
- R4 N4 / (total number of pixels), ... (18)
- R4 represents the accumulated amount of only extremely large trash
- R1 represents the accumulated amount of all trash from small to large trash.
- step S170 If the amount of garbage accumulated for each classified garbage calculated in step S170 exceeds the limit that causes a problem in normal use, the user is notified.
- step S180 a determination is made based on the conditions of the following equations (19) and (20).
- step SI90 a determination is made in step SI90 to warn the user (hereinafter, warning ON), and if the accumulated amount of dust does not satisfy the condition, In step S200, make a decision not to issue a warning (hereinafter, warning OFF).
- rate_thl is a value on the order of 0.1%
- rate_th4 is a value on the order of 0.01-1.001%
- rate_th2 and rate_th3 may be set to intermediate values. That is, the threshold value for the area ratio of the first classification dust side is made larger than the threshold value for the area ratio of the fourth classification dust side. This is because the reference image was taken with the minimum aperture value, but small dust may disappear if the aperture is slightly moved to the open side. In other words, extremely small dust that appears as dust in the reference image is less likely to cause a problem if the aperture is set at a position wider than the minimum in normal use. To do it.
- a warning is issued to the user in step S60 in FIG. 3 to prompt the user to remove dust.
- a warning is issued, it means that the image quality of the output image of the electronic camera is inevitably deteriorated unless physical dust is removed.
- a warning lamp may be provided in the electronic camera 1 and the warning lamp may blink, or a message urging cleaning may be displayed on the monitor 21.
- a warning by voice may be issued.
- the amount of accumulated dust is calculated based on the reference image, but strict uniformity is not required for capturing a uniform surface of the reference image. It can be realized relatively easily. Furthermore, detection with much higher sensitivity than conventional dust detection methods is possible.
- an image photographed with the focus out of focus (hereinafter, an unfocused image) is used.
- Image) to calculate the amount of garbage accumulated, and notify the user as needed when the amount of garbage exceeds the limit that poses a problem in normal use. Will be explained.
- the electronic camera 1 has an autofocus control unit (not shown).
- the autofocus control unit is controlled by the control unit 17, detects the distance to the subject and the focal position, and controls the variable optical system 3 to automatically focus on the basis of the detected distance and the focal position.
- an electronic camera moves the focus lens (focusing optical system), not shown, back and forth to find a position with high contrast in order to detect the distance to the subject and the focal position, and determines that position as the focal point. The way to do it.
- By controlling the variable optical system 3 so that the focus point thus obtained is in focus a focused image can be obtained.
- the above-described method of determining a position having a high contrast as a focal point is an example, and another autofocus mechanism may be used.
- the position of the autofocus, the focus, and the in-focus point are determined by the autofocus control unit. If not, the focus lens position is controlled to move from the position closer to the focus side to the position where defocusing is performed in the opposite direction, that is, from the focus side position to the infinity side located on the opposite side from the focus side, and defocusing is performed. Get an image.
- Figure 8 shows the process of diagnosing the amount of dust attached to the imaging surface based on the captured out-of-focus image and notifying the user when the amount of accumulated dust exceeds the limit in normal use. It is a flowchart. The following processing is performed by executing a program stored in the memory 18 of the electronic camera 1.
- an out-of-focus image is captured in the following procedure.
- an out-of-focus image is an image taken by focusing on the opposite side from the in-focus state with auto focus, no focus, no focus, shortest shot This is an image of a subject that is closer than the distance. Thereby, even when the subject has a uniform surroundings, it is possible to obtain a blurred image corresponding to the reference image in the first embodiment.
- photographing an out-of-focus image as in the case of photographing the above-mentioned reference image, it is assumed that the photographing is performed with the aperture being narrowed down to the maximum in a variable range prepared in the variable optical system 3.
- the smallest aperture value is around F22 for a standard lens, for example.
- step S220 the electronic camera 1 instructs the user to photograph a nearby object.
- a method of giving an instruction to the user a method of displaying a message on the monitor 21 or a method of giving an instruction by voice can be considered.
- the user needs to instruct the electronic camera 1 to shoot the out-of-focus image. To do this, set “Dust diagnosis mode” or “Out of focus image shooting mode” in the electronic camera 1 using the menu screen. When the user activates these modes, it can be determined that an out-of-focus image is to be captured.
- the close target a subject that is closer than the shortest shooting distance is used for safety. If the shortest shooting distance is too close, it indicates the distance at which there is no focal point, and the value differs depending on the lens. In general, there are many objects with a size of about 20 cm or more. Therefore, if a close-up subject of about 2 cm to 10 cm is photographed, a blurred image suitable for an out-of-focus image equivalent to a reference image can be reliably obtained even with a wide-angle lens.
- step S230 the electronic camera 1 automatically sets the auto focus to infinity by the auto focus control unit.
- the image is normally out of focus, and the focus is controlled in the direction opposite to the object, so that the image is further blurred.
- Image ie, an out-of-focus image.
- an out-of-focus image is photographed in step S240.
- the first step is to generate a luminance surface in step S20, generate a transmittance map in step S30, monitor the amount of dust adhering in step S40, determine the notification of the cleaning time in step S50, and notify the cleaning time in step S60.
- the amount of attached dust can be detected, and the user can be notified of the cleaning time.
- the first embodiment can be performed even when there is no uniform subject around.
- the user can obtain a blurred image corresponding to the reference image in the form (1), and the user can perform the dust detection regardless of the location. Further, similarly to the first embodiment, detection with much higher sensitivity can be performed as compared with the conventional dust detection method.
- the user is instructed to photograph a nearby object, and at the same time, the autofocus is adjusted in the out-of-focus direction, that is, infinity, to obtain an out-of-focus image.
- the autofocus control unit does not determine that the camera is at infinity.
- the configurations of the electronic camera 1 and the PC 31 as the image processing device are common to those of the second embodiment, and thus description thereof is omitted.
- the autofocus control unit controls the in-focus position in the opposite direction from the autofocused position, that is, the infinity side or the shortest position located on the opposite side from the autofocus position. Controls the movement of the focus lens toward the shooting distance to obtain an out-of-focus image.
- Figure 9 shows the process of diagnosing the amount of dust attached to the imaging surface based on the captured out-of-focus image, and notifying the user when the amount of accumulated dust exceeds the limit in normal use. It is a flowchart. The following processing is performed by executing a program stored in the memory 18 of the electronic camera 1.
- step S310 When it is determined in step S310 whether or not the mode is the dust diagnosis mode, shooting of an out-of-focus image is performed in the following procedure.
- the most narrowed-down range is set in the variable range prepared in the variable optical system 3. It is assumed that shooting is performed in the state.
- step S320 the electronic camera 1 instructs the user to photograph a nearby object. This is the same as the process of step S220 in FIG. 8 in the second embodiment, and thus the description is omitted.
- step S330 the electronic camera 1 automatically sets the auto focus to infinity by the auto focus control unit.
- the user is instructed in step S320 to take an image of the closest object, so that the camera is normally out of focus and further blurred by controlling the focus in the opposite direction to the object. Images can be taken.
- step S330 if it is determined in step S340 that there is no in-focus point, an out-of-focus image is captured in step S370 because an out-of-focus image can be captured.
- step S340 if it is determined in step S340 that there is an in-focus point, that is, if the in-focus state has been achieved despite the adjustment of the auto-focus in the out-of-focus direction, the amount of dust attached It is impossible to take out-of-focus images to monitor the scene. This can occur when the user has taken an image of a distant object, even though the user has instructed to take an image of a nearby object.
- step S 350 the electronic camera 1 automatically sets the auto focus to a direction opposite to infinity, that is, to the near side (shortest shooting distance) by the auto focus control unit.
- step S360 if it is determined in step S360 that there is no in-focus point, the process proceeds to step S370 since the out-of-focus image can be taken, and the out-of-focus image is taken. If the camera is still in focus, return to step S320 and take the closest object again to the user. Give an instruction to shadow and repeat the process.
- the electronic camera 1 notifies the user that the out-of-focus image could not be captured, and instructs the user to capture another nearby object different from the previous one in step S320 when re-taking.
- the content of the instruction message may be changed.
- the unfocused image is an image corresponding to the reference image in the first embodiment
- the subsequent processing is the same as the processing in the first embodiment. That is, the first step is to generate a luminance surface in step S20, generate a transmittance map in step S30, monitor the amount of dust adhering in step S40, determine the notification of the cleaning time in step S50, and notify the cleaning time in step S60.
- the amount of attached dust can be detected, and the user can be notified of the cleaning time.
- the autofocus control unit when focusing is achieved even though the autofocus is adjusted to infinity, the autofocus control unit operates in the opposite direction to infinity.
- Direction i.e., close focus (shortest shooting distance)
- the camera is automatically controlled to obtain an out-of-focus image. A focused image can be obtained.
- the user in order to photograph an out-of-focus image, the user is first instructed to photograph a nearby object.
- the variable optical system 3 connected to the electronic camera 1 is a telephoto lens or a micro lens for close-up photography
- the focus is placed on the end in the direction opposite to the autofocus in-focus direction regardless of the distance to the object to be photographed.
- the electronic camera 1 does not instruct the user to take an image of a nearby object, but automatically A control method for obtaining an out-of-focus image will be described.
- the configuration of the electronic camera 1 and the PC 31 as the image processing device is described in the second embodiment. The description is omitted because it is common to the embodiments.
- the autofocus control unit which does not instruct the user about the object to be photographed, performs a defocusing operation from the autofocusing position, that is, the autofocusing position.
- the position of the focus lens is controlled to move from the position to the infinity side or the shortest shooting distance side located on the opposite side to obtain an out-of-focus image.
- FIG. 10 is a flowchart of a process for diagnosing the amount of dust adhering to the imaging surface based on the captured out-of-focus image and notifying the user when the amount of accumulated dust exceeds the limit in normal use. is there.
- the following processing is performed by executing a program stored in the memory 18 of the electronic camera 1.
- step S410 When it is determined in step S410 whether or not the mode is the dust diagnosis mode, an out-of-focus image is captured in the following procedure.
- the most narrow of the variable range prepared in the variable optical system 3 is selected. It is assumed that the photograph is taken in a state where it is in a state.
- step S420 the variable optical system 3 connected to the electronic camera 1 determines a telephoto lens or a micro lens for close-up photography. This is determined by transmitting the type of the connected lens from the variable optical system 3 to the control unit 17 of the electronic camera 1 via the mount unit 9 of the camera body 2. If it is determined in step S420 that the variable optical system 3 connected to the electronic camera 1 is not a telephoto lens or a close-up microlens, the process proceeds to step S430. In this case, as in the second and third embodiments, it is necessary to instruct the user to take an image of the nearest object. Here, the process of step S430 is the same as step S320—step S360 of FIG. 9 in the third embodiment, and thus the description is omitted. If it is determined in step S420 that the variable optical system 3 connected to the electronic camera 1 is a telephoto lens or a close-up microlens, the process proceeds to step S440, and the following processing is performed.
- the unfocused image is an image corresponding to the reference image in the first embodiment
- the subsequent processing is the same as the processing in the first embodiment. That is, the first step is to generate a luminance surface in step S20, generate a transmittance map in step S30, monitor the amount of dust adhering in step S40, determine the notification of the cleaning time in step S50, and notify the cleaning time in step S60.
- the amount of attached dust can be detected, and the user can be notified of the cleaning time.
- variable optical system 3 connected to the electronic camera 1 is a telephoto lens or a micro lens for close-up photography, regardless of the distance to the object to be photographed.
- the invention according to the present embodiment is most effective when the variable optical system 3 connected to the electronic camera 1 is a telephoto lens or a close-up microlens, but is also applicable to other standard lenses. It is possible.
- the electronic camera 1 performs “1) shooting a reference image” in the first embodiment and “1) shooting an out-of-focus image” in the second to fourth embodiments, and then shoots.
- the captured image may be imported to the PC 31 and the processing of 2) -5) above may be performed on the PC.
- the reference image or the out-of-focus image taken by the electronic camera 1 is transmitted via the memory card 30.
- PC31 Provided to PC31. Alternatively, it is provided to the PC 31 via the external interface 23 and a predetermined cable or radio transmission line.
- the PC 31 that has taken in the reference image or the out-of-focus image functions as an imaging system diagnostic device, and is common to the first to fourth embodiments according to the imaging system diagnostic program installed in advance. ) Is performed. Based on the result, if it is determined by the processing 5) common to the first to fourth embodiments that the accumulated amount of dust exceeds the limit that causes a problem in normal use, the monitor 32 A message prompting the user to clean the electronic camera 31 that has taken the image is displayed. As a result, the dust accumulation amount can be diagnosed after the image is taken into the PC 31 or on the PC 31.
- the program executed by the PC 31 can be provided through a recording medium such as a CD-ROM or a data signal such as the Internet.
- FIG. 12 is a diagram showing this state.
- the PC 31 receives the program via the CD-ROM.
- the PC 31 has a function of connecting to the communication line 401.
- the computer 402 is a server computer that provides the program, and stores the program in a recording medium such as the hard disk 403.
- the communication line 401 is a communication line such as the Internet or a computer communication, or a dedicated communication line.
- the computer 402 reads the program using the hard disk 403 and transmits the program to the PC 31 via the communication line 401. That is, the program is transmitted as a data signal on a carrier wave via the communication line 401.
- the program can be supplied as a computer-readable computer program product in various forms such as a recording medium and a carrier wave.
- the reference image and the out-of-focus image are photographed, and the diagnosis is performed based on the image. It is possible to make a diagnosis. Furthermore, detection with much higher sensitivity than conventional dust detection methods is possible.
- the interpolation method described in the example of the RGB color system of the bay array does not depend at all on the arrangement method of the color filters as long as the interpolation processing is finally performed. Leh Needless to say. The same applies to other color systems (for example, complementary color systems).
- the present invention is not necessarily limited to this.
- the present invention is also applicable to a non-interchangeable lens camera.
- the pupil position and the aperture value may be appropriately acquired by a known method.
- the image data taken by the electronic camera 1 is processed by the PC (personal computer) 31 to determine the amount of attached dust.
- the PC personal computer
- a printer or a projection device may be provided with such a program. That is, the present invention can be applied to any device that handles image data.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04772129A EP1667434A4 (en) | 2003-08-29 | 2004-08-25 | IMAGING SYSTEM DIAGNOSTIC DEVICE, PROGRAM AND PROGRAM PRODUCT, AND IMAGING DEVICE |
JP2005513438A JP4196124B2 (ja) | 2003-08-29 | 2004-08-25 | 撮像系診断装置、撮像系診断プログラム、撮像系診断プログラム製品、および撮像装置 |
US11/362,235 US7598991B2 (en) | 2003-08-29 | 2006-02-27 | Image-capturing system diagnostic device, image-capturing system diagnostic program product and image-capturing device for monitoring foreign matter |
US12/461,946 US8098305B2 (en) | 2003-08-29 | 2009-08-28 | Image-capturing system diagnostic device, image-capturing system diagnostic program product and image-capturing device for monitoring foreign matter |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003307354 | 2003-08-29 | ||
JP2003-307354 | 2003-08-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/362,235 Continuation US7598991B2 (en) | 2003-08-29 | 2006-02-27 | Image-capturing system diagnostic device, image-capturing system diagnostic program product and image-capturing device for monitoring foreign matter |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005022901A1 true WO2005022901A1 (ja) | 2005-03-10 |
Family
ID=34269437
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/012168 WO2005022901A1 (ja) | 2003-08-29 | 2004-08-25 | 撮像系診断装置、撮像系診断プログラム、撮像系診断プログラム製品、および撮像装置 |
Country Status (4)
Country | Link |
---|---|
US (2) | US7598991B2 (ja) |
EP (2) | EP2448245A3 (ja) |
JP (1) | JP4196124B2 (ja) |
WO (1) | WO2005022901A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007124002A (ja) * | 2005-10-25 | 2007-05-17 | Nikon Corp | カメラシステム |
JP2008072416A (ja) * | 2006-09-14 | 2008-03-27 | Samsung Techwin Co Ltd | 撮像装置および撮像方法 |
JP2010534000A (ja) * | 2007-06-15 | 2010-10-28 | アイイーイー インターナショナル エレクトロニクス アンド エンジニアリング エス.エイ. | Tofレンジカメラにおける汚れ検出方法 |
WO2015008566A1 (ja) * | 2013-07-18 | 2015-01-22 | クラリオン株式会社 | 車載装置 |
WO2023112259A1 (ja) * | 2021-12-16 | 2023-06-22 | 株式会社Fuji | 画像処理装置 |
JP7429756B2 (ja) | 2021-10-27 | 2024-02-08 | 阿波▲羅▼智▲聯▼(北京)科技有限公司 | 画像処理方法、装置、電子機器、記憶媒体及びコンピュータプログラム |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7817874B2 (en) | 2006-12-06 | 2010-10-19 | Micron Technology, Inc. | Image sensor occlusion localization and correction apparatus, systems, and methods |
US8103121B2 (en) * | 2007-08-31 | 2012-01-24 | Adobe Systems Incorporated | Systems and methods for determination of a camera imperfection for an image |
US8109442B2 (en) * | 2008-12-23 | 2012-02-07 | Gtech Corporation | Optical reader quality factor |
JP2011078047A (ja) * | 2009-10-02 | 2011-04-14 | Sanyo Electric Co Ltd | 撮像装置 |
DE102009049203B4 (de) * | 2009-10-13 | 2016-10-13 | Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg | Diagnoseeinheit für eine elektronische Kamera und Kamerasystem |
JP5683099B2 (ja) * | 2009-12-18 | 2015-03-11 | キヤノン株式会社 | 画像読取装置、画像読取装置の制御方法、およびプログラム |
TW201326799A (zh) * | 2011-12-21 | 2013-07-01 | Hon Hai Prec Ind Co Ltd | 相機模組污點測試系統及其測試方法 |
US9313379B2 (en) * | 2012-09-24 | 2016-04-12 | Illinois State Toll Highway Authority | Camera washing system |
US9076363B2 (en) * | 2013-01-07 | 2015-07-07 | Apple Inc. | Parallel sensing configuration covers spectrum and colorimetric quantities with spatial resolution |
US10154255B2 (en) * | 2014-01-10 | 2018-12-11 | Hitachi Automotive Systems, Ltd. | In-vehicle-camera image processing device |
JP6472307B2 (ja) * | 2014-04-17 | 2019-02-20 | キヤノン株式会社 | 撮像装置 |
JP2017121679A (ja) * | 2016-01-06 | 2017-07-13 | ファナック株式会社 | ワイヤ放電加工機 |
JP7319597B2 (ja) * | 2020-09-23 | 2023-08-02 | トヨタ自動車株式会社 | 車両運転支援装置 |
US11496728B2 (en) * | 2020-12-15 | 2022-11-08 | Waymo Llc | Aperture health monitoring mode |
WO2023077426A1 (en) * | 2021-11-05 | 2023-05-11 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image processing device, imaging device, and program |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002374445A (ja) * | 2001-06-12 | 2002-12-26 | Nikon Corp | 電子カメラ |
JP2003023563A (ja) * | 2001-07-10 | 2003-01-24 | Canon Inc | カメラの異物検出装置及び異物検出方法 |
JP2004172835A (ja) * | 2002-11-19 | 2004-06-17 | Minolta Co Ltd | 撮像装置 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63221766A (ja) | 1987-03-11 | 1988-09-14 | Canon Inc | 画像読取装置 |
JPH04271663A (ja) | 1991-02-27 | 1992-09-28 | Tokyo Electric Co Ltd | イメージスキャナー |
JPH087332A (ja) * | 1994-06-16 | 1996-01-12 | Olympus Optical Co Ltd | 情報記録媒体及び読取装置 |
JPH0951459A (ja) | 1995-08-03 | 1997-02-18 | Sanyo Electric Co Ltd | ビデオカメラ |
US6125213A (en) * | 1997-02-17 | 2000-09-26 | Canon Kabushiki Kaisha | Image processing method, an image processing apparatus, and a storage medium readable by a computer |
JP3785520B2 (ja) * | 1997-03-19 | 2006-06-14 | コニカミノルタホールディングス株式会社 | 電子カメラ |
JPH1127475A (ja) | 1997-07-01 | 1999-01-29 | Sharp Corp | 画像読取方法および装置 |
US6529618B1 (en) * | 1998-09-04 | 2003-03-04 | Konica Corporation | Radiation image processing apparatus |
US6625318B1 (en) * | 1998-11-13 | 2003-09-23 | Yap-Peng Tan | Robust sequential approach in detecting defective pixels within an image sensor |
JP2000217039A (ja) * | 1999-01-21 | 2000-08-04 | Sanyo Electric Co Ltd | 点欠陥検出方法および点欠陥画素値補正方法 |
JP3461482B2 (ja) * | 1999-02-24 | 2003-10-27 | オリンパス光学工業株式会社 | デジタルカメラ及びデジタルカメラのゴミ位置検出方法 |
US7215372B2 (en) * | 2002-05-17 | 2007-05-08 | Olympus Corporation | Optical apparatus having dust off function |
JP4167401B2 (ja) * | 2001-01-12 | 2008-10-15 | 富士フイルム株式会社 | ディジタル・カメラおよびその動作制御方法 |
JP2003209749A (ja) * | 2002-01-11 | 2003-07-25 | Olympus Optical Co Ltd | 撮像装置 |
US7492408B2 (en) * | 2002-05-17 | 2009-02-17 | Olympus Corporation | Electronic imaging apparatus with anti-dust function |
JP2003338926A (ja) | 2002-05-21 | 2003-11-28 | Canon Inc | 画像処理方法及び画像処理装置 |
JP2004112010A (ja) * | 2002-09-13 | 2004-04-08 | Canon Inc | 画像読み取り装置及び該装置の制御プログラム |
CN101778203B (zh) * | 2002-12-27 | 2012-06-06 | 株式会社尼康 | 图像处理装置 |
JP4466017B2 (ja) * | 2002-12-27 | 2010-05-26 | 株式会社ニコン | 画像処理装置および画像処理プログラム |
-
2004
- 2004-08-25 WO PCT/JP2004/012168 patent/WO2005022901A1/ja active Application Filing
- 2004-08-25 EP EP12151692.6A patent/EP2448245A3/en not_active Withdrawn
- 2004-08-25 JP JP2005513438A patent/JP4196124B2/ja not_active Expired - Fee Related
- 2004-08-25 EP EP04772129A patent/EP1667434A4/en not_active Ceased
-
2006
- 2006-02-27 US US11/362,235 patent/US7598991B2/en not_active Expired - Fee Related
-
2009
- 2009-08-28 US US12/461,946 patent/US8098305B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002374445A (ja) * | 2001-06-12 | 2002-12-26 | Nikon Corp | 電子カメラ |
JP2003023563A (ja) * | 2001-07-10 | 2003-01-24 | Canon Inc | カメラの異物検出装置及び異物検出方法 |
JP2004172835A (ja) * | 2002-11-19 | 2004-06-17 | Minolta Co Ltd | 撮像装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1667434A4 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007124002A (ja) * | 2005-10-25 | 2007-05-17 | Nikon Corp | カメラシステム |
JP4640108B2 (ja) * | 2005-10-25 | 2011-03-02 | 株式会社ニコン | カメラ |
JP2008072416A (ja) * | 2006-09-14 | 2008-03-27 | Samsung Techwin Co Ltd | 撮像装置および撮像方法 |
JP4695571B2 (ja) * | 2006-09-14 | 2011-06-08 | 三星電子株式会社 | 撮像装置および撮像方法 |
JP2010534000A (ja) * | 2007-06-15 | 2010-10-28 | アイイーイー インターナショナル エレクトロニクス アンド エンジニアリング エス.エイ. | Tofレンジカメラにおける汚れ検出方法 |
WO2015008566A1 (ja) * | 2013-07-18 | 2015-01-22 | クラリオン株式会社 | 車載装置 |
US10095934B2 (en) | 2013-07-18 | 2018-10-09 | Clarion Co., Ltd. | In-vehicle device |
JP7429756B2 (ja) | 2021-10-27 | 2024-02-08 | 阿波▲羅▼智▲聯▼(北京)科技有限公司 | 画像処理方法、装置、電子機器、記憶媒体及びコンピュータプログラム |
WO2023112259A1 (ja) * | 2021-12-16 | 2023-06-22 | 株式会社Fuji | 画像処理装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2005022901A1 (ja) | 2007-11-01 |
EP1667434A1 (en) | 2006-06-07 |
EP1667434A4 (en) | 2009-11-18 |
JP4196124B2 (ja) | 2008-12-17 |
US20060146178A1 (en) | 2006-07-06 |
US20090316002A1 (en) | 2009-12-24 |
EP2448245A2 (en) | 2012-05-02 |
EP2448245A3 (en) | 2015-07-22 |
US7598991B2 (en) | 2009-10-06 |
US8098305B2 (en) | 2012-01-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8098305B2 (en) | Image-capturing system diagnostic device, image-capturing system diagnostic program product and image-capturing device for monitoring foreign matter | |
JP4186699B2 (ja) | 撮像装置および画像処理装置 | |
US7991241B2 (en) | Image processing apparatus, control method therefor, and program | |
KR100815512B1 (ko) | 촬상장치 및 그 제어 방법 | |
JP4957850B2 (ja) | 撮像装置、警告方法、および、プログラム | |
TWI511546B (zh) | 灰塵檢測系統及數位相機 | |
CN101001319B (zh) | 图像处理设备和图像处理方法 | |
JP5237978B2 (ja) | 撮像装置および撮像方法、ならびに前記撮像装置のための画像処理方法 | |
JP4466015B2 (ja) | 画像処理装置および画像処理プログラム | |
JP2003209749A (ja) | 撮像装置 | |
JPWO2012133427A1 (ja) | 撮像装置及びそのオートフォーカス制御方法 | |
JPH07151946A (ja) | カメラ | |
US6831687B1 (en) | Digital camera and image signal processing apparatus | |
JP2009009072A (ja) | カメラの動的フォーカスゾーン | |
US8576306B2 (en) | Image sensing apparatus, image processing apparatus, control method, and computer-readable medium | |
WO2014125837A1 (ja) | 撮像装置の異物情報検出装置および異物情報検出方法 | |
JP4419479B2 (ja) | 画像処理装置および画像処理プログラム | |
JP4466017B2 (ja) | 画像処理装置および画像処理プログラム | |
JP4269344B2 (ja) | 電子カメラ | |
JP2004333924A (ja) | カメラ | |
JP4466016B2 (ja) | 画像処理装置および画像処理プログラム | |
KR100252159B1 (ko) | 디지털스틸카메라 | |
JP2009118515A (ja) | 電子カメラ | |
JP2006020112A (ja) | デジタルカメラ | |
JP2982073B2 (ja) | カメラの制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005513438 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11362235 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004772129 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2004772129 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11362235 Country of ref document: US |