US11282175B2 - Sample imaging and image deblurring - Google Patents
Sample imaging and image deblurring Download PDFInfo
- Publication number
- US11282175B2 US11282175B2 US15/593,143 US201715593143A US11282175B2 US 11282175 B2 US11282175 B2 US 11282175B2 US 201715593143 A US201715593143 A US 201715593143A US 11282175 B2 US11282175 B2 US 11282175B2
- Authority
- US
- United States
- Prior art keywords
- image
- sample
- capture device
- relative movement
- image capture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000003384 imaging method Methods 0.000 title description 6
- 230000033001 locomotion Effects 0.000 claims abstract description 60
- 238000012545 processing Methods 0.000 claims description 27
- 238000000034 method Methods 0.000 claims description 25
- 230000008569 process Effects 0.000 claims description 18
- 238000000799 fluorescence microscopy Methods 0.000 claims description 4
- 238000003672 processing method Methods 0.000 claims description 3
- 238000011156 evaluation Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 3
- 238000012854 evaluation process Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 206010034960 Photophobia Diseases 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000005284 excitation Effects 0.000 description 2
- 208000013469 light sensitivity Diseases 0.000 description 2
- QSHDDOUJBYECFT-UHFFFAOYSA-N mercury Chemical compound [Hg] QSHDDOUJBYECFT-UHFFFAOYSA-N 0.000 description 2
- 229910052753 mercury Inorganic materials 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000003467 diminishing effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G06T5/003—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/26—Stages; Adjusting means therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H04N5/23229—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
Definitions
- the present technique relates to imaging.
- the present technique has relevance to the field of sample imaging and deblurring of images.
- an apparatus comprising: a sample holder to hold a sample to be imaged; an image capture device having a field of view, to capture an image of the field of view; an actuator; a controller to control the actuator to cause relative movement between the sample holder and the image capture device at a given speed and at a given direction during an exposure time of the image capture device such that, in use, the sample moves across at least a portion of the field of view during the exposure time; and a processor to perform a deblur algorithm to deblur the image using the given speed and the given direction.
- an image processing method comprising the steps: holding a sample to be imaged; capturing an image of a field of view; causing relative movement between the sample and an image capture device at a given speed and a given direction such that the sample moves across a portion of the field of view; and performing a deblur algorithm to deblur the image using the given speed and the given direction, wherein an exposure time of the image capture device when capturing the image corresponds with a time taken for the sample to move across the portion of the field of view.
- an image processing apparatus comprising: means for holding a sample to be imaged; means for capturing an image of a field of view; means for actuating; means for controlling the means for actuating to cause relative movement between the means for holding the sample to be imaged and the means for capturing during an exposure time of the means for capturing such that, in use, the sample moves across at least a portion of the field of view during the exposure time, wherein the relative movement is at a given speed and a given direction; and means for performing a deblur algorithm to deblur the image using the given speed and the given direction.
- an image processing method comprising: receiving an input image on which deblurring is to be performed, wherein the input image comprises a plurality of rows of pixels; receiving a given speed and given direction; performing a deblurring operation on the image by performing a plurality of independent row processing operations using the given speed and the given direction, each corresponding to a given row of the plurality of rows, wherein at least some of the row processing operations are performed in parallel.
- an apparatus comprising: a sample holder to hold a sample to be imaged; an image capture device having a field of view, to capture an image of the field of view as a plurality of rows of pixels; an actuator; a controller to control the actuator to cause relative movement between the sample holder and the image capture device at a given speed and at a given direction during an exposure time of the image capture device such that, in use, the sample moves across at least a portion of the field of view during the exposure time, wherein an axis of the rows of pixels is aligned with the given direction.
- FIG. 1 illustrates an apparatus in accordance with some embodiments
- FIG. 2 shows an example of relative movement between the image capture device and sample in accordance with some embodiments
- FIG. 3 shows an example of a plurality of images produced as a consequence of the relative movement in some embodiments
- FIG. 4 illustrates a relationship in overlap between consecutive images in the plurality of images in accordance with some embodiments
- FIG. 5 shows the effect of performing multiple iterations of a deblur algorithm on an image in accordance with some embodiments
- FIG. 6 is a flow chart illustrating a method of image processing in accordance with one embodiment.
- FIG. 7 is a flow chart illustrating a method of image processing in accordance with one embodiment.
- an apparatus comprising: a sample holder to hold a sample to be imaged; an image capture device having a field of view, to capture an image of the field of view; an actuator; a controller to control the actuator to cause relative movement between the sample holder and the image capture device at a given speed and at a given direction during an exposure time of the image capture device such that, in use, the sample moves across at least a portion of the field of view during the exposure time; and a processor to perform a deblur algorithm to deblur the image using the given speed and the given direction.
- a blurred image (e.g. of the sample) is intentionally created.
- a deblur algorithm can be applied to undo much if not all of the blurring. Consequently, the camera can keep moving and so the imaging process can be completed more quickly than if the camera must start and stop.
- the given speed of the relative movement between the sample and the image capture device is substantially constant.
- the speed of the relative movement may differ, for example, by an extent caused by defects in the manufacturing process of parts of the apparatus.
- the relative movement between the sample and the image capture device occurs substantially only in the given direction.
- the direction of the relative movement may differ, for example, by an extent caused by defects in the manufacturing process of parts of the apparatus.
- the apparatus further comprises speed determining circuitry to determine the given speed.
- the exact speed at which the relative movement occurs could be initially unknown.
- the speed determining circuitry it is possible to determine the speed at which the relative movement occurs.
- the apparatus further comprises direction determining circuitry to determine the given direction.
- the exact direction in which the relative movement occurs could be initially unknown.
- the direction determining circuitry it is possible to determine the direction at which the relative movement occurs.
- the image capture device is to capture a plurality of images of a plurality of fields of view of the image capture device; and the actuator is further to cause relative movement between the sample and the image capture device between each of the plurality of images such that the image capture device obtains the plurality of fields of view.
- the field of view of the image capture device will change. Accordingly, a plurality of fields of view will be imaged. As the number of images that are taken increase, a further improvement in processing time may be experienced by virtue of the camera being required to start and stop less often.
- the sample holder holds a plurality of samples to be imaged; and the plurality of images comprises at least one image of each of the plurality of samples.
- the sample holder could be a well plate, for example, with each well in the well plate holding a different sample to be imaged.
- two consecutive images in the plurality of images overlap by an amount greater than a product of the exposure time of the image capture device and the given speed.
- the exposure time of the image capture device multiplied by the given speed can be used to determine a “streak length”, e.g. the length of a streak caused by an object moving across the portion of the field of view while exposure occurs. Since the overlap is greater than the maximum streak length, there will be a single image showing the streak in its entirety. Since no information will be “lost” as a consequence of the streak disappearing off the end of an image, the deblur algorithm can be applied to remove the blur in an effective manner. In other embodiments, the streak length is longer than the overlap and so image data can be “lost”. Note that in some embodiments, all pairs of consecutive images overlap by an amount greater than the product of the exposure time of the image capture device and the given speed. In those embodiments, there is at least one image of every streak in its entirety.
- two consecutive images in the plurality of images overlap by an amount less than 120% of a product of the exposure time of the image capture device and the given speed.
- it is desirable to have long streaks since this provides more data with which to perform the deblur algorithm and so can result in more accurate deblurred images.
- the overlap must be at least as large as the maximum streak length and if the overlap is too extensive then an efficiency of the apparatus is reduced since a large number of images will be unnecessarily produced. Consequently, the amount of deblur processing that occurs will be increased and so the time taken to produce the deblurred images will be longer than if a smaller number of images are produced having less overlap.
- two consecutive images in the plurality of images overlap by 50%.
- An overlap of 50% represents a good tradeoff between the desire to create longer streaks for accurate deblur processing, the need to have an overlap at least as large as the streak length to avoid losing information, and the desire to have an efficient processing time for processing the deblur algorithm.
- the deblur algorithm is iterative. In other words, a block of instructions are executed repeatedly.
- the iterative algorithm might be recursive such that the solution to one or more sub-problems are used to solve the overall problem.
- the output from one iteration is provided as an input to the next or a future iteration.
- the image comprises a plurality of rows of pixels; an axis of the rows of pixels is aligned with the given direction; and the deblur algorithm comprises a plurality of independent row processing operations each corresponding to a given row of the plurality of rows of pixels. Since the cause of the blurring is as a consequence of the relative movement between the sample and the image capture device occurring in a given direction, and since the rows of pixels are aligned with the given direction, blurring that occurs in respect of one row of pixels is independent from the blurring that occurs in an adjacent row of pixels. Consequently, the deblur algorithm can occur as a plurality of row processing operations that occur independently, and each correspond with one of the rows in the plurality of rows of pixels.
- the row processing operations are performed in parallel. Given that the row processing operations are independent, the row processing operation performed in respect of one row does not affect the row processing operation in respect of another row. The processing of the rows can therefore be parallelised in order to complete processing of the image in a faster time.
- the deblur algorithm is iterative; and at each iteration, an evaluation value for a row of pixels is determined; and based on the evaluation value for the row of pixels in one iteration and the evaluation value for the row of pixels in a next iteration, the deblur algorithm is to disregard that row of pixels in subsequent iterations.
- the evaluation value for a row of pixels between two iterations can be used to determine whether the row processing operation for that row has completed or not.
- the deblur algorithm is to ignore that row of pixels in subsequent iterations. For example, if the evaluation value difference between two consecutive iterations changes by less than some threshold amount then it may be determined that additional iterations are unlikely to have further improvements on the deblurring of the image.
- the evaluation value could be an array of values representing a score for each pixel in the row. The difference could then represent a maximum difference between two corresponding pixels in two iterations of the deblur algorithm. In this way, the algorithm would continue until there was no pixel that changed more than some threshold value.
- the deblur algorithm is based on a Lucy-Richardson deconvolution process.
- the image capture device performs fluorescence imaging.
- Fluorescence imaging relates to a process in which a sample is illuminated with light of a particular wavelength. Once exposed, the sample then continues to fluoresce by emitting light of a second wavelength for a short period. This emitted light can be detected.
- the image capture device is a grayscale image capture device. Performing deblurring can be performed more effectively when a grayscale image is provided, since it may only be necessary to consider the intensity of each pixel, rather than its colour value.
- the apparatus is a digital microscope.
- row is used to refer to an array of pixels.
- row includes “column”, which is also an array of pixels.
- FIG. 1 shows an apparatus 100 in accordance with some embodiments.
- the apparatus 100 is a digital microscope.
- the digital microscope 100 includes a Charged Coupled Device (CCD) camera 110 (an example of an image capture device), which photographs a sample held by a sample holder 120 .
- the sample holder could be a well plate for holding a plurality of samples, each one of which is to be imaged using the CCD camera 110 .
- An actuator 130 is able to move the sample holder 120 , thereby providing relative movement between the CCD camera 110 and the sample holder 120 .
- the relative movement is in a given direction and occurs at a given speed. In the current embodiment, both the given direction and the given speed are known and need not be detected.
- the system of the present embodiment is constrained in terms of its motion.
- further circuitry provides this information, possibly by detecting the actual achieved speed and direction while the relative movement occurs.
- the given speed is substantially constant and the given direction is substantially the only direction in which the relative movement occurs. Other movement can occur from this as a consequence of manufacturing defects in, for example, the actuator.
- the CCD camera 110 could be moved in order to create the relative movement.
- a controller 140 is used to cause the relative movement to take place during an exposure time of the CCD camera 110 .
- the imaging technique used in the embodiment shown in FIG. 1 is fluorescence imaging.
- a mercury lamp 150 and excitation filter 160 are used to produce a light corresponding to a particular wavelength. This light is reflected by dichroic mirror 170 towards the sample held by the sample holder 120 . As a consequence of the illumination, the sample in the sample holder emits a light of a different wavelength.
- the dichroic mirror is designed to not reflect light of this wavelength, as opposed to light of the wavelength produced by the mercury lamp 150 and excitation filter 160 .
- the light therefore passes through dichroic mirror 170 and is instead reflected by mirror 180 .
- the light passes through an emission filter before being received by the CCD camera 110 . Since the given direction and given speed are known, these are provided to a processor 190 of the CCD camera, which then performs deblurring on the received image. Suitable processes, such as Lucy-Richardson deconvolution, will be known to the skilled person.
- the processor 190 may be entirely separate from the rest of the apparatus.
- blurred images are produced by the CCD camera.
- the images could then be deblurred at a later time or date.
- the images could be outsourced for the deblurring algorithm to be performed.
- this embodiment uses a CCD as an image capture device, other image capture technology (such as CMOS) can also be used.
- FIG. 2 shows an example of relative movement between the image capture device and sample in accordance with some embodiments.
- the sample holder 120 moves at a constant speed relative to the camera. This constant speed is maintained whether the camera is being exposed or not.
- Three different exposure times are shown, lasting from t 1 to t 2 , t 3 to t 4 , and t 5 to t 6 .
- each of the exposure times are substantially constant and are larger than the non-exposure times.
- the centre of the camera is pointed at position p 1
- the centre of the camera is pointed at position p 3
- the centre of the camera is pointed at position p 4
- the centre of the camera is pointed at position p 7
- at time t 5 the centre of the camera is pointed at position p 8
- at time t 6 the centre of the camera is pointed at position p 10 .
- These positions represent only the centre position of the camera. They do not represent the full field of view of the camera, which depends on the optical configuration of the camera. In the example of FIG. 2 , the field of view can be considered to be twice the distance between p 1 and p 0 .
- the overall area swept by the three exposures is from p 0 to p 5 , p 2 to p 9 , and p 6 to p 11 respectively. These areas are shown by the three sets of arrows in FIG. 2 . The arrows overlap by 50% in the case of FIG. 2 . Consequently, the images that are produced at the three exposure times will overlap by 50%.
- FIG. 3 shows an example of a plurality of images 210 , 220 , 230 produced as a consequence of the relative movement in some embodiments.
- the plurality of images 210 , 220 , 230 correspond with the three exposure times shown in FIG. 2 and the images 210 , 220 , 230 have been arranged to illustrate the overlap between the images.
- an overlap between the first two images 210 , 220 exists between point p 2 and point p 5 .
- an overlap between the second and third images 220 , 230 exists between point p 6 and p 9 .
- a streak 240 is shown in each of the plurality of images 210 , 220 , 230 in FIG. 3 .
- each sample in the sample holder will be imaged and so there will be at least one image of each sample.
- the streak is caused by the sample (which in these embodiments is treated as a sphere of light), which blurs as a consequence of the relative movement during exposure.
- the length of the streak is the product of the exposure time of the image and the speed of the relative movement between the CCD camera and sample holder (measured as the number of pixels in a row captured by the camera per second).
- the overlap is arranged to be less than the maximum streak length. Consequently, all of a streak will appear on a single one of the images 210 , 220 , 230 .
- Each of the images comprises a plurality of rows of pixels 250 a , 250 b , partially illustrated in FIG. 3 .
- the axis of the rows of pixels is aligned with the given direction, i.e. the direction of relative movement.
- the rows run left to right and the direction of movement occurs from left to right.
- the rows and the relative movement might be from top to bottom. Consequently, the blurring that occurs in each row is independent.
- the overlap is limited to being less than 120% of the maximum streak length, since this produces a high overlap while reducing the probability that numerous images will include the same streak.
- the overlap is exactly equal to the maximum streak length at 50% of the image. This allows for the maximum streak length to appear on a single image without the streak fully appearing on multiple images.
- the range of permissible overlap is 45% to 50% to allow for unexpected deviations in streak length.
- FIG. 4 illustrates a relationship in overlap between consecutive images in the plurality of images in accordance with some embodiments.
- two different exposure times are shown. The first occurs from t 1 to t 2 and the second occurs from t 3 to t 4 . Each exposure lasts for a period of T seconds.
- a period of S seconds elapses between the first exposure time and the second exposure time.
- the field of view of the image capture device is defined as C.
- C As shown in FIG. 4 , due to the relative movement between the image capture device and the sample holder at speed V, an area is “swept” by the field of view. The effective field of view is therefore equal to C+TV.
- an overlap between two consecutive images is shown as L. Using this information, it is possible to determine a relationship for S, the time between consecutive exposures.
- Equation 5 C ⁇ VS ⁇ TV (Equation 4) Therefore: S ⁇ C/V ⁇ T (Equation 5)
- FIG. 5 shows the effect of performing multiple iterations of a deblur algorithm on an image in accordance with some embodiments.
- a deblur algorithm is performed in order to obtain a deblurred image.
- the deblurred image should approximately correspond with the image that would be produced if the relative movement between the CCD camera and sample holder was stopped during exposure.
- the CCD camera is grayscale, movement occurs in a single direction at a known speed (and so the distance moved can be determined), and since the image comprises a small point of light in an otherwise dark image, and such factors improve the effectiveness of applying a deblur algorithm.
- Such a process also deals well with the sort of noise seen in CCD device sensors.
- FIG. 5 it can be seen that as the number of iterations of the algorithm increases from 0, to 1, to 2, to 5, to 10, to 20, to 50, to 100, the quality of the deblurred image improves with diminishing returns. Indeed, in the example shown in FIG. 5 , little improvement between 50 iterations and 100 iterations can be seen as compared between 0 iterations and 50 iterations.
- FIG. 6 is a flow chart 300 illustrating a method of image processing in accordance with one embodiment.
- the flowchart corresponds with a processing operation that runs on a single row (a row processing operation that is part of the overall deblur algorithm). Given that the axis of the rows of pixels is aligned with the given direction, the blurring that occurs in one row is independent of the blurring that occurs in another row. Each row can therefore be processed independently of the others and so at least some of the rows can be processed in parallel to each other.
- the row processing operation can begin, for example, at step 310 where deblurring is performed on the current row. The deblurring makes use of the fact that the given direction and given speed are known.
- an evaluation process is performed on the row.
- the evaluation process is used to determine the extent of change that is effected by the deblurring.
- the evaluation process might involve determining an intensity of each pixel in the row.
- the overall operation is therefore looped in that the deblurring continually occurs until such time as its overall effect falls below the threshold value, the evaluation of the overall effect being calculated by considering the maximum change in pixel intensity value for corresponding pixels.
- FIG. 7 is a flow chart 400 illustrating a method of image processing in accordance with one embodiment.
- the process can begin, for example, at step 410 in which a sample is held by a sample holder 120 .
- image capture of the sample begins by an image capture device such as a CCD camera 110 .
- image capture device such as a CCD camera 110 .
- relative movement between the sample and the image capture device 110 occurs at a given speed and in a given direction. This causes a streak 240 to occur in the corresponding image.
- a deblur algorithm is then applied in order to produce a deblurred image.
- the deblur algorithm can take advantage of the fact that that the given speed and the given direction are both known and so deblurring can occur effectively.
- the process can be repeated for multiple images that are taken.
- steps 410 - 430 could be repeated for a plurality of images and step 440 could be performed at the end once the images have been produced.
- the words “configured to . . . ” are used to mean that an element of an apparatus has a configuration able to carry out the defined operation.
- a “configuration” means an arrangement or manner of interconnection of hardware or software.
- the apparatus may have dedicated hardware which provides the defined operation, or a processor or other processing device may be programmed to perform the function. “Configured to” does not imply that the apparatus element needs to be changed in any way in order to provide the defined operation.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Optics & Photonics (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Microscoopes, Condenser (AREA)
Abstract
Description
2(C+TV)−L=C+V(2T+S) (Equation 1)
C−VS=L (Equation 2)
L≥TV (Equation 3)
C−VS≥TV (Equation 4)
Therefore:
S≤C/V−T (Equation 5)
Claims (10)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1608423 | 2016-05-13 | ||
GB1608423.8 | 2016-05-13 | ||
GB1608423.8A GB2550202B (en) | 2016-05-13 | 2016-05-13 | Sample imaging and image deblurring |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170330310A1 US20170330310A1 (en) | 2017-11-16 |
US11282175B2 true US11282175B2 (en) | 2022-03-22 |
Family
ID=56320346
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/593,143 Active 2038-03-12 US11282175B2 (en) | 2016-05-13 | 2017-05-11 | Sample imaging and image deblurring |
Country Status (2)
Country | Link |
---|---|
US (1) | US11282175B2 (en) |
GB (1) | GB2550202B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110121016B (en) * | 2019-05-08 | 2020-05-15 | 北京航空航天大学 | Video deblurring method and device based on double exposure prior |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050047672A1 (en) * | 2003-06-17 | 2005-03-03 | Moshe Ben-Ezra | Method for de-blurring images of moving objects |
JP2005164707A (en) | 2003-11-28 | 2005-06-23 | Effector Cell Institute Inc | Cell measurement support system and cell observation apparatus |
WO2005093654A2 (en) | 2004-03-25 | 2005-10-06 | Fatih Ozluturk | Method and apparatus to correct digital image blur due to motion of subject or imaging device |
WO2008131438A2 (en) | 2007-04-23 | 2008-10-30 | Fotonation Ireland Limited | Detection and estimation of camera movement |
EP2420970A1 (en) | 2010-08-06 | 2012-02-22 | Honeywell International, Inc. | Motion blur modeling for image formation |
US20120288157A1 (en) | 2011-05-13 | 2012-11-15 | Sony Corporation | Image processing apparatus, image processing method, and image processing program |
WO2013183267A1 (en) | 2012-06-05 | 2013-12-12 | Canon Kabushiki Kaisha | Vibration type driving apparatus, two-dimensional driving apparatus, image-blur correction apparatus, interchangeable lens, image-pickup apparatus, and automatic stage |
WO2015060181A1 (en) | 2013-10-22 | 2015-04-30 | 国立大学法人東京大学 | Blurless image capturing system |
WO2017195442A1 (en) | 2016-05-09 | 2017-11-16 | 富士フイルム株式会社 | Imaging device and method, and imaging device control program |
-
2016
- 2016-05-13 GB GB1608423.8A patent/GB2550202B/en active Active
-
2017
- 2017-05-11 US US15/593,143 patent/US11282175B2/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050047672A1 (en) * | 2003-06-17 | 2005-03-03 | Moshe Ben-Ezra | Method for de-blurring images of moving objects |
JP2005164707A (en) | 2003-11-28 | 2005-06-23 | Effector Cell Institute Inc | Cell measurement support system and cell observation apparatus |
WO2005093654A2 (en) | 2004-03-25 | 2005-10-06 | Fatih Ozluturk | Method and apparatus to correct digital image blur due to motion of subject or imaging device |
WO2008131438A2 (en) | 2007-04-23 | 2008-10-30 | Fotonation Ireland Limited | Detection and estimation of camera movement |
EP2420970A1 (en) | 2010-08-06 | 2012-02-22 | Honeywell International, Inc. | Motion blur modeling for image formation |
US20120288157A1 (en) | 2011-05-13 | 2012-11-15 | Sony Corporation | Image processing apparatus, image processing method, and image processing program |
WO2013183267A1 (en) | 2012-06-05 | 2013-12-12 | Canon Kabushiki Kaisha | Vibration type driving apparatus, two-dimensional driving apparatus, image-blur correction apparatus, interchangeable lens, image-pickup apparatus, and automatic stage |
WO2015060181A1 (en) | 2013-10-22 | 2015-04-30 | 国立大学法人東京大学 | Blurless image capturing system |
WO2017195442A1 (en) | 2016-05-09 | 2017-11-16 | 富士フイルム株式会社 | Imaging device and method, and imaging device control program |
EP3457192A1 (en) | 2016-05-09 | 2019-03-20 | FUJIFILM Corporation | Imaging device and method, and imaging device control program |
Non-Patent Citations (3)
Title |
---|
Combined Search and Examination Report issued in the corresponding United Kingdom Application No. 1608423.8 dated Nov. 21, 2016 (5 pages). |
Examination Report issued in corresponding application No. GB1608423.8 dated Apr. 8, 2019 (3 pages). |
Examination Report issued in corresponding United Kingdom application No. GB1608423.8 dated Sep. 10, 2019 (2 pages). |
Also Published As
Publication number | Publication date |
---|---|
GB201608423D0 (en) | 2016-06-29 |
GB2550202B (en) | 2020-05-20 |
GB2550202A (en) | 2017-11-15 |
US20170330310A1 (en) | 2017-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Abdelhamed et al. | A high-quality denoising dataset for smartphone cameras | |
JP2022509034A (en) | Bright spot removal using a neural network | |
WO2011099244A1 (en) | Image processing device and method | |
EP3657784B1 (en) | Method for estimating a fault of an image capturing system and associated systems | |
IES20070229A2 (en) | Image acquisition method and apparatus | |
JP4454657B2 (en) | Blur correction apparatus and method, and imaging apparatus | |
US20190155012A1 (en) | Artefact reduction for angularly-selective illumination | |
CN107667310B (en) | Fluorescence imaging system | |
WO2014038629A1 (en) | Moving body detection method | |
US7599576B2 (en) | Image subtraction of illumination artifacts | |
CN110400281B (en) | Image enhancement method in digital slice scanner | |
CN111433811B (en) | Reducing image artifacts in images | |
US11282175B2 (en) | Sample imaging and image deblurring | |
WO2017217325A1 (en) | Data recovery device, microscope system, and data recovery method | |
US20080170772A1 (en) | Apparatus for determining positions of objects contained in a sample | |
JP7278714B2 (en) | film scanning | |
US11841324B2 (en) | Method and device for estimating a STED resolution | |
JP6732809B2 (en) | Multi-line detection method | |
US11256078B2 (en) | Continuous scanning for localization microscopy | |
JP2015033006A (en) | Image processing apparatus, image processing method, image processing program and microscope system | |
Nagalakshmi et al. | Image acquisition, noise removal, edge detection methods in image processing using Matlab for prawn species identification | |
US11763434B2 (en) | Image processing system | |
EP4164211B1 (en) | Method and system for stray light compensation | |
JP6559354B2 (en) | Imaging device | |
JP7491667B2 (en) | Artifact reduction for angle-selective illumination |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SOLENTIM LTD, UNITED KINGDOM Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FIGG, AARON;JIANG, YONGGANG;REEL/FRAME:042458/0262 Effective date: 20170502 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
AS | Assignment |
Owner name: GLAS TRUST COMPANY LLC, NEW JERSEY Free format text: SECURITY INTEREST;ASSIGNOR:SOLENTIM LTD.;REEL/FRAME:057624/0960 Effective date: 20210927 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: ADVANCED INSTRUMENTS LTD., UNITED KINGDOM Free format text: CHANGE OF NAME;ASSIGNOR:SOLENTIM LTD;REEL/FRAME:062001/0001 Effective date: 20220531 |