EP3626160A1 - Handhaltbares bildgebungselement mit einem bewegungssensor - Google Patents

Handhaltbares bildgebungselement mit einem bewegungssensor Download PDF

Info

Publication number
EP3626160A1
EP3626160A1 EP18196325.7A EP18196325A EP3626160A1 EP 3626160 A1 EP3626160 A1 EP 3626160A1 EP 18196325 A EP18196325 A EP 18196325A EP 3626160 A1 EP3626160 A1 EP 3626160A1
Authority
EP
European Patent Office
Prior art keywords
movement
image data
imaging
radiation
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18196325.7A
Other languages
English (en)
French (fr)
Inventor
Morten Bo Søndergaard Svendsen
Jan Bertholdt Hansen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dianova AS
Original Assignee
Dianova AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dianova AS filed Critical Dianova AS
Priority to EP18196325.7A priority Critical patent/EP3626160A1/de
Priority to PCT/EP2019/075719 priority patent/WO2020064737A1/en
Publication of EP3626160A1 publication Critical patent/EP3626160A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • A61B5/7207Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
    • A61B5/721Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts using a separate sensor to detect motion or using motion information derived from signals other than the physiological signal to be measured
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • A61B5/0261Measuring blood flow using optical means, e.g. infrared light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S1/00Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
    • G01S1/02Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using radio waves
    • G01S1/08Systems for determining direction or position line
    • G01S1/44Rotating or oscillating beam beacons defining directions in the plane of rotation or oscillation
    • G01S1/46Broad-beam systems producing at a receiver a substantially continuous sinusoidal envelope signal of the carrier wave of the beam, the phase angle of which is dependent upon the angle between the direction of the receiver from the beacon and a reference direction from the beacon, e.g. cardioid system
    • G01S1/50Broad-beam systems producing at a receiver a substantially continuous sinusoidal envelope signal of the carrier wave of the beam, the phase angle of which is dependent upon the angle between the direction of the receiver from the beacon and a reference direction from the beacon, e.g. cardioid system wherein the phase angle of the direction-dependent envelope signal is compared with a non-direction-dependent reference signal, e.g. VOR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • A61B2090/3614Image-producing devices, e.g. surgical cameras using optical fibre
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • G06T2207/30104Vascular flow; Blood flow; Perfusion

Definitions

  • the present invention relates to an imaging element with a movement sensor and in particular to a handheld imaging element comprising an imaging device for generating image data and a movement sensor for movement compensation of the image data provided by the imaging device.
  • Imaging elements may be seen in WO2017062759 , US2017017858 , CN107307848 , CN204765619U , WO2014009859 , WO2011117779 , RU2573053 , US2016270672 , WO2010017508 , CN102357033 , " Correcting for motion artifact in handheld laser speckle images", Lertsakdadet B et al., Journal of Biomedical Optics, March 2018, page 036006 (7 pp.), NR 3 VOL 23 ., " Development of a handheld blood flow measurement system using laser speckle flowgraphy", Min-Chul Lee et al., Optical Society of Japan, co-published with Springer, April 2015, page 308-314, NR 2, VOL 22 ., " Handheld, point-of-care laser speckle imaging", Farraro R et al., Journal of Biomedical Optics, September 2016, page 094001 (6 pp.), NR 9 VOL 21
  • the invention relates to an imaging element for providing image data, the imaging element comprising:
  • image data is data representing a physical surface or structure, such as a portion of a human or animal body, a structure or the like, and which may be represented in an at least two-dimensional fashion.
  • the image data may be an image of the structure or surface, such as where a colour or the actual surface/object has been replaced or added information relating to an analysis performed on the basis of the image data, such as flow or relative movement one portion of the surface/structure vis-à-vis another portion of the surface/structure.
  • images are 2D representations of 2D or 3D elements, where the 2D representation is a projection of the surface of the elements on to a surface or image plane.
  • an image has, at a position often called a pixel, a colour or grey tone of a corresponding position of the element.
  • Image data in the present context may, at a position, comprise data or information in addition to or instead of such colour/grey tone.
  • Such data or information may represent a value, such as a relative movement between a portion of the element at the corresponding position and other elements.
  • a vast number of measurement types may be employed, such as quantification of surface stress and the like.
  • flow velocity and/or direction may be represented in the image data and in a position of the image data corresponding to a position or portion in the element where a flow or relative movement is seen vis-à-vis other portions of the element.
  • the image data may be represented in any manner.
  • a large number of image formats exist, such as JPEG, TIFF, GIF, BMP or the like.
  • the image data may represent a number of portions or pixels each preferably at least representing a coordinate or position in an at least 2D surface and a value corresponding to a result of an analysis of the image data.
  • the radiation source is configured to launch the monochromatic radiation on to the above 2D or 3D element, and the imaging device is able to detect radiation reflected or scattered by the element.
  • the imaging device may be a camera, such as a video camera, which is able to detect received radiation, in a number of individual sensors, such as a sensor array.
  • a camera may determine the received intensity and/or colour in each sensor.
  • the imaging device may comprise a filter removing radiation with a colour or wavelength different (such as outside a wavelength interval comprising the wavelength of the monochromatic radiation) from that of the radiation source.
  • a camera may comprise a 2D sensor, such as a CCD, or a 1D sensor which may be moved or scanned during providing of the imaging data.
  • monochromatic radiation is radiation having a predetermined wavelength.
  • “monochromatic” is a situation where the radiation has a predetermined wavelength and an output intensity at that wavelength. Then, no radiation is output with an intensity exceeding 50%, such as 40%, such as 20%, such as 10% of the output intensity and with a wavelength within 10nm, such as within 20nm of the predetermined wavelength.
  • a source of monochromatic radiation may be a laser or a gas-discharge lamp, such as a Mercury-vapour lamp. Clearly, radiation of other wavelengths may be emitted. If the wavelength of undesired radiation is too close to the wavelength desired output, an optical filter may be employed. Additionally or alternatively, an optical filter may be positioned in front of the imaging device to remove undesired wavelengths.
  • the radiation source may comprise a source and a radiation guide for transporting radiation from the source toward an emission portion from which the radiation is emitted.
  • a radiation guide may be an optical fibre, a fibre bundle, a lens, a prism or the like.
  • a movement sensor is configured to sense a movement thereof.
  • the movement sensor preferably is attached to, secured to and/or fixed to the imaging device so that the movement sensor determines movement of the imaging device.
  • a movement sensor may sense any type of movement, such as translation, rotation, vibration or combinations thereof.
  • the movement may be a movement in relation to e.g. the earth or relative to another object, such as an operator or an element on to which the monochromatic radiation is launched.
  • the movement sensor may comprise e.g. a gyrometer, an accelerometer, or the like. Any type of measurement determining a movement may be used, such as detection of changes, relative to the sensor, of the direction of the earth's magnetic field, air pressure or the like may also be used.
  • the movement signal may represent something as simple as a distance, an angle or a direction, but may be more complex so as to be able to describe a movement being a combination of a rotation and a translation.
  • the movement signal may also comprise additional information, such as a direction as well as a rotation/translation.
  • the direction, rotation etc. may be determined in relation to a fixed coordinate system, which may be that of the earth, the radiation emitter, the sensor, the imaging device or the like.
  • the processing device may be any type of processor, controller, DSP, ASIC or the like, as well as a combination thereof.
  • the processing device may be monolithic or formed by a number of such elements.
  • the processing device is configured to receive the image data and the movement signal and output movement corrected image data.
  • Movement may create different types of movement artefacts in image data depending on the actual measurement set-up. If a normal camera is moved during exposure of the sensor to take a standard image, the resulting image will be blurred. The movement causes neighbouring pixels to view the same element in the imaged portrait (be that of a tree, a person, a mountain) over the time of the exposure.
  • two sets of image data is obtained and the processing thereof is a comparison of the two images, such as the LSCI method.
  • movement between the deriving of the two sets of image data is also, or alternatively, desirably compensated for.
  • the movement sensor may be configured to determine the movement between the providing of the two sets of image data.
  • the movement during the providing of a set or both sets of image data may also be determined.
  • the compensation may take this movement or these movements into account, as is also described further below.
  • the movement sensing may determine how the sensor has been moved and the processing device may then calculate "backwards" to assess what each pixel would have "seen” if the camera was not moved.
  • the imaging has the purpose of determining movement of portions of an imaged element relative to the imaging device. This may be used for determining relative movement of one portion vis-à-vis another portion, of the element. If no movement is detected, one portion may be stationary when another is not. However, movement of the imaging device may cause the otherwise static portion to seem like moving and cause the moving portion to seem like moving more, differently - or less.
  • the compensation will depend on exactly what is moving. If the radiation source moves, the compensation may differ from the situation where the imaging device moves. This is described further below.
  • Determining the overall movement may enable the processing device to determine the artefact caused in each point or pixel and thus to remove the effect of the movement detected, leaving the relative movement in the corrected image data.
  • an overall movement artefact may be determined and applied to all points/pixels in the image data.
  • the movement artefact created may differ for different portions of the image data.
  • the image data usually represent a number of points or pixels positioned in a 2D pattern, such as a matrix.
  • the movement detected may cause different artefacts in different portions of the image data depending on what the image data represent. If the image data represent a usual image with pixels and colours, translating the imaging device closer to the imaged element will not alter the data represented by the centre of the image data, as this pixel will see the same colour at all times. However, if the image data represented movement relative to the imaging device, the movement would affect also this pixel.
  • a rotation of the imaging device would of course generate another artefact at the individual pixels or points. Again, this may be determined and compensated for.
  • the movement corrected image data may have the same format as the image data. Thus, if the image data represents a number of points or pixel values, so may the movement corrected image data. The pixel values or points may, however, be amended due to the correction for the movement detected.
  • the source is configured to launch the monochromatic radiation toward a predetermined direction, the imaging device having a field of view comprising the predetermined direction at least within a predetermined distance, so that the imaging device is able to determine radiation reflected by or scattered by element(s) receiving the radiation.
  • the radiation emitter may be off axis with the imaging device so that it is desired that irradiated elements are at least a predetermined minimum distance from the imaging device or source.
  • the imaging device has an imaging plane wherein the movement sensor is configured to output a movement signal correlated to movement of the imaging plane.
  • An imaging plane often is defined by a lens or lens system imaging the scene or portrait desired imaged on to a sensor.
  • the imaging plane thus may be defined as the sensitive surface of the sensor.
  • the imaging plane may be a surface of a fibre bundle on to which the scene is focused, which bundle may then guide the radiation to the sensor.
  • the movement of interest is that of the imaging plane vis-à-vis the element irradiated, as the image is subsequently "frozen" by the operation of the fibre bundle.
  • the actual sensor may be moved relative to the imaging plane without affecting the measurement.
  • Size requirements may be seen in e.g. laparoscopes or endoscopes which have a narrow rod at the end of which the radiation is to be received, but where it may be desired to have the imaging device at a more remote position where more space is available.
  • the imaging element has a handheld unit comprising the imaging device, or at least the imaging plane and the movement sensor.
  • a unit may be handheld if it weighs no more than 2kg, such as no more than 1kg, such as no more than 750g, such as no more than 500g.
  • a handheld unit preferably has rather small dimensions, such as no more than 40cm along any dimension.
  • the imaging element may be propelled such as if provided on or with a moving structure, such as a car, a vehicle, an airplane, a drone or the like.
  • the processing device may be in the handheld or propelled unit or remotely therefrom.
  • the processing device may be a central unit receiving image data and movement signals from multiple units and may then perform the correction.
  • a handheld/propelled unit may comprise a wired or wireless connection, such as for transferring image data to the processing device, if this is not in the unit.
  • a wired connection may be used for powering the handheld unit and/or for holding a fibre bundle, if the sensor of the imaging device is not in the unit.
  • the handheld unit has a handle portion configured to be grabbed by an operator, as well as a head portion comprising the radiation emitter and the imaging plane, the sensor or a lens of the imaging device.
  • the imaging element has a triggering element configured to activate the imaging device and the movement sensor. In this manner, it may be ensured that the providing of the image data and the detection of the movement is simultaneous or at least substantially simultaneous. It is preferred that the two detections are performed simultaneously so that the movement detected is the one desirably corrected for.
  • the trigger may be used for triggering multiple sets of image data and movement signals, as described below, image data representing different portions of an imaged element may be detected and then combined to a larger image.
  • the trigger may be provided in the handle part to arrive at a unit with a pistol-like shape and operation.
  • the movement sensor is configured to determine a distance between the imaging element and the imaged element and wherein the processing element is configured to include the distance in the determination of the corrected image data.
  • the distance will e.g. determine the extent of a single pixel in the image data.
  • the radiation emitter is configured to emit coherent radiation
  • the radiation will create, on the imaged element, a speckle pattern comprising a vast amount of very small high intensity spots.
  • the distance from the imaging plane to the speckles thus will decide how much of the speckle pattern is imaged on one pixel.
  • the imaging device has a fixed focus lens system between the imaging plane and the speckle pattern, it is desired that the distance is close to the focus distance in order for the image data to not be blurred.
  • Numerous distance sensing principles are known, such as laser-based principles, time-of-travel based principles, stereo camera based and the like. Other principles would be the providing of an off-axis light source emitting a beam into the field of view of the imaging device, so that the beam has different distances to the centre of the imaged area at different distances to the imaging device.
  • the imaging element further comprises a display, the processing device being configured to control the display to display an image representing the corrected image data.
  • the image illustrates relative movement between elements illustrated in different portions of the image.
  • the surface of an element may represent also parameter of the deeper lying structure, such as stress in a surface of a metallic or polymer tube may be the indication of fractures or the like in the tube material.
  • the display may be fixed to the movement sensor, such as if provided in the above handheld unit.
  • providing the display in the handheld unit will allow the operator to focus on the job at hand without needing to revert his/her attention to a display positioned away from the operator or the imaged element.
  • the image may represent different movements, such as different velocities, in different manners.
  • different colours indicate different velocities.
  • red is used for more extreme cases, such as higher velocities, where yellow, then green and then blue often represent lower and lower values.
  • arrows may be used, the length thereof may represent the velocity or the movement. The direction of the arrow may then indicate the direction of the movement.
  • the velocity indicated may be relative to any coordinate system or element.
  • the velocity is relative to the camera or image plane.
  • Velocity may be simply indicated (present or not - such as compared to a threshold velocity) or may be quantified.
  • Monochromatic radiation may be used for a number of different types of analysis, such as where an absorption of the radiation from the source is determined or where the radiation from the source causes fluorescence which is then determined.
  • the diameter of a deep underlying structure is determined in a multi-layered object, where the layers closest to the sensor are fully or partially transparent to the monochromatic light.
  • this is done using radiation from the infrared spectrum for detecting and measuring blood vessel structure.
  • this is called spectroscopy imaging. Motion artefacts will in this case cause blurring and thus affect the resultant measurement of the underlying structure.
  • motion artefacts presents the same errors as when monochromatic light is used for imaging different absorption.
  • the difference being, that different substances presents different (auto)fluorescence enabling detecting of bodies of different properties than the surrounding material or bodies.
  • Detecting and localisation of sand and stones in corn, and determining if the sand is above or below a threshold requiring removal may be obtained using e.g. (auto)fluorescence. Sorting out different minerals in a sample, where measurement enables immediate quality control and thus pricing and prioritisation option. Detecting, identifying, and/or grading, such as using fluorescence, cells in a multicellular organism. Finding and evaluating parasites in host organisms (parasites and hosts organisms comprising any combinations of assemblies of virus, assemblies of eukaryote cells and/or assemblies of prokaryote cells in a matrix or array of virus, eukaryote or prokaryote cells).
  • the movement may cause a variation of the intensity of the radiation from the source in different portions of the element irradiated. This again will cause a variation in the absorption and fluorescence and thereby in the intensity received by the imaging device in each point or pixel.
  • the movement may in addition to blurring give rise to a compensation across the image data of the intensities represented thereby.
  • a preferred radiation source is a laser, as it emits monochromatic radiation.
  • the radiation from a laser actually is coherent, which brings about even further advantages.
  • Launching laser light or coherent radiation on to an element automatically will generate a speckle pattern on that element. Speckles are formed by the positive and negative interference of the coherent radiation on the element, as is described further below.
  • the processing device is configured to, if the determined movement does not exceed a first threshold, output, as the corrected image data, the first image data. Then, no correction is made, if the movement sensed is too small. Also, it may be desired that the processing device is configured to, if the determined movement exceeds a second threshold, exceeding the first threshold, not output corrected image data. Movements may be so violent that it is not possible to perform a suitable correction. Thus, the original image data may be output, so that the operator can verify this.
  • a second aspect of the invention relates to a method of obtaining image data of an element, the method comprising:
  • the element is a part of a person or animal, such as an exposed part of the person or animal.
  • the exposed part may be a wound, such as during operation.
  • the exposed part may be of a person's or animals skin.
  • Other types of surfaces of interest may be liquid surfaces or the like. Above, a vast number of surfaces and elements are described in relation to which parameters may be derived.
  • launching monochromatic radiation on to the element facilitates the use of a number of measurement or analysis methods.
  • movement between the camera or image plane and the element may deteriorate the quality of the analysis results, so that movement is detected during the providing of the image data.
  • the movement is detected simultaneously with the providing of the image data.
  • the detected movement preferably is relative to particular portions of the element or relative to the camera or image plane. If portions of the element are stationary relative to the camera/image plane, this would be the same thing.
  • Obtaining image data may be performed by using the imaging device described above.
  • Determining the movement may be as described above.
  • the determined movement may be indicated in the movement signal described.
  • Generation of the movement corrected image data may also be as described above.
  • the imaging device has an imaging plane and wherein the step of determining movement comprises determining movement of the imaging plane vis-à-vis the imaged element. This is described further above.
  • the movement may be determined during obtaining of image data and/or between the providing of two sets of image data.
  • the generation of the corrected image data may take into account the movement between the two sets of image data.
  • the movement during the providing of the (one or both) sets of image data may be taken into account.
  • the movement determination comprises determining a distance from the imaging device, such as the imaging plane, to the imaged element, the generating step comprising generating the corrected image data also based on the distance determined. This distance is relevant in certain measurement types and especially when the providing of the image data is performed using a fixed focus.
  • the method further comprises the step of displaying an image representing the corrected data.
  • the image data represent relative movement between different portions of the imaged element. This may be as described above.
  • the image may illustrate movement in a number of manners. Also, movement may be quantified, such as using colours, arrows or the like.
  • the motion correction assists in correction of perceived size and peak intensity/absorbance.
  • Motion artefacts during recording results in blurring that will make the objects of interest appear disformed as compared to their natural shape. Similar artefacts will result in decreased measures, and wrong location of peak intensity/absorbance. If the motion artefact is within a set range of motion, the correction can be applied to the calculated size based on prior knowledge on the imaging characteristics (angle of view) and distance to object.
  • the launching step comprises illuminating the element with coherent radiation. This radiation often stems from a laser.
  • coherent radiation facilitates the use of a number of laser speckle analysis methods as described further below, as coherent radiation will create a speckle pattern on the irradiated element.
  • the determined movement may be desired, if the determined movement does not exceed a first threshold, to output, as the corrected image data, the first image data. Thus, when the movement is too small, no correction is made. Also, if the determined movement exceeds a second threshold, exceeding the first threshold, it may be desired to not output corrected image data. This may be due to the movement being so violent that no correction is possible.
  • Another aspect of the invention relates to an imaging element for providing image data, the imaging element comprising:
  • the imaging device, the processing device, and the source may be as those described above.
  • Coherent radiation has the advantage that it will produce a speckle pattern on the irradiated element. Speckles are high intensity radiation spots created by positive interference. These spots are visible in the image data.
  • the movement is determined from the original image data. This may be performed in a number of manners.
  • the image data may be corrected. This may remove the streaks but may also facilitate deriving other information from the image.
  • Speckle patterns are used for a number of different analysis, as is described below, such as LSCI/LASCA. Being able to perform movement analysis, as is described both below and above, enables the use of a handheld unit instead of the stationary, legacy units.
  • a final aspect relates to a method of obtaining image data of an element, the method comprising:
  • the launching step and the step of obtaining the image date may be as described above. Now, however, not separate movement determination is made. Instead, the movement is determined from the first image data.
  • the first image data may be corrected based on the movement determined and corrected image data output.
  • the movement determined is as that described above and below.
  • a Laser Speckle Contrast Imaging (LSCI) or Laser Speckle Contrast Analysis (LASCA) system 10 comprising a sensing head 20 launching a beam forming a speckle pattern on an element 30, such as a surface portion of a human or animal body.
  • LSCI is used for e.g. illustrating blood flow in tissue, such as during operations or wounds.
  • bio-speckle which has also been used for estimation of fruit quality, biofilm characterization, polymer stress evaluation, dental health assessment (plaque), assessment of paint drying, assessment of crystallinity, surface roughness estimation.
  • Perfusion measurement in medicine is used for e.g. determining regional microvascular blood flow in different bodily tissues. Its usage ranges from assessing severity of burn wounds to monitoring ulcers on diabetic patients.
  • Perfusion assessment and quantification of microvascular blood flow has among others been based on optical coherence tomography, laser Doppler velocimetry, and laser speckle imaging.
  • Laser Speckle imaging is a well-established technology, where a coherent light source emits a beam forming a speckle pattern arising from the physical nature of coherent light. This pattern is recorded via a camera, and then several different algorithms can be used to analyse the recorded speckle pattern to determine spatio-temporal changes in the element on which the speckle pattern is provided. Depending on the algorithm used, the changes of the spatio-temporal patterns, caused by blood flow, can be more or less correlated to the absolute level of blood flowing in the tissues. These algorithms all relate to the laser contrast.
  • LASCA Laser Speckle Contrast Analysis
  • LSCI Laser Speckle Contrast Imaging
  • SFI Speckle Flow Index
  • legacy laser speckle devices are bulky, mounted on sturdy arms, and possess a range of settings intended for changing the recording settings for the most optimal for a given clinical situation.
  • all this is severely restricting in relation to the use of laser speckle analysis in e.g. hospitals, as the time consumed in positioning the equipment, applying settings, recording and then making a decision based on the measurement is simply too troublesome for the technology to be used in situations other than monitoring cases - such as the aforementioned burn wounds and diabetic feet ulcers etc, etc.
  • the analysis is desired from different angles.
  • the present sensor head 20 is portable, such as hand held, making it much faster and versatile in use. Also, viewing from different angles etc. is facilitated. However, relative movement now is a larger problem.
  • the sensor head therefore has a movement sensor for outputting information relative to movement of the sensor head. This movement may then be compensated for in the analysis.
  • a sensor head 20 is illustrated launching radiation on to an area 30 and imaging at least that area while logging movement of the sensor head relative to the area 30.
  • a processor calculates the resulting image which is illustrated on a monitor 40. In this image, relative movement of some portions of the area 30 will be seen relative to other portions thereof. In the image, the central area has an upward flow (arrows) compared to outer portions (no arrows).
  • the movement sensed may be that of the sensor head 20 relative to the earth, assuming that the area 30 is stationary relative to that coordinate system.
  • the relevant portion of the sensor head may be a lens or image plane of the sensor in the sensor head.
  • an image plane 201 such as the actual image sensor or CCD, is illustrated together with a lens or lens system/assembly 202 imaging the area 30 on to the image plane.
  • FIG. 1 To the right in figure 1 , another system type is seen also launching radiation on to the area 30 but having a remotely positioned camera or radiation sensor, now provided in a portion 22. Then, an optical fibre bundle 24 is used for guiding radiation from the left portion to the camera. In front of the left portion of the fibre bundle 24 is a lens 242 again imaging the area 30 on to the left end 241 of the fibre bundle. Then, the image plane is that defined by the lens 242 now on the end of the fibre bundle. In this manner, the portion 22 may be moved in any manner in relation to the area 30 without affecting the imaging thereof.
  • Set-ups of this type may be used in laparoscopes, rigid endoscopes as well as in the tips of flexible endoscopes. Clearly, in this type of set-up, the coherent radiation may also be fed in fibres from the portion 22 to the end 241 for launching on to the area 30.
  • the present set-up may be used for a wide variety of analysis and used in connection with a large number of processes.
  • the present set-up may be provided on a propelled vehicle such as a a remotely operated vehicle, such as a drone.
  • the movement may therefore be the movement of the vehicle.
  • a sensor head 20 is illustrated having a head portion 50 and a handle portion 54.
  • the back side of the head portion has a display 40.
  • a source 57 of coherent radiation is provided together with a camera 58 and a distance sensor 59.
  • a trigger 56 is provided for the operator to activate the imaging.
  • the overall shape is pistol like making handling and activation as well as viewing of the display simple.
  • a movement sensor 53 is provided in the sensor head. Also, a processor 55 is provided for receiving the output from the movement sensor, the camera, the trigger, and the distance sensor and for controlling the source 57 and generating the movement corrected image data fed to the display 40.
  • the distance sensor is not required, but the quality or contrast of the speckle pattern provided by the emitter 57 will depend on the distance between the emitter 57 and the area 30. Then, it may be desired to provide a feedback, such as via the display 40 or via e.g. another visible indication (LED) or a sound for an operator to indicate when the distance to the area 30 is as recommended or e.g. is too short or too large. Also, clearly, the distance may be used in the compensation of the image data, as a rotation of the head 50 will generate different movement artefacts for different distances to the area 30.
  • a feedback such as via the display 40 or via e.g. another visible indication (LED) or a sound for an operator to indicate when the distance to the area 30 is as recommended or e.g. is too short or too large.
  • the distance may be used in the compensation of the image data, as a rotation of the head 50 will generate different movement artefacts for different distances to the area 30.
  • the distance sensor may be implemented in a number of manners.
  • optically based distance sensors already exist, such as based on time-of-flight measurement of radiation/sound emitted by the sensor, reflected by the object/element 30 and detected by the sensor.
  • a number of other manners exist, however, such as triangulation-based methods or by simply providing a radiation emitter off-axis from the camera and with a beam having an angle to an axis of the camera.
  • the beam centre thus will, at different distances, be at different positions vis-à-vis the centre of the image.
  • the speckle pattern provided on the element 30 will depend on the distance between the radiation source and the element 30 - and may then be determined from the image created by the camera. Thus, no separate distance sensor is then required.
  • the radiation source 57 for LSCI analysis for example, provides a speckle pattern on the area 30.
  • a speckle pattern is generated by coherent radiation due to positive and negative interference.
  • This coherent radiation may be fed to the area 30 directly from a radiation source, such as a laser diode, or guided to the head 50 or area 30 by an optical fibre. Naturally, the radiation may, before impact on the area 30, be reflected on a stationary or moving surface, such as when scanning the radiation on the area 30.
  • the movement sensor 53 may be an accelerometer, gyrometer, rotation sensor, IMU (Inertial Measurement Unit, AHRS (Attitude and Heading Reference System) or the like, such as solid-state or MEMS based sensors. Preferably, 3D movement/translation/rotation is sensed. As mentioned, a distance sensor may be implemented by the movement sensor 53 or separately therefrom. Movement compensation will depend on the movement in question as well as the parameter(s) represented by the image data. If the image data is a usual image, as described above, blurring may be caused as well as an intensity variation in the individual pixels. On the other hand, if the determination is speckle based or based on coherent radiation, the movement may cause a difference in the speckle pattern and thus the homogeneity and contrast thereof.
  • the determined size of the object can be corrected to the expected size, without movement using only the exposure time (t) and recorded motion (v) to provide the apparent distance travelled during the exposure (L) and the determined centerline width (w) of the imaged object.
  • the movement artefact arises from motion artefacts solely based on rotation around the Z-axis. Then the placement of the object in relation to the optical axis determines the degree of correction. If objects are at the rim of the image, a larger apparent movement is observed, and if the object is placed perfectly in the center of optical axis then no correction is needed unless the object is elongated.
  • the rotational artefact will appear as streaks as in the XY-translation artefact, however when sensor readings determine rotational movement, the streaks may be considered as arc lengths (s) instead of linear displacement.
  • the motion artefact causes apparent motion and deformation as if the object is moving on a sphere determined by the optical angle and distance to object. This will result in the same issues and solutions as above, but also a change in perspective.
  • the registration of the difference in expected light field and the observed radiation can be used to determine if there is an angulation and thus observed perspective.
  • This change in perspective on size can be corrected using a programmatical implementation of orthogonal projection correction.
  • the intensity of the light from the object can be corrected in the perspective gradient using the intensity equations above.
  • movement may also take place between the radiation source and the irradiated element. This may give rise to a variation in the radiation intensity over the irradiated element. From the movement and e.g. the emission intensity characteristic (often Gaussian for lasers), the intensity variation over the irradiated element and thus the image data may be determined. This, naturally, may cause a variation in the intensity of radiation received over the image data, such as when this radiation is reflected/scattered by the irradiated element or when the radiation emitted causes e.g. fluorescence to be emitted from the irradiated object.
  • the emission intensity characteristic often Gaussian for lasers
  • the movement will be a combination of the above movement types.
  • this movement may be de-composed into the above types, where after compensation may be performed.
  • This type of artifact may be added to the others and a combined compensation performed.
  • the movement may cause other or additional artifacts.
  • the speckle pattern is generated by the interference of the coherent radiation with itself on the irradiated object.
  • Movement of the radiation source will create a variation in the speckle pattern.
  • a distance variation of half the wavelength will give dramatic variations in the speckle pattern, so compensation is desired.
  • the speckle pattern is imaged. Movement will be a movement relative to the radiation source and will thus vary the speckle pattern, whereas stationary portions will retain the speckle pattern. A simple manner of quantifying the movement would be to determine the contrast in different portions of the image. A high contrast is seen where no change is seen in the speckle pattern. A low contrast will be seen where the speckle pattern varies a lot - i.e. where the movement is large.
  • the distance variation between the radiation source and the portion of the irradiated element/object may be determined. This will describe the variation created in the speckle pattern at that particular portion, as the speckle pattern is generated by positive/negative interference of the radiation (and thus depends on the wavelength). This again may describe a contrast or homogeneity in that particular pixel or portion. This will be taken as a sign of movement (if the parameter sought for is movement) in addition to the movement of the portion itself (the parameter desired).
  • ⁇ / ⁇ is the standard manner of determining contrast (standard deviation divided by mean value) and ⁇ is the sum of the errors.
  • K then is the resulting value, such as a velocity.
  • differences from one to the other may be caused by movement and may thus be used for quantifying this movement.
  • Rotational artefacts creates a change in angle. Translational motion between frames is not an issue for the light source, more than it is an issue of object location in the image.
  • the sum of motion artefacts is based on object motion, source motion, and imaging device motion. Knowing the motion of imaging plane and light source, will enable easier correction for object motion artefacts, as a function of source separation of the sources for errors.
  • the analysis of the image data may be performed.
  • the recorded image might need to be split from an image with multiple channels or color-intensity representation (e.g. RGB, CMYK, HSV, etc) into a single channel grey scale image where the grey scale represents the monochromatic light intensity.
  • color-intensity representation e.g. RGB, CMYK, HSV, etc
  • LSCI Laser Speckle Contrast Imaging
  • LSI Laser speckle imaging
  • LASCA Laser Speckle Contrast Analysis
  • LSTCA Laser speckle temporal contrast analysis
  • LSFG Laser speckle flowgraphy
  • LSPI Laser speckle perfusion imaging
  • t-LASCA temporal LASCA
  • mLSI modified LSI
  • SFI Speckle Flow Index
  • the degree of blurring of the speckle pattern is quantified by the speckle contrast.
  • exposure time of the camera is longer than the time scale of the speckle fluctuations recorded image results in a blurring of the speckle pattern in areas of high motion.
  • is the average intensity of each region and sigma is the standard deviation of pixel intensities in each region.
  • the kurtosis is a different measure that describes the shape of a distribution based on the ratio between the central tendency and the deviation of the data, and as such related to the inverse of the contrast (K). Its advantage is that imaging will have a higher signal-to-noise ratio, appearing less noisy, and providing a more precise foundation for segmentation of underlying objects.
  • This implementation will allow to correct single ROI values for motion artefacts, or be applied over the whole image if motion is below a threshold where the image still contains the same objects.
  • frequency distributions from the motion data can be extracted using e.g. fast-Fourier transformation.
  • the frequency distribution can then be used to automatically select the frequency representing the most powerful signal, and then using digital filtering like a Butterworth filter to remove the motion frequencies from the contrast data series.
  • the visualisation of the laser speckle contrast can be further improved by using active noise reduction, by modifying the coherent light source or what it is reflected on towards the object of interest.
  • the active noise reduction both has an advantage in situations without device motion and in situations including device motion. In situations without motion, the controlled changes of the light emitted functions as a running average when using time based contrast analysis.
  • the whole device could be motion dampened by mounting it on a mechanical gyro, however both the camera and the light source could be mounted on separate gyros or similar mechanical contraptions allowing for active control of the device parts based on the motion of the device mount as each element needs specific correction approaches depending on the nature of the motion.
  • the camera may have a fixed focus length in order to make the operation very simple.
  • the distance sensing may be used for guiding the operator to provide a correct distance, where after the camera will be "point and shoot".
  • This feedback may be audible or visible such as on a display, on separate visible indicators (red/green LED for example).
  • radiation may be launched on to the area 30 and thus visible to the operator, indicating whether the distance is OK or not. For example, if the distance is not OK, a red dot, cross or the like may be visible on the area 30 - and when it is no longer there, the distance may be OK.
  • the trigger has multiple advantages, such as to allow the operator to obtain the data when desired. Also, the coordination of the imaging and the movement sensing (or the logging or correlation of such data) may be handled by the trigger.
  • a feedback may also be given to the operator when pressing the trigger as to whether the image taken is acceptable or not. For example, if the distance was inacceptable, if the movement was too large, if the laser speckle contrast was not sufficient, or if any other parameter was not within range, an "unsuccessful" sound could be output (or a corresponding other indication could be made), so that the operator need not refer to a display to know that another "shot" is required.
  • Multiple images may be provided between which or during which movement takes place. This movement may then be used for determining the relative positions of the portions of the element 30 imaged in different images, which may then be combined or stitched into a single image if desired. Thus, zoomed-in or higher resolution images may be assembled to a larger image.
  • Absorption of radiation by the area 30 or elements therein, such as blood or oxygen, may be used for determining e.g. the presence or concentration/amount of e.g. molecules therein.
  • Radiation may be emitted from the head 50 and detected by the camera, where absorption of the radiation may indicate the concentration of oxygen (oxygenation) for example.
  • visible radiation may be used for indicating demarcations on the element 30 so that the operator may see the boundaries of the image or camera viewing angle, demarcations of the boundaries of the speckle pattern, indications of whether the distance should be increased/decreased, whether other analysis, such as the oxygenation analysis, is turned on or off. Then, the operator need only point the head portion toward the area 30 but need not view the display 40 or other indications on the handle in order to know the mode of operation, the pointing direction as well as the acceptability of the measurement/analysis which may then be activated using the trigger.
  • multiple speckle patterns may be provided, such as sequentially, by launching multiple radiation wavelengths on to the area 30.
  • Providing multiple flow measurements may increase the overall precision of the analysis.
  • the same camera may be used (as may the same source of coherent radiation if it has multiple wavelengths), as the wavelengths of the two speckle patterns may be separated due to their different wavelengths.
  • Separation can be fulfilled using optical filters, matching the emitted light (bandpass, lowpass, highpass or an appropriate combination), placed in front of a camera that is sensitive to a wide range of wavelengths.
  • This implementation enables that the filters can be changed to the appropriate wavelength for a given situation (light condition, measured quantity, surface characteristic, relative motion regime, distance to subject, or in the presence of fluorescent materials in the illumination area). With several light sources this can be done by multiple cameras each equipped with a specific optical filter, or by a mechanical implementation placing the filters in the imaging plane.
  • Multiplex controlling light sources and coordinated recording with camera may be performed, such that the camera records and handles separate light sources independently. This would provide the same benefits as with optical filters.
  • LDF Laser Doppler Flowmetry
  • the optional distance measurement may be performed using Laser Triangulation. This would also enable high precision movement measurements, as the operator motion can be calculated and subtracted.
  • the movement of the head 20/50 relative to the element 30 is determined solely from the image data provided by the camera.
  • the speckle pattern on the element 30 is defined by the coherence of the radiation and thus forms a pattern of spots created by negative and positive interference of the radiation when scattered/reflected. This pattern clearly will vary on the element 30 depending not only on the distance but also on the angle between the head (the radiation source) and the element 30. Thus, this pattern will vary, if the head or radiation source is moved relative to the element 30. This movement thus may be determined based only on the variation of the speckle pattern.
  • programmatical correction is needed if the sensor is of a progressive scan nature i.e. not exposing all parts of the sensor at the same time.
  • an optimization approach is normally used.
  • the optimization can be completed in fixed size blocks, or in areas of varying size, the comparison of regions can then be made by calculating the difference in regions.
  • a match of regions between two images is determined when test-statistics such as sum-of-squares, partial derivatives or absolute error/difference is within a predetermined threshold
EP18196325.7A 2018-09-24 2018-09-24 Handhaltbares bildgebungselement mit einem bewegungssensor Withdrawn EP3626160A1 (de)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP18196325.7A EP3626160A1 (de) 2018-09-24 2018-09-24 Handhaltbares bildgebungselement mit einem bewegungssensor
PCT/EP2019/075719 WO2020064737A1 (en) 2018-09-24 2019-09-24 A handheld imaging element with a movement sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP18196325.7A EP3626160A1 (de) 2018-09-24 2018-09-24 Handhaltbares bildgebungselement mit einem bewegungssensor

Publications (1)

Publication Number Publication Date
EP3626160A1 true EP3626160A1 (de) 2020-03-25

Family

ID=63685595

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18196325.7A Withdrawn EP3626160A1 (de) 2018-09-24 2018-09-24 Handhaltbares bildgebungselement mit einem bewegungssensor

Country Status (2)

Country Link
EP (1) EP3626160A1 (de)
WO (1) WO2020064737A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4093018A1 (de) * 2021-05-19 2022-11-23 Karl Storz SE & Co. KG Verfahren zur medizinischen bildgebung und medizinische bildgebungsvorrichtung

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL2027331B1 (en) 2021-01-18 2022-07-25 Univ Twente Handheld laser-based perfusion imaging apparatus and method of using said apparatus
WO2023237929A1 (en) * 2022-06-10 2023-12-14 Rockley Photonics Limited System and method for calibrating speckle-based sensor

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010017508A1 (en) 2008-08-07 2010-02-11 Verathon Inc. Device, system, and method to measure abdominal aortic aneurysm diameter
WO2011117779A2 (en) 2010-03-26 2011-09-29 Aïmago S.A. Optical coherent imaging medical device
CN102357033A (zh) 2011-09-27 2012-02-22 华中科技大学 一种激光散斑血流成像处理系统及方法
WO2014009859A2 (en) 2012-07-10 2014-01-16 Aïmago S.A. Perfusion assessment multi-modality optical medical device
US20140276099A1 (en) * 2013-03-14 2014-09-18 Koninklijke Philips N.V. Device and method for determining vital signs of a subject
CN204765619U (zh) 2015-06-15 2015-11-18 上海交通大学 一种激光散斑血流成像装置
RU2573053C1 (ru) 2014-09-10 2016-01-20 Самсунг Электроникс Ко., Лтд. Лазерные спекл-интерферометрические системы и способы для мобильных устройств
US20160270672A1 (en) 2015-03-20 2016-09-22 East Carolina University Multi-Spectral Laser Imaging (MSLI) Methods and Systems for Blood Flow and Perfusion Imaging and Quantification
US20170017858A1 (en) 2015-07-15 2017-01-19 Samsung Electronics Co., Ltd. Laser speckle contrast imaging system, laser speckle contrast imaging method, and apparatus including the laser speckle contrast imaging system
WO2017062759A1 (en) 2015-10-09 2017-04-13 Vasoptic Medical, Inc. System and method for rapid examination of vasculature and particulate flow using laser speckle contrast imaging
CN107307848A (zh) 2017-05-27 2017-11-03 天津海仁医疗技术有限公司 一种基于高速大范围扫描光学微造影成像的人脸识别及皮肤检测系统
US20170354392A1 (en) * 2016-06-14 2017-12-14 Novadaq Technologies Inc. Methods and systems for adaptive imaging for low light signal enhancement in medical visualization

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010017508A1 (en) 2008-08-07 2010-02-11 Verathon Inc. Device, system, and method to measure abdominal aortic aneurysm diameter
WO2011117779A2 (en) 2010-03-26 2011-09-29 Aïmago S.A. Optical coherent imaging medical device
CN102357033A (zh) 2011-09-27 2012-02-22 华中科技大学 一种激光散斑血流成像处理系统及方法
WO2014009859A2 (en) 2012-07-10 2014-01-16 Aïmago S.A. Perfusion assessment multi-modality optical medical device
US20140276099A1 (en) * 2013-03-14 2014-09-18 Koninklijke Philips N.V. Device and method for determining vital signs of a subject
RU2573053C1 (ru) 2014-09-10 2016-01-20 Самсунг Электроникс Ко., Лтд. Лазерные спекл-интерферометрические системы и способы для мобильных устройств
US20160270672A1 (en) 2015-03-20 2016-09-22 East Carolina University Multi-Spectral Laser Imaging (MSLI) Methods and Systems for Blood Flow and Perfusion Imaging and Quantification
CN204765619U (zh) 2015-06-15 2015-11-18 上海交通大学 一种激光散斑血流成像装置
US20170017858A1 (en) 2015-07-15 2017-01-19 Samsung Electronics Co., Ltd. Laser speckle contrast imaging system, laser speckle contrast imaging method, and apparatus including the laser speckle contrast imaging system
WO2017062759A1 (en) 2015-10-09 2017-04-13 Vasoptic Medical, Inc. System and method for rapid examination of vasculature and particulate flow using laser speckle contrast imaging
US20170354392A1 (en) * 2016-06-14 2017-12-14 Novadaq Technologies Inc. Methods and systems for adaptive imaging for low light signal enhancement in medical visualization
CN107307848A (zh) 2017-05-27 2017-11-03 天津海仁医疗技术有限公司 一种基于高速大范围扫描光学微造影成像的人脸识别及皮肤检测系统

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FARRARO R ET AL.: "Handheld, point-of-care laser speckle imaging", JOURNAL OF BIOMEDICAL OPTICS, vol. 21, no. 9, September 2016 (2016-09-01), pages 094001, XP060075652, DOI: doi:10.1117/1.JBO.21.9.094001
KUMAR MAYANK ET AL.: "PulseCam: High-resolution blood perfusion imaging using a camera and a pulse oximeter", 38TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC, 16 August 2016 (2016-08-16), pages 3904 - 3909, XP032980023, DOI: doi:10.1109/EMBC.2016.7591581
LERTSAKDADET B ET AL.: "Correcting for motion artifact in handheld laser speckle images", JOURNAL OF BIOMEDICAL OPTICS, vol. 23, no. 3, March 2018 (2018-03-01), pages 036006
MIN-CHUL LEE ET AL.: "Optical Society of Japan", vol. 22, April 2015, SPRINGER, article "Development of a handheld blood flow measurement system using laser speckle flowgraphy", pages: 308 - 314

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4093018A1 (de) * 2021-05-19 2022-11-23 Karl Storz SE & Co. KG Verfahren zur medizinischen bildgebung und medizinische bildgebungsvorrichtung

Also Published As

Publication number Publication date
WO2020064737A1 (en) 2020-04-02

Similar Documents

Publication Publication Date Title
US11051002B2 (en) Focus scanning apparatus
US8078265B2 (en) Systems and methods for generating fluorescent light images
CA3031905C (en) Instrument for acquiring co-registered orthogonal fluorescence and photoacoustic volumetric projections of tissue and methods of its use
US9204952B2 (en) Method for optical 3D measurement of teeth with reduced point spread function
US9175945B2 (en) Evaluating fit of an earpiece based on dynamic data
US7873407B2 (en) Systems and methods for in-vivo optical imaging and measurement
WO2020064737A1 (en) A handheld imaging element with a movement sensor
CN105286785A (zh) 多光谱医学成像装置及其方法
US9095255B2 (en) Method and device for locating function-supporting tissue areas in a tissue region
US20070038122A1 (en) Diffuse optical tomography system and method of use
US20090240138A1 (en) Diffuse Optical Tomography System and Method of Use
US20140276105A1 (en) Scanning techniques for probing and measuring anatomical cavities
CN101730498A (zh) 低相干性牙科光学相干断层成像术成像
JP2010220894A (ja) 蛍光観察システム、蛍光観察装置および蛍光観察方法
CN111031888B (zh) 用于内窥镜成像的系统和用于处理图像的方法
US8718398B2 (en) Image processing method and apparatus
JP2018179918A (ja) 形状計測システム、及び、形状計測方法
CN114504293B (zh) 内窥镜曝光控制方法及内窥镜
JP5278878B2 (ja) 管内面形状測定装置
WO2021099127A1 (en) Device, apparatus and method for imaging an object
JP6593866B2 (ja) 舌の立体形状計測システム
JP2019010346A (ja) 被検体情報取得装置および超音波探触子
US20240156349A1 (en) Method of measuring a fluorescence signal and of determining a 3d representation, image capturing and processing device
JP2018033567A (ja) 内視鏡システム
JPWO2022230563A5 (de)

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20190402

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20200817