CN114236803A - Microscope system and method for checking the calibration of a microscope - Google Patents

Microscope system and method for checking the calibration of a microscope Download PDF

Info

Publication number
CN114236803A
CN114236803A CN202110790405.XA CN202110790405A CN114236803A CN 114236803 A CN114236803 A CN 114236803A CN 202110790405 A CN202110790405 A CN 202110790405A CN 114236803 A CN114236803 A CN 114236803A
Authority
CN
China
Prior art keywords
microscope
panoramic image
panoramic
calibration
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110790405.XA
Other languages
Chinese (zh)
Inventor
克里斯蒂安·迪特里希
丹尼尔·哈泽
曼努埃尔·阿姆托尔
约翰内斯·克诺布利希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Carl Zeiss Microscopy GmbH
Original Assignee
Carl Zeiss Microscopy GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carl Zeiss Microscopy GmbH filed Critical Carl Zeiss Microscopy GmbH
Publication of CN114236803A publication Critical patent/CN114236803A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • G01B21/042Calibration or calibration artifacts
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/04Measuring microscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • Microscoopes, Condenser (AREA)
  • Image Analysis (AREA)

Abstract

A microscopy system comprises a panoramic camera (9) for taking at least one panoramic image (10) of a sample environment and a computing device (20) arranged for evaluating the at least one panoramic image (10). The computing device (20) has calibration parameters (P) with which the image coordinates of the at least one panoramic image (10) are interpreted. From the at least one panoramic image (10), geometric information (G) is determined about at least one reference structure (15) imaged in the panoramic image (10), the position or shape of the reference structure in the panoramic image (10) depending on the position of at least one microscope component (3, 5, 6, 9B). Determining whether a change affecting the validity of the calibration parameter (P) has occurred in the microscope component (3, 5, 6, 9B) by calculating the determined geometric information (G) using predefined reference data (a, B). A corresponding method and a method are described, which determine the mentioned variations by means of a trained machine learning model.

Description

Microscope system and method for checking the calibration of a microscope
Technical Field
The present disclosure relates to a microscopy system and a method for verifying microscope calibration.
Background
Automation plays an increasingly important role in modern microscopy systems. The microscopy system should take pictures, access and examine in more detail the sample to be examined in a partially or fully automated manner. The image information should be able to be displayed to the microscope user with the highest possible quality without the user having to manually optimize numerous settings. The relevant parts of these targets are realized by means of a panoramic camera taking a panoramic image of the sample and the sample environment. For example, a navigation map can be formed from the panoramic image, in which the user can select a position, which can then be automatically approached by the motorized stage and examined at a higher magnification. The panoramic image may also be used for automatic sample identification, for example by identifying and optionally more accurately automatically inspecting cells of a microtiter plate. The panoramic image may also be used for autofocusing, for example by estimating the sample height from the panoramic image or determining a suitable position at which the autofocusing method is subsequently performed.
In order to be able to use the panoramic image in this way, the relationship between the image coordinates of the panoramic image and the spatial information, such as the angle of view of the panoramic camera and/or the position of the panoramic camera relative to a reference point, must be known. For this purpose, the panoramic camera is calibrated. For example, a panoramic image of a calibration object having a known size may be taken so that the relationship to the image coordinates may be determined. The distortion of the panoramic camera can also be determined by such calibration and corrected by calculation.
A microscope system of this type therefore comprises a panoramic camera for taking at least one panoramic image of the sample environment and a computing device which is provided for evaluating the at least one panoramic image. The computing device has calibration parameters with which to interpret the image coordinates of the at least one panoramic image.
For example, in US9344650B2 and DE102013012987a1, the applicant describes an optical device in which a calibration is carried out by means of a reference object in order to process subsequently captured images by means of calibration data. In DE102017109698a1, the applicant discloses a microscope in which a panoramic image of a calibrated panoramic camera is evaluated, for example in order to classify microscope components and to load information about the identified components. Another calibration method for microscopes with a rotating support is described by the applicant in DE102013222295a1 to achieve automatic focusing and image center tracking. The applicant has described in DE102013006994a1 the evaluation of panoramic images, for example for determining and automatically accessing the sample position. The applicant has described in DE102019114117 a further image evaluation for automatically carrying out a workflow of a microscope, for which purpose calibration and special samples, which may also be, for example, labels on sample stages, are identified in a panoramic image. Furthermore, the applicant describes in DE102020118801 the evaluation of panoramic images of microscopes. In particular, the distance between the panoramic camera and the sample plane is estimated from the panoramic image, for which purpose reference markers in the field of view of the panoramic camera can be used. In DE102020101191, the applicant describes the evaluation of panoramic images of microscopes, wherein a homography is determined by which the panoramic image is converted into another representation. The correct homography in perspective can be determined by taking calibration samples of known dimensions, etc.
Ensuring proper calibration is made more difficult if the user can reconfigure, replace, or otherwise position the components differently. For example, limit switches of the mobile station may be reconfigured. Even if the specification indicates that a new calibration is to be performed in accordance with these measures, the user may ignore this. Changes may also be unknowingly caused by vibration, loose connections or temperature changes. There is therefore a risk that the navigation will only take place with reduced accuracy and result in collisions that damage the device parts or the sample. Frequent calibration, for example at each system start-up, is not possible because it requires time and a trained user. Although large safety tolerances can avoid collisions even in the case of incorrect calibration, they unnecessarily limit the travel range of the motorized sample stage and do not eliminate inaccuracies in image-based navigation.
In order to identify whether maintenance is required, in particular whether calibration is required, the applicant proposes a machine learning model in DE102018133196a 1. The machine learning model may identify deviations from the correct case in the panoramic image, for example, by a neural network trained for detecting anomalies. Thus, if maintenance is required, an error message can in principle be generated on the basis of the panoramic image. It is desirable to determine more accurately whether or to what extent calibration is still applicable.
Disclosure of Invention
It may be seen as an object of the present invention to provide a microscope system and a method for checking the calibration of a microscope, which enable a panoramic image to be used accurately and error conditions to be recognized as accurately as possible.
This object is achieved by a microscope system according to the invention and a method according to the invention for checking the calibration of a microscope.
In a microscopy system of the above-mentioned type, according to the invention, the computing device is arranged for determining from at least one panoramic image geometrical information about at least one reference structure imaged in the panoramic image, the position or shape of the reference structure in the panoramic image depending on the position of at least one microscope part. By calculating the determined geometric information with the predefined reference data, the calculation device determines whether a change of the microscope component has occurred which influences the validity of the calibration parameter.
In a method for checking the calibration of a microscope, according to one embodiment of the invention, at least one panoramic image is evaluated to determine geometric information of at least one reference structure imaged in the panoramic image, wherein the position or shape of the reference structure in the panoramic image depends on the position of at least one microscope part. The determined geometric information is calculated using predefined reference data in order to determine whether a change of the microscope component has occurred which influences the validity of calibration parameters for interpreting the image coordinates of the at least one panoramic image.
The invention allows to check whether the calibration parameters used are applicable without requiring special measurements for this purpose. Determining the geometric information from the reference structure in the panoramic image allows not only to determine the calibration offset itself but also to quantify it. This may more accurately illustrate whether previous calibration parameters are still applicable. In an alternative refinement, the quantitative evaluation of the calibration deviation is used to correct the image representation and/or to control the microscope components. Advantageous variants of the microscope system according to the invention and of the method according to the invention are the subject matter of the dependent claims and are explained in the following description.
Geometric information
The geometric information may relate, for example, to the position, orientation and/or shape of the reference structure in the panoramic image or to the distance between two or more predefined points of the reference structure. Here, the geometry and dimensions of the reference structure in space are known. For example, the reference structure may comprise a circle, a rectangle, or a straight line of microscope components. Thus, for example, the edges of the slide form a straight line or rectangle. The circle may be printed on a reference label or may be given by some opening of a fixing frame for the sample carrier. The shape, position, orientation and size of these geometries in the panoramic image depend on the position of the panoramic camera, the reference structure and optional components through which the panoramic camera views the reference structure. The circle of the reference structure may be deformed into an ellipse, for example, in the panoramic image, wherein in particular the shape, size and/or position of the ellipse in the panoramic image may be determined as geometrical information.
The determination of the geometric information can be done by classical image processing algorithms without machine learning models. For example, edge filters can be used, by means of which barcodes can be measured in a simple manner as reference structures or in principle any differently designed reference structures.
Alternatively, the computing device may have a trained machine learning model that obtains at least one panoramic image as input and outputs geometric information. The machine learning model may comprise, for example, a Convolutional Neural Network (CNN) which obtains as input at least one panoramic image or image data derived therefrom. The training of the machine learning model can be carried out by a monitored learning process, in which training panoramic images with corresponding annotations/identifications are predefined. The identification may correspond to geometric information. For example, the position of the reference structure in the training panoramic image may be manually predetermined. A learning algorithm is used for determining model parameters of the machine learning model from the annotated training panoramic image. For this purpose, a predetermined objective function can be optimized, for example a loss function can be minimized. The loss function describes the deviation between a predetermined identifier of the machine learning model and the current output, which is calculated from the current model parameter values from the training panoramic image. Depending on this deviation, the model parameter values may be changed, for example, by a (random) gradient descent. In the case of CNN, the model parameters may in particular comprise entries of convolution matrices of different layers of CNN. Instead of CNN, other model architectures of deep neural networks (english) are also possible.
The training panorama can display various reference structures for which the geometric information is specified as an identifier. Thus a model is learned which can identify not only a single reference structure in the panoramic image, but also all reference structures contained in the training panoramic image.
Instead of a supervised training, a partially supervised training may also be used, wherein not all training panoramic images are annotated.
Reference structure
The reference structure can in principle be any element in the field of view of the panoramic camera whose position or shape in the panoramic image depends on the position of the microscope part. For example, the reference structure may be or comprise an element of the microscope component, in particular a (threaded) hole or a screw of the microscope component. The reference structure may also be or comprise a contour, step or edge of the microscope component. Furthermore, the reference structure may comprise reference marks, such as labels, reference patterns or written instructions, in particular on the microscope part.
The position or shape of the reference structure in the panoramic image is thus dependent on the position of the microscope part, and the reference structure does not necessarily have to be part of the microscope part or rigidly connected thereto. For example, the reference structure may also be formed by or on a fixed apparatus frame or microscope stand, and the microscope component may be the panoramic camera itself: the position of the reference structure on the stationary microscope support depends on the positioning and orientation of the panoramic camera. Misalignment of the panoramic camera, such as rotation or displacement, may thus be determined.
When using a machine learning model, what is the reference structure can be defined by the annotation of the training panoramic image. For example, if the size (e.g., measured in units of image pixels) or the shape of an object imaged in a training panoramic image is given in advance as an annotation, the object represents a reference structure.
From the same panoramic image, geometrical information about a plurality of imaged reference structures may optionally be determined. The position or shape of the reference structure in the panoramic image may depend on the position of the various microscope components. For example, one reference structure may be present on the sample stage and another reference structure may be present on the objective revolver or on a microscope component held by the objective revolver. By calculating the determined geometric information using the predefined reference data, it is now possible to determine which of the various microscope components have undergone a change, in particular a misalignment. For example, if the panoramic camera views the sample stage through a mirror fixed to the objective revolver, it can be distinguished by a plurality of reference structures whether the sample stage or the mirror has undergone an unintentional change with respect to the arrangement on which the calibration parameters are based.
Reference data
The reference data may be determined during a last performed calibration or recorded after performing the calibration. By comparing or calculating the determined geometric information with reference data, it can thus be determined whether a misalignment has occurred, i.e. whether a relevant change has occurred with respect to the last performed calibration.
As a simple example, the reference data may represent geometric information about the reference structure, i.e. for example the position or size of a tag (representing the reference structure) in a panoramic image taken during the calibration process, that is to say a panoramic image about which it is known or predefined that the calibration parameters are valid. If differences are now found between the reference data and the geometric information determined from the current panoramic image, changes in the microscope components can be inferred therefrom. Whether the change is large enough that the validity of the calibration parameters is considered to be impaired can be determined by a predetermined threshold value. If the difference is greater than the threshold in absolute value, the calibration parameter is considered invalid or corrupted.
For the above case, provision may be made for the panoramic image to be recorded at a setting of the movable component, which corresponds to the setting at the time of determining the reference data. For example, the stage position of the movable sample stage is set to be the same as when the reference data is determined. For example, if the reference structure is a threaded hole in the sample stage, the geometric information can be directly compared with the reference data. However, panoramic images taken with different settings of the movable component can also be evaluated for flexible monitoring during continuous operation. In this case, the influence of these settings on the representation of the reference structure in the panoramic image has to be taken into account, which can be done by means of calibration parameters. For example, if the motorized sample stage has moved a known distance, the respective offset of the reference structure (e.g. the threaded hole of the sample stage) in the panoramic image can be calculated by means of the calibration parameters. After determining the geometric information of the reference structure from the panoramic image, the reference structure can therefore be calculated with the aid of calibration parameters as a function of the table position (or generally as a function of the settings of the microscope components) before comparison with the reference data.
One or more panoramic images taken during or following the last performed calibration or data derived therefrom may also be used as reference data. In particular, it may be a panoramic image taken for calibration purposes. After the subsequent calibration, it can be assumed that no changes have occurred to the microscope components that would affect the calibration. Thus, the panoramic image taken after the subsequent calibration may also represent the reference data or be used to derive the reference data.
In some embodiments, the calibration parameters are used to calculate an imaging or homography from the panoramic image that perspectively images or projects a plane in space into another plane. The homography thus describes how the image content of the panoramic image is viewed from different viewing directions, for example as an overhead view image. In such a perspective-corrected representation of the device part, the sample-holding frame or the sample, the geometric structures, for example edges, discs or cuboids, are represented in a geometrically correct manner, for example as straight lines, circles or rectangles, and therefore the geometric measurements can be better performed. It is therefore also possible to use the reference data for the calculation after the calculation of the homography. Here, the reference data may relate to, for example, the shape of the imaged part, for example certain objects having a right-angled, straight-sided or circular shape. If the shape in the processed panoramic image differs from this, it can be determined that the calibration parameters are no longer applicable.
The geometric information determined by calculation using the predefined reference data can also be made by means of a trained machine learning model. The machine learning model takes as input geometric information and outputs information about the changes of a/this microscope component. The reference data is presented here by learning the model parameter values. Thus, the reference data need not be readily interpretable geometric information, but may generally be any information that facilitates assessing whether the currently determined geometric information is consistent with the arrangement for which the calibration parameters are applicable. For example, the machine learning model may be trained with annotated training data that includes geometric information determined as described from a panoramic image about a reference structure. As an annotation, the value of the variation or classification information (e.g. class: "no relevant variation" and "variation affecting the calibration parameter") can be predefined, respectively. Accordingly, the machine learning model may particularly be designed as a classification model or a regression model. It may be connected to the above-mentioned machine learning model, which takes the panoramic image as input and outputs geometric information. In an alternative embodiment, the machine learning model can obtain, as an additional input, the settings of the microscope components (e.g. the sample stage position) used for taking the panoramic image. It is thus possible to take into account how different settings of the microscope components influence the determined geometric information during the calibration and during the acquisition of the panoramic image. In this embodiment, various settings of the microscope components are also taken into account in the training of the machine learning model.
Calibration parameters and calibration procedure
Calibration parameters are understood to be parameter values on the basis of which the image coordinates of at least one panoramic image are interpreted. For example, a calibration model can be specified which uses homography parameters and/or scaling dimensions as calibration parameters. Homographies describe how one plane is mapped onto another plane. For example, the panoramic camera may observe the sample carrier obliquely such that the direction of observation of the sample carrier forms an angle with the normal on its surface. Such a panoramic image can be converted into an image corresponding to a perpendicular viewing direction on the surface of the sample carrier by homography.
The calibration model or the values of the calibration parameters contained therein may enable the calculation of the position in space (world coordinate system) from the position in the panoramic image. In particular, transverse x and y coordinates and/or z coordinates (in the direction of the optical axis of the microscope objective) associated with a point in the panoramic image can thus be determined in space. The position information can thus be converted from the panoramic image into position information relating to the reference position of the microscope. In particular, a drive direction, for example a sample stage, can be assigned to the camera coordinate system. The calibration parameters can thus also be used to avoid collisions between microscope components. For example, for a sample carrier displayed in a panoramic image, its height and/or lateral dimensions can be determined by means of calibration parameters. Depending on this height and/or lateral dimension and depending on the dimensions of the microscope objective currently used, an allowable movement range for the sample stage is generated. In particular, in the case of exchangeable additional modules (e.g. movable sample tables), it is advantageous to avoid collisions. User operational errors may be at least partially avoided and/or corrected.
Thus, calibration parameters may be used for using the panoramic image as a navigation map. If the user selects a position in the panoramic image, the motorized sample stage and/or the focus actuator can be controlled by means of the calibration parameters to approach the selected position in order to perform a further examination at this position.
The values of the calibration parameters may be determined during the calibration process. For example, during the calibration process, a panoramic image of the reference object is recorded at a predefined/known microscope setting. The predefined microscope settings can in particular specify the lateral position of the sample table and the height setting of the sample table. More generally, a microscope setup may relate to all settings of the microscope components that affect the position, size or shape of the reference object in the panoramic image. The reference object may be the already described reference structure, which is used to check whether the calibration parameters are still applicable. Alternatively, however, the reference object may also be different from this and for example comprise a reference sample, for example a checkerboard, which is to be arranged at a predetermined position on the sample stage for the calibration process and may have known dimensions.
Alternatively, the reference data may be derived from the calibration parameters and the current microscope settings, and the determined geometric information of the reference structure may be compared with the reference data. For example, the current microscope settings may indicate the (assumed) lateral and height coordinates of the sample stage; the calibration parameters can then be used to derive image coordinates at which the known threaded bore (or generally another reference structure) of the sample stage should be located in the panoramic image (nominal coordinates). These nominal coordinates may represent reference data and may be compared with the actual coordinates of the reference structure, which are determined in the current panoramic image. This comparison allows to assess whether the calibration parameters are still applicable or whether a change of the microscope component has occurred which is not taken into account by the calibration model using the calibration parameters.
Microscope component
The at least one microscope component can in principle be any component of a microscope, in particular a replaceable or adjustable component. The microscope component may be, for example, a sample stage, an objective changer or objective turret, a mirror or other optical element that directs light to a panoramic camera, or a panoramic camera.
The target position can be predefined for one or more microscope components. The determined change then describes the difference between the actual position of the microscope component and the target position.
The change affecting the effectiveness of the calibration parameter may in particular be a movement, rotation, bending, deformation or other spatial change of the microscope component. The geometric information of the reference structure may thus particularly describe a displacement, a rotation, a bending or a twisting of the reference structure. For example, when a mirror through which the panoramic camera views the sample environment is bent or deformed, a corresponding distortion of the reference structure in the panoramic image may occur. The variation of the microscope components can be determined from a single panoramic image or also based on a plurality of panoramic images. In particular, a plurality of panoramic images may be taken at different settings of the microscope component. A difference in the geometric information of the reference structure between the panoramic images is determined and calculated using the predefined reference data in order to determine whether a change in the microscope component has occurred. For example, the sample stage may be moved and it may be detected how far the reference structure on the sample stage is moved in the panoramic image as a difference in the geometric information. In the case of an adjustable holding frame for the sample carrier, panoramic images can also be taken and evaluated in different settings of the holding frame to determine whether the position or the model of the holding frame corresponds to the calibration parameters. The two or more panoramic images may also be from different panoramic cameras. In stereo measurements, these panoramic cameras view the sample environment from different directions. When evaluating a plurality of panoramic images from different viewing directions or with different settings of the microscope components, the position of the microscope components can be determined more accurately in some cases.
Continuous self-monitoring
Taking panoramic images and determining whether changes have occurred to the microscope components may be performed during or alternating with a microscopy measurement run in which a sample is examined through a microscope objective. This allows continuous monitoring to identify misalignments at the outset. If the panoramic camera views the sample area through a mirror on the objective changer, wherein the mirror can be selected instead of the microscope objective, a panoramic image can be taken through the microscope objective alternately with the sample examination.
The insertion of special calibration or reference structures is not absolutely necessary for the calibration check, so that the training or guidance of the user associated therewith can also be dispensed with.
Determining device-specific maintenance periods
The determined change may be recorded by the computing device in time or usage information, such as a duration of usage since a last calibration process. From which the device-specific maintenance period can be determined. Preventative maintenance may be introduced before a change is detected. The detected changes can therefore be used to optimize a maintenance interval, wherein the necessity of a device calibration is identified preventively (predictive maintenance). The execution of the displayed maintenance method may be activated by the user directly or online by the manufacturer company after the user agrees.
Determining compromised results of calibration parameters
If the computing device determines that a change has occurred in a microscope component, and the effectiveness of the calibration parameters is compromised due to the change, various actions may be taken:
thus, in case of a determined change, a control instruction for motion compensation related to the change may be output. The control instructions may be for manual adjustment by a user, or for one or more actuators. The actuator may be designed to move or adjust one of the microscope components. By means of motion compensation, it is possible in particular to restore or approach the arrangement to which the calibration parameters apply. Motorized equipment may be present whereby the microscope components can be brought to a desired reference or working position. Alternatively, the manual adjustment can also be carried out by means of a corresponding pawl, wherein the pawl can also be designed to be adjustable.
In particular for the above case, the computing device may be designed to determine not only whether a change affecting the calibration parameters has occurred, but also in which direction and optionally in which movement path a change in position has occurred. If the change is a rotation, the angle of rotation of the microscope component can additionally or alternatively also be determined. This is a significant advantage over pure anomaly detection, in which only the current panoramic image is determined to deviate from normal, but the changes that have occurred cannot be given in more detail.
The computing device may also be configured to output a calibration start instruction in the event of a determined change. This variant can be implemented as an alternative or in addition to the above-described corrective measures. Thus, in case of a determined change, the computing device may optionally check whether a compensated motion compensation (in particular a movement and/or rotation) is possible or may be determined. If the situation is the case, a corresponding control command is output, otherwise, a calibration starting command is output. The calibration start instruction may be directed to the user or may also initiate an automated calibration process. The calibration start instruction may also be sent to the microscope manufacturer via an internet connection after approval by the microscope user, and the microscope manufacturer may then start or accompany the calibration process via the internet connection.
In particular, a calculation correction or updating of the calibration parameters can also be carried out if the determined change is less than a predefined limit value. The calculation correction can be performed without the user having to interrupt his application. Thus, the identified target state deviation may be corrected by a software or firmware offset, for example updating the coordinate zero of the virtual coordinate system. The rotation of the virtual coordinate system can also be used for correction.
The determined changes may be taken into account in the navigation based on the panoramic image or another panoramic image for the purpose of approaching the sample area. The calibration parameters for navigation are thus adjusted by the determined change. The determined change can also be taken into account in the control of the movable microscope part in order to avoid collisions. In particular, by corresponding control of the sample stage and the Z drive/focus drive, it is possible to ensure that collisions between the microscope and the sample on the motorized microscope are avoided.
For collision-free navigation, it is advantageous if, in addition to the possible position or rotational changes, position data are also acquired from this panoramic image or these panoramic images about the sample carrier used and/or the system components used (objective, objective turret, stage position and focal position). For example, the sample carrier used can be classified in the panoramic image and its lateral position can additionally be determined. By means of the classification, for example, the height of the sample carrier type stored in the database can be determined. If the mirror on the objective revolver is unintentionally misaligned, through which the panoramic camera views the sample carrier, the change can be determined from the panoramic image in the manner already described. The lateral position of the sample carrier determined in the panoramic image can then be corrected in a computational manner on the basis of the determined variations. The corrected lateral position may enable collision-free navigation.
Calibration verification by end-to-end machine learning model
The invention further relates to a method for checking microscope calibration by means of a machine learning model, which may also be referred to as end-to-end machine learning model. The method includes obtaining and inputting at least one panoramic image of a sample environment of a microscope into a trained machine learning model. In addition, calibration parameters or associated reference data are input into the machine learning model. The calibration parameters are used here to interpret the image coordinates of at least one panoramic image. The machine learning model now generates an output based on the input at least one panoramic image and the calibration parameters or reference data. In one embodiment, the output provides whether a change in the microscope component has occurred that affects the effectiveness of the calibration parameter. Alternatively, the machine learning model is designed to generate an output that gives corrections for variations in the microscope components that affect the effectiveness of the calibration parameters.
The machine learning model may be trained with training data that includes a plurality of different training panoramic images for a plurality of sets of calibration parameters, respectively. For example, the training panoramic image may be taken with the described microscopy system or another microscopy system of the same type. Since the calibration parameters are also part of the training data, it can be taken into account in the examination of the panoramic image whether the image content of the panoramic image matches the currently used calibration parameters. Since the same image content of the panoramic image may either be correct or point to an incorrect position of the microscope component depending on the current calibration parameters.
Training may be performed by supervised learning. Here, it can be given as an annotation or an identification whether the corresponding training panoramic image matches the calibration parameters entered at the same time. The annotation can be presented either in the form of a "yes/no" classification, or as a number/vector within a range of values, thereby giving a measure for the change of the microscope component (position). Those training panoramic images belonging to a particular set of calibration parameters may also be used in training as an example of an inapplicable calibration parameter for other sets of calibration parameters. If corrections are given as annotations, a model can be learned from the training data, which directly outputs corrections for changes that occur to the microscope components. The correction given as an annotation may be, for example, a corrected calibration parameter or change information about a previous calibration parameter. Alternatively, the annotated correction may also involve instructions for the modified control of the microscope, in particular in order to perform a compensatory motion compensation, as already described.
Alternatively, it is also possible to perform an unmonitored training, in which no annotations are given in advance and only the training panoramic images with respectively correctly matching associated calibration parameters are used as input data. Learning methods that are partially supervised training or reinforcement learning are also possible.
The panoramic image may be input into the machine learning model as captured by the panoramic camera, or may be processed first. The processing may involve, for example, segmentation by another machine learning model. Feature extraction may also be performed by another machine learning model, thereby generating feature vectors from the panoramic image, which still include information from the panoramic image, but cannot be displayed directly as an image. Such processing results generated from the panoramic image may be used as input in the machine learning model described above (or in other machine learning models also described herein). Training data can be obtained from the panoramic image by similar processing.
The panoramic image to be examined can be taken with assumed settings of the microscope components, which correspond to those settings when determining the calibration parameters or the reference data. For example, an assumed stage position can be provided, which is assumed to be the same as the stage position in the determination of the calibration parameters or the reference data. This is referred to as a hypothetical setting, since the actual setting can deviate from the hypothetical setting due to the changes to be determined of the microscope components.
However, panoramic images taken at any other setting of the microscope components may also optionally be examined for greater flexibility. In this case, information about the current (assumed) settings of the microscope components can also be input into the machine learning model. For example, the machine learning model obtains information of the assumed stage position. The stage position may be estimated by measurement or based on stage control instructions.
Since the reference data belongs to the calibration parameters, the reference data and/or the calibration parameters may be provided to the machine learning model. For example, the reference data may be a panoramic image taken with certain (known) microscope settings when detecting the calibration parameters. The reference data is optionally omitted entirely if the machine learning model is provided with calibration parameters.
The described machine learning model may be trained as a single model. Alternatively, it may also be trained by two or more interconnected models.
General characteristics
A microscope system is understood to be a device comprising at least one computing device and a microscope. A microscope is understood to mean in principle any magnifying measuring device, in particular an optical microscope, an X-ray microscope, an electron microscope, a macro-scope or also a magnifying image recording device of different design.
The computing device may be physically designed as part of the microscope, may be separately disposed in the microscope environment, or may be disposed at any location remote from the microscope. The computing device may also be designed in a decentralized manner and communicate with the microscope via a data connection. It may generally be formed by any combination of electronic devices and software, and in particular comprises a computer, a server, a cloud-based computing system, or one or more microprocessors or graphics processors. The computing device may also be configured to control the microscope camera, image capture, stage control, and/or other microscope components.
In addition to sample cameras for taking more significantly magnified images of the sample area, there may also be panoramic cameras for taking panoramic images. However, alternatively, this could also be the same camera, where different objectives or optical systems are used to take the panoramic image and the more significantly magnified image of the sample. Panoramic cameras may enable state and position measurements of components over large spatial areas, where a relatively large depth of field of the panoramic camera may be helpful. The panoramic camera may be attached to a fixed equipment frame, such as a microscope stand, or a movable part, such as a microscope stage, focus drive, or objective revolver. The panoramic image may be a raw image, as it is taken by a camera, or a processed image from one or more raw images. The captured raw/panoramic image may be further processed before being evaluated in the manner described herein. The method variant of the invention may be based on a previously taken panoramic image and the panoramic image may be obtained, for example, from a memory. Alternatively, the taking of panoramic images may also be part of the claimed method. The panoramic image covers an area surrounding the location where the sample is to be placed, referred to herein as the sample environment. In order to take a panoramic image in which the reference structure can then be evaluated in the described manner, it is not absolutely necessary for the sample to be located in the field of view of the panoramic camera. Rather, the reference structure may be located on other components, such as a sample stage. The microscope components to be determined whether a change has occurred need not be imaged in the panoramic image. Rather, it is sufficient that the reference structure to be imaged is dependent on the position of the microscope components in the position or shape of the imaging.
The computer program according to the invention comprises instructions which, when executed by a computer, cause one of the described method variants to be performed.
The features of the invention described as additional apparatus features also lead to variants of the method according to the invention when used as specified. In a reverse manner, the microscope system can also be provided for carrying out the described method variants. In particular, the computing device may be provided for carrying out the described method variants and/or for outputting control instructions for carrying out the described method steps. Furthermore, the computing device may comprise the described computer program. While in some variants a trained machine learning model is used in checking the calibration parameters, other inventive variants of the invention arise by performing corresponding training steps.
Drawings
Further advantages and features of the invention are described below with reference to the accompanying schematic drawings:
FIG. 1 is a schematic view of one embodiment of a microscopy system of the present invention;
fig. 2 schematically shows a panoramic image;
FIG. 3 schematically shows a panoramic image;
FIG. 4 schematically illustrates a reference structure;
FIG. 5 schematically illustrates a portion of the microscopy system of FIG. 1;
FIG. 6 is a schematic diagram of a process of a method of an embodiment of the invention; and
fig. 7 is a schematic diagram of the process of a method of another embodiment of the invention.
Detailed Description
Various embodiments are described below with reference to the drawings. Identical and identically functioning parts are generally identified with the same reference numerals.
FIG. 1 shows a schematic view of a
Fig. 1 shows an embodiment of a microscopy system 100 according to the invention. The microscope system comprises a computing device 20 and a microscope 1, in the example shown the microscope 1 being an optical microscope, but in principle it could also be a different type of microscope. The microscope 1 comprises a holder 2 by means of which other microscope components are held. Among these may be included, among others: an objective changer or objective revolver 3 on which, in the example shown, an objective 4 is mounted; a sample stage 5 having a holding frame 6 for holding a sample carrier 7, and a microscope camera 8. If the objective lens 4 is rotated into the microscope beam path, the microscope camera 8 receives detection light 7 from one or more samples held by the sample carrier to take a sample image.
The microscope 1 further comprises a panoramic camera 9 for taking a panoramic image of the sample environment. The panoramic image may thus particularly show the sample carrier 7 or a part thereof. The field of view 9A of the panoramic camera 9 is larger than the field of view when the sample image is taken. In the example shown, the panoramic camera 9 views the sample carrier 7 through a mirror 9B. The mirror 9B is arranged on the objective revolver 3 and may be selected instead of the objective 4. In a variant of this embodiment, the mirror or another deflecting element can also be arranged in another position. Alternatively, the panoramic camera 9 may also be arranged such that it directly observes the sample carrier 7 without the mirror 9B. In principle, the microscope camera 8 can also be a panoramic camera if a further objective, in particular a macro objective, is selected by the objective turret 3 for capturing a panoramic image.
The computing device 20 processes the panoramic image using the computer program 80 according to the invention and optionally controls the microscope component based on the processing result. For example, the computing device 20 can evaluate the panoramic image to the extent where the cells of the microtiter plate are located, in order then to control the sample stage 5 in a manner close to a particular cell. The computing device 20 uses the calibration parameters P in order to correctly process the panoramic image and in order to realize how to control the microscope components based on the position information from the panoramic image. These calibration parameters allow interpretation of the panoramic image, e.g. how the orientation in the panoramic image is related to the orientation on the sample stage, in particular quantitatively. The calibration parameters P may also include information about the scale of scaling, in particular, at what shape or size an object in the panoramic image appears at a particular stage height. For evaluating the panoramic image, the calibration parameters P and optionally the current setting of the microscope components, for example the current height of the motorized sample stage, can then be taken into account, alternatively the height of the sample carrier can also be estimated from the panoramic image, for example, by means of the calibration parameters P without knowing the setting of the microscope components. The relationship between the position information from the panoramic image and the position information about the reference position of the microscope can be described in particular by the calibration parameter P.
The calibration parameters P may lose effectiveness if the user replaces or repositions the microscope components. The effects of vibrations, loose connections or variations in environmental parameters such as air humidity or temperature may also lead to misalignment of the microscope components, thereby making the calibration parameter P inaccurate or unsuitable. The computing device 20 evaluates one or more panoramic images to automatically determine such changes. This will be described in more detail with reference to the following figures.
FIGS. 2 to 4
Fig. 2 and 3 schematically show a panoramic image 10, in which the holding frame 6 can be seen. The holding frame 6 comprises a circular opening 6A in which the sample carrier can be held. If the panoramic camera views the sample stage and thus the holding frame 6 obliquely, the imaging can be distorted in perspective as shown in fig. 3. In particular, the circular holding frame opening 6A appears as an ellipse in the panoramic image 10. With the aid of the calibration parameters, a perspective correction can be calculated, resulting in the representation according to fig. 2. A prerequisite for this calculation is that no changes occur which would cause the calibration parameters to lose effectiveness. Otherwise, the image transformation does not result in a rectified representation, but in a distorted representation.
Alternatively, a misalignment in the direction of the vertical viewing on the holding frame 6 may result in the calculation of the panoramic image becoming incorrect due to calibration parameters that no longer apply. From the shot panoramic image, which corresponds to the illustration of fig. 2 in the case of a vertical camera viewing direction, a distorted image according to fig. 3 can result due to inapplicable calculations.
In order to check whether changes affecting the calibration parameters have occurred in the microscope component, geometric information about one or more reference structures 15 is determined in the panoramic image 10. The reference structure 15 may be a mark or label, as shown in fig. 4, whereby other information may optionally be encoded, such as the model name of the microscope component on which the mark is located. Corresponding markings may be applied to a plurality of (sub-) components. Alternatively, elements of the microscope component, such as threaded holes of the sample table or screws on the holding frame 6, can also be used as reference structures 15. In this case, the reference structure 15 is already structurally arranged on the microscope component to be monitored and is therefore part of this microscope component. Furthermore, the geometry or shape of a known structure, which may be part of the microscope component to be monitored or may be part of a mountable sample carrier or sample, may also be used as a reference structure.
As introduced in fig. 3, for example, the position coordinates Ph and Pv of the reference structure 15 in the panoramic image 10 and/or the size or shape information of the reference structure 15 can be determined as geometrical information, as indicated in fig. 3 by the horizontal dimension Dh and the vertical dimension Dv.
The geometric information is then calculated with respect to the reference data. The reference data may for example give the position coordinates of the reference structure 15 for the situation in which the calibration parameters are valid. If the deviation between them exceeds a predefined limit value, a change affecting the calibration parameter is identified.
Although fig. 3 shows an exemplary case in which the reference structure 15 is recognized on the holding frame 6, reference structures relating to other microscope components can in principle also be detected. This will be described in more detail below with reference to fig. 5.
FIG. 5
Fig. 5 schematically shows how the panoramic camera 9 observes the microscope components. In this example, the field of view 9A of the panoramic camera extends to the holding frame 6 via a mirror 9B on the objective revolver 3. In this case, the sample stage 5 may also be visible in the panoramic image taken. A reference structure 15, such as a label or indicia, may be present on mirror 9B. In addition, the outer circumference of the reflective mirror 9B is suitable as a reference structure because the edge direction, edge position, and corner angle of the edge of the reflective mirror 9B can be accurately and reliably determined from the panoramic image. This evaluation is generally also possible if the mirror edges are displayed blurred due to the different focal lengths of the panoramic camera 9. Alternatively, the panoramic camera 9 with adjustable focal length may take a plurality of different panoramic images with different focal lengths. Thereby, in some cases, reference structures relating to various microscope components can be analyzed more accurately.
The reference structure 15 can likewise be arranged on the sample holder 5 or an element of the sample holder 5 can be used as the reference structure 15, for example a threaded bore 5A. It is thus possible to distinguish whether the mirror 9B, the holding frame 6 or the sample stage 5 have undergone an undesired change. If a change of the reference structure 15 is found for a number or all of the microscope parts imaged in the panoramic image, this may indicate that not these microscope parts, but the panoramic camera 9 itself is rotated or positioned incorrectly. Thus, changes to the microscope components not rigidly connected to the reference structure can also be inferred from the reference structure 15 in certain cases.
The individual steps for detecting and evaluating the reference structure and the subsequent actions in the case of a change being determined are described below with reference to fig. 6.
FIG. 6
Fig. 6 schematically shows a flow of a method according to the invention, as it may be executed by a computer program of the computing device of fig. 1.
The determination of the geometric information about the reference structure in the panoramic image 10 is here made by two trained machine learning models M1 and M2. As step S1 of the method, the machine learning model M1 obtains the panoramic image 10 and thereby calculates the segmentation mask 30. The segmentation mask may in particular be a binary mask in which one pixel value gives an attribution to a reference structure and another pixel value gives a structure which is not referenced here. The machine learning model M1 may be specifically designed as a Convolutional Neural Network (CNN), which performs image-to-image mapping.
Alternatively, the machine learning model M1 may also be designed to use the context information i input together with the panoramic image 10. The context information i may comprise, for example, information about the (estimated) position or height of the sample stage or information about the type of holding frame or sample carrier used. The machine learning model M1 may in this case take into account the context information i, which reference structures to determine and how to determine the reference structures.
The segmentation mask 30 is then fed to a second machine learning model M2, which is here designed as a regression model and calculates geometric information G about the structure characterized by the respective pixel values in the segmentation mask 30 in step S2. For example, here, as the geometric information G, the sizes Dv, Dh of the structures are determined and output. The machine learning model M2 may also optionally use context information i, which may correspond to or be different from the context information for the machine learning model M1. For example, if the machine learning model M1 performs semantic segmentation or instance segmentation, information regarding what reference structure may also be conveyed to the machine learning model M2.
Next, in step S3, the geometric information G is calculated with respect to the reference data a, b. From this calculation, it is determined that either no change affecting the calibration parameters has occurred (step S4) or a change affecting the calibration parameters has been determined (step S5).
It is decided in step S5 whether the influence on the calibration parameters can be compensated for by a calculated adjustment of the calibration parameters based on the determined variation, which can then be done as step S6.
If a calculation adjustment is not possible, control instructions for moving the microscope components may be output in step S7 to compensate for the change. The correction of the recognized target state deviations (for example the zero coordinates of the virtual coordinate system) in the presence of the motorized hardware actuators can be carried out by mechanical correction offsets or by manual correction, in which the values or positions to be set are specified for the user. By measuring the deviation of the actual value of the microscope part from the target value, the position and orientation of the microscope part can thus be corrected in order to bring the actual value of the microscope part back to the target value again.
If this is not possible, for example because the relevant microscope component is not motorized, does not provide useful adjustability of the microscope component, or is unclear, whether the panoramic camera itself or the guide to the panoramic camera to the optical element is misaligned, a calibration start instruction may be given in step 8. This may for example be an indication to calibrate the user. Furthermore, information about the changes that have occurred may be stored in step S9, in particular together with information of the time of use or the duration of use. This information is then used to predict the maintenance period. For the present microscopy system, the maintenance period can be determined and specified individually.
In a modification, not all of steps S6 to S9 are provided. Further, the preferable order in the decision of which of steps S6 to S8 is performed may be changed. For example, the calculation adjustment according to step S6 can only occur when the compensation motion according to step S7 is not possible, e.g., when the measurement run in progress should not be affected by the compensation motion.
The calculation of one of the geometrical information G with respect to the reference data a, b described in step S3 of fig. 5 is only a simple example and more complex calculations are possible. Further, step S3 may optionally be performed by the machine learning model M3. The machine learning model defines reference data a, b in the form of model parameter values defined in the training. In addition to the geometric information, the machine learning model M3 may additionally receive as input contextual information relating to the information 40 about the settings of the microscope components (e.g. assumed stage positions) at which the panoramic image 10 was taken. This information 40 may also optionally be used when step S3 is performed by a classical algorithm without a machine learning model.
In another variation of the illustrated embodiment, step S1 and the machine learning model M1 may be omitted such that the machine learning model M2 obtains the panoramic image 10 instead of the segmentation mask 30. In yet another variant, the machine learning model M2 may be replaced by a classical image processing algorithm that calculates the geometric information G from the panoramic image 10 or from the segmentation mask 30 without using a machine learning model.
FIG. 7
Fig. 7 schematically shows a flow of another method according to the invention, as it may be performed by a computer program of the computing device of fig. 1.
In this case, an end-to-end machine learning model M is used which generates directly from the input panoramic image 10 an output which gives a change in the microscope components or whether a change has occurred which affects the validity of the calibration parameters P. The output of the machine learning model M can thus decide whether to proceed according to the already described steps S4 or S5.
In addition to the panoramic image 10, information 40 about microscope settings used to take the panoramic image 10 and the calibration parameters P are input to the machine learning model M.
The machine learning model M may be trained with training data that also includes the panoramic image 10, information 40 about the relevant microscope settings, and calibration parameters P. The training data may comprise a plurality of sets of calibration parameters P, wherein for each set of calibration parameters K there are a plurality of panoramic images 10 having respective information 40 about associated microscope settings, and the microscope settings are different from each other. The training may be a supervised training, in which the machine learning model M should provide as much as possible a result that is predefined. Alternatively, an unmonitored training may also be performed. As also mentioned in the opening paragraph, the present case differs from known anomaly detection methods in that the training data further comprises calibration parameters P and information 40 about the microscope settings.
Using the information 40 about the microscope settings is useful, for example, for distinguishing whether the sample stage position represents an incorrect change in position of the sample stage, in particular a movement, a tilt or a rotation, or a desired setting when the panoramic image 10 is taken.
In principle, however, the information 40 about the microscope settings can also be dispensed with if only panoramic images 10 taken with the same microscope settings at all times are fed to the machine learning model M. The training data may be adjusted accordingly in this embodiment.
In a variant of the described embodiment, the calibration parameter P may be replaced by other information related to the calibration parameter P. For example, a reference panoramic image may be stored, which is taken at a point in time at which the validity of the calibration parameter P is known, for example during or directly after the calibration procedure is performed. Such a reference panoramic image may be used in the flow of fig. 7 in place of the calibration parameter P. Together with the reference panoramic image, the information of the relevant microscope settings may be another input into the machine learning model M. Alternatively, the reference panoramic image and the panoramic image 10 to be examined may also be taken with the same (assumed/estimated) microscope settings, so that no information 40 of the relevant microscope settings has to be entered in the machine learning model M. Instead of the reference panoramic image, other reference data may also be used, for example geometric information derived from the reference panoramic image, as has been described for other embodiments.
The embodiment of fig. 7 may also be modified to the extent that the output of the machine learning model M directly gives a correction. The correction may be, for example, a calculated adjustment of calibration parameters or control instructions for motion compensation of the microscope components, similar to steps S6 or S7 in fig. 6.
The described embodiments are purely illustrative and modifications thereof are possible within the scope of the appended claims.
List of reference numerals
1 microscope
2 support
3 objective lens rotary base and microscope component
4 microscope objective and microscope component
5 sample stage, microscope part
5A threaded hole
6 holding frame, microscope part
6A holding frame opening
7 sample carrier
8 microscope camera
9 panoramic camera and microscope component
Field of view of a 9A panoramic camera
9B reflecting mirror and microscope component
10 panoramic image
15 reference structure
20 computing device
30 segmentation mask
40 information about the current setting of the microscope components
80 computer program of the invention
100 microscope system of the invention
a. b reference data
G geometric information
i context information
M machine learning model
M1 machine learning model
M2 machine learning model
M3 machine learning model
P calibration parameters
Image coordinates of the reference structure 15 of Ph, Pv in the panoramic image 10
Dh. Size of reference structure 15 in Dv panoramic image 10
S1-S9 steps of an embodiment of a method according to the invention

Claims (20)

1. A microscopy system, comprising:
panoramic camera (9) for recording at least one panoramic image (10) of a sample environment, and method for recording a panoramic image
A computing device (20) provided for evaluating at least one panoramic image (10), wherein the computing device (20) has calibration parameters (P) with which image coordinates of the at least one panoramic image (10) are interpreted;
it is characterized in that the preparation method is characterized in that,
the computing device (20) is arranged for
-determining from the at least one panoramic image (10) geometrical information (G) about at least one reference structure (15) imaged in the panoramic image (10), the position or shape of the reference structure in the panoramic image (10) depending on the position of at least one microscope component (3, 5, 6, 9B); and
-determining whether a change of the microscope component (3, 5, 6, 9B) has occurred which influences the effectiveness of the calibration parameter (P) by calculating the determined geometric information (G) with pre-given reference data (a, B).
2. The microscopy system according to claim 1,
characterized in that the geometric information (G) relates to the position and/or shape of the reference structure (15) in the panoramic image (10).
3. The microscopy system according to claim 1,
characterized in that the computing device (20) has a trained machine learning model (M1, M2) that obtains the at least one panoramic image (10) as input and outputs the geometric information (G).
4. The microscopy system according to claim 1,
characterized in that the calculation of the determined geometric information (G) with the predetermined reference data (a, b) is carried out by means of a trained machine learning model (M3) in which the reference data (a, b) are represented by learned model parameter values.
5. The microscopy system according to claim 1,
characterized in that the microscope part (3, 5, 6, 9B) is a sample stage (5), a mirror (9B) or an optical element, an objective converter or an objective turret (3) guiding light to the panoramic camera (9) or the panoramic camera (9).
6. The microscopy system according to claim 1,
characterized in that a target position is predefined for the microscope component (3, 5, 6, 9B) and the determined change of the microscope component (3, 5, 6, 9B) describes a difference between an actual position and the target position.
7. The microscopy system according to claim 1,
characterized in that a plurality of panoramic images (10) with different settings of one of the microscope parts (3, 5, 6, 9B) are taken and
wherein a difference in the geometric information (G) of the reference structure (15) between the panoramic images (10) is determined and calculated using the predefined reference data (a, B) to determine whether one of the microscope components (3, 5, 6, 9B) has changed.
8. The microscopy system according to claim 1,
characterized in that the computing device (20) is provided for, in the event of a change determined in the microscope component (3, 5, 6, 9B), outputting a control instruction (S7) for motion compensation with respect to the change;
wherein there is at least one actuator for moving or adjusting the microscope component (3, 5, 6, 9B) and the computing device (20) is arranged for providing control instructions for motion compensation to the at least one actuator.
9. The microscopy system according to claim 1,
characterized in that the computing device (20) is provided for outputting a calibration start instruction in the event of a determined change (S8).
10. The microscopy system according to claim 1,
characterized in that the computing device (20) is arranged for performing a computational correction or update (S6) of the calibration parameters (P).
11. The microscopy system according to claim 1,
characterized in that the computing device (20) is arranged for performing a navigation based on the panoramic image (10) to access a sample area, the determined variation being taken into account in the navigation.
12. The microscopy system according to claim 1,
characterized in that the computing device (20) is provided for taking into account the determined change in the control of the movable microscope part (3, 5, 6, 9B) for avoiding a collision.
13. The microscopy system according to claim 1,
characterized in that the reference structure (15) comprises elements of the microscope component (3, 5, 6, 9B), threaded holes or screws of the microscope component (3, 5, 6, 9B), an outer shape of the microscope component (3, 5, 6, 9B), or reference marks on the microscope component (3, 5, 6, 9B).
14. The microscopy system according to claim 1,
characterized in that the computing device (20) is arranged to,
determining from the at least one panoramic image (10) geometrical information (G) about a plurality of reference structures (15) imaged in the panoramic image (10), wherein the positions or shapes of the reference structures in the panoramic image (10) depend on the positions of different microscope parts (3, 5, 6, 9B); and
determining which of the different microscope components (3, 5, 6, 9B) have changed by calculating the determined geometric information (G) using predefined reference data (a, B).
15. The microscopy system according to claim 1,
characterized in that the computing device (20) is provided for carrying out the recording of the panoramic image (10) and determining whether a change has occurred in the microscope component (3, 5, 6, 9B) during or alternately with a microscope measurement run in which the sample is examined by means of a microscope objective.
16. The microscopy system according to claim 1,
characterized in that the computing device (20) is arranged for recording the determined changes of the microscope components (3, 5, 6, 9B) and thereby determining a device-specific maintenance cycle (S9) in order to introduce preventive maintenance.
17. A method for verifying microscope calibration, comprising:
obtaining at least one panoramic image (10) of the sample environment of the microscope (1);
evaluating the at least one panoramic image (10) to determine geometrical information (G) of at least one reference structure (15) imaged in the panoramic image (10), wherein the position or shape of the reference structure in the panoramic image (10) depends on the position of at least one microscope component (3, 5, 6, 9B); and
the determined geometric information (G) is calculated using predefined reference data (a, B) in order to determine whether a change of the microscope component (3, 5, 6, 9B) has occurred which influences the validity of a calibration parameter (P) which is used to interpret the image coordinates of the at least one panoramic image (10).
18. A method for verifying microscope calibration, comprising:
obtaining at least one panoramic image (10) of the sample environment of the microscope (1);
inputting the at least one panoramic image (10) into a trained machine learning model (M);
inputting calibration parameters (P) for interpreting image coordinates of the at least one panoramic image (10), or predetermined reference data (a, b) belonging to the calibration parameters (P) into the machine learning model (M); and
calculating an output based on the input at least one panoramic image (10) and the calibration parameter (P) or reference data (a, B) using the machine learning model (M), wherein the output gives a correction as to whether a change of a microscope component (3, 5, 6, 9B) has occurred affecting the effectiveness of the calibration parameter (P) or the output gives a change of a microscope component (3, 5, 6, 9B) affecting the effectiveness of the calibration parameter (P).
19. The method of claim 18, wherein the first and second portions are selected from the group consisting of,
characterized in that information about the current settings of the microscope components (3, 5, 6, 9B) is also input into the machine learning model (M).
20. A computer program having instructions which, when executed by a computer, cause performance of the method according to any one of claims 17 to 19.
CN202110790405.XA 2020-09-09 2021-07-13 Microscope system and method for checking the calibration of a microscope Pending CN114236803A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020123562.3A DE102020123562A1 (en) 2020-09-09 2020-09-09 MICROSCOPY SYSTEM AND METHOD OF VERIFYING A MICROSCOPE CALIBRATION
DE102020123562.3 2020-09-09

Publications (1)

Publication Number Publication Date
CN114236803A true CN114236803A (en) 2022-03-25

Family

ID=80266706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110790405.XA Pending CN114236803A (en) 2020-09-09 2021-07-13 Microscope system and method for checking the calibration of a microscope

Country Status (2)

Country Link
CN (1) CN114236803A (en)
DE (1) DE102020123562A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022106435A1 (en) 2022-03-18 2023-09-21 Carl Zeiss Microscopy Gmbh Microscopy system and method for determining an orientation of a sample carrier

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013006994A1 (en) 2013-04-19 2014-10-23 Carl Zeiss Microscopy Gmbh Digital microscope and method for optimizing the workflow in a digital microscope
DE102013012987A1 (en) 2013-08-03 2015-02-05 Carl Zeiss Microscopy Gmbh Method for calibrating a digital optical device and optical device
DE102013222295A1 (en) 2013-11-04 2015-05-07 Carl Zeiss Microscopy Gmbh Digital microscope, method for calibration and method for automatic focus and image center tracking for such a digital microscope
DE102017109698A1 (en) 2017-05-05 2018-11-08 Carl Zeiss Microscopy Gmbh Determining context information for change components of an optical system
DE102018133196A1 (en) 2018-12-20 2020-06-25 Carl Zeiss Microscopy Gmbh IMAGE-BASED MAINTENANCE PROPERTY AND MISUSE DETECTION
DE102019114117B3 (en) 2019-05-27 2020-08-20 Carl Zeiss Microscopy Gmbh Automatic workflows based on recognition of calibration samples
DE102020101191A1 (en) 2020-01-20 2021-07-22 Carl Zeiss Microscopy Gmbh Microscope and method for determining a measurement location of a microscope
DE102020118801A1 (en) 2020-07-16 2022-01-20 Carl Zeiss Microscopy Gmbh MICROSCOPE AND PROCEDURE FOR DISTANCE DETERMINATION OF A SAMPLE REFERENCE PLANE

Also Published As

Publication number Publication date
DE102020123562A1 (en) 2022-03-10

Similar Documents

Publication Publication Date Title
US10035268B2 (en) Measurement system used for calibrating mechanical parameters of robot
US11754392B2 (en) Distance determination of a sample plane in a microscope system
JP6280525B2 (en) System and method for runtime determination of camera miscalibration
JP6662527B2 (en) Slide glass holder to detect slide glass placement in microscope
CA2669922C (en) Method for assessing image focus quality
US10379387B2 (en) Method and device for checking refractive power distribution and centering
CN114227791B (en) Cutting machine and machine-readable carrier
CN109715307B (en) Bending machine with work area image capturing device and method for representing work area
JP6366875B1 (en) Information processing apparatus and processing defect identification method
US8885945B2 (en) Method for improving repeatability in edge location results of a machine vision inspection system
EP3410395A1 (en) System and method for assessing virtual slide image quality
JP2021113805A (en) Microscope and method for determining measuring location of microscope
EP4195142A1 (en) System for detecting the defects of the wooden boards and their classification into quality classes
CN114236803A (en) Microscope system and method for checking the calibration of a microscope
JP4842782B2 (en) Defect review method and apparatus
CN114326078A (en) Microscope system and method for calibration checking
CN111475016A (en) Assembly process geometric parameter self-adaptive measurement system and method based on computer vision
JP2005283267A (en) Through hole measuring device, method, and program for through hole measurement
US20220236551A1 (en) Microscopy System and Method for Checking a Rotational Position of a Microscope Camera
JP6884077B2 (en) Surface inspection equipment and surface inspection method
US20220238396A1 (en) Laser repair method and laser repair device
US12007547B2 (en) Microscopy system, method and computer program for aligning a specimen carrier
CN116634134B (en) Imaging system calibration method and device, storage medium and electronic equipment
US20220091405A1 (en) Microscopy System, Method and Computer Program for Aligning a Specimen Carrier
EP4339679A1 (en) Stage insert, microscope system, apparatus, method and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination