EP4111363A1 - Procédé pour analyser une structure dans un système fluidique - Google Patents

Procédé pour analyser une structure dans un système fluidique

Info

Publication number
EP4111363A1
EP4111363A1 EP21725685.8A EP21725685A EP4111363A1 EP 4111363 A1 EP4111363 A1 EP 4111363A1 EP 21725685 A EP21725685 A EP 21725685A EP 4111363 A1 EP4111363 A1 EP 4111363A1
Authority
EP
European Patent Office
Prior art keywords
image
analysis
analyzed
section
mask
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21725685.8A
Other languages
German (de)
English (en)
Inventor
Anna-Lina HAHN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP4111363A1 publication Critical patent/EP4111363A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification

Definitions

  • the present invention relates to a method for analyzing a structure within a fluidic system using image analysis methods.
  • WO 2005/011947 A2 describes a method for processing an image of a microfluidic device, a first image of the microfluidic device being obtained in a first state and a second image of the microfluidic device being obtained in a second state. The first image and the second image are transformed into a third coordinate space, a difference being determined between the first image and the second image. With this method, crystals can be recognized in individual chambers of the microfluidic device.
  • US Pat. No. 8,849,037 B2 likewise describes a method for image processing for microfluidic devices in which several images are analyzed by dynamic comparison. This technique can detect bubbles using a baseline correction.
  • edge detection method is known from image processing. However, edge detection can only be used for closed structures, that is to say for structures that have continuous edges.
  • Not closed structures such as a canal section of a microfluidic device or a chamber with an inlet and outlet cannot be analyzed with conventional edge detection.
  • An analysis of such, incomplete structures therefore generally requires an examination of a plurality of sequentially recorded images, with differences being able to be recognized by dynamic comparison of the images.
  • the invention provides a method for analyzing a structure within a fluidic system, in which both closed structures and, with particular advantage, open structures can be examined using image processing methods.
  • Edge detection is used here.
  • open structures are not accessible to edge detection for the reasons mentioned.
  • the proposed method allows the use of edge detection also on open structures, so that the method can be used for various fluidic systems in which open structures, for example channel sections or chambers with inlets and / or outlets, are often to be evaluated.
  • the proposed method uses a reference image and at least one object image as well as at least one analysis image, the latter being evaluated with the proposed method. This evaluation can take place, for example, with regard to bubble detection or some other evaluation of a fluidic system.
  • the method initially provides a reference image section with the structure to be analyzed (step a.),
  • the structure to be analyzed being isolated from a reference image.
  • the reference image was taken with the first camera setting.
  • the reference image section with the structure to be analyzed can be isolated and stored from the reference image that was recorded with a first camera setting. It is also possible to use a previously isolated reference image section.
  • an object image (default image) is selected, which the has the same fluidic state as the reference image and which was recorded with the first camera setting or another, second camera setting (step b.).
  • an image registration of the object image is carried out with the reference image section (step c.).
  • Image registration also known as co-registration, is a method of digital image processing known per se, in which two or more images are fused or superimposed with one another. The aim is to bring two or more images of the same, or at least a similar scene, into harmony with one another as well as possible. To adapt the images to one another, a compensating transformation is usually calculated in order to bring one image into agreement with the other image as well as possible. This method is often used in medical image processing, for example.
  • the proposed method uses the image registration to merge and further process the object image with the reference image section, which can be recorded with different camera settings and at different times. Edge detection is applied to the merged image in order to create a mask on this basis.
  • the image registration also allows the analysis of a non-closed structure in a fluidic system and in particular in a microfluidic device.
  • the key point here is that the image registration and the use of edge detection make it possible to isolate the structure to be analyzed so that it is transferred from a possibly incomplete state to a closed state.
  • the mask created on the basis of the image registration is applied to the analysis image so that the image section of the analysis image to be analyzed can be isolated on the analysis image (step e.).
  • the analysis image or images to be examined can be selected beforehand or during the course of the method (step d.).
  • the at least one analysis image should have been recorded with the same camera setting as the object image.
  • the image section to be analyzed isolated from the analysis image with the aid of the mask can then be examined by means of an image analysis evaluation (step f.), For example with regard to a proportion of bubbles or something else.
  • This evaluation can in particular take place on the basis of a determination of pixel intensities, so that, for example, a percentage of bubbles within a chamber of a microfluidic device can be determined at times t1 and t2. For example, this can be used to determine be that at time t1 the chamber was filled to 50% with bubbles and at time t2 to 20%.
  • the structure to be examined can in principle be any conceivable shape, for example a rectangle, circle, any polygon or the like.
  • This structure represents, for example, a specific chamber within a microfluidic device or a specific section from a channel of a microfluidic device or the like.
  • the particular advantage of the invention is that the proposed method can be used to analyze incomplete structures, such as, for example, a section of a channel that does not have completely continuous edges.
  • a non-closed structure is isolated once in such a way that it is converted into a closed structure. This takes place in particular in the context of the isolation and storage of a reference image section with the structure to be analyzed from the reference image in accordance with step a ..
  • image processing can take place on the object image in order to align it with the reference image section.
  • the object image can be rotated accordingly to match the reference image section, so that there is a match with the reference image section.
  • Further possible image processing steps are, for example, a conversion of colors into gray levels and / or a smoothing of the image and / or the use of edge detection.
  • filling a closed structure and / or removing certain elements of the structure and / or calculating the scope and / or calculating other parameters of the structure can be carried out. Whether and which such optional steps are sensible and / or advantageous depends on the respective object image. In general, such image processing steps can optimize the subsequent image registration.
  • Further image processing can also take place for the creation of the mask after the image registration, for example a conversion of colors into gray levels and / or a smoothing of the image.
  • image processing for example, the filling of a closed structure and / or the removal of certain elements of the structure and / or an extraction of edges and / or a calculation of the circumference and / or other parameters of the structure are possible.
  • the entire background of the structure can be set to one color, for example white.
  • artifacts at the edge of the image can be eliminated, if any, in order to further clean up the boundaries of the structure.
  • the structure can be completely filled in order to eliminate edges within the structure.
  • the closed edge of the structure to be analyzed can be extracted as a mask.
  • this mask can be filled, for example, For example, to simplify a later analysis using a histogram.
  • one or more image processing steps also take place on the at least one analysis image, so that the analysis image can be matched to the object image before the image registration.
  • this image processing step can include rotating the image so that the position of the structure to be analyzed, for example the position of a chamber within the microfluidic device, corresponds to the object image.
  • a conversion of the colors of the analysis image into gray levels is particularly preferred in order to facilitate the subsequent evaluation, for example on the basis of a pixel distribution. Cutting out the affected image section can also be advantageous. This measure also facilitates the subsequent evaluation.
  • the subsequent evaluation or examination of the image section to be analyzed is carried out using a threshold value method; the evaluation can preferably be based on a frequency distribution of pixels, in particular pixels whose intensity is above or below a predefinable threshold value.
  • the proposed method can be used in a particularly advantageous manner to detect bubbles within a microfluidic device as a fluidic system, for example to detect bubbles within a specific chamber or a specific reaction space or a specific channel section of a microfluidic device.
  • the method is not limited to such applications.
  • the method can also be used to determine other parameters of a fluidic system.
  • the proposed method can be used for a large number of fluidic systems, for example with regard to monitoring or control of manufacturing processes and / or for quality controls of fluidic and in particular of microfluidic systems, for example to determine the size and position of solids, for example crystals, within the system too determine.
  • Another parameter that can be examined with the proposed method is, for example, the leakage of liquid from the system into the environment, this leakage being noticeable through changes in intensity that can be detected with the proposed method.
  • FIG. 1 flow diagram of an algorithm for carrying out the proposed method
  • Fig. 2 illustrates various steps of the proposed
  • step 1 illustrates various steps of the proposed method in the form of an algorithm.
  • the query is first made as to whether an object image (default image) has been selected. If this is the case, the initial preparation of the object image takes place in step 2, possibly with image processing and the image registration of the object image with the previously isolated and stored reference image section with the structure to be analyzed to create a mask.
  • step 3 you are then asked whether one or more analysis images have been selected. If this is the case, the query is made in step 4 as to whether only one analysis image has been selected. If this is the case, the further analysis of the analysis image takes place in step 5 by applying the mask created to the analysis image and the further image analysis analysis for evaluating the image section to be analyzed.
  • step 6 a Image selected and analyzed similar to step 5. Subsequently, in step 7, the number of analysis images can be reduced by one and the user can jump back to step 3 so that the various analysis images can be analyzed one after the other according to steps 5 and 6, respectively.
  • the program can be ended in step 8. So if no more analysis images to be analyzed are selected, a jump is made from step 3 to the end of the program in step 8. If the query in step 1 shows that no object image has been selected, you can also jump directly to step 8 at the end of the program. This method provides a loop for processing several selected analysis images.
  • step 4 If more than one image (for example ten images) was selected in step 4, one image is taken and this is analyzed. Then the new number of images is calculated (now nine). The loop is executed a total of nine times. Then there is only one picture left. This is analyzed last and the program ends. If only one image was selected from the start, the loop can be ignored.
  • FIG. 2 illustrates various possible steps of the proposed method, in part on the basis of image details (sub-figures 2/1 to 2/10).
  • step 20 the structure to be analyzed is first cut out once from an image of the fluidic system (reference image) and is saved as a new image with a white background.
  • This reference image section can be used for a large number of implementations of the method described below.
  • step 21 an object image (default image) is selected which shows the same fluidic status as the reference image. In this example, this is, for example, a chamber of the microfluidic device that is not filled with liquid, which is represented by a white circle.
  • the white circle can be caused by a solid which has been introduced into the microfluidic device and which will be dissolved in the subsequent operation of the microfluidic device. Even if the object image and the reference image show the same fluidic status, the camera settings, such as, for example, orientation, zoom or other, can deviate from one another. If no such object image is selected, the program can be terminated, as has already been explained with reference to FIG. 1. In the next or in a subsequent step, in principle, any number of images can be selected to be analyzed (analysis images). This selection of the analysis images can take place now or at a later point in time, but the object image and the analysis images should be recorded with the same camera setting.
  • analysis images can take place now or at a later point in time, but the object image and the analysis images should be recorded with the same camera setting.
  • the object image obtained in step 21 can be rotated, for example, by 180 degrees in the optional step 22 in order to facilitate the subsequent image registration (image fusion) with the reference image section.
  • the structure to be analyzed can also be cut out or isolated from the object image to simplify processing of the images. This can be done by recognizing the white circle in step 23 and defining a frame around the corresponding image section (step 24).
  • the subsequent image processing steps 25 to 34 are also optional and can be carried out in step 35 to simplify and optimize the subsequent image registration.
  • These image processing steps of the object image can include a conversion of the colors of the image into gray levels (step 25).
  • edge detection can be applied to the image.
  • the edges can be thickened.
  • the area between the connected edges can be filled.
  • various parameters of the image can be determined, for example a determination of the circumference.
  • step 30 all pixels that belong to an area with fewer than 400 connected pixels, for example, can be removed so that the display is cleaned up further.
  • step 31 the outline can be filled.
  • step 32 the parameters of the completed structure can be calculated in order to find the position and the size of the circle in the object image.
  • step 33 for example, the first and the last white pixel can be determined in the x and y directions in order to be able to find and isolate the image section as a function of the chamber position.
  • step 34 the image section can be isolated as a function of the chamber position in order to avoid variances that result from the position of the circle in the object image.
  • the edge detection can be used to obtain a black and white representation of the chamber and to find the chamber accordingly within the image.
  • This can be useful, for example, if any solid material that may be present in the chamber is at an extreme position within the chamber and this is not completely present in the image, for example, because a part is cut off. If, for example, the solid is on the far left of the chamber and the image is isolated or cut using the solid in the chamber, it can happen that the chamber is not completely captured. Isolating and cutting out the chamber as such is generally not possible because of the different background and the open structure. In addition, there would be the problem that the chamber would not be cut out correctly with (slightly) different zoom settings. In other cases it is entirely possible that these optional steps and the isolation and cutting out can be dispensed with.
  • the size of the structure to be analyzed can be obtained and, if necessary, further parameters can be determined.
  • these image processing steps are only to be understood as examples and can generally improve the subsequent image registration in step 35. Which steps are sensible, however, depends in particular on the respective object image, with the subsequent image registration being able to be optimized through this image processing.
  • step 35 the object image processed in this case and the reference image section are merged with one another.
  • step 36 the colors can optionally be converted into gray levels.
  • step 37 the entire background can be set to a color tone, for example white, or the black edges at the edge of the merged image section can be removed.
  • Edge detection is then applied to the merged image in step 38.
  • the edges can be thickened in step 39 to add gaps to the edge of the structure avoid. In this example, for example, a threefold thickening is shown.
  • step 40 artifacts at the edge of the image can be eliminated, if any. This allows the boundaries of the structure to be cleaned up further.
  • step 41 in this example, the structure is completely filled in order to eliminate edges within the structure.
  • the image can then be smoothed in step 42.
  • the peripheral edge of the structure is extracted in step 43 in order to generate the mask.
  • the mask can be filled in step 44. This can be useful in particular with regard to a later analysis using a histogram.
  • step 46 the analysis image or images are now used in step 46. This corresponds to step 3 in FIG. 1. If several analysis images are available, these can be processed individually one after the other.
  • the analysis images show, for example, a microfluidic device in different fluidic states which can deviate from the fluidic state of the object image.
  • the edge of the mask from step 43 can be applied to the analysis image, for example in order to carry out a visual check.
  • the analysis image to be examined can be rotated in step 47, for example by 180 degrees, so that it corresponds to the object image in the state before the image registration.
  • the image section can be cut out which corresponds to the chamber position or the position of the structure to be analyzed from the object image.
  • the colors can be converted into gray levels.
  • the previously created mask is used or applied to this section of the analysis image. In this way, the corresponding image can be cut out and the structure isolated from the background, so that a structure that was not completed before is converted into a completed structure.
  • a histogram is then created in step 51 in this evaluation example that represents the number of pixels with a certain intensity of the masked image. This histogram therefore represents the masked analysis image.
  • a comparison histogram can be generated which represents the proportion of white and black pixels in the mask, that is to say the mask from step 44.
  • the histogram of the masked analysis image differs from the comparison histogram mainly in the background and possibly in the number of pixels.
  • an evaluation can be carried out, for example, with regard to the presence of bubbles within the structure.
  • a histogram of the completed structure is created for this by applying the mask to the analysis image (step 51) and a histogram of the filled mask (step 52).
  • the number of white pixels in the filled mask corresponds to the total number of pixels.
  • a threshold value procedure is carried out for the completed structure.
  • Pixels below a defined value are counted and the pixels above this defined value are ignored.
  • This evaluation or counting of the pixels can also be carried out in reverse.
  • the percentage of pixels that were determined using this threshold value method can now be determined within the chamber or the structure being examined.
  • black pixels or very dark gray pixels are evaluated as bubbles, so that the percentage filling of the chamber or the percentage of bubbles in the chamber can be calculated with this method.
  • bubbles can be determined by recognizing circles.
  • steps 54 to 60 illustrate the corresponding processing and evaluation of the further analysis image from step 46, with steps 54 to 60 corresponding to steps 47 to 53.
  • the reference image or the reference image section can be used for different object images with the same fluidic status, which, for example, were recorded at an earlier or later point in time became. It is particularly advantageous here that the different object images can be recorded with different settings, such as, in particular, zoom, section, orientation or other. This particularly advantageously enables an automated analysis of images at different times with the same fluidic status.
  • the object image and the analysis image are one and the same image.
  • the analysis image which is also used as the object image, should not show any strong formation of bubbles, so that there are no problems with the image registration between the object image and the reference image section.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé pour analyser une structure dans un système fluidique, au cours duquel une image de référence, au moins une image-objet et au moins une image d'analyse sont utilisées. Une zone de l'image de référence comportant la structure à analyser, qui est isolée dans l'image de référence, est préparée, l'image de référence étant acquise avec un premier réglage d'appareil de prise de vue. Une image-objet qui présente le même état fluidique que l'image de référence et qui a été acquise avec le premier ou un deuxième réglage d'appareil de prise de vue, est sélectionnée. Un recalage d'images est réalisé au moyen de l'image-objet et la zone de l'image de référence, et une détection de contours est utilisée pour créer un masque. Avant ou après, une sélection d'au moins une image d'analyse est effectuée, cette ou ces image(s) d'analyse et l'image-objet étant acquise(s) avec le même réglage d'appareil de prise de vue. Le masque est utilisé sur l'image d'analyse pour isoler la zone de l'image à analyser de l'image d'analyse. En outre, la zone d'image à analyser peut être examinée au moyen d'une évaluation analytique d'image.
EP21725685.8A 2020-02-27 2021-02-26 Procédé pour analyser une structure dans un système fluidique Pending EP4111363A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020202528.2A DE102020202528A1 (de) 2020-02-27 2020-02-27 Verfahren zur Analyse einer Struktur innerhalb eines fluidischen Systems
PCT/DE2021/100196 WO2021170183A1 (fr) 2020-02-27 2021-02-26 Procédé pour analyser une structure dans un système fluidique

Publications (1)

Publication Number Publication Date
EP4111363A1 true EP4111363A1 (fr) 2023-01-04

Family

ID=75919167

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21725685.8A Pending EP4111363A1 (fr) 2020-02-27 2021-02-26 Procédé pour analyser une structure dans un système fluidique

Country Status (5)

Country Link
US (1) US20230085663A1 (fr)
EP (1) EP4111363A1 (fr)
CN (1) CN115136208A (fr)
DE (2) DE102020202528A1 (fr)
WO (1) WO2021170183A1 (fr)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6031935A (en) 1998-02-12 2000-02-29 Kimmel; Zebadiah M. Method and apparatus for segmenting images using constant-time deformable contours
AU2004261655A1 (en) 2003-07-28 2005-02-10 Fluidigm Corporation Image processing method and system for microfluidic devices
US8055034B2 (en) 2006-09-13 2011-11-08 Fluidigm Corporation Methods and systems for image processing of microfluidic devices

Also Published As

Publication number Publication date
WO2021170183A1 (fr) 2021-09-02
DE102020202528A1 (de) 2021-09-02
DE112021001284A5 (de) 2023-01-12
US20230085663A1 (en) 2023-03-23
CN115136208A (zh) 2022-09-30

Similar Documents

Publication Publication Date Title
DE102008040804B4 (de) Verfahren, Operationsmikroskop und Analysesystem zur quantitativen Darstellung des Blutflusses
EP2130174B1 (fr) Procédé et dispositif permettant de déterminer un contour d'une cellule
DE102008040807B4 (de) Verfahren zur Korrektur von den Blutfluss darstellenden Bilddaten, Medizinisches Gerät und Analysesystem
WO2000010119A1 (fr) Procede de reconnaissance d'objets dans des images numerisees
EP2430837A1 (fr) Procédé de traitement d'images permettant de déterminer des informations de profondeur à partir d'au moins deux images initiales enregistrées au moyen d'un système de caméras stéréoscopiques
DE102006034911A1 (de) Verfahren zur Erstellung von Panorama-Bildern des Augenhintergrundes
DE102005034374B3 (de) Verfahren zur automatischen Erstellung einer Hintergrundmaske bei Bildern mit verrauschten Hintergrundbereichen, Anwendungen dazu sowie ein Kernspintomographiegerät zur Durchführung der Verfahren und Computersoftwareprodukt
EP3289398B1 (fr) Procédé de génération d'une image de contraste à réflexion réduite et dispositifs associés
DE102007028226A1 (de) Auswertungsverfahren für eine zeitliche Sequenz von Röntgenbildern und hiermit korrespondierende Gegenstände
DE102012223587A1 (de) Verfahren zum Testen einer Applikation
EP4111363A1 (fr) Procédé pour analyser une structure dans un système fluidique
DE102004060868A1 (de) Verfahren zur automatischen Bestimmung eines oder mehrerer organabhängiger Parameter zur Bildnachverarbeitung
DE102011104435B3 (de) Zerstörungsfreie Bestimmung von Werkstoffeigenschaften
EP3583902B1 (fr) Procédé d'ajustement automatique d'un ensemble de données d'image obtenu au moyen d'un appareil à rayons x, programme informatique, mémoire de données et appareil à rayons x
DE19754909C2 (de) Verfahren und Vorrichtung zur Erfassung und Bearbeitung von Abbildungen biologischen Gewebes
DE10214114A1 (de) Verfahren und Vorrichtung zur Filterung eines mittels eines medizinischen Gerätes gewonnenen digitalen Bildes mittels eines Ortsfrequenzoperators
DE102010028382A1 (de) Verfahren und Computersystem zur Bearbeitung tomographischer Bilddaten aus einer Röntgen-CT-Untersuchung eines Untersuchungsobjektes
EP3155588B1 (fr) Système d'acquisition et de traitement d'une image d'un corps complet et procédé pour faire fonctionner celui-ci
DE102009024030B3 (de) Verfahren zur optischen Ermittlung von geometrischen Oberflächenmerkmalen
WO2021099385A1 (fr) Procédé mis en œuvre par ordinateur pour segmenter des données de mesure à partir d'une mesure d'un objet
WO2015169675A1 (fr) Procédé de segmentation d'une image en couleurs et microscope numérique
DE102015010264A1 (de) Verfahren zur Erstellung einer 3D-Repräsentation und korrespondierende Bildaufnahmevorrichtung
DE102019131440B3 (de) Computerimplementiertes Verfahren zur Segmentierung von Messdaten aus einer Messung eines Objekts
DE102018207725B3 (de) Auflösungsbereinigte Druckvorschau
DE102007043433B3 (de) Vorrichtung und Verfahren zum Adaptieren eines Maskenbildes

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220927

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)