EP3853616B1 - Hypothesierungs- und verifizierungsnetzwerke und verfahren zur probenklassifizierung - Google Patents

Hypothesierungs- und verifizierungsnetzwerke und verfahren zur probenklassifizierung

Info

Publication number
EP3853616B1
EP3853616B1 EP19862904.0A EP19862904A EP3853616B1 EP 3853616 B1 EP3853616 B1 EP 3853616B1 EP 19862904 A EP19862904 A EP 19862904A EP 3853616 B1 EP3853616 B1 EP 3853616B1
Authority
EP
European Patent Office
Prior art keywords
classification
specimen
neural network
index
verification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP19862904.0A
Other languages
English (en)
French (fr)
Other versions
EP3853616A1 (de
EP3853616A4 (de
Inventor
Venkatesh NARASIMHAMURTHY
Vivek Singh
Yao-Jen Chang
Benjamin S. Pollack
Ankur KAPOOR
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Healthcare Diagnostics Inc
Original Assignee
Siemens Healthcare Diagnostics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Healthcare Diagnostics Inc filed Critical Siemens Healthcare Diagnostics Inc
Publication of EP3853616A1 publication Critical patent/EP3853616A1/de
Publication of EP3853616A4 publication Critical patent/EP3853616A4/de
Application granted granted Critical
Publication of EP3853616B1 publication Critical patent/EP3853616B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/00584Control arrangements for automatic analysers
    • G01N35/00722Communications; Identification
    • G01N35/00732Identification of carriers, materials or components in automatic analysers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/02Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor using a plurality of sample containers moved by a conveyor system past one or more treatment or analysis stations
    • G01N35/026Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor using a plurality of sample containers moved by a conveyor system past one or more treatment or analysis stations having blocks or racks of reaction cells or cuvettes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N2021/8411Application to online plant, process monitoring
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/00584Control arrangements for automatic analysers
    • G01N35/00722Communications; Identification
    • G01N35/00732Identification of carriers, materials or components in automatic analysers
    • G01N2035/00742Type of codes
    • G01N2035/00752Type of codes bar codes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Definitions

  • multiple labels may have been provided (such as from multiple facilities that have handled the specimen container 106), and the multiple labels may overlap each other to some extent.
  • two labels e.g., a manufacturer's label and a barcode label
  • the label(s) 210 may occlude some portion of the specimen 104 (an occluded portion)
  • some portion of the specimen 104 and serum or plasma portion 204SP may still be viewable from at least one viewpoint (an un-occluded portion).
  • embodiments of the SCNN configured to carry out the characterization method can be trained to recognize the occluded and un-occluded portions, such that improved HILN detection may be provided.
  • the characterization methods may still be able to distinguish the boundaries of the serum or plasma portion 204SP through the one or more labels 210.
  • the carrier 120 may be stopped at a predetermined location in the quality check module 100, such as at the imaging location 116. At this location, normal vectors from each of the image capture devices 110A-110C intersect each other.
  • a gate or a linear motor (not shown) of the carrier 120 may be provided to stop the carrier 120 at the imaging location 116, so that multiple images may be captured thereat.
  • one or more sensors may be used to determine the presence of the carrier 120 at the quality check module 100.
  • the quality check module 100 may include a housing 128 that may at least partially surround or cover the track 118 to minimize outside lighting influences.
  • the specimen container 106 may be located inside the housing 128 during the image-capturing sequences.
  • Housing 128 may include one or more openings (and/or doors) 128D to allow the carriers 120 to enter into and/or exit from the housing 128.
  • the ceiling may include an opening 1280 to allow a specimen container 106 to be loaded into the carrier 120 by a robot (not shown) including moveable robot fingers from above.
  • 4-8 images may be captured by image capture device 110A at viewpoint 1 while the specimen 104 is backlit illuminated with light source 126A that has a red spectrum. Additional like images may be captured sequentially at viewpoints 2 and 3. Other numbers of images may be captured.
  • capturing the multiple spectral images may be accomplished using different light sources 126A, 126B, and 126C emitting different spectral illumination.
  • the light sources 126A-126C may back light the specimen container 106 (as shown).
  • a light diffuser may be used in conjunction with the light sources 126A-126C in some embodiments.
  • the multiple different spectra light sources 126A-126C may be red, green, blue (RGB) light sources, such as light-emitting diodes (LEDs) emitting nominal wavelengths of 634nm +/- 35nm (Red), 537nm +/- 35nm (Green), and 455nm +/- 35nm (Blue).
  • LEDs light-emitting diodes
  • the light sources 126A-126C may be white light sources. In cases where the label 210 obscures multiple viewpoints, infrared (IR) backlighting or near infrared (NIR) backlighting may be used. Furthermore, RGB light sources may be used in some instances even when label occlusion is present. In other embodiments, the light sources 126A-126C may emit one or more spectra having a nominal wavelength between about 700 nm and about 1200 nm. Other light sources and/or wavelengths may be used.
  • three red light sources 126A-126C may be used to sequentially illuminate the specimen 104 from three lateral locations.
  • the red illumination by the light sources 126A-126C may occur as the multiple images (e.g., 4-8 images or more) at different exposure times are captured by each image capture device 110A-110C from each viewpoint 1-3.
  • the exposure times may be between about 0.1 ms and 256 ms. Other exposure times may be used.
  • each of the respective images for each of the image capture devices 110A-110C may be taken sequentially, for example.
  • a group of images may be sequentially obtained that have red spectral backlit illumination and multiple exposures (e.g., 4-8 exposures, such as different exposure times).
  • the images may be captured in a round robin fashion, for example, where all images from viewpoint 1 are captured followed sequentially by viewpoints 2 and 3.
  • green spectral light sources 126A-126C may be turned on (nominal wavelength of about 537 nm with a bandwidth of about +/- 35 nm), and multiple images (e.g., 4-8 or more images) at different exposure times may be sequentially captured by each of the image capture devices 110A-110C. This may be repeated with blue spectral light sources 126A-126C (nominal wavelength of about 455 nm with a bandwidth of about +/- 35 nm) for each of the image capture devices 110A-110C.
  • the different nominal wavelength spectral light sources 126A-126C may be accomplished by light panels including banks of different desired spectral light sources (e.g., R, G, B, W, IR, and/or NIR) that can be selectively turned on and off, for example. Other means for backlighting may be used.
  • desired spectral light sources e.g., R, G, B, W, IR, and/or NIR
  • Other means for backlighting may be used.
  • optimal image intensity may be pixels (or patches) that fall within a predetermined range of intensities, such as between 180 and 254 on a scale of 0-255, for example. In another embodiment, optimal image intensity may be between 16 and 254 on a scale of 0-255, for example. If more than one pixel (or patch) in the corresponding pixel (or patch) locations of two exposure images is determined to be optimally exposed, the higher of the two may be selected, for example.
  • the selected pixels (or patches) exhibiting optimal image intensity may be normalized by their respective exposure times.
  • the result is a plurality of normalized and consolidated spectral image data sets for the illumination spectra (e.g., R, G, B, white light, IR, and/or NIR - depending on the combination used) and for each image capture device 110A-110C where all of the pixels (or patches) are optimally exposed (e.g., one image data set per spectrum) and normalized.
  • the data pre-processing carried out by the computer 124 may result in a plurality of optimally-exposed and normalized image data sets, one for each illumination spectra employed.
  • FIG. 1C illustrates an example specimen testing apparatus 150 capable of automatically processing multiple specimen containers 106 containing specimens 104.
  • the specimen containers 106 may be provided in one or more racks 152 at a loading area 154 prior to transportation to, and analysis by, one or more analyzers (e.g., first analyzer 156, second analyzer 158, and/or third analyzer 160) arranged about the specimen testing apparatus 150. More or less numbers of analyzers may be used.
  • the analyzers may be any combination of clinical chemistry analyzers and/or assaying instruments, or the like.
  • the computer 172 may operate to control movement of the carriers 120 to and from the loading area 154, motion about the track 118, motion to and from the first pre-processing station 168 as well as operation of the first pre-processing station 168 (e.g., centrifuge), motion to and from the quality check module 100 as well as operation of the quality check module 100, and motion to and from each analyzer 156, 158, 160 as well as operation of each analyzer 156, 158, 160 for carrying out the various types of testing (e.g., assay or clinical chemistry).
  • Computer 124 of FIGS. 1A and 1B may be part of or separate from computer 172 of FIG. 1C .
  • the pixel data for each of the multi-view, multi-spectral, multi-exposure images may be pre-processed in pre-process subsystem 334 as discussed above to provide a plurality of optimally-exposed and normalized image data sets (hereinafter "image data sets").
  • image data sets One or more image data sets of an image of a specimen container may be used by a hypothesizing network 338 to predict a specimen classification.
  • the hypothesizing network 338 may be a segmentation network and/or a classification network.
  • the apparatus 330A may output a verified classification (using output subsystem 344), which may be generated based on the output of the verification networks 342.
  • the verified output of output subsystem 344 may be based on positive results generated by the verification networks 342.
  • the verification networks 342 may verify or reject the prediction of the HIL classification generated by the hypothesizing network 338.
  • output subsystem 344 may output the HIL determination of the hypothesizing network 338 with a certainty greater than a predetermined level of certainty.
  • the verified output, and thus the output of the apparatus 330A may determine with a certainty of 98% that the HIL classification is correct. Other certainty levels may be used.
  • the verified output may be a signal indicating that the apparatus 330A was not able to make a determination of the HIL classification within a predetermined level of certainty. In some situations, the verified output may indicate that the specimen should be analyzed manually.
  • the hemolysis index verification network 352A may verify a plurality of individual predicted hemolysis classification indexes.
  • the hemolysis index verification network 352A may include an individual neural network, such as a CNN, to verify each predicted hemolysis classification index.
  • the icterus index verification network 352B may verify a plurality of individual predicted icterus classification indexes.
  • the icterus index verification network 352B may include an individual neural network, such as a CNN, to verify each predicted icterus classification index.
  • the lipemia index verification network 352C may verify a plurality of predicted lipemia classification indexes.
  • the lipemia index verification network may include an individual neural network, such as a CNN, to verify each lipemia classification index.
  • Multi-view images may be captured by the one or more image capture devices 110A-110C ( FIG. 1A ).
  • the image data or pixel data for each of the multi-view, multi-spectral, multi-exposure images may be pre-processed as discussed above to provide a plurality of optimally-exposed and/or normalized image data sets.
  • the computer 124 and/or 172 may pre-process the pixel data to generate image data sets.
  • the image data sets may be provided as input to the hypothesizing network 458, which may be or include a segmentation convolutional neural network (SCNN) 460.
  • the hypothesizing network 458 and/or the SCNN 460 may be programs running on the computer 124 and/or 172.
  • the extracted pixel index information can be further processed by the SCNN 460 to determine a final HILN classification and/or a final HILN classification index.
  • the classification index may include 21 serum classes, including an un-centrifuged class, a normal class, and 19 HIL classes/subclasses (indexes), as described in more detail below.
  • the classification and/or the classification index may include different, fewer or more classes and subclasses
  • the SCNN 460 may include a small container segmentation network (SCN) 464 at the front end of the DSSN 462.
  • the SCN 464 may be configured and operative to determine a container type and a container boundary information 466.
  • the container type and container boundary information 466 may be input via an additional input channel to the DSSN 462 and, in some embodiments, the SCNN 460 may provide, as an output, the determined container type and boundary 468.
  • the SCN 464 may have a similar network structure as the DSSN 462, but shallower (i.e., with far fewer layers).
  • an output of the SCNN 460 may be a predicted classification (or classification index) 470 that, in some embodiments, may include an un-centrifuged class 470U, a normal class 470N, a hemolysis class 470H, an icterus class 470I, and a lipemia class 470L (or corresponding class index).
  • the hemolysis class 470H may include sub-classes or indexes H0, H1, H2, H3, H4, and H5.
  • the icterus class 470I may include sub-classes or indexes I0, I1, I2, I3, I4, and I5.
  • the lipemia class 470L may include sub-classes L0, L1, L2, and L3.
  • Each of the hemolysis class 470H, icterus class 470I, and/or lipemia class 470L may have, in other embodiments, other numbers of sub-classes or indexes.
  • the index verification networks 352 depicted in FIG. 4 may include an un-centrifuged verification network 472 that verifies a prediction by the hypothesizing network 458 that the specimen 104 has not been separated into the serum or plasma portion 204SP and the settled blood portion 204SB.
  • the index verification networks 352 may include a plurality of individual verification networks, wherein each individual verification network may be trained to verify an individual classification index. For example, a first individual verification network may be trained solely to verify H2 and a second individual verification network may be trained solely to verify H3.
  • each of the individual verification networks may be a neural network, such as a convolutional neural network.
  • the individual verification networks may be DenseNet and/or ResNet networks.
  • Architecture 500 may include, for example, the following operational layers: two convolutional (CONV) layers 501 and 547; eleven dense block layers DB4 503, DB5 507, DB7 511, DB10 515, DB12 519, DB15 523, DB12 527, DB10 531, DB4 535, DB5 539, and DB4 543; five transition down layers TD 505, TD 509, TD 513, TD 517, and TD 521; five transition up layers TU 525, TU 529, TU 533, TU 537, and TU 541, and a fully connected layer 545 arranged as shown in FIG. 5 , wherein the classification index 470 is output. Other numbers, types and/or arrangements of layers may be used.
  • a second dense layer then receives the concatenated output as its input and outputs a number of pixel label maps, which are again concatenated to the previous pixel label maps. This may be repeated for each dense layer in the dense block layer. Other numbers and/or types of dense layers may be used.
  • the dropout probability may range from 0 to 1 depending on the experimental test runs that result in the best outcome.
  • the number of pixel label maps at the output of layer TD 505 may be 112
  • at the output of TD 509 may be 192
  • at the output of TD 513 may be 304
  • at the output of TD 517 may be 464
  • at the output of TD 521 may be 656.
  • each transition up layer TU 525, TU 529, TU 533, TU 537, and TU 541 may include a 3x3 transposed convolutional layer with stride 2.
  • Other transition up layer parameters may be used.
  • the method 600 includes, at 602, capturing one or more images of the specimen, the one or more images including a serum or plasma portion (e.g., serum or plasma portion 204SP) of the specimen, the capturing generating pixel data.
  • the method 600 includes, at 604, processing pixel data of the one or more images of the specimen using a first network (e.g., SCNN 460) executing on a computer (e.g., computer 124 and/or 172) to predict a classification of the serum or plasma portion, wherein the classification comprises hemolysis, icterus, and lipemia.
  • the method includes, at 606, verifying the predicted classification using one or more verification networks (e.g., verification networks 342). A similar method may be performed for verifying a predicted classification index.
  • a max pooling layer is a processing step that may apply a filter to generate output activation maps having maximum pixel values appearing in the one or more activation maps received from a convolutional layer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Chemical & Material Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Immunology (AREA)
  • Quality & Reliability (AREA)
  • Biochemistry (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Investigating Or Analysing Biological Materials (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Automatic Analysis And Handling Materials Therefor (AREA)

Claims (14)

  1. Computer-implementiertes Verfahren zur Charakterisierung einer Probe (104), wobei die Probe (104) Vollblut ist, das einen abgesetzten Blutanteil und einen Serum- oder Plasmaanteil umfasst, wobei das Verfahren umfasst:
    Aufnehmen eines oder mehrerer Bilder der Probe (104), wobei die ein oder mehreren Bilder einen Serum- oder Plasmaanteil (204SP) der Probe (104) enthalten und das Aufnehmen Pixeldaten erzeugt;
    wobei das Verfahren ferner umfasst:
    Verarbeiten von Pixeldaten der ein oder mehreren Bilder der Probe (104) unter Verwendung eines ersten neuronalen Netzes (338), das auf einem Computer (124/172) ausgeführt wird, um eine Klassifizierung des Serum- oder Plasmaanteils (204SP) vorherzusagen, wobei das erste neuronale Netz eine Klassifizierungshypothese für den Serum- oder Plasmaanteil der Probe ausgibt, wodurch das erste neuronale Netz eine Hypothese darüber aufstellt, in welche von mehreren Klassen der Anteil gehört, wobei die Klassen der Klassifizierung Hämolyse, Ikterus und Lipämie umfassen; und
    Verifizieren der vorhergesagten Klassifizierung unter Verwendung eines oder mehrerer neuronaler Verifikationsnetze (342), wobei das neuronale Verifikationsnetz identifiziert, ob die Vorhersage des ersten neuronalen Netzes korrekt ist, und wobei das Verifizieren ein neuronales Verifikationsnetz verwendet, das in Reaktion auf die und entsprechend der jeweiligen Klasse, die durch das erste neuronale Netz erhalten wird, ausgewählt wird.
  2. Verfahren nach Anspruch 1, ferner umfassend:
    Verifizieren der vorhergesagten Klassifizierung durch ein neuronales Verifikationsnetz (342A), das in Reaktion darauf, dass das erste neuronale Netz (338) eine Hämolyse-Klassifizierung der Probe (104) vorhersagt, auf Hämolyse trainiert wurde;
    Verifizieren der vorhergesagten Klassifizierung durch ein neuronales Verifikationsnetz (342B), das in Reaktion darauf, dass das erste neuronale Netz (338) eine Ikterus-Klassifizierung der Probe (104) vorhersagt, auf Ikterus trainiert wurde; und
    Verifizieren der vorhergesagten Klassifizierung durch ein neuronales Verifikationsnetz (342C), das in Reaktion darauf, dass das erste neuronale Netz (338) eine Lipämie-Klassifizierung der Probe (104) vorhersagt, auf Lipämie trainiert wurde.
  3. Verfahren nach Anspruch 1, wobei:
    das Verarbeiten von Pixeldaten umfasst, einen Klassifizierungsindex für mindestens eines von Hämolyse, Ikterus und Lipämie vorherzusagen; und
    das Verifizieren der vorhergesagten Klassifizierung umfasst, den vorhergesagten Klassifizierungsindex unter Verwendung eines oder mehrerer auf den vorhergesagten Klassifizierungsindex trainierter neuronaler Verifikationsnetze (352) zu verifizieren.
  4. Verfahren nach Anspruch 1, wobei:
    das Verarbeiten der Pixeldaten umfasst, einen Hämolyse-Klassifizierungsindex vorherzusagen; und
    das Verifizieren der vorhergesagten Klassifizierung umfasst, den vorhergesagten Hämolyse-Klassifizierungsindex unter Verwendung eines auf den vorhergesagten Hämolyse-Klassifizierungsindex trainierten neuronalen Verifikationsnetzes (352A) zu verifizieren, und/oder das Verarbeiten der Pixeldaten umfasst, einen Ikterus-Klassifizierungsindex vorherzusagen; und
    das Verifizieren der vorhergesagten Klassifizierung umfasst, den vorhergesagten Ikterus-Klassifizierungsindex unter Verwendung eines auf den vorhergesagten Ikterus-Klassifizierungsindex trainierten neuronalen Verifikationsnetzes (352B) zu verifizieren, und/oder
    das Verarbeiten der Pixeldaten umfasst, einen Lipämie-Klassifizierungsindex vorherzusagen; und
    das Verifizieren der vorhergesagten Klassifizierung umfasst, den vorhergesagten Lipämie-Klassifizierungsindex unter Verwendung eines auf den vorhergesagten Lipämie-Klassifizierungsindex trainierten neuronalen Verifikationsnetzes (352C) zu verifizieren.
  5. Verfahren nach Anspruch 1, wobei die Klassifizierung ferner eine normale Klassifizierung umfasst und wobei das Verfahren ferner das Verifizieren der normalen Klassifizierung unter Verwendung eines neuronalen Verifikationsnetzes (342D) umfasst, das auf eine normale Klassifizierung trainiert wurde.
  6. Verfahren nach Anspruch 1, wobei das Verarbeiten von Pixeldaten der ein oder mehreren Bilder der Probe (104) unter Verwendung eines ersten neuronalen Netzes (338) das Verarbeiten von Pixeldaten der ein oder mehreren Bilder der Probe (104) unter Verwendung eines neuronalen Segmentierungsnetzes (460) und/oder eines neuronalen Klassifizierungsnetzes umfasst.
  7. Verfahren nach Anspruch 1, wobei das Verarbeiten von Pixeldaten ferner umfasst:
    Identifizieren eines Serum- oder Plasmaanteils (204SP) in der Probe (104) unter Verwendung eines tiefen semantischen neuronalen Segmentierungsnetzes (462); und
    Vorhersagen eines Klassifizierungsindexes des Serum- oder Plasmaanteils (204SP) der Probe (104) unter Verwendung des tiefen semantischen neuronalen Segmentierungsnetzes (462) basierend zumindest teilweise auf der durch die Pixeldaten des Serum- oder Plasmaanteils (204SP) repräsentierten Farbe.
  8. Verfahren nach Anspruch 1, wobei das erste neuronale Netz (338) eine Architektur (500) mit mindestens elf dichten Blockschichten (DB4 503, DB5 507, DB7 511, DB10 515, DB12 519, DB15 523, DB12 527, DB10 531, DB4 535, DB5 539 und 543) umfasst.
  9. Verfahren nach Anspruch 1, ferner umfassend das Erzeugen eines Konfidenzniveaus in Reaktion auf die Verifizierung der vorhergesagten Klassifizierung unter Verwendung eines oder mehrerer neuronaler Verifikationsnetze (342) und das Erzeugen eines Signals in Reaktion darauf, dass das Konfidenzniveau unter einem vorgegebenen Niveau liegt.
  10. Qualitätsprüfungsmodul (100), das Folgendes umfasst:
    mehrere Bildaufnahmevorrichtungen (110A-110C), die zum Aufnehmen eines oder mehrerer Bilder eines Probenbehälters (106), der einen Serum- oder Plasmaanteil (204SP) einer Vollblutprobe (104) darin enthält, aus einem oder mehreren Blickwinkeln betriebsfähig sind; und
    einen Computer (124, 172), der mit den mehreren Bildaufnahmevorrichtungen (110A-110C) gekoppelt ist, wobei der Computer (124, 172) zu Folgendem ausgelegt und betriebsfähig ist:
    Aufnehmen eines oder mehrerer Bilder der Probe (104), wobei die ein oder mehreren Bilder einen Serum- oder Plasmaanteil (204SP) der Probe (104) enthalten und das Aufnehmen Pixeldaten erzeugt;
    Verarbeiten von Pixeldaten der ein oder mehreren Bilder der Probe (104) unter Verwendung eines ersten neuronalen Netzes (338), das auf dem Computer (124, 172) ausgeführt wird, um eine Klassifizierung des Serum- oder Plasmaanteils (204SP) vorherzusagen, wobei das erste neuronale Netz eine Klassifizierungshypothese für den Serum- oder Plasmaanteil der Probe ausgibt, wodurch das erste neuronale Netz eine Hypothese darüber aufstellt, in welche von mehreren Klassen der Anteil gehört, wobei die Klassen der Klassifizierung Hämolyse, Ikterus und Lipämie umfassen; und
    Verifizieren der vorhergesagten Klassifizierung unter Verwendung eines oder mehrerer neuronaler Verifikationsnetze (342), die auf dem Computer (124, 172) ausgeführt werden, wobei das neuronale Verifikationsnetz identifiziert, ob die Vorhersage des ersten neuronalen Netzes korrekt ist, und wobei das Verifizieren ein neuronales Verifikationsnetz verwendet, das in Reaktion auf die und entsprechend der jeweiligen Klasse, die durch das erste neuronale Netz erhalten wird, ausgewählt wird.
  11. Qualitätsprüfungsmodul (100) nach Anspruch 10, wobei der Computer (124, 172) ferner dafür ausgelegt und betriebsfähig ist, Pixeldaten zu verarbeiten, um eine normale Klassifizierung des Serum- oder Plasmaanteils (204SP) vorherzusagen, was umfasst, eine Hypothese für den Serum- oder Plasmaanteil der Probe, ob die Klasse normal ist, auszugeben.
  12. Qualitätsprüfungsmodul (100) nach Anspruch 10, wobei der Computer (124, 172) ausgelegt und betriebsfähig ist zum:
    Verarbeiten von Pixeldaten zum Vorhersagen eines Klassifizierungsindexes, was umfasst, eine Hypothese für den Klassifizierungsindex, die aus H0-H6, I0-I6 oder L0-L4 ausgewählt wird, auszugeben; und
    Verifizieren des vorhergesagten Klassifizierungsindex unter Verwendung eines oder mehrerer auf den vorhergesagten Klassifizierungsindex trainierter neuronaler Verifikationsnetze (352).
  13. Qualitätsprüfungsmodul (100) nach Anspruch 10, wobei der Computer (124, 172) ausgelegt und betriebsfähig ist zum:
    Verifizieren der vorhergesagten Klassifizierung durch ein neuronales Verifikationsnetz (342A), das in Reaktion darauf, dass das erste neuronale Netz (338) eine Hämolyse-Klassifizierung für die Probe (104) vorhersagt, auf Hämolyse trainiert wurde;
    Verifizieren der vorhergesagten Klassifizierung durch ein neuronales Verifikationsnetz (342B), das in Reaktion darauf, dass das erste neuronale Netz (338) eine Ikterus-Klassifizierung für die Probe (104) vorhersagt, auf Ikterus trainiert wurde; und
    Verifizieren der vorhergesagten Klassifizierung durch ein neuronales Verifikationsnetz (342C), das in Reaktion darauf, dass das erste neuronale Netz (338) eine Lipämie-Klassifizierung für die Probe (104) vorhersagt, auf Lipämie trainiert wurde.
  14. Qualitätsprüfungsmodul (100) nach Anspruch 10, wobei die ein oder mehreren neuronalen Verifikationsnetze (342) ein oder mehrere faltungsneuronale Netze umfassen.
EP19862904.0A 2018-09-20 2019-09-19 Hypothesierungs- und verifizierungsnetzwerke und verfahren zur probenklassifizierung Active EP3853616B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862734007P 2018-09-20 2018-09-20
PCT/US2019/052014 WO2020061370A1 (en) 2018-09-20 2019-09-19 Hypothesizing and verification networks and methods for specimen classification

Publications (3)

Publication Number Publication Date
EP3853616A1 EP3853616A1 (de) 2021-07-28
EP3853616A4 EP3853616A4 (de) 2021-11-17
EP3853616B1 true EP3853616B1 (de) 2025-07-23

Family

ID=69887852

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19862904.0A Active EP3853616B1 (de) 2018-09-20 2019-09-19 Hypothesierungs- und verifizierungsnetzwerke und verfahren zur probenklassifizierung

Country Status (5)

Country Link
US (1) US12504435B2 (de)
EP (1) EP3853616B1 (de)
JP (1) JP7203206B2 (de)
CN (2) CN112689763B (de)
WO (1) WO2020061370A1 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11940451B2 (en) 2021-12-20 2024-03-26 Instrumentation Laboratory Co. Microfluidic image analysis system
CN115436319B (zh) * 2022-10-13 2024-08-20 昆明理工大学 一种基于近红外光谱的真假牛干巴快速检测方法及其应用
CN116704248B (zh) * 2023-06-07 2024-10-25 南京大学 一种基于多语义不平衡学习的血清样本图像分类方法

Family Cites Families (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5597248A (en) 1979-01-19 1980-07-24 Toshiba Electric Equip Corp Cooler for ultraviolet-ray lamp
JP2885823B2 (ja) * 1989-04-11 1999-04-26 株式会社豊田中央研究所 視覚認識装置
JPH08315144A (ja) * 1995-05-16 1996-11-29 Hitachi Ltd パターン分類装置及びそのパターン分類方法
JPH09120455A (ja) * 1995-10-26 1997-05-06 Meidensha Corp ニューラルネットワークによる特徴識別方法
JPH09133687A (ja) 1995-11-13 1997-05-20 Meiji Denki Kogyo Kk 採血試験管における血清量測定装置
WO1998029833A1 (en) * 1996-12-25 1998-07-09 Hitachi, Ltd. Pattern recognition apparatus and pattern recognition method
JPH10302067A (ja) * 1997-04-23 1998-11-13 Hitachi Ltd パターン認識装置
JP4900013B2 (ja) * 2007-04-16 2012-03-21 富士通セミコンダクター株式会社 検証方法及び検証装置
US8565507B2 (en) * 2008-05-23 2013-10-22 University Of Rochester Automated placental measurement
EP2414536B1 (de) * 2009-04-03 2014-06-18 Battelle Memorial Institute Fluoreszenz unterstuzte raman identifizierung lebender organismen
JP5859439B2 (ja) * 2009-08-13 2016-02-10 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレイテッド 臨床分析機によって分析される液体サンプルおよび容器の中の干渉物質および物理的寸法を確定するための方法ならびに装置
BR112015024134A2 (pt) * 2013-03-19 2017-07-18 Cireca Theranostics Llc método para classificar a amostra biológica, sistema para classificar a amostra biológica e produto de programa de computador
JP6143584B2 (ja) 2013-07-04 2017-06-07 株式会社日立ハイテクノロジーズ 検出装置および生体試料分析装置
WO2015154205A1 (en) * 2014-04-11 2015-10-15 Xiaoou Tang Methods and systems for verifying face images based on canonical images
US10695803B2 (en) 2015-02-13 2020-06-30 Siemens Healthcare Diagnostics Inc. Pipette cleaning methods and apparatus, neutralizing liquid vessels, and methods of reducing carryover
US11009467B2 (en) 2015-02-17 2021-05-18 Siemens Healthcare Diagnostics Inc. Model-based methods and apparatus for classifying an interferent in specimens
WO2016145547A1 (en) * 2015-03-13 2016-09-22 Xiaoou Tang Apparatus and system for vehicle classification and verification
CN105354611B (zh) * 2015-10-08 2018-01-09 程涛 一种基于人工神经网络的最佳质量图像扫描方法及系统
JP6712321B2 (ja) 2015-12-10 2020-06-17 セント・ジュード・メディカル,カーディオロジー・ディヴィジョン,インコーポレイテッド 血管隔離アブレーション装置
JP7146635B2 (ja) 2015-12-16 2022-10-04 ヴェンタナ メディカル システムズ, インク. マルチスペクトル軌跡を使用するデジタル撮像のための自動合焦の方法およびシステム
CN108369642A (zh) 2015-12-18 2018-08-03 加利福尼亚大学董事会 根据头部计算机断层摄影解释和量化急症特征
US10928310B2 (en) 2016-01-28 2021-02-23 Siemens Healthcare Diagnostics Inc. Methods and apparatus for imaging a specimen container and/or specimen using multiple exposures
EP3408640B1 (de) 2016-01-28 2021-01-06 Siemens Healthcare Diagnostics Inc. Verfahren und vorrichtung zur identifizierung eines probenbehälters aus mehreren seitlichen ansichten
WO2017132171A1 (en) 2016-01-28 2017-08-03 Siemens Healthcare Diagnostics Inc. Methods and apparatus for characterizing a specimen container and specimen
EP3408652B1 (de) 2016-01-28 2020-12-16 Siemens Healthcare Diagnostics Inc. Verfahren und vorrichtung zur klassifizierung eines artefakts in einer probe
JP6870826B2 (ja) 2016-01-28 2021-05-12 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッドSiemens Healthcare Diagnostics Inc. 側方多視点から試料を定量化するように構成された方法及び装置
JP6976257B2 (ja) 2016-01-28 2021-12-08 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッドSiemens Healthcare Diagnostics Inc. マルチビューの特徴付けのための方法及び装置
JP6791972B2 (ja) 2016-01-28 2020-11-25 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッドSiemens Healthcare Diagnostics Inc. 試料中のインターフェレントを検出するための方法と装置
US9739783B1 (en) 2016-03-15 2017-08-22 Anixa Diagnostics Corporation Convolutional neural networks for cancer diagnosis
CN105825509A (zh) 2016-03-17 2016-08-03 电子科技大学 基于3d卷积神经网络的脑血管分割方法
EP3443121A4 (de) * 2016-04-11 2020-04-22 Agency for Science, Technology and Research Hochdurchsatzverfahren zur genauen vorhersage von verbindungsinduzierter leberschädigung
JP2019515898A (ja) * 2016-04-22 2019-06-13 インノスペック リミテッドInnospec Limited 方法、組成物、及びそれらに関する使用
US10255522B2 (en) 2016-06-17 2019-04-09 Facebook, Inc. Generating object proposals using deep-learning models
US10387765B2 (en) 2016-06-23 2019-08-20 Siemens Healthcare Gmbh Image correction using a deep generative machine-learning model
US11151721B2 (en) 2016-07-08 2021-10-19 Avent, Inc. System and method for automatic detection, localization, and semantic segmentation of anatomical objects
CN109477848B (zh) 2016-07-25 2023-07-18 西门子医疗保健诊断公司 用于识别样品容器盖的系统、方法和设备
CN106250866A (zh) * 2016-08-12 2016-12-21 广州视源电子科技股份有限公司 基于神经网络的图像特征提取建模、图像识别方法及装置
CN106372390B (zh) 2016-08-25 2019-04-02 汤一平 一种基于深度卷积神经网络的预防肺癌自助健康云服务系统
US9965863B2 (en) 2016-08-26 2018-05-08 Elekta, Inc. System and methods for image segmentation using convolutional neural network
CN106408562B (zh) 2016-09-22 2019-04-09 华南理工大学 基于深度学习的眼底图像视网膜血管分割方法及系统
CN110023950B (zh) 2016-10-28 2023-08-08 拜克门寇尔特公司 物质准备评估系统
EP3538839B1 (de) * 2016-11-14 2021-09-29 Siemens Healthcare Diagnostics Inc. Verfahren, vorrichtung und qualitätsprüfmodule zum nachweis von hämolyse, ikterus, lipämie oder normalität einer probe
JP7012719B2 (ja) 2016-11-14 2022-02-14 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッド パターン照明を用いて試料を特徴付ける方法及び装置
WO2018105062A1 (ja) 2016-12-07 2018-06-14 オリンパス株式会社 画像処理装置及び画像処理方法
CN110573883B (zh) 2017-04-13 2023-05-30 美国西门子医学诊断股份有限公司 用于在样本表征期间确定标签计数的方法和装置
WO2018191287A1 (en) 2017-04-13 2018-10-18 Siemens Healthcare Diagnostics Inc. Methods and apparatus for hiln characterization using convolutional neural network
US11538159B2 (en) 2017-04-13 2022-12-27 Siemens Healthcare Diagnostics Inc. Methods and apparatus for label compensation during specimen characterization
JP6875709B2 (ja) * 2017-06-09 2021-05-26 株式会社Aiメディカルサービス 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体
EP3596697B1 (de) 2017-06-28 2021-03-17 Deepmind Technologies Limited Verallgemeinerbare analyse medizinischer bilder unter verwendung von neuronalen segmentierungs- und klassifizierungsnetzen
US11333553B2 (en) 2017-07-19 2022-05-17 Siemens Healthcare Diagnostics Inc. Methods and apparatus for specimen characterization using hyperspectral imaging
WO2019023376A1 (en) 2017-07-28 2019-01-31 Siemens Healthcare Diagnostics Inc. METHODS AND APPARATUS FOR QUANTIFYING DEEP LEARNING VOLUME
CN107424159B (zh) * 2017-07-28 2020-02-07 西安电子科技大学 基于超像素边缘和全卷积网络的图像语义分割方法
CN107909566A (zh) 2017-10-28 2018-04-13 杭州电子科技大学 一种基于深度学习的皮肤癌黑色素瘤的图像识别方法
JP7324757B2 (ja) 2018-01-10 2023-08-10 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレイテッド 訓練が低減されたニューラルネットワークを使用したバイオ流体検体の特徴付けのための方法および装置
CN108364006B (zh) 2018-01-17 2022-03-08 超凡影像科技股份有限公司 基于多模式深度学习的医学图像分类装置及其构建方法
CN108596166B (zh) 2018-04-13 2021-10-26 华南师范大学 一种基于卷积神经网络分类的集装箱箱号识别方法
US11763461B2 (en) 2018-06-15 2023-09-19 Siemens Healthcare Diagnostics Inc. Specimen container characterization using a single deep neural network in an end-to-end training fashion
CN112424334B (zh) 2018-06-15 2024-08-20 美国西门子医学诊断股份有限公司 利用高级语义分割和对抗训练进行细粒度hil指数确定的方法和设备
EP3853615B1 (de) * 2018-09-20 2024-01-17 Siemens Healthcare Diagnostics, Inc. Verfahren und vorrichtung zur hiln-bestimmung mit einem tiefen anpassungsnetzwerk für sowohl serum- als auch plasmaproben

Also Published As

Publication number Publication date
CN119104549A (zh) 2024-12-10
EP3853616A1 (de) 2021-07-28
CN112689763A (zh) 2021-04-20
WO2020061370A1 (en) 2020-03-26
JP7203206B2 (ja) 2023-01-12
JP2022501595A (ja) 2022-01-06
US20210333298A1 (en) 2021-10-28
EP3853616A4 (de) 2021-11-17
US12504435B2 (en) 2025-12-23
CN112689763B (zh) 2024-09-20

Similar Documents

Publication Publication Date Title
US11238318B2 (en) Methods and apparatus for HILN characterization using convolutional neural network
CN112639482B (zh) 以端到端训练方式使用单深度神经网络的样本容器表征
EP3610269B1 (de) Verfahren und vorrichtung zur bestimmung der markierungszahl während der charakterisierung von proben
EP3807396B1 (de) Verfahren und vorrichtung zur feinkörnigen hil-indexbestimmung mit fortgeschrittener semantischer segmentierung und adversarial-training
EP3853615B1 (de) Verfahren und vorrichtung zur hiln-bestimmung mit einem tiefen anpassungsnetzwerk für sowohl serum- als auch plasmaproben
CN111556961A (zh) 用于使用具有经缩减的训练的神经网络的生物流体试样表征的方法和装置
EP3853616B1 (de) Hypothesierungs- und verifizierungsnetzwerke und verfahren zur probenklassifizierung
HK40045684A (en) Hypothesizing and verification networks and methods for specimen classification
HK40042175A (en) Methods and apparatus for fine-grained hil index determination with advanced semantic segmentation and adversarial training
HK40041703A (en) Specimen container characterization using a single deep neural network in an end-to-end training fashion
HK40045654A (en) Methods and apparatus for hiln determination with a deep adaptation network for both serum and plasma samples
HK40013553A (en) Methods and apparatus for hiln characterization using convolutional neural network
HK40013553B (en) Methods and apparatus for hiln characterization using convolutional neural network

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210414

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: G01N0035000000

Ipc: G06T0007000000

Ref country code: DE

Ref legal event code: R079

Ref document number: 602019073029

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G01N0035000000

Ipc: G06T0007000000

A4 Supplementary search report drawn up and despatched

Effective date: 20211019

RIC1 Information provided on ipc code assigned before grant

Ipc: G01J 3/02 20060101ALI20211013BHEP

Ipc: G01N 35/00 20060101ALI20211013BHEP

Ipc: G06T 7/174 20170101ALI20211013BHEP

Ipc: G06T 7/136 20170101ALI20211013BHEP

Ipc: G06T 7/10 20170101ALI20211013BHEP

Ipc: G06T 7/00 20170101AFI20211013BHEP

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20250221

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602019073029

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20251124

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1817223

Country of ref document: AT

Kind code of ref document: T

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20251123

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20251120

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20251002

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20251023

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20251024

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20251023

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20250723