EP4210564A1 - Determining locations in reproductive cellular structures - Google Patents

Determining locations in reproductive cellular structures

Info

Publication number
EP4210564A1
EP4210564A1 EP21867751.6A EP21867751A EP4210564A1 EP 4210564 A1 EP4210564 A1 EP 4210564A1 EP 21867751 A EP21867751 A EP 21867751A EP 4210564 A1 EP4210564 A1 EP 4210564A1
Authority
EP
European Patent Office
Prior art keywords
cellular structure
location
clinical parameter
reproductive cellular
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21867751.6A
Other languages
German (de)
French (fr)
Other versions
EP4210564A4 (en
Inventor
Hadi Shafiee
Charles Bormann
Manoj Kumar KANAKASABAPATHY
Prudhvi THIRUMALARAJU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Brigham and Womens Hospital Inc
General Hospital Corp
Original Assignee
Brigham and Womens Hospital Inc
General Hospital Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brigham and Womens Hospital Inc, General Hospital Corp filed Critical Brigham and Womens Hospital Inc
Publication of EP4210564A1 publication Critical patent/EP4210564A1/en
Publication of EP4210564A4 publication Critical patent/EP4210564A4/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30044Fetus; Embryo

Definitions

  • the present invention relates generally to the field of assisted fertility, and more particularly, to determining locations in reproductive cellular structures.
  • Infertility is an underestimated healthcare problem that affects over forty-eight million couples globally and is a cause of distress, depression, and discrimination.
  • IVF assisted reproductive technologies
  • ART assisted reproductive technologies
  • IVF in-vitro fertilization
  • ICM inner cell mass
  • a system in accordance with an aspect of the present invention, includes a processor and a non-transitory computer readable medium storing machine executable instructions for assigning a value representing a location of interest within a reproductive cellular structure.
  • the machine executable instructions include an imager interface that receives an image of the reproductive cellular structure from an associated imager.
  • a neural network determines a clinical parameter representing the location of interest within the reproductive cellular structure from the image of the reproductive cellular structure.
  • the clinical parameter is stored at the non- transitory computer readable medium.
  • a method for determining a clinical parameter representing the location of interest within a reproductive cellular structure.
  • An image of the reproductive cellular structure is obtained and provided to a neural network to generate the clinical parameter.
  • the clinical parameter is stored on a non-transitory computer readable medium.
  • a method for planning an assisted fertility procedure.
  • An image of a reproductive cellular structure is obtained and provided to a neural network to generate a clinical parameter representing a location of interest within the reproductive cellular structure.
  • a procedure is performed on the reproductive cellular structure at a location determined according to the clinical parameter.
  • FIG. 1 illustrates one example of a system for determining a location of interest on a reproductive cellular structure
  • FIG. 2 illustrates another example of a system for determining a location of interest on a reproductive cellular structure
  • FIG. 3 illustrates a method for determining a clinical parameter representing the location of interest within a reproductive cellular structure
  • FIG. 4 illustrates a method for planning an assisted fertility procedure
  • FIG. 5 is a schematic block diagram illustrating an exemplary system of hardware components capable of implementing examples of the systems and methods disclosed herein.
  • Intracytoplasmic sperm injection is a procedure that includes alignment of metaphase (Mil) oocytes, selection, and immobilization of sperm, and injection of sperm at a precise location that does not interfere with the mitotic spindle.
  • the spindle is located adjacent to the extruded polar body and cannot be visualized using bright field microscopy.
  • Assisted hatching is a procedure designed to enable embryo escape from the zona pellucida (ZP).
  • AH may increase the chance of pregnancy in older women with repeat IVF failure and in frozen embryo transfer cycles. This procedure is widely used on cleavage stage embryos to facilitate herniation and biopsy of trophectoderm cells for Preimplantation Genetic Testing. Blastomeres can be easily damaged if AH is performed too close to healthy cells.
  • a “reproductive cellular structure”, as used herein, is a single cell or multicellular structure involved in an assisted fertility procedure on a mammalian subject. Reproductive cellular structures can include gametes, such as oocytes, as well as fertilized embryos prior to implantation.
  • a “static observation,” as used herein, is an image or group of images of a reproductive cellular structure that represent a single point in the development of the reproductive cellular structure. Where multiple images are used in a static observation, no discernible change in the structure and appearance of the reproductive cellular structure will have taken place between images.
  • FIG. 1 illustrates one example of a system 100 for determining a location of interest on a reproductive cellular structure.
  • the system 100 includes a processor 102 and a non-transitory computer readable medium 110 that stores machine executable instructions for assigning a value representing a location of interest within a reproductive cellular structure.
  • the machine executable instructions include an imager interface 112 that receives an image of the reproductive cellular structure from an associated imager.
  • the imaging interface 112 can receive the image from the imager via a bus or network connection and condition the image for analysis at a neural network 114.
  • the neural network 114 can be implemented on a cloud computing system, with the image transmitted to the server containing the neural network 114 via a network interface (not shown).
  • the neural network 1 14 determines, from the image of the reproductive cellular structure, a clinical parameter representing the location of interest within the reproductive cellular structure.
  • each possible value of the categorical clinical parameter represents a designated location within a representation of the reproductive cellular structure.
  • the representation can be substantially circular, and the various values for the clinical parameter can represent individual sectors of the circle.
  • the representation is divided into twelve thirty-degree sectors.
  • the representation is divided into an array of tiled polygons, and with each polygon in the array being represented by one of the values for the clinical parameter.
  • the reproductive cellular structure is an oocyte and the clinical parameter represents a location of the polar body within the oocyte.
  • the reproductive cellular structure is an embryo and the clinical parameter represents a location on the zona pellucida that is furthest from a healthy blastomere.
  • the neural network 1 14 includes a plurality of nodes having a plurality of interconnections. Values from the image, for example luminance and/or chrominance values associated with the individual pixels, are provided to a plurality of input nodes. The input nodes each provide these input values to layers of one or more intermediate nodes. A given intermediate node receives one or more output values from previous nodes. The received values are weighted according to a series of weights established during the training of the classifier. An intermediate node translates its received values into a single output according to an activation function at the node.
  • the intermediate node can sum the received values and subject the sum to an identify function, a step function, a sigmoid function, a hyperbolic tangent, a rectified linear unit, a leaky rectified linear unit, a parametric rectified linear unit, a Gaussian error linear unit, the softplus function, an exponential linear unit, a scaled exponential linear unit, a Gaussian function, a sigmoid linear unit, a growing cosine unit, the Heaviside function, and the mish function.
  • a final layer of nodes provides the confidence values for the output classes of the ANN, with each node having an associated value representing a confidence for one of the associated output classes of the classifier.
  • ANN classifiers are fully-connected and feedforward.
  • a convolutional neural network includes convolutional layers in which nodes from a previous layer are only connected to a subset of the nodes in the convolutional layer.
  • Recurrent neural networks are a class of neural networks in which connections between nodes form a directed graph along a temporal sequence. Unlike a feedforward network, recurrent neural networks can incorporate feedback from states caused by earlier inputs, such that an output of the recurrent neural network for a given input can be a function of not only the input but one or more previous inputs.
  • LSTM Long Short-Term Memory
  • LSTM Long Short-Term Memory
  • the neural network 1 14 is trained on a plurality of labeled images of the appropriate reproductive cellular structure.
  • labeled images it is meant that the position of the location of interest within the image is known, for example, via expert annotation, and the clinical parameter associated with the position of the location of interest is provided to the neural network along with the image during the training process.
  • the weights associated with the interconnections among nodes in the neural network 114 are iteratively changed until, once the network is changed, an output of the network when presented with a novel, unlabeled image provides a clinical parameter representing the location of the location of interest within the novel image.
  • This clinical parameter or a representation of the clinical parameter can be stored on the non-transitory computer readable medium 110 and/or provided to a user via an associated output device.
  • FIG. 2 illustrates another example of a system 200 for determining a location of interest on a reproductive cellular structure.
  • the system 200 generates a categorical clinical parameter representing a location on the reproductive cellular structure.
  • the system 200 includes an imager 202 that acquires an image of the reproductive cellular structure on at least one day of development.
  • the imager 202 can include one or more cameras, capable of producing images in the visible or infrared range, paired with appropriate optics to provide an image of a reproductive cellular structure.
  • the imager 202 can be implemented to capture images of an embryo at multiple days of development as part of a time-lapse embryo imaging system.
  • the imager 202 can be configured to generate a static observation of the reproductive cellular structure as a set of one or more images.
  • the imager 202 includes an attachment for a mobile device that operates with a camera of the mobile device to provide the images of the reproductive cellular structure.
  • the housing for the attachment can be 3-D printed using polylactic acid with dimensions of 82 x 34 x 48 mm.
  • An acrylic lens can be included in the housing to provided appropriate magnification for the embryo images.
  • the imager 202 can be implemented as a standalone system with an optical housing that is 3-D printed from polylactic acid and overall dimensions of 62 x 92 x 175 mm.
  • the housing contains an electronic circuit with a white light-emitting diode, a three-volt battery, and a single pole double-throw switch.
  • the embryo sample is transilluminated, with a 10x Plan-Achromatic objective lens for image magnification and a complementary metal-oxide-semiconductor (CMOS) image sensor for embryo image data acquisition.
  • CMOS complementary metal-oxide-semiconductor
  • the CMOS sensor can be connected to a single-board computer to process the captured images.
  • the imager 202 can be connected to a mobile device via a wireless connection (e.g., Wi-Fi, Bluetooth, or a similar connection) for data processing and visualization.
  • a wireless connection e.g., Wi-Fi, Bluetooth, or a similar connection
  • the one or more images obtained at the imager 202 are provided to an analysis system 210 comprising a processor 212, an output device 214, and a non- transitory computer readable medium 220 storing instructions executable by the processor.
  • the instructions are executable to provide an imager interface 222 that receives the image or images of the reproductive cellular structure.
  • the imager interface 222 can apply one or more imaging condition techniques, such as cropping and filtering, to better prepare the image for analysis.
  • the images are then provided to a neural network 224 that provides the categorical clinical parameter representing the desired location.
  • the neural network 224 can be a convolutional neural network, which is a feed-forward artificial neural network that includes convolutional layers, which effectively apply a convolution to the values at the preceding layer of the network to emphasize various sets of features within an image.
  • a convolutional layer each neuron is connected only to a proper subset of the neurons in the preceding layer, referred to as the receptive field of the neuron.
  • the convolutional neural network is implemented using the Xception architecture.
  • at least one chromatic value (e.g., a value for an RGB color channel, a YCrCb color channel, or a grayscale brightness) associated with each pixel is provided as an initial input to the convolutional neural network.
  • the neural network 224 can be implemented as a recurrent neural network.
  • a recurrent neural network the connections between nodes in the network are selected to form a directed graph along a sequence, allowing it to exhibit dynamic temporal behavior.
  • the neural network 224 is implemented and trained as a discriminative network in a generative adversarial model, in which a generative neural network and the discriminative network provide mutual feedback to one another, such that the generative neural network produces increasingly sophisticated samples for the discriminative network to attempt to classify.
  • some or all layers of the neural network can be trained via transfer learning from another system, with only some of the layers trained on the training images of the reproductive cellular structure.
  • a final layer of the neural network 224 can be implemented as a softmax layer to provide a classification result.
  • the neural network 224 In response to a novel image, the neural network 224 generates a clinical parameter representing the portion of the image containing the location of interest.
  • the clinical parameter can be provided to a user at the output device 214 via a user interface 226.
  • the user interface 226 can include appropriate software instructions for receiving the output of the neural network 224 and presenting it at the output device 214.
  • the output device 214 can include a mobile device that communicates wirelessly with the analysis system 210.
  • the clinical parameter can be provided to the user as a representation of the cellular reproductive structure with the section of the cellular reproductive structure highlighted within the representation.
  • FIGS. 3 and 4 While, for purposes of simplicity of explanation, the methods of FIGS. 3 and 4is shown and described as executing serially, it is to be understood and appreciated that the present invention is not limited by the illustrated order, as some aspects could, in accordance with the present invention, occur in different orders and/or concurrently with other aspects from that shown and described herein. Moreover, not all illustrated features may be required to implement a method in accordance with an aspect the present invention.
  • FIG. 3 illustrates one method 300 for determining a clinical parameter representing the location of interest within a reproductive cellular structure.
  • the method 300 can be used to determine the location of a polar body within an oocyte, with the clinical parameter representing a portion of the oocyte containing the polar body.
  • the method can be used to determine a location on the zona pellucida that is furthest from a healthy blastomere, with the clinical parameter representing the appropriate location.
  • an image of the reproductive cellular structure is obtained. For example, an image can be captured at an appropriate imager and provided to a computer system for image processing.
  • the image of the reproductive cellular structure is provided to a neural network to generate the clinical parameter.
  • Each possible value of the clinical parameter represents a designated location within a representation of the reproductive cellular structure, such that the clinical parameter indicates the location of interest within the image.
  • the neural network can be implemented as any of a convolutional neural network, a recurrent neural network, and a discriminative classifier trained as part of a generative adversarial network.
  • the clinical parameter is stored on a non- transitory computer readable medium.
  • FIG. 4 illustrates a method 400 for planning an assisted fertility procedure.
  • an image of the reproductive cellular structure is obtained.
  • an image can be captured at an appropriate imager and provided to a computer system for image processing.
  • the image of the reproductive cellular structure is provided to a neural network to generate a clinical parameter representing a location of interest within the reproductive cellular structure.
  • Each possible value of the clinical parameter represents a designated location within a representation of the reproductive cellular structure, such that the clinical parameter indicates the location of interest within the image.
  • the neural network can be implemented as any of a convolutional neural network, a recurrent neural network, and a discriminative classifier trained as part of a generative adversarial network.
  • a procedure can be performed on the reproductive cellular structure at a location determined according to the clinical parameter.
  • the method 400 can be used to determine the location of a polar body within an oocyte, with the clinical parameter representing a portion of the oocyte containing the polar body.
  • the location of the polar body can be used to determine an appropriate location for intracytoplasmic sperm injection.
  • the intracytoplasmic sperm injection can be performed at a location spaced ninety degrees from the polar body.
  • the method 400 can be used to determine a location on the zona pellucida that is furthest from a healthy blastomere to avoid damaging healthy blastomeres during laser assisted hatching.
  • the procedure can be performed by an automated robotic system based upon the location represented by the clinical parameter.
  • FIG. 5 is a schematic block diagram illustrating an exemplary system 500 of hardware components capable of implementing examples of the systems and methods disclosed in FIGS. 1 -4, such as the systems illustrated in FIGS. 1 and 2.
  • the system 500 can include various systems and subsystems.
  • the system 500 can be any of personal computer, a laptop computer, a workstation, a computer system, an appliance, an application-specific integrated circuit (ASIC), a server, a server blade center, or a server farm.
  • ASIC application-specific integrated circuit
  • the system 500 can includes a system bus 502, a processing unit 504, a system memory 506, memory devices 508 and 510, a communication interface 512 (e.g., a network interface), a communication link 514, a display 516 (e.g., a video screen), and an input device 518 (e.g., a keyboard and/or a mouse).
  • the system bus 502 can be in communication with the processing unit 504 and the system memory 506.
  • the additional memory devices 508 and 510 such as a hard disk drive, server, stand-alone database, or other non-volatile memory, can also be in communication with the system bus 502.
  • the system bus 502 interconnects the processing unit 504, the memory devices 506-510, the communication interface 512, the display 516, and the input device 518. In some examples, the system bus 502 also interconnects an additional port (not shown), such as a universal serial bus (USB) port.
  • an additional port not shown, such as a universal serial bus (USB) port.
  • USB universal serial bus
  • the system 500 could be implemented in a computing cloud.
  • features of the system 500 such as the processing unit 504, the communication interface 512, and the memory devices 508 and 510 could be representative of a single instance of hardware or multiple instances of hardware with applications executing across the multiple of instances (i.e., distributed) of hardware (e.g., computers, routers, memory, processors, or a combination thereof).
  • the system 500 could be implemented on a single dedicated server.
  • the processing unit 504 can be a computing device and can include an application-specific integrated circuit (ASIC).
  • the processing unit 504 executes a set of instructions to implement the operations of examples disclosed herein.
  • the processing unit can include a processing core.
  • the additional memory devices 506, 508, and 510 can store data, programs, instructions, database queries in text or compiled form, and any other information that can be needed to operate a computer.
  • the memories 506, 508 and 510 can be implemented as computer-readable media (integrated or removable) such as a memory card, disk drive, compact disk (CD), or server accessible over a network.
  • the memories 506, 508 and 510 can comprise text, images, video, and/or audio, portions of which can be available in formats comprehensible to human beings.
  • the system 500 can access an external data source or query source through the communication interface 512, which can communicate with the system bus 502 and the communication link 514.
  • the system 500 can be used to implement one or more parts of an embryo evaluation system in accordance with the present invention.
  • Computer executable logic for implementing the composite applications testing system resides on one or more of the system memory 506, and the memory devices 508, 510 in accordance with certain examples.
  • the processing unit 504 executes one or more computer executable instructions originating from the system memory 506 and the memory devices 508 and 510. It will be appreciated that a computer readable medium can include multiple computer readable media each operatively connected to the processing unit.
  • the processing units can be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • the embodiments can be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart can describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations can be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process can correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
  • embodiments can be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof.
  • the program code or code segments to perform the necessary tasks can be stored in a machine readable medium such as a storage medium.
  • a code segment or machine-executable instruction can represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements.
  • a code segment can be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents.
  • Information, arguments, parameters, data, etc. can be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, ticket passing, network transmission, etc.
  • the methodologies can be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein.
  • Any machine-readable medium tangibly embodying instructions can be used in implementing the methodologies described herein.
  • software codes can be stored in a memory.
  • Memory can be implemented within the processor or external to the processor.
  • the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • the term “storage medium” can represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
  • ROM read only memory
  • RAM random access memory
  • magnetic RAM magnetic RAM
  • core memory magnetic disk storage mediums
  • optical storage mediums flash memory devices and/or other machine readable mediums for storing information.
  • computer readable medium includes, but is not limited to portable or fixed storage devices, optical storage devices, wireless channels, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
  • a “computer readable medium” or “machine readable medium” can include multiple media each operatively connected to a processing unit. In such a case, when it is stated that data is stored at the computer readable medium, it can refer to any of the interconnected media within the system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Multimedia (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Quality & Reliability (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Image Analysis (AREA)
  • Investigating Or Analysing Biological Materials (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Devices For Executing Special Programs (AREA)
  • Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • General Factory Administration (AREA)
  • Image Processing (AREA)

Abstract

Systems and methods are provided for determining a clinical parameter representing the location of interest within a reproductive cellular structure. An image of the reproductive cellular structure is obtained and provided to a neural network to generate the clinical parameter. The clinical parameter is stored on a non-transitory computer readable medium.

Description

DETERMINING LOCATIONS IN REPRODUCTIVE CELLULAR STRUCTURES
Related Applications
[0001] The present application claims priority to each of U.S. Provisional Patent Application Serial No. 63/077,405 filed September 11 , 2020 entitled ARTIFICIAL INTELLIGENCE-ENABLED SYSTEM TO AID IN ANEUPLOIDY SCREENING OF PREIMPLANTATION EMBRYOS and U.S. Provisional Patent Application Serial No. 63/077,398 filed September 11 , 2020 entitled ARTIFICIAL INTELLIGENCE-ENABLED SYSTEM FOR ALIGNMENT OF OOCYTES AND PREIMPLANTATION EMBRYOS FOR INTRACYTOPLASMIC SPERM INJECTION (ICSI) AND ASSISTED HATCHING (AH) PROCEDURES. The entire content of each of these applications is incorporated herein by reference in its entirety for all purposes.
Statement on Government Rights
[0002] This invention was made with government support under one or more of grant numbers R01 AH 18502, R01AI138800, and R21 HD092828, awarded by the National Institutes of Health. The government has certain rights in the invention.
Technical Field
[0003] The present invention relates generally to the field of assisted fertility, and more particularly, to determining locations in reproductive cellular structures.
Background of the Invention
[0004] Infertility is an underestimated healthcare problem that affects over forty-eight million couples globally and is a cause of distress, depression, and discrimination.
Although assisted reproductive technologies (ART) such as in-vitro fertilization (IVF) has alleviated the burden of infertility to an extent, it has been inefficient with an average success rate of approximately twenty-six percent reported in 2015 in the US. IVF remains as an expensive solution, with a cost between $7000 and $20,000 per ART cycle in the US, which is generally not covered by insurance. Further, many patients require multiple cycles of IVF to achieve pregnancy. Embryos are usually transferred to a patient's uterus during either the cleavage or the blastocyst stage of development. Embryos are described as being at the cleavage stage two or three days after fertilization. Embryos reach the blastocyst stage five or six days after fertilization. Blastocysts have fluid filled cavities and two distinguishable cell types, the trophectoderm and the inner cell mass (ICM).
Summary of the Invention
[0005] In accordance with an aspect of the present invention, a system is provided. The system includes a processor and a non-transitory computer readable medium storing machine executable instructions for assigning a value representing a location of interest within a reproductive cellular structure. The machine executable instructions include an imager interface that receives an image of the reproductive cellular structure from an associated imager. A neural network determines a clinical parameter representing the location of interest within the reproductive cellular structure from the image of the reproductive cellular structure. The clinical parameter is stored at the non- transitory computer readable medium.
[0006] In accordance with another aspect of the present invention, a method is provided for determining a clinical parameter representing the location of interest within a reproductive cellular structure. An image of the reproductive cellular structure is obtained and provided to a neural network to generate the clinical parameter. The clinical parameter is stored on a non-transitory computer readable medium.
[0007] In accordance with yet another aspect of the present invention, a method is provided for planning an assisted fertility procedure. An image of a reproductive cellular structure is obtained and provided to a neural network to generate a clinical parameter representing a location of interest within the reproductive cellular structure. A procedure is performed on the reproductive cellular structure at a location determined according to the clinical parameter.
Brief Description of the Drawings
[0008] The foregoing and other features of the present invention will become apparent to those skilled in the art to which the present invention relates upon reading the following description with reference to the accompanying drawings, in which: [0009] FIG. 1 illustrates one example of a system for determining a location of interest on a reproductive cellular structure;
[0010] FIG. 2 illustrates another example of a system for determining a location of interest on a reproductive cellular structure;
[0011] FIG. 3 illustrates a method for determining a clinical parameter representing the location of interest within a reproductive cellular structure
[0012] FIG. 4 illustrates a method for planning an assisted fertility procedure; and [0013] FIG. 5 is a schematic block diagram illustrating an exemplary system of hardware components capable of implementing examples of the systems and methods disclosed herein.
Detailed Description
[0014] The two most widely performed and technically challenging micromanipulation procedures conducted in a clinical embryology laboratory are intracytoplasmic sperm injection (ICSI) and assisted hatching (AH). Both procedures are manually performed by highly trained embryologists. Intracytoplasmic sperm injection is a procedure that includes alignment of metaphase (Mil) oocytes, selection, and immobilization of sperm, and injection of sperm at a precise location that does not interfere with the mitotic spindle. The spindle is located adjacent to the extruded polar body and cannot be visualized using bright field microscopy. Assisted hatching is a procedure designed to enable embryo escape from the zona pellucida (ZP). Studies show that AH may increase the chance of pregnancy in older women with repeat IVF failure and in frozen embryo transfer cycles. This procedure is widely used on cleavage stage embryos to facilitate herniation and biopsy of trophectoderm cells for Preimplantation Genetic Testing. Blastomeres can be easily damaged if AH is performed too close to healthy cells.
[0015] A “reproductive cellular structure”, as used herein, is a single cell or multicellular structure involved in an assisted fertility procedure on a mammalian subject. Reproductive cellular structures can include gametes, such as oocytes, as well as fertilized embryos prior to implantation.
[0016] A “static observation,” as used herein, is an image or group of images of a reproductive cellular structure that represent a single point in the development of the reproductive cellular structure. Where multiple images are used in a static observation, no discernible change in the structure and appearance of the reproductive cellular structure will have taken place between images.
[0017] FIG. 1 illustrates one example of a system 100 for determining a location of interest on a reproductive cellular structure. The system 100 includes a processor 102 and a non-transitory computer readable medium 110 that stores machine executable instructions for assigning a value representing a location of interest within a reproductive cellular structure. The machine executable instructions include an imager interface 112 that receives an image of the reproductive cellular structure from an associated imager. For example, the imaging interface 112 can receive the image from the imager via a bus or network connection and condition the image for analysis at a neural network 114. In one example, the neural network 114 can be implemented on a cloud computing system, with the image transmitted to the server containing the neural network 114 via a network interface (not shown).
[0018] The neural network 1 14 determines, from the image of the reproductive cellular structure, a clinical parameter representing the location of interest within the reproductive cellular structure. In one implementation, each possible value of the categorical clinical parameter represents a designated location within a representation of the reproductive cellular structure. In one example, the representation can be substantially circular, and the various values for the clinical parameter can represent individual sectors of the circle. In one implementation, the representation is divided into twelve thirty-degree sectors. In another example, the representation is divided into an array of tiled polygons, and with each polygon in the array being represented by one of the values for the clinical parameter. In one implementation, the reproductive cellular structure is an oocyte and the clinical parameter represents a location of the polar body within the oocyte. In another implementation, the reproductive cellular structure is an embryo and the clinical parameter represents a location on the zona pellucida that is furthest from a healthy blastomere.
[0019] The neural network 1 14 includes a plurality of nodes having a plurality of interconnections. Values from the image, for example luminance and/or chrominance values associated with the individual pixels, are provided to a plurality of input nodes. The input nodes each provide these input values to layers of one or more intermediate nodes. A given intermediate node receives one or more output values from previous nodes. The received values are weighted according to a series of weights established during the training of the classifier. An intermediate node translates its received values into a single output according to an activation function at the node. For example, the intermediate node can sum the received values and subject the sum to an identify function, a step function, a sigmoid function, a hyperbolic tangent, a rectified linear unit, a leaky rectified linear unit, a parametric rectified linear unit, a Gaussian error linear unit, the softplus function, an exponential linear unit, a scaled exponential linear unit, a Gaussian function, a sigmoid linear unit, a growing cosine unit, the Heaviside function, and the mish function. A final layer of nodes provides the confidence values for the output classes of the ANN, with each node having an associated value representing a confidence for one of the associated output classes of the classifier.
[0020] Many ANN classifiers are fully-connected and feedforward. A convolutional neural network, however, includes convolutional layers in which nodes from a previous layer are only connected to a subset of the nodes in the convolutional layer. Recurrent neural networks are a class of neural networks in which connections between nodes form a directed graph along a temporal sequence. Unlike a feedforward network, recurrent neural networks can incorporate feedback from states caused by earlier inputs, such that an output of the recurrent neural network for a given input can be a function of not only the input but one or more previous inputs. As an example, Long Short-Term Memory (LSTM) networks are a modified version of recurrent neural networks, which makes it easier to remember past data in memory.
[0021] The neural network 1 14 is trained on a plurality of labeled images of the appropriate reproductive cellular structure. By “labeled images,” it is meant that the position of the location of interest within the image is known, for example, via expert annotation, and the clinical parameter associated with the position of the location of interest is provided to the neural network along with the image during the training process. During training, the weights associated with the interconnections among nodes in the neural network 114 are iteratively changed until, once the network is changed, an output of the network when presented with a novel, unlabeled image provides a clinical parameter representing the location of the location of interest within the novel image. This clinical parameter or a representation of the clinical parameter can be stored on the non-transitory computer readable medium 110 and/or provided to a user via an associated output device.
[0022] FIG. 2 illustrates another example of a system 200 for determining a location of interest on a reproductive cellular structure. Specifically, the system 200 generates a categorical clinical parameter representing a location on the reproductive cellular structure. The system 200 includes an imager 202 that acquires an image of the reproductive cellular structure on at least one day of development. For example, the imager 202 can include one or more cameras, capable of producing images in the visible or infrared range, paired with appropriate optics to provide an image of a reproductive cellular structure. In one implementation, the imager 202 can be implemented to capture images of an embryo at multiple days of development as part of a time-lapse embryo imaging system. In another implementation, the imager 202 can be configured to generate a static observation of the reproductive cellular structure as a set of one or more images. In one implementation, the imager 202 includes an attachment for a mobile device that operates with a camera of the mobile device to provide the images of the reproductive cellular structure. The housing for the attachment can be 3-D printed using polylactic acid with dimensions of 82 x 34 x 48 mm. An acrylic lens can be included in the housing to provided appropriate magnification for the embryo images.
[0023] In another implementation, the imager 202 can be implemented as a standalone system with an optical housing that is 3-D printed from polylactic acid and overall dimensions of 62 x 92 x 175 mm. The housing contains an electronic circuit with a white light-emitting diode, a three-volt battery, and a single pole double-throw switch. The embryo sample is transilluminated, with a 10x Plan-Achromatic objective lens for image magnification and a complementary metal-oxide-semiconductor (CMOS) image sensor for embryo image data acquisition. The CMOS sensor can be connected to a single-board computer to process the captured images. The imager 202 can be connected to a mobile device via a wireless connection (e.g., Wi-Fi, Bluetooth, or a similar connection) for data processing and visualization.
[0024] The one or more images obtained at the imager 202 are provided to an analysis system 210 comprising a processor 212, an output device 214, and a non- transitory computer readable medium 220 storing instructions executable by the processor. The instructions are executable to provide an imager interface 222 that receives the image or images of the reproductive cellular structure. The imager interface 222 can apply one or more imaging condition techniques, such as cropping and filtering, to better prepare the image for analysis. The images are then provided to a neural network 224 that provides the categorical clinical parameter representing the desired location.
[0025] In one implementation, the neural network 224 can be a convolutional neural network, which is a feed-forward artificial neural network that includes convolutional layers, which effectively apply a convolution to the values at the preceding layer of the network to emphasize various sets of features within an image. In a convolutional layer, each neuron is connected only to a proper subset of the neurons in the preceding layer, referred to as the receptive field of the neuron. In one example, the convolutional neural network is implemented using the Xception architecture. In one implementation, at least one chromatic value (e.g., a value for an RGB color channel, a YCrCb color channel, or a grayscale brightness) associated with each pixel is provided as an initial input to the convolutional neural network.
[0026] In another implementation, the neural network 224 can be implemented as a recurrent neural network. In a recurrent neural network, the connections between nodes in the network are selected to form a directed graph along a sequence, allowing it to exhibit dynamic temporal behavior. In another implementation, the neural network 224 is implemented and trained as a discriminative network in a generative adversarial model, in which a generative neural network and the discriminative network provide mutual feedback to one another, such that the generative neural network produces increasingly sophisticated samples for the discriminative network to attempt to classify. Regardless of the structure of the neural network 224, some or all layers of the neural network can be trained via transfer learning from another system, with only some of the layers trained on the training images of the reproductive cellular structure. A final layer of the neural network 224 can be implemented as a softmax layer to provide a classification result.
[0027] In response to a novel image, the neural network 224 generates a clinical parameter representing the portion of the image containing the location of interest. The clinical parameter can be provided to a user at the output device 214 via a user interface 226. For example, the user interface 226 can include appropriate software instructions for receiving the output of the neural network 224 and presenting it at the output device 214. In one implementation, the output device 214 can include a mobile device that communicates wirelessly with the analysis system 210. In one example, the clinical parameter can be provided to the user as a representation of the cellular reproductive structure with the section of the cellular reproductive structure highlighted within the representation.
[0028] In view of the foregoing structural and functional features described above, a method in accordance with various aspects of the present invention will be better appreciated with reference to FIGS. 3 and 4. While, for purposes of simplicity of explanation, the methods of FIGS. 3 and 4is shown and described as executing serially, it is to be understood and appreciated that the present invention is not limited by the illustrated order, as some aspects could, in accordance with the present invention, occur in different orders and/or concurrently with other aspects from that shown and described herein. Moreover, not all illustrated features may be required to implement a method in accordance with an aspect the present invention.
[0029] FIG. 3 illustrates one method 300 for determining a clinical parameter representing the location of interest within a reproductive cellular structure. In one example, the method 300 can be used to determine the location of a polar body within an oocyte, with the clinical parameter representing a portion of the oocyte containing the polar body. In another example, the method can be used to determine a location on the zona pellucida that is furthest from a healthy blastomere, with the clinical parameter representing the appropriate location. At 302, an image of the reproductive cellular structure is obtained. For example, an image can be captured at an appropriate imager and provided to a computer system for image processing.
[0030] At 304, the image of the reproductive cellular structure is provided to a neural network to generate the clinical parameter. Each possible value of the clinical parameter represents a designated location within a representation of the reproductive cellular structure, such that the clinical parameter indicates the location of interest within the image. The neural network can be implemented as any of a convolutional neural network, a recurrent neural network, and a discriminative classifier trained as part of a generative adversarial network. At 306, the clinical parameter is stored on a non- transitory computer readable medium.
[0031] FIG. 4 illustrates a method 400 for planning an assisted fertility procedure. At 402, an image of the reproductive cellular structure is obtained. For example, an image can be captured at an appropriate imager and provided to a computer system for image processing. At 404, the image of the reproductive cellular structure is provided to a neural network to generate a clinical parameter representing a location of interest within the reproductive cellular structure. Each possible value of the clinical parameter represents a designated location within a representation of the reproductive cellular structure, such that the clinical parameter indicates the location of interest within the image. The neural network can be implemented as any of a convolutional neural network, a recurrent neural network, and a discriminative classifier trained as part of a generative adversarial network.
[0032] At 406, a procedure can be performed on the reproductive cellular structure at a location determined according to the clinical parameter. In one example, the method 400 can be used to determine the location of a polar body within an oocyte, with the clinical parameter representing a portion of the oocyte containing the polar body. The location of the polar body can be used to determine an appropriate location for intracytoplasmic sperm injection. For example, the intracytoplasmic sperm injection can be performed at a location spaced ninety degrees from the polar body. In another example, the method 400 can be used to determine a location on the zona pellucida that is furthest from a healthy blastomere to avoid damaging healthy blastomeres during laser assisted hatching. In one implementation, the procedure can be performed by an automated robotic system based upon the location represented by the clinical parameter.
[0033] FIG. 5 is a schematic block diagram illustrating an exemplary system 500 of hardware components capable of implementing examples of the systems and methods disclosed in FIGS. 1 -4, such as the systems illustrated in FIGS. 1 and 2. The system 500 can include various systems and subsystems. The system 500 can be any of personal computer, a laptop computer, a workstation, a computer system, an appliance, an application-specific integrated circuit (ASIC), a server, a server blade center, or a server farm. [0034] The system 500 can includes a system bus 502, a processing unit 504, a system memory 506, memory devices 508 and 510, a communication interface 512 (e.g., a network interface), a communication link 514, a display 516 (e.g., a video screen), and an input device 518 (e.g., a keyboard and/or a mouse). The system bus 502 can be in communication with the processing unit 504 and the system memory 506. The additional memory devices 508 and 510, such as a hard disk drive, server, stand-alone database, or other non-volatile memory, can also be in communication with the system bus 502. The system bus 502 interconnects the processing unit 504, the memory devices 506-510, the communication interface 512, the display 516, and the input device 518. In some examples, the system bus 502 also interconnects an additional port (not shown), such as a universal serial bus (USB) port.
[0035] The system 500 could be implemented in a computing cloud. In such a situation, features of the system 500, such as the processing unit 504, the communication interface 512, and the memory devices 508 and 510 could be representative of a single instance of hardware or multiple instances of hardware with applications executing across the multiple of instances (i.e., distributed) of hardware (e.g., computers, routers, memory, processors, or a combination thereof). Alternatively, the system 500 could be implemented on a single dedicated server.
[0036] The processing unit 504 can be a computing device and can include an application-specific integrated circuit (ASIC). The processing unit 504 executes a set of instructions to implement the operations of examples disclosed herein. The processing unit can include a processing core.
[0037] The additional memory devices 506, 508, and 510 can store data, programs, instructions, database queries in text or compiled form, and any other information that can be needed to operate a computer. The memories 506, 508 and 510 can be implemented as computer-readable media (integrated or removable) such as a memory card, disk drive, compact disk (CD), or server accessible over a network. In certain examples, the memories 506, 508 and 510 can comprise text, images, video, and/or audio, portions of which can be available in formats comprehensible to human beings. [0038] Additionally or alternatively, the system 500 can access an external data source or query source through the communication interface 512, which can communicate with the system bus 502 and the communication link 514. [0039] In operation, the system 500 can be used to implement one or more parts of an embryo evaluation system in accordance with the present invention. Computer executable logic for implementing the composite applications testing system resides on one or more of the system memory 506, and the memory devices 508, 510 in accordance with certain examples. The processing unit 504 executes one or more computer executable instructions originating from the system memory 506 and the memory devices 508 and 510. It will be appreciated that a computer readable medium can include multiple computer readable media each operatively connected to the processing unit.
[0040] Specific details are given in the above description to provide a thorough understanding of the embodiments. However, it is understood that the embodiments can be practiced without these specific details. For example, circuits can be shown in block diagrams in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques can be shown without unnecessary detail in order to avoid obscuring the embodiments. [0041] Implementation of the techniques, blocks, steps, and means described above can be done in various ways. For example, these techniques, blocks, steps, and means can be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units can be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
[0042] Also, it is noted that the embodiments can be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart can describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations can be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process can correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function. [0043] Furthermore, embodiments can be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks can be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction can represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment can be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. can be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, ticket passing, network transmission, etc.
[0044] For a firmware and/or software implementation, the methodologies can be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions can be used in implementing the methodologies described herein. For example, software codes can be stored in a memory. Memory can be implemented within the processor or external to the processor. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
[0045] Moreover, as disclosed herein, the term "storage medium" can represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information. The terms “computer readable medium” and "machine readable medium" includes, but is not limited to portable or fixed storage devices, optical storage devices, wireless channels, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data. It will be appreciated that a “computer readable medium” or “machine readable medium” can include multiple media each operatively connected to a processing unit. In such a case, when it is stated that data is stored at the computer readable medium, it can refer to any of the interconnected media within the system.
[0046] While the principles of the disclosure have been described above in connection with specific apparatuses and methods, it is to be clearly understood that this description is made only by way of example and not as limitation on the scope of the disclosure.

Claims

Having described the invention, we claim:
1 . A system comprising: a processor; and a non-transitory computer readable medium storing machine executable instructions for assigning a value representing a location of interest within a reproductive cellular structure, the machine executable instructions comprising: an imager interface that receives an image of the reproductive cellular structure from an associated imager; and a neural network that determines, from the image of the reproductive cellular structure, a clinical parameter representing the location of interest within the reproductive cellular structure; and wherein the clinical parameter is stored at the non-transitory computer readable medium.
2. The system of claim 1 , further comprising an output device, the machine executable instructions comprising a user interface that displays the clinical parameter to a user at the output device.
3. The system of claim 1 , wherein the reproductive cellular structure is an oocyte and the clinical parameter represents a location of the polar body within the oocyte.
4. The system of claim 1 , wherein the reproductive cellular structure is an embryo and the clinical parameter represents a location on the zona pellicuda that is furthest from a healthy blastomere.
5. The system of claim 1 , wherein the neural network is trained on a set of images, each labelled with a class of a plurality of classes, with each of the plurality of classes representing a section of the reproductive cellular structure containing the location of interest, the clinical parameter comprising a categorical parameter representing one of the plurality of classes.
6. The system of claim 1 , wherein the neural network is a convolutional neural network.
7. The system of claim 1 , further comprising the imager, the imager comprising: a white light emitting diode; a complementary metal-oxide-semiconductor (CMOS) image sensor; and an objective lens connected to the CMOS image sensor.
8. The system of claim 9, further comprising the imager, the imager comprising a plastic housing containing an acrylic lens and configured to affix to a mobile device, such that the acrylic lens is aligned with a camera of the mobile device.
9. A method for determining a clinical parameter representing the location of interest within a reproductive cellular structure, the method comprising: obtaining an image of the reproductive cellular structure; and providing the image of the reproductive cellular structure to a neural network to generate the clinical parameter; and storing the clinical parameter on a non-transitory computer readable medium.
10. The method of claim 9, further comprising performing a procedure on the reproductive cellular structure at a location determined according to the clinical parameter.
11 . The method of claim 10, wherein the procedure on the reproductive cellular structure is one of intracytoplasmic sperm injection and laser assisted hatching.
12. The method of claim 9, wherein the reproductive cellular structure is an oocyte and the clinical parameter represents a location of the polar body within the oocyte.
13. The method of claim 9, wherein the reproductive cellular structure is an embryo and the clinical parameter represents a location on the zona pellicuda that is furthest from a healthy blastomere.
14. The method of claim 9, wherein providing the image of the embryo to the neural network comprises providing the image of the embryo to a discriminative classifier trained as part of a generative adversarial network.
15. The method of claim 9, wherein providing the image of the embryo to the neural network comprises providing the image of the embryo to a recurrent neural network.
16. A method for planning an assisted fertility procedure, the method comprising: obtaining an image of a reproductive cellular structure; and providing the image of the reproductive cellular structure to a neural network to generate a clinical parameter representing a location of interest within the reproductive cellular structure; and performing a procedure on the reproductive cellular structure at a location determined according to the clinical parameter.
17. The method of claim 16, wherein the reproductive cellular structure is an oocyte and the clinical parameter represents a location of the polar body within the oocyte.
18. The method of claim 17, wherein performing a procedure on the reproductive cellular structure at the location determined according to the clinical parameter comprises performing intracytoplasmic sperm injection at a location spaced ninety degrees from the polar body.
19. The method of claim 16, wherein the reproductive cellular structure is an embryo and the clinical parameter represents a location on the zona pellicuda that is furthest from a healthy blastomere.
16
20. The method of claim 19, wherein performing a procedure on the reproductive cellular structure at the location determined according to the clinical parameter comprises performing laser assisted hatching at a location indicated by the clinical parameter.
17
EP21867751.6A 2020-09-11 2021-09-13 Determining locations in reproductive cellular structures Pending EP4210564A4 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063077398P 2020-09-11 2020-09-11
US202063077405P 2020-09-11 2020-09-11
PCT/US2021/050050 WO2022056370A1 (en) 2020-09-11 2021-09-13 Determining locations in reproductive cellular structures

Publications (2)

Publication Number Publication Date
EP4210564A1 true EP4210564A1 (en) 2023-07-19
EP4210564A4 EP4210564A4 (en) 2024-10-16

Family

ID=80630102

Family Applications (2)

Application Number Title Priority Date Filing Date
EP21867754.0A Pending EP4211606A4 (en) 2020-09-11 2021-09-13 Automated aneuploidy screening using arbitrated ensembles
EP21867751.6A Pending EP4210564A4 (en) 2020-09-11 2021-09-13 Determining locations in reproductive cellular structures

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP21867754.0A Pending EP4211606A4 (en) 2020-09-11 2021-09-13 Automated aneuploidy screening using arbitrated ensembles

Country Status (9)

Country Link
US (2) US20230326014A1 (en)
EP (2) EP4211606A4 (en)
JP (2) JP7523680B2 (en)
KR (2) KR20230075468A (en)
CN (1) CN116438585A (en)
AU (3) AU2021338858A1 (en)
BR (2) BR112023004565A2 (en)
CA (2) CA3192441A1 (en)
WO (2) WO2022056374A1 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014134550A1 (en) 2013-02-28 2014-09-04 Auxogyn, Inc. Apparatus, method, and system for image-based human embryo cell classification
US10902334B2 (en) 2016-01-28 2021-01-26 Gerard Letterie Automated image analysis to assess reproductive potential of human oocytes and pronuclear embryos
GB201806999D0 (en) * 2018-04-30 2018-06-13 Univ Birmingham Automated oocyte detection and orientation
US20210200986A1 (en) 2018-05-25 2021-07-01 Sony Corporation Control device, control method, and program
KR20210078488A (en) 2018-09-20 2021-06-28 에이아이브이에프 엘티디 Image feature detection
US11926809B2 (en) * 2018-09-28 2024-03-12 Brigham And Women's Hospital, Inc. Automated evaluation of sperm morphology
CN113260894B (en) 2018-12-28 2023-04-04 仪景通株式会社 Microscope system
WO2020157761A1 (en) * 2019-01-31 2020-08-06 Amnon Buxboim Automated evaluation of embryo implantation potential
US10646156B1 (en) * 2019-06-14 2020-05-12 Cycle Clarity, LLC Adaptive image processing in assisted reproductive imaging modalities

Also Published As

Publication number Publication date
AU2021339829B2 (en) 2024-06-13
US20230326165A1 (en) 2023-10-12
US20230326014A1 (en) 2023-10-12
BR112023004565A2 (en) 2023-04-04
BR112023004578A2 (en) 2023-04-11
AU2021338858A1 (en) 2023-05-18
JP2023541841A (en) 2023-10-04
EP4211606A1 (en) 2023-07-19
EP4211606A4 (en) 2024-10-09
KR20230084176A (en) 2023-06-12
JP7545575B2 (en) 2024-09-04
CA3192444A1 (en) 2022-03-17
KR20230075468A (en) 2023-05-31
AU2024219507A1 (en) 2024-09-26
WO2022056374A1 (en) 2022-03-17
WO2022056370A1 (en) 2022-03-17
AU2021339829A1 (en) 2023-05-25
CN116438585A (en) 2023-07-14
EP4210564A4 (en) 2024-10-16
CA3192441A1 (en) 2022-03-17
JP2023541165A (en) 2023-09-28
JP7523680B2 (en) 2024-07-26
AU2021339829A9 (en) 2024-02-08

Similar Documents

Publication Publication Date Title
CN111279421B (en) Automatic evaluation of human embryos
Louis et al. Review of computer vision application in in vitro fertilization: the application of deep learning-based computer vision technology in the world of IVF
AU2024200572A1 (en) Automated evaluation of quality assurance metrics for assisted reproduction procedures
Baručić et al. Automatic evaluation of human oocyte developmental potential from microscopy images
CN111401183A (en) Artificial intelligence-based cell body monitoring method, system, device and electronic equipment
US20230326165A1 (en) Determining locations in reproductive cellular structures
CN116456892A (en) Determination of position in germ cell structure
CN115036021A (en) Embryo development monitoring method based on space dynamics parameters
EP4178485A1 (en) Predicting embryo implantation probability
RU2810125C1 (en) Automated assessment of quality assurance indicators for assisted reproduction procedures
Eswaran et al. Deep Learning Algorithms for Timelapse Image Sequence-Based Automated Blastocyst Quality Detection
US20240037743A1 (en) Systems and methods for evaluating embryo viability using artificial intelligence
WO2022150914A1 (en) Systems and methods for non-invasive preimplantation embryo genetic screening

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230411

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230728

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: A61B0005000000

Ipc: G16H0050200000

A4 Supplementary search report drawn up and despatched

Effective date: 20240917

RIC1 Information provided on ipc code assigned before grant

Ipc: G06V 20/69 20220101ALI20240911BHEP

Ipc: G06V 10/82 20220101ALI20240911BHEP

Ipc: G06V 10/25 20220101ALI20240911BHEP

Ipc: G06T 7/70 20170101ALI20240911BHEP

Ipc: G16H 30/40 20180101ALI20240911BHEP

Ipc: G16H 50/70 20180101ALI20240911BHEP

Ipc: G16H 50/20 20180101AFI20240911BHEP