WO2013084123A2 - Selection of images for optical examination of the cervix - Google Patents
Selection of images for optical examination of the cervix Download PDFInfo
- Publication number
- WO2013084123A2 WO2013084123A2 PCT/IB2012/056855 IB2012056855W WO2013084123A2 WO 2013084123 A2 WO2013084123 A2 WO 2013084123A2 IB 2012056855 W IB2012056855 W IB 2012056855W WO 2013084123 A2 WO2013084123 A2 WO 2013084123A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging system
- medical imaging
- cervix
- images
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
Definitions
- the invention relates to the field optical examination of the cervix, particularly to the field of colposcopy.
- Cancer arising from the cervix is the number one cancer in women in many countries. About 30% of cancers in women are due to cervical cancer with more than
- colposcopic examination is routinely used as the second diagnostic step by gynecologists for identification of abnormal areas of the cervix.
- a colposcope is a low-power, stereoscopic, binocular field microscope with a powerful light source used for magnified visual examination of the uterine cervix to help in the diagnosis of cervical cancer.
- a colposcope is used to identify visible clues suggestive of abnormal tissue. It functions as a lighted binocular microscope to magnify the view of the cervix, vagina, and vulvar surface. Low power (2x to 6 X ) may be used to obtain a general impression of the surface architecture. Medium (8 X to 15 x ) and high (15 x to 25 x ) powers are utilized to evaluate the vagina and cervix. The higher powers are often necessary to identify certain vascular patterns that may indicate the presence of more advanced precancerous or cancerous lesions. Various light filters are available to highlight different aspects of the surface of the cervix.
- Acetic acid (usually 3 - 5 %) is applied to the cervix by means of, e.g., cotton swabs, or spray. Areas with a high risk of neoplasia, or cancer, will appear as varying degrees of whiteness, because acetowhiteness correlates with higher nuclear density. The term "acetowhiteness" is used in contradistinction to areas of hyperkeratosis or leukoplakia which appear white before the application of acetic acid.
- the transformation zone is a critical area on the cervix where many precancerous and cancerous lesions most often arise. The ability to see the transformation zone and the entire extent of any visualized lesion determines whether an adequate colposcopic examination is attainable.
- cervix Areas of the cervix which turn white after the application of acetic acid or have an abnormal vascular pattern are often considered for biopsy. If no lesions are visible, an iodine solution may be applied to the cervix to help highlight areas of abnormality.
- the colposcopist determines the areas with the highest degree of visible abnormality and may obtain biopsies from these areas using a long biopsy instrument. Most doctors and patients consider anesthesia unnecessary, however, some colposcopists now recommend and use a topical anesthetic such as lidocaine or a cervical block to diminish patient discomfort, particularly if many biopsy samples are taken.
- a topical anesthetic such as lidocaine or a cervical block
- the invention provides for a medical imaging system, a computer program product and a method of operating a medical imaging system in the independent claims.
- Embodiments of the invention may provide a means of addressing the lack of qualified persons for interpreting a colposcope test by providing an automated system for selecting a diagnostic image.
- video data from various phases of the examination is classified and satisfactory images are selected using feature correspondence.
- the quality of the diagnostic image is evaluated to see if it is satisfactory for performing a diagnosis or not. Both embodiments may enable less well trained health care practitioners to obtain diagnostic images which are suitable for diagnosis and then may be subsequently passed on to better trained health care practitioners who are able to provide a diagnosis.
- Current colposcopy examinations are subjective and depend on the knowledge and experience of a gynaecologist for the interpretation. It is thus an object of the present invention to eliminate subjectivity from the process. It is another object of the present invention to increase usability of colposcopy by reducing the learning curve and assisting in diagnosis. It is another object of the present invention to provide better confidence to the user (gynaecologist). It is still another object of the present invention to provide a quantitative measure for the degree of cancer severity.
- a system for optical examination of the cervix comprising optical magnification means, illumination means, dispensing means for administration of at least one stimulation and/or contrasting agent, imaging means and image processing means.
- Said image processing means further comprises key frame extraction means, optionally, glare removal means, object detection means (also called “object detector”) and opacity change detection means.
- system further comprises operator interfacing means for data input and data output.
- interfacing means is, for example, a display screen, a keyboard, a mouse, a touch screen, a touchpad, a joystick or likewise.
- said stimulation and/or contrasting agents are selected from the group consisting of acetic acid and/or iodine solution, preferably Lugol's solution or
- the imaging means comprises either a digital imaging device, or a non-digital camera and a frame grabber.
- Said digital imaging device is preferably a digital camera.
- Said digital or non-digital camera comprises, preferably, a CCD (Charge Coupled Device) or a CMOS (Metal Oxide Semiconductor) camera.
- the optical magnification means is a colposcope.
- a colposcope is a low-power, stereoscopic, binocular field microscope with a powerful light source, which is used for magnified visual examination of the uterine cervix to help in the diagnosis of cervical cancer.
- the system further comprises a computer workstation for controlling at least one of the means selected from the group consisting of an optical magnification means, illumination means, dispensing means for administration of stimulation agents, imaging means, imaging processing means, and/or operator interfacing means.
- a method of optical examination of the cervix of a patient comprising the steps of a) applying, at least once, a stimulating and/or contrasting agent to the cervix;
- Said patient is preferably, a mammal, particularly preferred a primate, and more particularly preferred a human. It is important to mention that the above mentioned steps do not necessarily have to be carried out in the given order.
- the stimulating and/or contrasting agent is at least one selected from the group consisting of opacity difference score detection agent and/or transformation zone detection agent.
- the opacity difference score detection agent has two different purposes: First, essential anatomical objects can be identified after stimulation and/or contrasting with said agent. Second, the opacity difference score detection agent serves to create opacity difference scores, which are indicative of neoplastic or cancerous processes, as the degree of acetowhiteness correlates with higher nuclear density in the respective tissues.
- the transformation zone detection agent has, preferably, a deep colour (iodine solution, for example, has a deep purple or brown colour, also termed “mahogony brown”), and serves as a contrasting agent to detect the transformation zone in the cervix, or the area, or shape, of the latter, respectively.
- Said opacity difference score detection agent is, preferably, acetic acid
- said transformation zone detection agent is, preferably, iodine solution, such as Lugol's solution or Schiller's solution.
- at least one image, or frame is acquired before and after application of the opacity difference score detecting agent, (for convenience, these images will be called “pre- and post-acetic acid images” in the following, although said terms will also be used in connection with other opacity difference score detection agents), and/or before and after application of the transformation zone detection agent, (for convenience, these images will be called “pre- and post-iodine solution images” in the following, although said terms will also be used in connection with other transformation zone detection agents).
- alternative stimulating and/or contrasting agents can be derived from the respective literature by routine work, as a matter of course.
- Alternative agents which can be used to detect areas of higher nuclear density are, for example
- An actual transformation zone is identified by subtracting, in the pre- and post-acetic acid image, the Os and columnar regions from the tentative transformation zone.
- step b) of the above list post-iodine solution images are processed to tentatively detect the transformation zone, based on the color changes that the effected cervical zone depicts on the application of the iodine solution.
- the post-iodine solution image obtained from the key frame extractor is segmented using color based K- means clustering into two clusters. The smaller of the two clusters is selected.
- the convex hull of this cluster is defined as the tentative transformation zone; (ii) In a second step (step c) of the above list, the tentative transformation zone is mapped to the pre- and post-acetic acid images and then the detected Os and columnar epithelium regions (detected according to step a) of the above listing, are subtracted to define the actual transformation zone (see step d) of the above listing.
- the pre- and post-acetic acid images can be registered before identifying the transformation zone.
- the respective image analysis process is shown in Figs. 6 and 7.
- the opacity difference score is generated by image processing of at least one pre- acetic acid image and one post-acetic acid image. For said processing, preferably only the actual transformation zone data of these images are used based on the prior identification of the actual transformation zone.
- the actual transformation zone (or, preferably, its shape and/or overall area), as determined according to the method of the present invention, and/or the opacity difference score as determined according to the method of the present invention, is indicative of a neoplastic and/or cancerous process.
- the actual transformation zone as determined after iodine solution staining, and mapping the latter to the post-acetic acid image, can also indicate a cancerous region.
- the opacity difference score as determined by image processing of at least one pre-acetic acid image and one post-acetic acid image serves to determine the degree of whiteness in the cancerous region, and thus the degree, or severity, of the cancer.
- the shape and/or the total area of the actual transformation zone can be indicative of cancer.
- the opacity difference score can thus be used to confirm, or revoke, the suspicious neoplastic or cancerous region.
- the acquisition of magnified images of the cervix, the identification of essential anatomical objects and/or the generation of opacity difference scores is performed with the help of imaging means and/or image processing means.
- a colposcope is used for this purpose.
- the acquisition of magnified images of the cervix comprises the steps of
- the identification of sequences of interrelated images and/or the identification of key frames comprises, preferably, the steps of:
- the identification of essential anatomical objects in the images comprises the steps of:
- the generation of opacity difference score comprises the steps of
- OpacityDifferenceScore ⁇ ( ( ⁇ >7) _ J( j)) x r i j))-
- the method according to the invention further comprises at least one step selected from the group consisting of a PAP test and/or a molecular test for the Human Papilloma virus.
- the PAP test (named after Dr. George Papanicolaou, who has deveolped it in the first half of the last century) is a screening test used in gynaecology to detect premalignant and malignant processes.
- a speculum is used to gather cells from the outer opening of the cervix of the uterus and the endocervix. The cells are examined histologically for abnormalities, particularly for neoplastic or cancerous processes of the cervix.
- HPV human papillomavirus
- HPV infection is the cause for nearly all cases of cervical cancer.
- the presence of viral nucleic acids in a sample taken from the cervix is indicative for HPV infection of the cervix, and is thus a measure for the risk of a patient to develop neoplastic or cancerous processes in the cervix.
- Modern molecular diagnostic techniques like PCR, allow for the quick detection of HPV in a sample.
- One such test is manufactured by Qiagen and is distributed as the digene HC2 HPV DNA test.
- the invention provides for a system for optical examination of the cervix, said system comprising optical magnification means, illumination means, dispensing means for administration of at least one stimulation and/or contrasting agent, imaging means, and image processing means, said image processing means further comprising
- ⁇ optionally, glare removal means
- system further comprises operator interfacing means for data input and data output.
- the optical magnification means is a colposcope.
- said system further comprises a computer workstation for controlling at least one of the means selected from the group consisting of optical magnification means, illumination means, dispensing means for administration of stimulation agents, imaging means, imaging processing means, and/or operator interfacing means.
- the invention provides for a method of optical examination of the cervix of a patient, said method comprising the steps of a) applying, at least once, a stimulating and/or contrasting agent to the cervix b) acquiring magnified images of the cervix before and after each application of a stimulation agent
- the stimulating and/or contrasting agent is at least one selected from the group consisting of opacity difference score detection agent, and/or transformation zone detection agent.
- the opacity difference score detection agent is acetic acid, and/or the transformation zone detection agent is iodine solution.
- At least one image is acquired before and after application of the opacity difference score detecting agent ("pre- and post-acetic acid images") and/or before and after application of the transformation zone detection agent (“pre- and post-iodine solution images”).
- Os and columnar regions are identified in the post-iodine solution image, a tentative transformation zone is identified said tentative transformation zone is mapped to the post-acetic acid image, and an actual transformation zone is identified by subtracting, in the pre- and post- acetic acid image, the Os and columnar regions from the tentative transformation zone.
- the opacity difference score is generated by image processing of at least one pre-acetic acid image and one post-acetic acid image.
- the actual transformation zone, as determined according to claim 9, and/or b) the opacity difference score as determined according to claim 10, is indicative of a neoplastic, or cancerous, process.
- OpacityDifferenceScore ⁇ ( ( ⁇ >7) _ J( j)) x r i j))-
- said method further comprises at least one step selected from the group consisting of a PAP test and/or a molecular test for human Papilloma virus.
- the invention provides for a system according to -an embodiment of the invention for carrying out a method according to an embodiment of the invention.
- the invention provides for a medical imaging system, a computer program product and a method of operating a medical imaging system in the independent claims. Embodiments are given in the dependent claims.
- the invention provides for a medical imaging system comprising a processor for controlling the medical imaging system.
- the medical imaging system further comprises a memory for storing machine-executable instructions for execution by the processor. Execution of the instructions causes the processor to receive image data.
- the image data comprises multiple images of a cervix.
- the image data may comprise multiple still images or the image data may comprise multiple images which form video data.
- Execution of the instructions further causes the processor to select a diagnostic image from the image data.
- execution of the instructions further causes the processor to forward the image data to a computer system. That is to say the medical imaging system receives image data, selects a diagnostic image and then forwards the diagnostic image to a computer system.
- the image data comprises video data.
- the video data comprises the multiple images.
- the video data comprises at least two image sequences.
- An image sequence as used herein is a collection of images which show the cervix in a particular state.
- Execution of the instructions causes the processor to identify each of the at least two image sequences.
- Execution of the instructions further causes the processor to select at least two diagnostic images from each of the at least two image sequences.
- the diagnostic image is one of the at least two diagnostic images. That is to say the diagnostic image is one of the images selected from an image sequence.
- the at least two diagnostic images from each of the at least two image sequences are any one of the following: a single image, a series of images selected at a predetermined interval, and a video clip.
- the at least two image sequences are identified by determining a number of feature correspondences of the video data using a feature identification algorithm.
- An alternative for the analysis of feature correspondences is a method based on color histograms. As used herein the determining of the number of feature correspondences is intended to be equivalent to using the method based on color histograms of the image.
- the at least two image sequences are further identified by identifying transitions between the image sequence by detecting transitions in the feature
- the transition in the feature correspondences could have several different meanings. For instance it could be a change in the number of feature correspondences identified from frame-to-frame. It could also mean using the strategy of a local minimum to determine the transition. Some embodiments may be identifying transitions between image sequences by analyzing the temporal variation in the number of features over time. One way is by detecting local minimums in the number of feature correspondences measured over a temporal window of more than one frame. In another embodiment the identification of transitions between image sequences is performed by detecting local minimums in the number of identifiable features or the number of corresponding identifiable features in the adjacent frames.
- the feature identification algorithm is operable for mapping identical features between the adjacent frames using warping and translation.
- the at least two image sequences are any one of the following: a cleaning sequence, a green filter sequence, and an aceto white sequence, an iodine sequence, a detailed regions sequence, and combinations thereof.
- the medical imaging system is a system for optical examination of the cervix.
- Said system comprising optical magnification means, illumination means, dispensing means for administration of at least one simulation and/or contrasting agent, imaging means, and image processing means.
- the image processing means further comprises key frame extraction means, optionally, glare removal means, object detection means, and opacity change detection means.
- execution of the instructions further causes the processor to segment the diagnostic image to generate an image segmentation that identifies the old squamocolumnar junction.
- Execution of the instructions further causes the processor to determine a quality rating of the diagnostic image using the image segmentation.
- Execution of the instructions further causes the processor to reject the diagnostic image if the quality rating is below a predetermined level. Execution of the instructions further causes the processor to accept the diagnostic image if the quality rating is above the predetermined level.
- the quality rating indicates if the old squamocolumnar junction is completely within the diagnostic image. If a portion of the old squamocolumnar junction is outside of the diagnostic image then the diagnostic image is rejected. In some embodiments if the old squamocolumnar junction is too small within the diagnostic image the image may also be rejected.
- the image segmentation further identifies the location of an internal os where the diagnostic image has a center.
- Execution of the instructions causes the processor to determine a cervix center using the segmentation of an internal os.
- the cervix center may be the internal os's center.
- Execution of the instructions further causes the processor to transmit the diagnostic image to place the cervical center at the center of the image.
- Execution of the instructions further causes the processor to generate a partitioned image which shows the diagnostic image partitioned into a predetermined number of segments. This embodiment may be beneficial because it places the medical image in a form which is more useful for a physician or healthcare provider.
- the image is partitioned in 12 pie-like segments.
- the image is partitioned into pie-like segments.
- the outer boundary of the segmentation is the cervical boundary.
- the image is further segmented to locate the: columnar epithelium, transformation zone, and cervical boundary.
- the partitioned image is sent to a mobile telephone device of an expert such as a physician or healthcare provider.
- the image segmentation further identifies the location of any one of the following: a columnar epithelium, a transformation zone, a cervical boundary, and combinations thereof. Execution of the instructions causes the processor to display at least a portion of the image segmentation in the partitioned image.
- embodiment may be beneficial because the location of any one of the aforementioned regions may help a physician or healthcare provider with performing a diagnosis. It may also enable unskilled or untrained persons to better interpret the partitioned image.
- the displayed image segmentation can include: the internal os, a columnar epithelium, a transformation zone, a cervical boundary, the new squamocolumnar junction, and/or the old squamocolumnar junction.
- execution of the instructions causes the processor to request a replacement image if the diagnostic image is rejected. This embodiment is advantageous because if an unskilled person is using the medical imaging system he or she can be prompted to acquire another diagnostic image.
- the medical imaging system comprises a cloud-based computing system.
- the cloud-based computing system generates the image segmentation. This embodiment may be advantageous when the image data is acquired using a portable or small handheld optical device.
- the medical imaging data can be forwarded to a cloud-based computing system for selecting the diagnostic image and performing any image
- execution of the instructions causes the processor to send the diagnostic image to a mobile-based reporting application.
- a physician or healthcare provider may have a patient management or reporting application on a smart cell phone, computer or tablet.
- the medical imaging system further comprises a camera for acquiring the image data.
- This camera may be built into the medical imaging system or it may be an external camera which then transfers the image data to the medical imaging system.
- the invention provides for a computer program for execution by a processor for controlling a medical imaging system. Execution of the instructions causes the processor to receive image data. The image data comprises multiple images of a cervix. Execution of the instructions further causes the processor to select a diagnostic image from the image data.
- the invention provides a method of operating the medical imaging system. The method comprises receiving image data. The image data comprises multiple images of a cervix. The method further comprises selecting a diagnostic image from the diagnostic data.
- optical magnification means relates to a device, or algorithm, which is capable of magnifying an optical image, for example a magnification lens, a microscope, or a digital image processing system in which the magnification is carried out digitally, or electronically.
- the term "colposcope” refers to a low-power, stereoscopic, binocular field microscope with a powerful light source, which is used for magnified visual examination of the uterine cervix to help in the diagnosis of cervical cancer.
- Os relates to the external orifice of the cervix, which is an interior narrowing of the uterine cavity. It corresponds to a slight constriction known as the isthmus that can be observed on the surface of the uterus about midway between the apex and base.
- the term "columnar region” relates to a region of the epithelium of the cervix.
- the ectocervix is composed of non-keratinized stratified squamous epithelium.
- the endocervix (more proximal, within the uterus) is composed of simple columnar epithelium, i.e., the "columnar region".
- the term "transformation zone” relates to the area adjacent to the border of the endocervix and ectocervix. The transformation zone undergoes metaplasia numerous times during normal life. This metaplastic potential, however, increases the risk of cancer in this area - the transformation zone is the most common area for cervical cancer to occur.
- transformation zone relates to the transformation zone as provisionally detected after iodine solution staining.
- actual transformation zone relates to the transformation zone as determined after mapping the tentative transformation zone to the post-acetic acid image.
- the actual transformation zone is often also called “cancerous region” due to high risk of neoplastic, or cancerous, processes.
- dispenser means relates to a device which is useful for applying, in a controlled manner with respect to time, volume and position, at least one stimulation and/or contrasting agent to a given object.
- dispensing means is a syringe, a pipette or the like.
- frame grabber means a device which has the capability to convert the output of an analogue video frame imaging device or analogue scan converter into a digital image for further image processing.
- key frame extraction means relates to a device or an algorithm that can automatically identify at least one pre-acetic acid image, post-acetic acid image and post- Iodine solution image.
- glare removal means relates to a device, or an algorithm, which is capable of removing glare in a digital image, e.g., as described by Lange et al.
- image processing means (or image processor) relates to a digital image processing device, or software, which is capable of inputting, computing, and outputting digital image data.
- opacity change detection means (or opacity change detector) relates to a digital image processing device, or software, which is capable detecting opacity changes in at least two corresponding images, e.g., as described in the present specification.
- object detection means (or object detector) relates to a digital image processing device, or software, which is capable of detecting and/or identifying objects in a digital image.
- a 'computer-readable storage medium' as used herein encompasses any tangible storage medium which may store instructions which are executable by a processor of a computing device.
- the computer-readable storage medium may be referred to as a computer-readable non-transitory storage medium.
- the computer-readable storage medium may also be referred to as a tangible computer readable medium.
- a computer-readable storage medium may also be able to store data which is able to be accessed by the processor of the computing device.
- Examples of computer-readable storage media include, but are not limited to: a floppy disk, punched tape, punch cards, a magnetic hard disk drive, a solid state hard disk, flash memory, a USB thumb drive, Random Access Memory (RAM), Read Only Memory (ROM), an optical disk, a magneto-optical disk, and the register file of the processor.
- Examples of optical disks include Compact Disks (CD) and Digital Versatile Disks (DVD), for example CD-ROM, CD-RW, CD-R, DVD-ROM, DVD- RW, or DVD-R disks.
- the term computer readable-storage medium also refers to various types of recording media capable of being accessed by the computer device via a network or communication link.
- a data may be retrieved over a modem, over the internet, or over a local area network.
- References to a computer-readable storage medium should be interpreted as possibly being multiple computer-readable storage mediums.
- Various executable components of a program or programs may be stored in different locations.
- the computer-readable storage medium may for instance be multiple computer-readable storage medium within the same computer system.
- the computer-readable storage medium may also be computer-readable storage medium distributed amongst multiple computer systems or computing devices.
- Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to: RAM memory, registers, and register files. References to 'computer memory' or 'memory' should be interpreted as possibly being multiple memories.
- the memory may, for instance, be multiple memories within the same computer system.
- the memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
- Computer storage is any non- volatile computer-readable storage medium. Examples of computer storage include, but are not limited to: a hard disk drive, a USB thumb drive, a floppy drive, a smart card, a DVD, a CD-ROM, and a solid state hard drive. In some embodiments computer storage may also be computer memory or vice versa. References to 'computer storage' or 'storage' should be interpreted as possibly being multiple storage. The storage may for instance be multiple storage devices within the same computer system or computing device. The storage may also be multiple storages distributed amongst multiple computer systems or computing devices.
- a 'processor' as used herein encompasses an electronic component which is able to execute a program or machine executable instruction.
- References to the computing device comprising "a processor” should be interpreted as possibly containing more than one processor or processing core.
- the processor may for instance be a multi-core processor.
- a processor may also refer to a collection of processors within a single computer system or distributed amongst multiple computer systems.
- the term computing device should also be interpreted to possibly refer to a collection or network of computing devices each comprising a processor or processors. Many programs have their instructions performed by multiple processors that may be within the same computing device or which may even be distributed across multiple computing devices.
- a 'user interface' as used herein is an interface which allows a user or operator to interact with a computer or computer system.
- a 'user interface' may also be referred to as a 'human interface device.
- a user interface may provide information or data to the operator and/or receive information or data from the operator.
- a user interface may enable input from an operator to be received by the computer and may provide output to the user from the computer.
- the user interface may allow an operator to control or manipulate a computer and the interface may allow the computer indicate the effects of the operator's control or manipulation.
- the display of data or information on a display or a graphical user interface is an example of providing information to an operator.
- the receiving of data through a keyboard, mouse, trackball, touchpad, pointing stick, graphics tablet, joystick, gamepad, webcam, headset, gear sticks, steering wheel, pedals, wired glove, dance pad, remote control, one or more switches, one or more buttons, and accelerometer are all examples of user interface components which enable the receiving of information or data from an operator.
- a 'hardware interface' as used herein encompasses a interface which enables the processor of a computer system to interact with and/or control an external computing device and/or apparatus.
- a hardware interface may allow a processor to send control signals or instructions to an external computing device and/or apparatus.
- a hardware interface may also enable a processor to exchange data with an external computing device and/or apparatus. Examples of a hardware interface include, but are not limited to: a universal serial bus, IEEE 1394 port, parallel port, IEEE 1284 port, serial port, RS-232 port, IEEE-488 port, Bluetooth connection, Wireless local area network connection, TCP/IP connection, Ethernet connection, control voltage interface, MIDI interface, analog input interface, and digital input interface.
- a 'display' or 'display device' as used herein encompasses an output device or a user interface adapted for displaying images or data.
- a display may output visual, audio, and or tactile data. Examples of a display include, but are not limited to: a computer monitor, a television screen, a touch screen, tactile electronic display, Braille screen, Cathode ray tube (CRT), Storage tube, Bistable display, Electronic paper, Vector display, Flat panel display, Vacuum fluorescent display (VF), Light-emitting diode (LED) displays, Electroluminescent display (ELD), Plasma display panels (PDP), Liquid crystal display (LCD), Organic light- emitting diode displays (OLED), a projector, and Head-mounted display.
- VF Vacuum fluorescent display
- LED Light-emitting diode
- ELD Electroluminescent display
- PDP Plasma display panels
- LCD Liquid crystal display
- OLED Organic light- emitting diode displays
- Fig. l shows a Philips Goldway Colposcope
- Fig.2 shows a flow diagram of the method according to the invention
- Fig.3 shows a flow diagram for the method to automate colposcope images
- Fig.4a shows a colposcopic image showing the process when acetic acid is applied to the cervix
- Fig.4b shows a raw colposcopic image
- Fig.4c shows the same image with glare pixels removed
- Fig.4d shows the objects in a colposcopic image identified with methods according to the invention, i.e., Cervix (1), Transformation zone (2), Columnar epithelium (3) and Os (4);
- Fig.5a shows a colposcopic image taken after acetic acid has been applied
- Fig.5b shows the Os and Columnar epithelium regions identified together;
- Fig.5c shows the Os and columnar regions separated;
- Fig.5d shows the Os and columnar epithelium demarked after removing the small and disjointed clusters
- Fig.6a shows a tentative transformation zone identified in post-iodine solution image (black line);
- Fig.6b shows the tentative transformation zone mapped to the post-acetic acid images
- Fig.6c shows the actual transformation zone
- Fig.7a shows the transformation zone in post-acetic acid image
- Fig.7b shows the clustering transformation zone to dominant and minor opacity changes
- Fig.7c shows pixels in post-acetic transformation zone with dominant opacity changes
- Fig.7d shows corresponding pixels in pre-acetic acid image
- Fig.8a shows a pre-acetic acid image
- Fig.8b shows a post-acetic acid image after 1 min
- Fig.8d shows a pre-acetic acid image
- Fig.8e shows a post-acetic acid image after 3 min
- Fig.8f shows an opacity image with opacity difference score of 43.28;
- Fig. 9 shows a flow diagram which illustrates a method according to an embodiment of the invention.
- Fig. 10 shows a flow diagram which illustrates a method according to a further embodiment of the invention.
- Fig. 11 shows a flow diagram which illustrates a method according to a further embodiment of the invention.
- Fig. 12 illustrates a medical imaging system according to an embodiment of the invention
- Fig. 13 shows a medical imaging system according to an alternative embodiment of the invention
- Fig. 14 shows an example of an unsatisfactory cervicography image
- Fig. 15 shows an example of a satisfactory cervicography image
- Fig. 16 illustrates the cervical structure
- Fig. 17 shows two separate workflows
- Fig. 18 illustrates a workflow according to an embodiment of the invention
- Fig. 19 shows an example of a medical imaging system according to a further embodiment of the invention
- Fig. 20 shows a segmented cervicography image 2000
- Fig. 21 shows the same image as shown in Fig. 20 divided into 12 pie-like sections
- Fig. 22 illustrates a user-interface which can be implemented in a computer or other computing device such as a smart or mobile phone or a tablet computer;
- Fig. 23 shows an acetic acid image where a cervix region is identified
- Fig. 24 shows a Lugol's iodine image wherein a cervix region is identified
- Fig. 25 illustrates the steps in a colposcopic examination
- Fig. 26 illustrates an example of feature-based tracking
- Fig. 27 shows a plot of the frame number versus the number of tracked features in a sequence of video images.
- a colposcope according to the invention (see Fig. 1), has a key frame extractor, pre-processor, object detector and opacity change detector. Key frame extractor takes as input the video that comes from the colposcope which is displayed to the
- the images (pre-acetic acid, post-acetic acid and post-iodine solution) containing the cervical region at the same optical magnification are extracted (in the present application, the words frames and images are used interchangeably).
- glare caused by specular reflection in the images is removed, e.g., by a method described in Lange et al. [, according to which a feature image is extracted from a color image that provides a good glare to background ratio.
- the glare containing regions in the feature image are detected and extended to cover all pixels that have been affected by the glare.
- the glare in the affected regions is then removed by filling in an estimate of the underlying image features.
- the de-glared image is then sent to the object detector.
- the object detector processes the images to identify the Os, columnar epithelium and the transformation zone, based on segmentation techniques that use the color and intensity information.
- the key frame extractor takes as input the 2D video that comes out of the colposcope. It divides the videos into shots by identifying the shot boundaries and then extracting the relevant images from each shot.
- the types of images extracted are the pre- acetic acid images, post-acetic acid images and post-iodine solution images having the same magnification factor.
- a shot/view is defined as a sequence of interrelated images captured from the colposcope, which represents a continuous action in time and space.
- a "view boundary” is defined herein as the point of switching over from one view to another while performing the colposcopy.
- Classical shot boundary detection algorithms proposed in context of general purpose video applications can be used to detect shot boundaries. Color features have been used for shot detection.
- the key frames are from the following shots: pre-acetic acid, post-acetic acid and post- iodine solution.
- Each frame or image in a shot is described with a set of features based on color, texture, area of the cervix, etc.
- a score, a is assigned to each frame which is a function of the set of features used. The frame with maximum a is selected as the key frame in a shot.
- the preprocessor takes as input the key images selected by the key frame extractor. Quality of cervical images is attributed to many factors including glare/glint/specular reflection. The glare eliminates the color information in affected pixels and thus makes image analysis algorithms unreliable. It is therefore essential to remove the glare from colposcope images which is done in the preprocessor.
- the glare regions are detected as small, saturated and high contrast regions.
- An image, T is opened in RGB color space.
- the G component of the RGB color space is used as feature space as it provides a good glare to background ratio. Local maxima for a histogram generated by the G component are identified that represent saturated values.
- a mask for the glare pixels is generated and applied on the image T to remove the glare pixels from it (see Fig. 4c).
- the object detector takes as input the key frame images from which the glare pixels have been filtered out.
- the object detector aims at identifying relevant objects of the cervix (1), like the Os (4), columnar epithelium (3) and the transformation zone (2), see Fig. 4d.
- Smallest cluster is labeled as Os + columnar epithelium region
- Detecting transformation zone in the post-acetic acid images is a two step approach.
- post-iodine solution images are processed to tentatively detect the transformation zone based on the color changes that the affected cervical zone depicts on the application of iodine solution.
- the post-iodine solution image obtained from the key frame extractor is segmented using color based K-means clustering into two clusters. The smaller of the two clusters is selected. The convex hull of this cluster is defined as the tentative transformation zone.
- This is followed by a second step where the tentative transformation zone is mapped to the pre- and post-acetic acid images and then the detected Os and columnar epithelium regions are subtracted to define the actual transformation zone.
- the pre- and post-acetic acid images can be registered before identifying the transformation zone.
- the respective image analysis process is shown in Fig. 6.
- This module takes as input the transformation zones identified in the pre- and post-acetic acid images and generate an opacity difference score for all post-acetic acid images with respect to the pre-acetic acid images.
- pixels in the transformation zone which show dominant opacity changes are identified and they are compared with their corresponding pixels in pre-acetic acid image.
- the steps to generate the opacity difference score are as follows:
- each post-acetic acid image is given an opacity difference score with respect to the pre-acetic acid image using the following formula:
- N Number of pixels with dominant opacity changes
- Fig. 9 shows a flow diagram which illustrates a method according to an embodiment of the invention.
- image data is received and the image data comprises multiple images of a cervix.
- a diagnostic image is selected from the image data.
- Fig. 10 shows a flow diagram which illustrates a method according to a further embodiment of the invention.
- image data comprising video data is received.
- the image data comprises multiple images of a cervix.
- image sequences within the video data are identified.
- at least one diagnostic image is selected from each image sequence.
- Fig. 11 shows a flow diagram which illustrates a method according to a further embodiment of the invention.
- image data is received.
- the image data contains multiple images of a cervix.
- a diagnostic image is selected from the image data.
- the diagnostic image is segmented to generate an image segmentation that identifies the old squamocolumnar junction.
- a quality rating is determined of the diagnostic image using the segmentation.
- Step 1108 is a decision box. The question is is quality rating of the diagnostic image sufficient. If it is yes then the method ends at step 1110. If not the method proceeds to step 1112 and in step 1112 a new diagnostic image is selected from the image data.
- Fig. 12 illustrates a medical imaging system 1200 according to an embodiment of the invention.
- the medical imaging system 1200 comprises a computer 1202.
- the computer 1202 has a processor 1204 connected to a user- interface 1206, computer storage 1208, and computer memory 1210.
- image data 1212 that has been received. It may for example be received from a camera or via a network connection.
- the computer storage 1208 further comprises a diagnostic image 1214 selected from the image data 1212.
- the computer memory 1210 comprises an image selection module 1216.
- the image selection module 1216 comprises computer executable code which enables the processor 1204 to select the diagnostic image 1214 from the image data 1212.
- the computer 1202 could be replaced by a variety of equivalents.
- the examples include, but are not limited to: a tablet computer, a laptop computer, and a mobile telephone device.
- the processor 1204 may be represented by various processors distributed amongst various computing devices.
- Fig. 13 shows a medical imaging system 1300 according to an alternative embodiment of the invention.
- the medical imaging system 1300 comprises a computer system 1202 equivalent to the computer system 1202 in Fig. 12.
- the processor 1204 is connected to a hardware interface 1302.
- the hardware interface enables the processor 1204 to communicate with a camera 1304.
- the computer storage 1208 is shown as containing feature correspondences
- the image data 1212 comprises video data.
- the computer storage 1208 further contains image sequences 1312 identified from the video data.
- the computer storage 1208 further contains an image segmentation 1312 taken from the one or more diagnostic images 1214.
- the computer storage 1208 further contains a quality rating 1316 of each of the diagnostic images 1214.
- the computer storage 1208 further contains a cervix center location 1318 of the diagnostic image 1214.
- the computer storage 1208 further contains a translated diagnostic image 1320 such that the cervix center location 1318 is now in the center of the image.
- the computer storage 1208 further contains a partitioned image 1322 which is suitable to be displayed to a physician.
- the partitioned image 1322 may have the cervix center location in the center of the image and is divided into a number of pie-like sections.
- the computer memory 1210 is further shown as containing a feature correspondence identification module 1330.
- the feature correspondence identification module 1330 contains computer-executable code which enables the processor 1204 to calculate the feature correspondences 1310 from the image data 1212.
- the computer memory 1210 further contains an image sequence identification module 1332 which enables the processor 1204 to determine a set of image sequences 1312 from the image data 1212. Once the image sequences 1312 have been selected the image selection module 1216 selects at least one diagnostic image 1214 for each of the identified image sequences 1312.
- the computer memory 1210 further contains an image segmentation module 1334.
- the image segmentation module 1334 contains computer-executable code which enables the processor 1204 to segment the one or more diagnostic images 1214. This generates the image segmentation 1314.
- the computer memory 1210 is further shown as containing a quality rating module 1336.
- the quality rating module 1336 contains computer-executable code which generates a quality rating 1316 for each of the diagnostic images 1214 using the image segmentation 1314.
- the computer memory 1210 is further shown as containing a cervix center location module 1338.
- the cervix center location module 1338 contains computer- executable code which enables the processor 1204 to use the image segmentation 1314 to identify the cervix center location 1318.
- the computer memory 1210 further contains an image processing module 1340.
- the image processing module 1340 contains computer- executable code which enables the processor 1204 to modify the diagnostic image 1214 into the translated diagnostic image 1330 and then finally into the partitioned image 1322.
- Cervicography is a method for conducting VIAM procedure in which a non-physician takes pictures of the cervix and submits them to a physician for interpretation.
- VIAM magnification
- One of the main concern of cervicography is that unsatisfactory cervicography should be avoided.
- An unsatisfactory procedure is where the upper margin of transformation zone in the patient is not visible and yet the images are transferred to the expert for interpretation thus increasing the number of procedures with recall/ revisit of patients. This also results in missed diagnoses.
- Cancer arising from cervix is the commonest cancer in women in India. About 30% of cancers in women in India are due to cervical cancer with more than 100,000 new cases diagnosed every year. Unfortunately in India there are only 20 qualified
- cervicography which has been practiced for some time to address the issue of very low specialist to patient ratio. It is a method for conducting VIAM procedure in which a non physician takes pictures of the cervix and submits them to a physician for interpretation. Cervicography has not been popular due to high false positives and negatives. This largely arises from unsatisfactory or technically uninterpretable cervicographic images for interpretation by experts. Unless the expert can examine the entire transformation zone in its full length, the cervicographic examination is termed inadequate or unsatisfactory (Fig. 14 shows an image of a satisfactory colposcopy and Fig. 15 shows an image of an unsatisfactory colposcopy).
- Fig. 14 shows an example of a cervicography image.
- the cervicography image 1400 shown in Fig. 14 is unsatisfactory and would not be useful for performing a diagnosis.
- Fig. 15 also shows a cervicography image 1500.
- the cervicography image 1500 shown in Fig. 15 is satisfactory and may be used by a physician for performing a diagnosis.
- Fig. 16 shows an image which illustrates the cervical structure 1600.
- the external os is labeled 1602.
- the columnar epithelium is labeled 1604.
- the transformation zone is labeled 1606.
- the original or old squamocolumnar junction is labeled 1608.
- the new squamocolumnar junction is labeled 1610.
- the Cervix is the lower most part of the uterus which extends into the vaginal canal.
- Ectocervix is the portion of cervix visible through the vaginal canal and is lined by the squamous epithelium.
- Endocervix is the portion extending from the external Os which is the external orifice of the cervix. This is lined by columnar epithelium.
- the junction of these two types of epithelium is called the transformation zone.
- the transformation zone lies between the new squamo columnar junction (New SCJ) and the original squamocolumnar junction (Original SCJ).
- the transformation zone is the most common area for cervical cancer to occur.
- HRI estimates the annual consumer market for remote/mobile monitoring devices to be $7.7 billion to $43 billion. Remote monitoring also could reduce hospital spending, a goal of both government and private payers.
- HRI estimates the annual consumer market for remote/mobile monitoring devices to be $7.7 billion to $43 billion. Remote monitoring also could reduce hospital spending, a goal of both government and private payers.
- Fig. 17 shows two separate workflows 1700, 1702 illustrated as flow diagrams.
- Workflow 1700 creates the current workflow.
- Workflow 1702 illustrates the application of an embodiment according to the invention. In both workflows 1700, 1702 a woman is screened for cervical cancer.
- workflow 1700 the woman attends a primary care physician 1704. Next the woman undergoes examination in the form of visual inspection or
- step 1710 the healthcare worker or general practitioner interprets 1708 the results of the examination performed in step 1706.
- step 1710 it is mentioned that false positives are highly likely when step 1708 is performed.
- step 1712 the woman is referred to a higher center for further cancer screening.
- step 1714 unnecessary referrals adding to the burden on the healthcare system and cost to patient.
- Workflow 1702 is discussed next. In this workflow starts the step 1720 with the woman attending the primary care physician. Next the woman undergoes examination using VIAM. Next in step 1724 the images are captured under magnification in step 1724. In step 1726 an expert interprets the images captured under magnification. Embodiments of the invention may be used to send images to the expert remotely. Next in step 1728 a referral is performed only if the expert finds the image to be abnormal. In step 1730 the benefit is that this reduces referrals and unnecessary anxiety caused by false positives.
- Fig. 18 illustrates a workflow according to an embodiment of the invention.
- any trained medical professional can capture an image of the cervix at a 5-7x zoom with a digital camera or mobile phone with a macro lens before and after dispensing acetic acid and Lugol's iodine.
- these images are transferred to a hospital server or cloud.
- the server or cloud the images are segmented to attain the os, columnar epithelium and cervix region. It is then checked if the cervicography is satisfactory or not. If the cervicography is satisfactory then the cervix region is divided into 12 zones and sent to the expert in step 1803.
- the gynecologist or expert receives a full image of the cervix and the demarked zones one-by-one or the gynecologist could also select the zones which he or she is interested in. These for example could be sent to his or her mobile phone. This is illustrated in step 1804.
- the expert then annotates and marks the lesions in his or her mobile phone. This is illustrated as step 1805.
- step 1806 a report is generated containing the annotated images with the diagnosis. The report is then sent back to the hospital server or cloud and then back to the medical professionals as illustrated in step 1806.
- Step 1801 as shown in Fig. 18).
- the image would be segmented to obtain Os, columnar epithelium and cervix region.
- the cervix region is divided into 12 zones and sent to the expert (Step 1803 as shown in Fig. 18).
- the GynaecoOncologist or the expert receives the full image of the cervix and the demarked zones one by one (or the GynaecoOncologist could select the zones which he/she is interested) in his/her mobile phone (Step 1804 as shown in Fig. 18).
- a report is generated containing the annotated images with the diagnosis. The report is then sent back to the hospital server/cloud and then back to the medical
- Embodiments of the invention may have features that make up a system that captures images of the cervix after applying acetic acid and Lugol's Iodine and transfers it to the cloud based system that manages healthcare records.
- the cloud has an object detection module that segments the cervical images to identify Os, columnar epithelium,
- transformation zone and cervical boundary are input to the adequacy determining module that classifies if the image is adequate for interpretation by an expert or not. If the images are satisfactory then the image is partitioned into 12 regions with the center of the Os as center in the partitioning module. The main image and some (an expert could also select a region interactively)/ all the 12 regions are sent to the mobile phone of the
- Fig. 19 shows an example of a medical imaging system 1900 according to an embodiment of the invention.
- the medical imaging system 1900 is illustrated from a functional point of view.
- Within the cloud computer system 1904 are various software modules for processing the image.
- adequacy determination module 1908 which uses a segmentation generated by the object detection module 1906 to determine if the image is adequate or not.
- the image is processed by a partitioning module 1910 which partitions and/or centers the image.
- the cloud computing system 1904 sends the diagnostic image to a mobile and/or smart phone 1912.
- the image capture device for the cervix could be a digital camera with a macro lens or a mobile phone with a camera. Images are captured from the cervix after application of acetic acid and Lugol's Iodine.
- the mobile phone provides an application (browser based) that is used to connect securely to the hospital server/cloud. The images from the phone are transferred to the server with a unique identification of the patient concerned. These images are then analyzed on the server/cloud which are discussed in the next modules.
- the object detector module takes as input the images (pre acetic acid/post acetic acid and post Lugol's Iodine) sent to the cloud for a patient objects like the Os, columnar epithelium and the cancerous tissues or transformation zone and cervix region.
- Fig. 20 shows a cervicography image 2000. Also shown is an image segmentation represented by various lines.
- the old squamocolumnar junction is labeled 2002.
- the new squamocolumnar junction is labeled 2004.
- the columnar epithelium is labeled 2006.
- the os boundary is labeled 2008.
- the os region is labeled 2010.
- the transformation zone is labeled 2012.
- the cervix is labeled 2014 and the cervix boundary is labeled 2016.
- An adequacy determination module may be used to evaluate the diagnostic image. As discussed earlier, unless the squamocolumnar junction is visible to its full length, a cervicographic image is not said to be satisfactory. This module uses the concept of contour continuity in the neighborhood pixels of the boundary of the transformation zone to decide if the cervicographic image is satisfactory or not.
- the pseudocode of the algorithm is as follows:
- T the transformation zone in an image I with dimensions A in the x axis and B in the y axis
- TB the boundary of the transformation zone containing pixels (xl,yl) (xn, yn), where x is the X coordinate and y is the Y coordinate of TB in XY plane
- the object detection module or adequacy determination module could be present in the cloud/server or locally in the mobile phone as an application to give immediate feedback to the medical professional.
- the adequacy determination module also displays some movie files that could train or guide a medical professional on how to take a cervical image from an unsatisfactory cervicographic image.
- the image may be partitioned into 12 zones with center of the cervix as center which is shown in the figure below:
- Fig. 21 shows the same image 2000 as shown in Fig. 20. However, not all of the segmentations are shown on this image. Only the cervix boundary 2016 and the os boundary 2008 are shown. The region within the cervix boundary 2016 has been divided into 12 pie-like sections 2100. The image has also been manipulated such that a cervix center 2102 has been determined and moved to a central region of the image.
- the cervical image as well as each partitioned zone is sent to the expert for his/her opinion. Partitioning the image allows viewing of the image in a mobile phone with small screen easier and come with accurate diagnosis.
- mCervix module in mobile phones there may be an application that would allow an expert to view images and the partitioned zones in a mobile/smart phone. There may also be an annotation tool. This tool would allow the doctor to view an image, mark the lesions, zoom into some specific part of the image and annotate the image
- Report generation tool This tool would generate a report with essential information.
- One such template is as illustrated in Fig. 22.
- Fig. 22 illustrates a user-interface 2200 which can be implemented in a computer or other computing device such as a smart or mobile phone or a tablet computer.
- the user-interface shown in 2200 is an annotation tool which may be filled in by a physician which may then be used to generate a diagnosis which is sent to his or her colleagues who originally took the image and sent it to the expert.
- a mobile phone based system for cervicography comprising: A mobile phone with camera or a mobile phone and a digital camera; a server or cloud platform for processing the images;
- An object detection module and adequacy determination module An mCervix module to annotate and generate a report
- object detection module is configured for:
- adequacy determination module is configured for:
- server/ cloud comprises:
- An object detection module and adequacy determination module An object detection module and adequacy determination module
- a partitioning module that partitions the cervical image into 12 zones with center of the Os as the center for partitioning.
- the Cervix region may be automatically identified in pre and post acetic acid images using a dynamic thresh holding scheme on the red component of the image opened in RGB color space. This is followed by a K means clustering. If the cluster identifies a region with size less than one/ tenth of the size of the image then we conclude that the cervix is not visible in the image. Else the cluster is considered to be the cervix region.
- Otsu thresh holding is used to differentiate the cervix region from non-cervix regions.
- Fig. 23 shows an acetic acid image 2300. Within the acetic acid image 2300 a cervix region 2302 has been identified.
- Fig. 24 shows a Lugol's iodine image 2400. Within the Lugol's iodine image 2400 there is a cervix region 2402 identified.
- Colposcopic examination may be performed for the diagnosis of cervical cancer.
- the abnormal area in the cervix is checked for visual signs observed in cervical neoplasia.
- a colposcope a gynecologist captures 5-10 minutes video data.
- Figure 1 depicts the different phases that can be identified in the colposcopic examination.
- the video data shows the successive steps of cleaning the cervix, observing the cervix with and without a green color filter, application of acetowhite acid, the temporal response of the tissue to the acid, the application of Lugol's iodine, and detailed (zoomed) analysis of regions of interest of the cervix. Especially the temporal response of the tissue to the acetowhite acid is crucial for the diagnosis.
- the video sequence is then analyzed in detail by a medical expert.
- Colposcopic examination is performed for the diagnosis of cervical cancer.
- the abnormal area in the cervix is checked for visual signs observed in cervical neoplasia.
- a colposcope a gynecologist captures 5-10 minutes video data.
- Figure 1 depicts the different phases that can be identified in the colposcopic examination.
- the video data shows the successive steps of cleaning the cervix, observing the cervix with and without a green color filter, application of acetowhite acid, the temporal response of the tissue to the acid, the application of Lugol's iodine, and detailed (zoomed) analysis of regions of interest of the cervix. Especially the temporal response of the tissue to the acetowhite acid is crucial for the diagnosis.
- the video sequence is then analyzed in detail by a medical expert.
- Fig. 25 illustrates the steps in a colposcopic examination.
- a normal saline is applied to douche and clean any secretions.
- Next images may be acquired at this point.
- a green filter 2504 is used and more images may be acquired. The green filter allows the inspection of the blood vessel pattern.
- a 3-5% acetic acid solution is applied. The change in the appearance of the cervix over time and the duration of the change and the time is noted as part of the examination.
- Lugol's iodine is applied and the partial and complete uptake of the iodine is examined.
- a biopsy is taking from any abnormal tissue regions.
- Video compression can significantly reduce the amount of data, however at the cost of loss of detail
- Video key- frame extraction e.g. used to create a visual summary of a movie by identification of relatively large differences between successive frames.
- Video compression will not remove irrelevant frames and is only effective if you allow for loss of image detail.
- image detail is relevant for at least parts of the recordings (e.g. for detection of the transformation zone or for the analysis of the mosaic pattern of the capillary vasculature).
- Video key-frame extraction techniques have been developed to provide very effective summaries for movies.
- the general idea behind key- frames is to capture an image for each part of the video sequence that shows a large difference with the previous part.
- a strong clue for key- frame extraction is the presence of "black frames" that typically separate different recordings/scenes in a movie.
- key- frame detection can be used to compress/summarize some parts of the colposcopy video data.
- One example of small yet important changes of the image data occurs in the part of the video after the acetowhite acid has been applied. From this phase a sufficient amount of images must be sampled in order to allow a proper interpretation of the acetowhite kinetics.
- Embodiments of the invention may apply a computer to automatically identify the parts of the video data that are relevant for colposcopic analysis.
- This algorithm will remove the video data that is not crucial for the expert to come to the right diagnosis. Doing so, the amount of data that needs to be transmitted to the expert centre is significantly reduced.
- the data that is transmitted may be a limited number of images to capture each different phase of the applied protocol (cleaning, green filter, acetowhite acid, iodine, detailed regions) plus sets of images that were selected using application specific knowledge, e.g. to enable proper interpretation of the acetowhite dynamics.
- This approach may:
- a straightforward way to obtain the start and end points in the video for each relevant phase is to ask the user to manually annotate these frames.
- Automated approaches can identify these start and end locations by studying large differences between successive frames in the video sequence.
- Another clue can be obtained from monitoring of inter-frame camera motion. Camera motion is typically based on tracking of image features in successive frames. A large number of corresponding features between frames indicates good similarity. When the number of corresponding features drops, it indicates a special event, e.g. occlusion of the colposcope by the user for the purpose of cleaning the cervix or application of acetowhite acid/iodine.
- Identification/labeling of each different phase can again be done manually by the user. Automatic approaches will look for image features that characterize each different phase. E.g. a large amount of white image data can be a clue for the cotton used for cleaning the cervix. Pinkish-white color can be a clue for the presence of acetowhite acid and a yellow/orange color can be a strong clue for Lugol's iodine. Information about the protocol, about the duration of each phase, the zoom modus of the colposcope during each phase, and the observed motion pattern, can also be exploited to identify each phase.
- phase can be characterized with only one image.
- the cleaning phase or the phase where the cervix is imaged at a fixed magnification using the green filter can be summarized with a single frame.
- the frame can be selected based on analysis of the motion pattern (small motion indicating small chance of motion artifacts and large chance of proper focus) and/or on inspection of the image content (e.g. check if the Os region is visible and near the image centre).
- acetowhite kinetics is preferably based on more than 1 or 2 frames. In a most conservative implementation all frames from this phase are considered relevant and transmitted to the expert centre. This part may be compressed using normal video compression techniques. In a more intelligent implementation the dynamics of the changes are accurately monitored and a subset of images is selected that still allows for a proper interpretation.
- the user will zoom- in on the cervix and may scan the region of interest at somewhat larger magnification. This means that neighboring regions will be placed in the (reduced) field of view.
- a subset of good quality images can be selected to (when stitched together) cover the complete scan area.
- Fig. 26 illustrates an example of feature-based tracking.
- Image 2600 is at time t and image 2602 is at time t+1. Dots connected by a line show corresponding features.
- the boxes 2604 represent the same location in both images and are used to illustrate motion between frames 2600 and 2602.
- Fig. 27 shows a plot of the frame number 2700 versus the number of tracked features 2702 in a sequence of video images. This Fig. shows an example of the correlation between the number of tracked features and the transitions 2704 between different phases of the video.
- the blocked regions 2704 are regions where the number of tracked features decreases sharply and indicate transitions between image sequences or when an operation was performed on the cervix.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Heart & Thoracic Surgery (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Pathology (AREA)
- Quality & Reliability (AREA)
- Biomedical Technology (AREA)
- Theoretical Computer Science (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
- Image Analysis (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN4513CHN2014 IN2014CN04513A (enrdf_load_stackoverflow) | 2011-12-05 | 2012-11-30 | |
CN201280059835.3A CN103975364B (zh) | 2011-12-05 | 2012-11-30 | 针对宫颈的光学检查的图像选择 |
RU2014127529A RU2633320C2 (ru) | 2011-12-05 | 2012-11-30 | Отбор изображений для оптического исследования шейки матки |
BR112014013361A BR112014013361A2 (pt) | 2011-12-05 | 2012-11-30 | sistema de formação de imagem médica, produto de programa de computador para execução por um processador para controlar um sistema de formação de imagem médica, e método de operação de um sistema de formação de imagem médica |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161566904P | 2011-12-05 | 2011-12-05 | |
US61/566,904 | 2011-12-05 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2013084123A2 true WO2013084123A2 (en) | 2013-06-13 |
WO2013084123A3 WO2013084123A3 (en) | 2013-08-08 |
Family
ID=47501384
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2012/056855 WO2013084123A2 (en) | 2011-12-05 | 2012-11-30 | Selection of images for optical examination of the cervix |
Country Status (5)
Country | Link |
---|---|
CN (1) | CN103975364B (enrdf_load_stackoverflow) |
BR (1) | BR112014013361A2 (enrdf_load_stackoverflow) |
IN (1) | IN2014CN04513A (enrdf_load_stackoverflow) |
RU (1) | RU2633320C2 (enrdf_load_stackoverflow) |
WO (1) | WO2013084123A2 (enrdf_load_stackoverflow) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103750810A (zh) * | 2013-12-30 | 2014-04-30 | 深圳市理邦精密仪器股份有限公司 | 对电子阴道镜取得图像进行特征分析的方法及装置 |
CN103767658A (zh) * | 2013-12-30 | 2014-05-07 | 深圳市理邦精密仪器股份有限公司 | 电子阴道镜图像的采集方法及装置 |
CN103932665A (zh) * | 2014-03-17 | 2014-07-23 | 深圳市理邦精密仪器股份有限公司 | 一种电子阴道镜图像的显示方法及装置 |
CN105212886A (zh) * | 2015-02-28 | 2016-01-06 | 赵峰 | 数码电子阴道镜装置 |
CN105476597A (zh) * | 2015-12-22 | 2016-04-13 | 佛山市南海区欧谱曼迪科技有限责任公司 | 一种多模式电子阴道镜系统 |
WO2016124539A1 (en) * | 2015-02-04 | 2016-08-11 | Koninklijke Philips N.V. | A system and a method for labeling objects in medical images |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105520712B (zh) * | 2015-11-30 | 2019-04-16 | 广州三瑞医疗器械有限公司 | 一种阴道镜图像智能采集评估方法及装置 |
CN107689041A (zh) * | 2017-08-10 | 2018-02-13 | 汕头大学 | 一种用于图像相似度比对的方法、装置及存储介质 |
CN111651132B (zh) * | 2020-06-02 | 2023-03-24 | 马鞍山芯乔科技有限公司 | 一种基于视觉检查画面的子母画面同步显示系统 |
RU2758330C1 (ru) * | 2020-06-25 | 2021-10-28 | Владимир Андрианович Алёшкин | Способ определения степени дисплазии шейки матки |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2404600A1 (en) * | 2000-03-28 | 2001-10-04 | Board Of Regents, The University Of Texas System | Methods and apparatus for diagnostic multispectral digital imaging |
CN1236302C (zh) * | 2003-12-19 | 2006-01-11 | 武汉大学 | 一种多光谱细胞涂片自动分析仪及其用于宫颈细胞的分析方法 |
US7664300B2 (en) * | 2005-02-03 | 2010-02-16 | Sti Medical Systems, Llc | Uterine cervical cancer computer-aided-diagnosis (CAD) |
JP4800127B2 (ja) * | 2006-06-29 | 2011-10-26 | 富士フイルム株式会社 | 医用画像分割装置、及び、医用画像分割プログラム |
CN101322644B (zh) * | 2008-06-13 | 2013-09-11 | 曾堃 | 便携式子宫颈癌癌前病变诊断装置 |
KR101886712B1 (ko) * | 2009-10-14 | 2018-08-09 | 트라이스 이미징 인코퍼레이티드 | 의료 영상을 전환하고 모바일 장치 및 리모트 통신 시스템에 전송하기 위한 시스템 및 방법 |
RU107924U1 (ru) * | 2011-01-18 | 2011-09-10 | Общество с ограниченной ответственностью "ИНЛАЙФ" | Флуоресцентный кольпоскоп |
-
2012
- 2012-11-30 BR BR112014013361A patent/BR112014013361A2/pt not_active Application Discontinuation
- 2012-11-30 RU RU2014127529A patent/RU2633320C2/ru not_active IP Right Cessation
- 2012-11-30 WO PCT/IB2012/056855 patent/WO2013084123A2/en active Application Filing
- 2012-11-30 IN IN4513CHN2014 patent/IN2014CN04513A/en unknown
- 2012-11-30 CN CN201280059835.3A patent/CN103975364B/zh not_active Expired - Fee Related
Non-Patent Citations (1)
Title |
---|
None |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103750810A (zh) * | 2013-12-30 | 2014-04-30 | 深圳市理邦精密仪器股份有限公司 | 对电子阴道镜取得图像进行特征分析的方法及装置 |
CN103767658A (zh) * | 2013-12-30 | 2014-05-07 | 深圳市理邦精密仪器股份有限公司 | 电子阴道镜图像的采集方法及装置 |
CN103932665A (zh) * | 2014-03-17 | 2014-07-23 | 深圳市理邦精密仪器股份有限公司 | 一种电子阴道镜图像的显示方法及装置 |
WO2016124539A1 (en) * | 2015-02-04 | 2016-08-11 | Koninklijke Philips N.V. | A system and a method for labeling objects in medical images |
CN105212886A (zh) * | 2015-02-28 | 2016-01-06 | 赵峰 | 数码电子阴道镜装置 |
CN105476597A (zh) * | 2015-12-22 | 2016-04-13 | 佛山市南海区欧谱曼迪科技有限责任公司 | 一种多模式电子阴道镜系统 |
Also Published As
Publication number | Publication date |
---|---|
RU2633320C2 (ru) | 2017-10-11 |
RU2014127529A (ru) | 2016-02-10 |
CN103975364A (zh) | 2014-08-06 |
CN103975364B (zh) | 2017-04-12 |
WO2013084123A3 (en) | 2013-08-08 |
IN2014CN04513A (enrdf_load_stackoverflow) | 2015-09-11 |
BR112014013361A2 (pt) | 2017-06-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013084123A2 (en) | Selection of images for optical examination of the cervix | |
EP2685881B1 (en) | Medical instrument for examining the cervix | |
AU2019431299B2 (en) | AI systems for detecting and sizing lesions | |
AU2019311336B2 (en) | Computer classification of biological tissue | |
CN114782307A (zh) | 基于深度学习的增强ct影像直肠癌分期辅助诊断系统 | |
CN109190540A (zh) | 活检区域预测方法、图像识别方法、装置和存储介质 | |
US8718344B2 (en) | Image processing apparatus and medical image diagnosis apparatus | |
CN109117890A (zh) | 一种图像分类方法、装置和存储介质 | |
JP2022010202A (ja) | 医療画像処理装置 | |
WO2019088008A1 (ja) | 画像処理装置、画像処理方法、プログラム、及び内視鏡システム | |
WO2022120212A1 (en) | Apparatus and method for detecting cervical cancer | |
CN118315048A (zh) | 基于多模态深度学习的宫颈管镜在体辅助诊断方法及系统 | |
Vijaya et al. | CanScan: non-invasive techniques for oral cancer detection | |
Cao et al. | Deep learning based lesion detection for mammograms | |
WO2013150419A1 (en) | Quality-check during medical imaging procedure | |
Pallavi et al. | Automated analysis of cervix images to grade the severity of cancer | |
Andrade et al. | Automatic Segmentation of the Cervical Region in Colposcopic Images. | |
Xue et al. | A unified set of analysis tools for uterine cervix image segmentation | |
CN116402752B (zh) | 一种甲状腺癌筛查方法、装置、设备及存储介质 | |
CN111553352B (zh) | 一种dicom图像处理方法及系统 | |
Andrade | A Portable System for Screening of Cervical Cancer | |
Li et al. | Computerized image analysis for acetic acid induced intraepithelial lesions | |
Rodriguez | Integration of Multimodal, Multiscale Imaging and Biomarker Data for Squamous Precancer Detection and Diagnosis | |
Keerthana et al. | Real-time detection of premalignant cervical lesion using Artificial Intelligence (AI) model in multispectral imaging system | |
Yang et al. | Multi-modal convolutional neural network-based thyroid cytology classification and diagnosis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12809854 Country of ref document: EP Kind code of ref document: A2 |
|
ENP | Entry into the national phase |
Ref document number: 2014127529 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014013361 Country of ref document: BR |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12809854 Country of ref document: EP Kind code of ref document: A2 |
|
ENP | Entry into the national phase |
Ref document number: 112014013361 Country of ref document: BR Kind code of ref document: A2 Effective date: 20140602 |