WO2016124539A1 - Système et procédé pour étiqueter des objets dans des images médicales - Google Patents
Système et procédé pour étiqueter des objets dans des images médicales Download PDFInfo
- Publication number
- WO2016124539A1 WO2016124539A1 PCT/EP2016/052061 EP2016052061W WO2016124539A1 WO 2016124539 A1 WO2016124539 A1 WO 2016124539A1 EP 2016052061 W EP2016052061 W EP 2016052061W WO 2016124539 A1 WO2016124539 A1 WO 2016124539A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lesion
- tool
- biopsy
- sites
- image
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/0004—Operational features of endoscopes provided with input arrangements for the user for electronic operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H15/00—ICT specially adapted for medical reports, e.g. generation or transmission thereof
Definitions
- the invention relates to object labeling, more particularly to a system and a method for object labeling in medical images. .
- lesion annotation and biopsy marking are very important as to meaningful and successful completion of such procedure.
- lesion annotation is done to mark the areas of abnormalities and provide descriptions of such areas.
- a lesion tool is used to annotate the lesions and the observations thereof, and thereafter indicate the biopsy sites using a biopsy tool.
- the biopsy sites are indicated based on the lesions indicated in the lesion tool, which is done manually by the user. Accordingly, it is required to indicate the same lesion twice in both the tools viz. lesion tool and biopsy tool. This requires the user to remember the indications or markings of the lesion and indicate it once again, at least in the biopsy tool. Also, the annotations of the lesion and of providing the description therein are done independently. This makes these tasks more cumbersome given to that they need to be referred to each other and appropriate description has to be identified and adaptively provided for the marked lesion.
- the current devices provide basic image capturing, image browsing and report generation.
- the procedure as described above involves labeling of the objects in the medical images that is more manually performed though being digital.
- WO 2001078607 provides actual biopsy sites by comparing sequence of images.
- Further object of the invention is to provide a method for labeling objects in the images using the system of the invention.
- the invention provides a system for labeling objects in images.
- the system of the invention comprises a lesion tool for identifying and providing lesion sites from at least one of the images; a biopsy tool for identifying the biopsy sites from the said at least one image and / or of the said lesion sites, the said lesion tool and biopsy tool are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom.
- GUI Graphical User Interface
- a biopsy tool which comprises a comparison module for comparing the said lesion sites and the said image and of the image data thereof.
- the biopsy sites are automatically identified by the biopsy tool herein and of the affirmation of the biopsy sites in accordance with the identified lesion sites.
- the invention also provides a method labeling objects in images by the system of the invention.
- the method of the invention comprises identifying and providing lesion sites from at least one of the image, by a lesion tool; identifying biopsy sites from the said at least one image and / or the said lesion sites, by the biopsy tool, the said lesion tool and biopsy tool are integrated therein along with a knowledge base and Graphical User Interface (GUI), to identify the lesion sites and of the biopsy sites therefrom.
- GUI Graphical User Interface
- Fig. 1 shows a system for labeling objects in images, in accordance with the invention
- Figs. 2a illustrate the method of the invention performed automatically
- Fig. 2b illustrates the method of the invention performed using manual inputs from the user; and Fig. 2c illustrates the method of the invention performed in real time.
- a system (100) for labeling objects in images is shown.
- the system (100) purports to provide biopsy sites in respect of a colposcopy examination and of the procedure.
- the images of the region of interest is captured by a colposcopy hardware (101), which typically comprises a camera with other supporting elements that enables it to capture images and provide the same for further analysis and processing.
- the images may be captured digitally, which allows further integration with other devices to process the said images captured and of the image data related thereto.
- Inputs from the user (102) may be obtained along with the images from the colposcopy hardware (101).
- Such inputs include but not limited to at least one of the manual drawings of lesion or landmark regions, manual annotation of biopsy sites, annotation of properties such as position, size, grade , opaqueness, etc. of lesions and landmarks, confirmation on computer recommended lesions / landmarks / biopsy sites, and user modifications of any of the above.
- GUI Graphical User Interface
- the interactions between the graphical representation of the images and of the image data may be enabled by the GUI (103).
- the lesion tool (104) is provided to identify the lesion sites from at least one of the captured images.
- the graphical representation of the image and of the image data are analyzed by the lesion tool (104) and the text description pertaining to the images under analysis is provided by the knowledge base (106).
- the process in relation to the knowledge base (106) recognizes the most frequent observation terms the user usually uses or the user has configured the vocabulary in advance.
- the process maps the terms the user uses into properties of the features. Based on this mapping, the automatic description generation can selectively translate the quantitative parameters into qualitative description.
- a rule-based reasoning plays a role here. For instance, if the user usually describes Transformation Zone (TZ) type, then the translated properties are not only TZ type, but also the ones that the TZ type is dependent on such as Squamous Columnar Junction (SCJ) margins.
- TZ Transformation Zone
- SCJ Squamous Columnar Junction
- the knowledge base (106) may comprises, in general, at least one of the process data such as text description, information on the medical procedure purporting to the image labeling, images and of the corresponding data, semantic data, procedural data, user inputs, etc. These process data may be stored and / or updated by lesion tool (104), biopsy tool (105) or user (102).
- the knowledge base (106) may also stores one or more of margins and properties e.g., size, opacity, thickness, border, etc. of landmark and lesions, temporal change of aceto-whiteness, geographic properties of the detected features, timestamp of acetic acid application, taxonomy of the descriptions for cervical features, map of user's vocabulary and corresponding properties of features.
- Lesion tool (104) detects lesion / landmark margins, and perform lesion mapping based on the four quadrants of clock diagram of the image. The result is visualized via the GUI (103).
- the GUI (103) collects user input, such as confirmed margins, which in turn are fed back to the lesion tool (104).
- the detected lesions / landmark margins, and their properties, with or without user input, are gathered in the lesion tool (104).
- the lesion tool (104) translates quantitative feature measurements, which includes lesion and landmark segmentation, size measurement, lesion mapping, into qualitative description. Provided with an output from the knowledge base (106), a final description tailored to user's vocabulary is generated.
- Lesions and their properties may be obtained either by manual input from user (102) or from the lesion tool (104) that automatically recognize lesions.
- the biopsy tool (105) accepts the aforementioned input and uses the knowledge-based rules to decide recommended biopsy sites and their properties.
- the knowledge- based rules make decisions based on type, grade of lesions and any other description provided by the user (102). For instance, if the user annotates a lesion 'aceto-white' at 3 o'clock and marks it as dense aceto-white, then the biopsy tool provides an indication of biopsy site at 3 o'clock accordingly. On the other hand, if the user annotates 'polyp' or 'transformation zone', the biopsy tool does not note down any marks because it does not require a biopsy according to the knowledge rules.
- the biopsy tool (105) can be presented together with the lesion tool (104), either integrated into one clock diagram or two clock diagrams of the image next to each other. Alternatively, it can be presented next to the video during biopsy process, wherein lesion properties can be automatically computed by the lesion tool (104). It can serve as not only navigation guidance but also a reminder.
- the biopsy tool (105) compares the difference between recommended biopsy sites and operated ones. By knowing the difference, the system (100) reminds the user of potentially overlooked sites. The difference can be computed by comparing the recommended sites with either users' manual input on the biopsy tool or biopsy tool (105) that automatically recognize actual biopsy sites.
- the lesion tool (104) and the biopsy tool (105) may co-exist with each other in the system (100) or may be integrated therein.
- the images and of the data corresponding thereto may be converted into semantic data by the lesion tool (104) and / or the biopsy tool (105), to provide more information about the images and of its association with the process and of the process data.
- Fig. 2a illustrates one embodiment of the method of the invention wherein the detection of biopsy sites is performed automatically.
- the user performs the colposcopy examination (201) and the system is adapted to detect the findings of the images, its position, type and grade
- the user provides the annotation for the lesions (203), and the findings and properties are provided thereto to the system.
- the system uses the knowledge base to decide on the recommended biopsy sites (204).
- the biopsy sites recommended by the system and that provided by the user are compared and confirmed (205).
- the system uses the knowledge-base and generates description of findings and biopsy sites and updates the knowledge base as well (206).
- Fig. 2b one embodiment of the method of the invention, wherein the detection of biopsy sites is performed using manual inputs from the user, is shown.
- the user performs the colposcopy examination (201) and provides the markings and findings and other related information like type, grade, etc. (203).
- These findings and other property related information are provided to the system and the knowledge base to decide on the biopsy sites that needs to be recommended (204).
- the recommended biopsy sites provided by the system and that identified by the user based on the findings and properties are compared by the user (205) and confirmation on such biopsy sites were made thereupon.
- Fig. 2c the method of the invention performed in real time is shown in Fig. 2c as one embodiment in accordance with the method of the invention.
- the user performs the colposcopy examination (201) and the system automatically detects the findings and of other properties related thereto (202). These findings and properties are provided to the system.
- the system uses the knowledge base to decide on the biopsy sites to be recommended (204).
- the System provides the recommended biopsy sites to the user (205a) and upon receiving the recommended biopsy sites, the user confirms the same during report generation or may add such sites as the case may be (205b).
- the invention as described above provides a system and a method theretofore labeling objects in images, in an improved manner.
- the lesion tool and the biopsy tool as described herein may be integrated and the recommendations of the biopsy sites are made available automatically. This improves the reliability and efficiency of the system and of the method for performing such colposcopy procedure and of the examination.
- the invention can be coextensively applied to colposcopy, sonography, endoscopy, cystoscopy etc.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Signal Processing (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
La présente invention concerne un système d'étiquetage d'objets dans des images. Le système de l'invention comprend un outil de lésion pour identifier et fournir des sites de lésion à partir d'au moins une image ; un outil de biopsie pour identifier les sites de biopsie à partir de ladite au moins une image et/ou desdits sites de lésion, lesdits outil de lésion et outil de biopsie sont intégrés dans celui-ci avec une base de connaissances et une interface utilisateur graphique (GUI) pour identifier les sites de lésion et les sites de biopsie à partir de celui-ci. La comparaison des sites de lésion détectés par l'outil de lésion est comparée automatiquement avec les images capturées par l'outil de biopsie. L'invention concerne également un procédé pour étiqueter des objets dans des images, effectué par le système de l'invention.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN549/CHE/2015 | 2015-02-04 | ||
IN549CH2015 | 2015-02-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016124539A1 true WO2016124539A1 (fr) | 2016-08-11 |
Family
ID=55436066
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2016/052061 WO2016124539A1 (fr) | 2015-02-04 | 2016-02-01 | Système et procédé pour étiqueter des objets dans des images médicales |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016124539A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112614572A (zh) * | 2020-12-28 | 2021-04-06 | 深圳开立生物医疗科技股份有限公司 | 一种病灶标记方法、装置、图像处理设备及医疗系统 |
US11195313B2 (en) | 2016-10-14 | 2021-12-07 | International Business Machines Corporation | Cross-modality neural network transform for semi-automatic medical image annotation |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001078607A1 (fr) | 2000-04-18 | 2001-10-25 | Litton Systems, Inc. | Visualisation amelioree du site d'une biopsie du sein in vivo pour la documentation medicale |
US20050059894A1 (en) * | 2003-09-16 | 2005-03-17 | Haishan Zeng | Automated endoscopy device, diagnostic method, and uses |
US20070237378A1 (en) * | 2005-07-08 | 2007-10-11 | Bruce Reiner | Multi-input reporting and editing tool |
US20080260218A1 (en) * | 2005-04-04 | 2008-10-23 | Yoav Smith | Medical Imaging Method and System |
US20090046905A1 (en) * | 2005-02-03 | 2009-02-19 | Holger Lange | Uterine cervical cancer computer-aided-diagnosis (CAD) |
US20100145720A1 (en) * | 2008-12-05 | 2010-06-10 | Bruce Reiner | Method of extracting real-time structured data and performing data analysis and decision support in medical reporting |
WO2013084123A2 (fr) * | 2011-12-05 | 2013-06-13 | Koninklijke Philips Electronics N.V. | Sélection d'images pour examen optique du col de l'utérus |
-
2016
- 2016-02-01 WO PCT/EP2016/052061 patent/WO2016124539A1/fr active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001078607A1 (fr) | 2000-04-18 | 2001-10-25 | Litton Systems, Inc. | Visualisation amelioree du site d'une biopsie du sein in vivo pour la documentation medicale |
US20050059894A1 (en) * | 2003-09-16 | 2005-03-17 | Haishan Zeng | Automated endoscopy device, diagnostic method, and uses |
US20090046905A1 (en) * | 2005-02-03 | 2009-02-19 | Holger Lange | Uterine cervical cancer computer-aided-diagnosis (CAD) |
US20080260218A1 (en) * | 2005-04-04 | 2008-10-23 | Yoav Smith | Medical Imaging Method and System |
US20070237378A1 (en) * | 2005-07-08 | 2007-10-11 | Bruce Reiner | Multi-input reporting and editing tool |
US20100145720A1 (en) * | 2008-12-05 | 2010-06-10 | Bruce Reiner | Method of extracting real-time structured data and performing data analysis and decision support in medical reporting |
WO2013084123A2 (fr) * | 2011-12-05 | 2013-06-13 | Koninklijke Philips Electronics N.V. | Sélection d'images pour examen optique du col de l'utérus |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11195313B2 (en) | 2016-10-14 | 2021-12-07 | International Business Machines Corporation | Cross-modality neural network transform for semi-automatic medical image annotation |
CN112614572A (zh) * | 2020-12-28 | 2021-04-06 | 深圳开立生物医疗科技股份有限公司 | 一种病灶标记方法、装置、图像处理设备及医疗系统 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230033601A1 (en) | Dynamic self-learning medical image method and system | |
US9805469B2 (en) | Marking and tracking an area of interest during endoscopy | |
KR101346539B1 (ko) | 얼굴들을 상관시킴으로써 디지털 이미지들을 구조화하기 | |
Ali et al. | A systematic review of automated melanoma detection in dermatoscopic images and its ground truth data | |
AU2014237346B2 (en) | System and method for reviewing and analyzing cytological specimens | |
JP2017534117A (ja) | 最適化された解剖学的関心構造ラベリング | |
US20150080652A1 (en) | Lesion detection and image stabilization using portion of field of view | |
US11094403B2 (en) | Method and apparatus for collecting test data from use of a disposable test kit | |
US11106724B2 (en) | Matching result display device, matching result display method, program, and recording medium | |
KR102531400B1 (ko) | 인공 지능 기반 대장 내시경 영상 진단 보조 시스템 및 방법 | |
US20230190404A1 (en) | Systems and methods for capturing, displaying, and manipulating medical images and videos | |
KR20150049585A (ko) | 용종 검출 장치 및 그 동작방법 | |
CN117618021A (zh) | 超声成像系统及相关的工作流系统和方法 | |
WO2016124539A1 (fr) | Système et procédé pour étiqueter des objets dans des images médicales | |
Hsieh et al. | An overview of deep learning algorithms and water exchange in colonoscopy in improving adenoma detection | |
US20230172425A1 (en) | Information processing method, electronic device, and computer storage medium | |
US20080031504A1 (en) | Optimized user interactions using archived data in medical applications | |
CN111289510B (zh) | 体外诊断仪、图像切换方法和可读存储介质 | |
JP2010057727A (ja) | 医用画像読影システム | |
JP2016202722A (ja) | 医用画像表示装置及びプログラム | |
CN111192679B (zh) | 一种影像数据异常的处理方法、装置及存储介质 | |
JP2004201722A (ja) | 超音波診断装置 | |
Marinescu et al. | Endobronchial optical coherence tomography for the diagnosis of fibrotic interstitial lung disease: a light at the end of the tunnel? | |
US20220399118A1 (en) | Generation device and generation method | |
JP6418792B2 (ja) | 情報処理方法、情報処理装置及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16706316 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16706316 Country of ref document: EP Kind code of ref document: A1 |