CN113256820B - Digital developing method for mandibular surface lesion based on edge detection - Google Patents

Digital developing method for mandibular surface lesion based on edge detection Download PDF

Info

Publication number
CN113256820B
CN113256820B CN202110556033.4A CN202110556033A CN113256820B CN 113256820 B CN113256820 B CN 113256820B CN 202110556033 A CN202110556033 A CN 202110556033A CN 113256820 B CN113256820 B CN 113256820B
Authority
CN
China
Prior art keywords
teeth
model
mandible
tooth
tumor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110556033.4A
Other languages
Chinese (zh)
Other versions
CN113256820A (en
Inventor
何炳蔚
施家峰
钟鸿川
闫茂松
邹澄圣
陈斌赫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Qishan Lake Medical Technology Co ltd
Unnamed Fujian Investment Group Co ltd
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202110556033.4A priority Critical patent/CN113256820B/en
Publication of CN113256820A publication Critical patent/CN113256820A/en
Application granted granted Critical
Publication of CN113256820B publication Critical patent/CN113256820B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Software Systems (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a mandibular surface lesion digital developing method based on edge detection, which comprises the following steps: step S1, acquiring medical image data of a patient, and respectively constructing a mandible model, a tooth model and a tumor model; step S2, unifying the coordinate systems of the mandible model, the tooth model and the tumor model; step S3, identifying the tooth model, extracting the edge contour feature set of the tooth and storing the edge contour feature set in a database; and step S4, the terminal identifies by aligning the maxillofacial teeth, extracts the edge contour features of the teeth at the current position, matches the edge contour feature set with the highest similarity in the database, and automatically projects and matches the images of the corresponding teeth, mandible and tumor onto the real teeth. The invention carries out digital development on the maxillofacial lesion by means of edge detection of teeth, can visually display the condition of the lesion to a doctor through a screen, provides visual planning for the doctor and reduces the dependence on prior knowledge.

Description

Digital developing method for mandibular surface lesion based on edge detection
Technical Field
The invention relates to the field of augmented reality, in particular to a mandibular plane lesion digital developing method based on edge detection.
Background
The augmented reality technology is a computer simulation technology, and combines multiple technologies such as virtual reality, computer vision, computer network, human-computer interaction and the like. Perception of the real world is augmented by integrating virtual objects into image sequences acquired from various camera technologies. The existing tracking registration technology mostly adopts a mode based on feature points to carry out target identification, the extraction of the feature points is not limited to plane objects and regular objects, and the false-true fusion is completed by carrying out attitude estimation through the feature point extraction, matching and tracking. Augmented reality is changing the practice of healthcare by providing powerful and intuitive methods of exploring and interacting with digital medical data, as well as integrating data into the physical world to create a natural, interactive virtual experience.
In surgical operations, preoperative planning has a dominant effect on the outcome of the operation. Scientific preoperative planning can deal with the emergency in the operation in advance and improve the success rate of the operation. Preoperative planning requires the location of the lesion to be determined by means of gray scale images of Computed Tomography (CT) or Magnetic Resonance Imaging (MRI). The position and the size of a focus in a three-dimensional space of a human body are determined by a two-dimensional image, and important organs are avoided as much as possible when a surgical path is selected, which means that the reading capability of a doctor on a CT (computed tomography) image and an MRI (magnetic resonance imaging) image is extremely tested. During the operation, the physician usually can only see the organ surface exposed under the visual field, and to perform the precise operation, the physician usually needs to review and refer to the complicated human anatomy structure, and combine with the preoperative planned operation scheme path memorized in the brain. If the focus position cannot be accurately found, the canceration tissue is cut too little or the normal tissue is cut too much, so that the tumor residue or the excessive damage to the organ function is caused, and the operation effect is greatly reduced.
Disclosure of Invention
In view of this, the present invention provides a digital developing method for mandibular and facial lesions based on edge detection, which performs digital development on mandibular and facial lesions by means of edge detection of teeth, so as to intuitively display the conditions of lesions to a doctor through a screen, provide the doctor with visual planning, and reduce the dependency on prior knowledge.
In order to realize the purpose, the invention adopts the following technical scheme:
a mandibular surface lesion digital visualization method based on edge detection comprises the following steps:
step S1, acquiring medical image data of a patient, and respectively constructing a mandible model, a tooth model and a tumor model;
step S2, unifying the coordinate systems of the mandible model, the tooth model and the tumor model;
step S3, identifying the tooth model, extracting the edge contour feature set of the tooth and storing the edge contour feature set in a database;
and step S4, the terminal identifies by aligning the maxillofacial teeth, extracts the edge contour features of the teeth at the current position, matches the edge contour feature set with the highest similarity in the database, and automatically projects and matches the images of the corresponding teeth, mandible and tumor onto the real teeth.
Further, the step S1 is specifically:
acquiring DICOM medical image data of human lower jaw face lesion; extracting the ct range of the tumor by setting the upper and lower limits of a threshold value aiming at the acquired DICOM medical image data, and reconstructing a three-dimensional virtual model of the tumor;
extracting ct ranges of the jaw and the teeth by setting upper and lower threshold limits, removing pixel points at the joint of the upper jaw and the lower jaw, and extracting the lower jaw and the teeth from the jaw by a region growing algorithm;
the ct range of the tooth is extracted by setting the upper and lower threshold values, because the mandible is overlapped with the tooth threshold value, the pixel point at the joint with the mandible is removed, the tooth is extracted from the mandible by a region growing algorithm, and the three-dimensional virtual model of the tooth is reconstructed to be used as the virtual model for feature extraction.
And subtracting pixel points of the tooth model from the extracted mandible and the teeth through a Boolean algorithm, reserving the mandible without the teeth, and reconstructing a three-dimensional virtual model of the mandible.
Further, the step S2 is specifically: and respectively importing the three model files into three-dimensional model making software, unifying the coordinate systems of the three models under the same coordinate system and combining the three model files into a group.
Further, the step S2 is specifically:
taking the virtual three-dimensional model of the teeth as a virtual model for feature extraction, taking an image of the virtual model of the teeth under a visual angle, extracting an edge contour feature set under the visual angle through an edge detection algorithm, and storing the edge contour feature set into a feature database of a system;
and repeating the operation for a plurality of times to obtain a complete edge profile feature set in a plurality of directions.
Further, the step S4 is specifically:
creating an augmented reality system, adding an augmented reality camera and connecting a database to the augmented reality system;
adding a three-dimensional model group of teeth, mandible and tumor in an augmented reality system, adding materials with different colors for the teeth, the mandible and the tumor, and adjusting lower opacity;
the augmented reality system is led into an augmented reality equipment terminal, the tooth part is identified through the equipment terminal, and the edge contour characteristic set of the tooth is extracted;
putting the feature set into a database for matching, matching data with the highest similarity of the current contour feature set in a system library to obtain lesion information corresponding to the feature set, and automatically projecting and matching the lesion information to teeth; the user can directly observe the position of the lesion part on the maxillofacial surface through the screen.
A digital developing system for mandibular plane lesion based on edge detection comprises a database and an augmented reality device terminal; the database is used for storing the edge profile characteristic set of the teeth of the patient and the corresponding three-dimensional model sets of the teeth, the mandible and the tumor; the augmented reality equipment terminal is used for identifying the edge contour characteristics of the teeth and matching the corresponding model projection to the real teeth through augmented reality.
Compared with the prior art, the invention has the following beneficial effects:
the invention carries out digital development on the maxillofacial lesion by means of edge detection of teeth, can visually display the condition of the lesion to a doctor through a screen, provides visual planning for the doctor and reduces the dependence on prior knowledge.
Drawings
FIG. 1 is a flow chart of a method in an embodiment of the invention;
FIG. 2 is a hint diagram for contour alignment in one embodiment of the present invention;
fig. 3 is a virtual model of a projection of recognized mandibular teeth in an embodiment of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings and the embodiments.
Referring to fig. 1, the present invention provides a digital developing method of mandibular plane lesion based on edge detection, comprising the following steps:
step S1: medical image data of a patient is obtained through electronic Computed Tomography (CT), the medical image data is imported into medical image processing software, and a required tissue model is created according to different threshold ranges of different tissues of a human body in a CT image and depending on the difference of the threshold ranges.
Preferably, the threshold value of the human jaw bone is selected from 555-. Under the normal condition, maxilla and mandible can have the continuous condition of pixel on the CT image, need obtain complete mandible and need get rid of the pixel of the two connected portions, generally in the position of mandible head, mandible about the regional separation that increases of rethread remains mandible. When extracting the tooth, can include partial jaw in threshold value selection process, consequently will get rid of jaw and the pixel of the pixel junction of tooth, can select the junction of the two on the model according to the visible part of tooth. The representation of the teeth in the CT image often has artifacts, the artifacts can interfere the selection of the teeth, and pixel points of the artifacts need to be manually removed. The threshold difference of the tumor and the soft tissue on the CT image is small, a part of soft tissue can be selected when the tumor model is obtained, soft tissue pixel points around the tumor need to be manually removed, the tumor and the soft tissue are separated through region growing to obtain a complete tumor model, and finally the three models are exported in an stl file format.
And step S2, respectively importing the three model files into three-dimensional model making software, unifying coordinate systems of the three models under the same coordinate system and combining the coordinate systems into a group so as to prevent the problem that the coordinate alignment needs to be carried out for multiple times when the center points of the coordinate systems are different when the models are applied to an augmented reality system. Exporting the models of the group in a FBX file format, wherein the unit is mm;
step S3: and performing edge detection on the virtual model of the teeth, extracting feature point information of the edge position of the teeth, and storing the contour feature point information of the current model and corresponding pose coordinate information into an augmented reality system database. Selecting the relative position of the current tooth virtual model on a screen, and extracting edge characteristic information to be used as a prompt graph for aligning the edge contour of the augmented reality system;
and step S4, adding an augmented reality camera, a feature information base and a model group synthesized by the tooth model, the mandible model and the tumor model which are manufactured in the step S2 into the augmented reality system. The augmented reality system is led into augmented reality glasses or a mobile phone, a program is opened, a user can align the maxillofacial teeth through a prompt graph of contour alignment on a screen to identify, the edge contour features of the teeth at the current position are extracted, the edge contour feature set with the highest similarity in the system is matched, corresponding position information is provided, and images of preset teeth, mandibles and tumors are automatically projected and matched onto the real teeth. The user can directly observe the position information of the lesion part on the maxillofacial surface through the screen.
Preferably, in this embodiment, the colors and opacities of the model set are adjusted to 255,255,255,70 RGBA (red, green, blue and opacities) for the teeth and mandible and 50,70,210,100 RGBA for the tumor.
The above description is only a preferred embodiment of the present invention, and all equivalent changes and modifications made in accordance with the claims of the present invention should be covered by the present invention.

Claims (2)

1. A mandibular surface lesion digital imaging method based on edge detection is characterized by comprising the following steps:
step S1, acquiring medical image data of a patient, and respectively constructing a mandible model, a tooth model and a tumor model;
step S2, unifying the coordinate systems of the mandible model, the tooth model and the tumor model;
step S3, identifying the tooth model, extracting the edge contour feature set of the tooth and storing the edge contour feature set in a database;
step S4, the terminal identifies by aligning the maxillofacial teeth, extracts the edge contour characteristics of the teeth at the current position, matches the edge contour characteristic set with the highest similarity in the database, and automatically projects and matches the images of the corresponding teeth, mandible and tumor to the real teeth;
the step S1 specifically includes:
acquiring DICOM medical image data of human lower jaw face lesion; extracting the ct range of the tumor by setting the upper and lower limits of a threshold value aiming at the acquired DICOM medical image data, and reconstructing a three-dimensional virtual model of the tumor;
extracting ct ranges of the jaw and the teeth by setting upper and lower threshold limits, removing pixel points at the joint of the upper jaw and the lower jaw, and extracting the lower jaw and the teeth from the jaw by a region growing algorithm;
extracting the ct range of the teeth by setting the upper and lower threshold limits, removing pixel points at the joint with the mandible because the mandible is overlapped with the tooth threshold, extracting the teeth from the mandible by a region growing algorithm, and reconstructing a three-dimensional virtual model of the teeth as a virtual model for feature extraction;
subtracting pixel points of the tooth model from the extracted mandible and tooth model through a Boolean algorithm, reserving the mandible model without teeth, and reconstructing a three-dimensional virtual model of the mandible;
the step S2 specifically includes: respectively importing the three model files into three-dimensional model making software, unifying coordinate systems of the three models under the same coordinate system and combining the coordinate systems into a group;
the step S3 specifically includes:
taking the virtual three-dimensional model of the teeth as a virtual model for feature extraction, taking an image of the virtual model of the teeth under a visual angle, extracting an edge contour feature set under the visual angle through an edge detection algorithm, and storing the edge contour feature set into a feature database of a system;
repeating the operation for a plurality of times to obtain an edge profile feature set with complete directions;
the step S4 specifically includes:
creating an augmented reality system, adding an augmented reality camera and connecting a database to the augmented reality system;
adding a three-dimensional model group of teeth, mandible and tumor in an augmented reality system, adding materials with different colors for the teeth, the mandible and the tumor, and adjusting the opacity lower than a threshold value;
the augmented reality system is led into an augmented reality equipment terminal, the tooth part is identified through the augmented reality equipment terminal, and the edge contour characteristic set of the tooth is extracted;
putting the feature set into a database for matching, matching data with the highest similarity of the current contour feature set in a system library to obtain lesion information corresponding to the feature set, and automatically projecting and matching the lesion information to teeth; the user directly observes the position of the lesion part on the maxillofacial surface through the screen.
2. A system for implementing the digital developing method for mandibular surface lesion based on edge detection according to claim 1, comprising a database and an augmented reality device terminal; the database is used for storing the edge profile characteristic set of the teeth of the patient and the corresponding three-dimensional model sets of the teeth, the mandible and the tumor; the augmented reality equipment terminal is used for identifying the edge contour characteristics of the teeth and matching the corresponding model projection to the real teeth through augmented reality.
CN202110556033.4A 2021-05-21 2021-05-21 Digital developing method for mandibular surface lesion based on edge detection Active CN113256820B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110556033.4A CN113256820B (en) 2021-05-21 2021-05-21 Digital developing method for mandibular surface lesion based on edge detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110556033.4A CN113256820B (en) 2021-05-21 2021-05-21 Digital developing method for mandibular surface lesion based on edge detection

Publications (2)

Publication Number Publication Date
CN113256820A CN113256820A (en) 2021-08-13
CN113256820B true CN113256820B (en) 2022-07-01

Family

ID=77183531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110556033.4A Active CN113256820B (en) 2021-05-21 2021-05-21 Digital developing method for mandibular surface lesion based on edge detection

Country Status (1)

Country Link
CN (1) CN113256820B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115645044A (en) * 2022-11-04 2023-01-31 福州大学 Oral implant image superposition method based on no-marker

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110147647A (en) * 2019-06-15 2019-08-20 广西科技大学 Tooth jawbone 3-dimensional digital modeling method based on remporomandibular joint stability
US10460839B1 (en) * 2018-11-29 2019-10-29 Richard Ricci Data mining of dental images
CN112102291A (en) * 2020-09-15 2020-12-18 北京大学口腔医学院 Method for obtaining reference data of middle-of-surface defect target by anatomical feature point matching
CN112785691A (en) * 2021-01-29 2021-05-11 北京大学口腔医学院 Mandible defect reconstruction method, device electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10460839B1 (en) * 2018-11-29 2019-10-29 Richard Ricci Data mining of dental images
CN110147647A (en) * 2019-06-15 2019-08-20 广西科技大学 Tooth jawbone 3-dimensional digital modeling method based on remporomandibular joint stability
CN112102291A (en) * 2020-09-15 2020-12-18 北京大学口腔医学院 Method for obtaining reference data of middle-of-surface defect target by anatomical feature point matching
CN112785691A (en) * 2021-01-29 2021-05-11 北京大学口腔医学院 Mandible defect reconstruction method, device electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Development and Assessment of a Three-Dimensional Tooth Morphology Quiz for Dental Students;Lone, Mutahira et.al;《ANATOMICAL SCIENCES EDUCATION》;20190604;全文 *
基于多模图像融合的3D打印胶质瘤实体模型;刘宇清 等;《创伤与急诊电子杂志》;20190430;全文 *

Also Published As

Publication number Publication date
CN113256820A (en) 2021-08-13

Similar Documents

Publication Publication Date Title
US10143522B2 (en) Systems and methods for planning hair transplantation
US10357316B2 (en) Systems and methods for planning hair transplantation
US10258427B2 (en) Mixed reality imaging apparatus and surgical suite
KR102018565B1 (en) Method, apparatus and program for constructing surgical simulation information
CN112087985B (en) Simulated orthodontic treatment via real-time enhanced visualization
EP0741540B1 (en) Imaging device and method
Jiang et al. Registration technology of augmented reality in oral medicine: A review
CN107072625A (en) Treatment procedure planning system and method
CN110547869B (en) Preoperative auxiliary planning device based on virtual reality
US20240127559A1 (en) Methods for medical image visualization
CN113256820B (en) Digital developing method for mandibular surface lesion based on edge detection
CN117316393B (en) Method, apparatus, device, medium and program product for precision adjustment
CN115645044A (en) Oral implant image superposition method based on no-marker
CN117557724A (en) Head presentation method and system for brain surgery patient based on pose estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230308

Address after: Room 501-3, Floor 5, Building 14, Phase I, Innovation Park, No. 3, Keji East Road, High-tech Zone, Fuzhou, Fujian 350100

Patentee after: Fujian Qishan Lake Medical Technology Co.,Ltd.

Address before: 350100 Room 501, Floor 5, Building 14, Phase I, "Haixi High-tech Industrial Park", High-tech Zone, Fuzhou City, Fujian Province (located at No. 3, Keji East Road, Shangjie Town, Minhou County)

Patentee before: Unnamed (Fujian) Investment Group Co.,Ltd.

Effective date of registration: 20230308

Address after: 350100 Room 501, Floor 5, Building 14, Phase I, "Haixi High-tech Industrial Park", High-tech Zone, Fuzhou City, Fujian Province (located at No. 3, Keji East Road, Shangjie Town, Minhou County)

Patentee after: Unnamed (Fujian) Investment Group Co.,Ltd.

Address before: Fuzhou University, No.2, wulongjiang North Avenue, Fuzhou University Town, Minhou County, Fuzhou City, Fujian Province

Patentee before: FUZHOU University

TR01 Transfer of patent right