WO2021125851A1 - 개인 맞춤형 유방 보형물을 모델링하는 방법 및 프로그램 - Google Patents
개인 맞춤형 유방 보형물을 모델링하는 방법 및 프로그램 Download PDFInfo
- Publication number
- WO2021125851A1 WO2021125851A1 PCT/KR2020/018592 KR2020018592W WO2021125851A1 WO 2021125851 A1 WO2021125851 A1 WO 2021125851A1 KR 2020018592 W KR2020018592 W KR 2020018592W WO 2021125851 A1 WO2021125851 A1 WO 2021125851A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- breast
- modeling
- data
- breast implant
- Prior art date
Links
- 210000000481 breast Anatomy 0.000 title claims abstract description 137
- 238000000034 method Methods 0.000 title claims abstract description 64
- 239000007943 implant Substances 0.000 claims description 83
- 238000010801 machine learning Methods 0.000 claims description 9
- 210000002976 pectoralis muscle Anatomy 0.000 claims description 9
- 238000005094 computer simulation Methods 0.000 abstract 1
- 210000003205 muscle Anatomy 0.000 description 23
- 230000008569 process Effects 0.000 description 23
- 230000015654 memory Effects 0.000 description 15
- 238000003860 storage Methods 0.000 description 12
- 238000002595 magnetic resonance imaging Methods 0.000 description 10
- 210000000038 chest Anatomy 0.000 description 8
- 238000004519 manufacturing process Methods 0.000 description 8
- 238000011176 pooling Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 210000005075 mammary gland Anatomy 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 230000010354 integration Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 210000000062 pectoralis major Anatomy 0.000 description 4
- 206010006187 Breast cancer Diseases 0.000 description 3
- 208000026310 Breast neoplasm Diseases 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000002059 diagnostic imaging Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000013145 classification model Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000002316 cosmetic surgery Methods 0.000 description 2
- 230000002354 daily effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000003203 everyday effect Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000009607 mammography Methods 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 239000002537 cosmetic Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000002600 positron emission tomography Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/12—Mammary prostheses and implants
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/337—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2240/00—Manufacturing or designing of prostheses classified in groups A61F2/00 - A61F2/26 or A61F2/82 or A61F9/00 or A61F11/00 or subgroups thereof
- A61F2240/001—Designing or manufacturing processes
- A61F2240/002—Designing or making customized prostheses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30068—Mammography; Breast
Definitions
- the present invention relates to a method, program and system for modeling a personalized breast implant.
- artificial breast implants are used for the purpose of reconstruction of breast defects caused by diseases and accidents such as breast cancer and cosmetic and plastic surgery due to deformities, and the demand is increasing due to the increase in the incidence and survival rate of breast cancer.
- the interface of the implant combined with the internal interface of the breast and the shape/volume of the excised breast and the shape/volume of the implant do not match. There was a problem with the occurrence of complications.
- An object of the present invention is to provide a method for modeling a personalized breast implant by merging medical image data and body scan data of a specific patient.
- a method for modeling a personalized breast implant for solving the problems of the present invention includes: acquiring medical image data and body scan data; obtaining first image data by 3D modeling the medical image data; obtaining second image data by 3D modeling the body scan data; generating third image data by merging the first image data and the second image data; and generating breast implant appearance information based on the third image data.
- the generating of the external information is performed using a machine learning model
- the training data for the machine learning model includes the medical image data, the first image data, the body scan data, the second image data, and It may be composed of the third image data.
- the merging is performed through image registration, and the image registration may include at least one of a feature element matching technique and a template-based matching technique.
- the third image data may be generated differently for the left breast and the right breast.
- the first image data may include information on the chest muscle
- the second image data may include information on the shape, size, and volume of the chest.
- a breast implant modeling apparatus for solving the problems of the present invention, a medical image acquisition unit for acquiring medical image data; a body image acquisition unit for acquiring body scan data; and 3D modeling the medical image data to obtain first image data, 3D modeling the body scan data to obtain second image data, and merging the first image data and the second image data to obtain a third image.
- a processor for generating data and generating information about the appearance of the breast implant based on the third image data may include.
- the patient acquires 3D image information of the chest muscle region of the patient by using the existing medical image data without additional medical imaging in order for the patient to produce a personalized breast implant, and simply in a standing position.
- a personalized breast implant can be modeled.
- the personalized breast implant modeled by the present invention it is possible to reproduce the appearance of the breast in the patient's daily situation before surgery even after mastectomy, accurately matching the chest muscle interface of a specific patient and reducing the volume. By matching, it is possible to minimize the occurrence of complications such as skin perforation and serousoma due to friction between the implant and the cut surface of the patient's breast.
- FIG. 1 is a block diagram of a breast implant modeling apparatus according to an embodiment of the present invention.
- FIG. 2 is an exemplary view of manufacturing a breast implant according to an embodiment of the present invention.
- FIG 3 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- FIG. 4 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- FIG. 5 is a flowchart illustrating a breast implant modeling process according to an embodiment of the present invention.
- FIG. 6 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- FIG. 7 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- FIG. 8 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- FIG. 9 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- FIG 10 is an exemplary view showing a breast implant according to an embodiment of the present invention.
- the term “unit” refers to a hardware element such as software, FPGA, or ASIC, and “unit” performs certain roles. However, “part” is not meant to be limited to software or hardware. A “unit” may be configured to reside on an addressable storage medium and may be configured to refresh one or more processors. Thus, by way of example, “part” refers to elements such as software elements, object-oriented software elements, class elements, and task elements, and processes, functions, properties, procedures, subroutines, and programs. It includes segments of code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functionality provided within elements and “parts” may be combined into a smaller number of elements and “parts” or further separated into additional elements and “parts”.
- all “units” of the present specification may be controlled by at least one processor, and at least one processor may perform operations performed by the “units” of the present disclosure.
- Embodiments of the present disclosure may be described in terms of a function or a block performing a function.
- Blocks which may be referred to as 'parts' or 'modules', etc. in the present disclosure include logic gates, integrated circuits, microprocessors, microcontrollers, memories, passive electronic components, active electronic components, optical components, hardwired circuits, and the like. It may be physically implemented by analog or digital circuitry, such as, and optionally driven by firmware and software.
- Embodiments of the present disclosure may be implemented using at least one software program running on at least one hardware device and may perform a network management function to control an element.
- spatially relative terms “below”, “beneath”, “lower”, “above”, “upper”, etc. It can be used to easily describe the correlation between a component and other components.
- a spatially relative term should be understood as a term that includes different directions of components during use or operation in addition to the directions shown in the drawings. For example, when a component shown in the drawing is turned over, a component described as “beneath” or “beneath” of another component is placed “above” the other component. can get Accordingly, the exemplary term “below” may include both directions below and above. Components may also be oriented in other orientations, and thus spatially relative terms may be interpreted according to orientation.
- 'medical image data' refers to magnetic resonance imaging (MRI), computerized tomography (CT), positron emission tomography (PET), ultrasonography, or mammography.
- MRI magnetic resonance imaging
- CT computerized tomography
- PET positron emission tomography
- ultrasonography or mammography.
- mammography including, but not limited to, includes all image data of a patient's body for medical purposes.
- 'body scan data' includes all image data scanned outside the patient's body.
- 'first image data' is image data obtained by obtaining a muscle region from 3D-modeled medical image data of the breast region.
- 'second image data' is image data obtained by 3D modeling the breast region of the body scan data.
- 'third image data' is image data obtained by merging the first image data and the second image data.
- FIG. 1 is a block diagram of a breast implant modeling apparatus according to an embodiment of the present invention.
- the breast prosthesis modeling apparatus 100 may include a medical image acquisition unit 110 , a body image acquisition unit 120 , a memory 130 , and a processor 140 .
- the breast implant modeling apparatus 100 is, for example, a computer, an Ultra Mobile PC (UMPC), a workstation, a net-book, a Personal Digital Assistants (PDA), a portable computer, a web tablet.
- UMPC Ultra Mobile PC
- PDA Personal Digital Assistants
- portable computer a web tablet.
- electronic devices such as a wireless phone, a mobile phone, a smart phone, and a portable multimedia player (PMP)
- PMP portable multimedia player
- the electronic device may perform overall service operations such as, for example, configuring a service screen, inputting data, transmitting/receiving data, and storing data, under the control of the application.
- the breast implant modeling apparatus 100 may acquire medical image data through the medical image acquisition unit 110 .
- the medical image data may include an MRI or CT image.
- the breast implant modeling apparatus 100 may acquire body scan data obtained by scanning the body through the body image acquisition unit 120 .
- the body scan data may include all image data obtained by scanning the outside of the patient's body.
- the memory 130 of the present invention is a local storage medium capable of storing medical image data, body scan data, and first image data, second image data, and third image data extracted by the processor 140 . If necessary, the processor 140 may use data stored in the memory 130 . In addition, the memory 130 of the present invention may store instructions for operating the processor 140 .
- the memory 130 of the present invention should retain data even when the power supplied to the breast implant modeling apparatus is cut off, and may be provided as a writable non-volatile memory (Writable Rom) to reflect changes. That is, the memory 130 may be provided with either a flash memory, an EPROM, or an EEPROM. In the present invention, it is described that all instruction information is stored in one memory 130 for convenience of explanation, but the present invention is not limited thereto, and the breast implant modeling apparatus 100 may include a plurality of memories.
- the processor 140 may obtain (or extract) the first image data by 3D modeling the medical image data and segmenting the muscle region from the 3D image data.
- the muscle region may include a muscle region coupled with a breast region including mammary glands or fat.
- the processor 140 may 3D model the body scan data and obtain second image data.
- the second image data obtained by the processor 140 is a 3D model of image data scanned in a standing posture of the patient and may include a breast area among body parts. Accordingly, the second image data may include the overall shape, size, and volume of the breast.
- a separate 3D modeling process may be omitted.
- the processor 140 may obtain the third image data by merging the first image data and the second image data.
- the third image data may be data including a muscle region combined with a breast region including mammary glands or fat, and the overall shape, size, and volume of the breast.
- the processor 140 may generate information about the appearance of the breast prosthesis based on the third image data.
- the processor acquires the first image data and the second image data, but is not limited thereto.
- the first image data is acquired by the medical image acquisition unit 110
- the second image data is acquired by the body image acquisition unit 120
- the processor 140 generates the first image data and the second image data.
- the third data may be acquired to model a personalized breast implant.
- FIG. 2 is an exemplary view of manufacturing a breast implant according to an embodiment of the present invention.
- a personalized breast implant such as a breast implant 210 .
- the personalized breast implant 230 may be manufactured through the production process 220 .
- FIG 3 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- medical imaging including MRI is performed to manufacture the implants.
- medical imaging is performed to precisely check the extent of the lesion and establish a surgical plan.
- the MRI is performed in a lying position as shown in the figure 310, which makes it difficult to check the shape of the breast in a normal situation (eg, a situation in which the patient is standing).
- an electronic device such as the breast implant modeling apparatus 100 performs 3D modeling for manufacturing a breast implant based on only medical image data including MRI imaging
- image data of a patient including breast internal information such as chest muscle
- it has the advantage of being easy to secure, it is difficult to reproduce the breast in everyday situations because most of the medical image data are images taken while lying down. Therefore, in order to manufacture a personalized breast implant, a large number of additional images are required, and a lot of shooting time and cost are required, so the shooting conditions are difficult.
- the body when modeling the implant based on only body scan data as shown in Figure 320, the body can be photographed quickly and easily in a standing position, and it is easy to reproduce the breast in everyday situations.
- the downside is that you don't know.
- the overall shape, size, and volume of the breast can be known through body scan data, but the location and structure of the internal muscles cannot be known, making it difficult to manufacture personalized breast implants.
- the breast of a woman's body is composed of the outermost skin of the body, including mammary glands or fat covering bones and muscles.
- Each person has a different breast shape, size, or internal structure.
- the shape, size, or internal structure of a person's left breast and right breast may be different. According to embodiments of the present invention, it may be possible to provide personalized breast implants for breasts having different shapes for each person.
- the figure 340 may be a state in which a general commercial breast implant is inserted.
- commercial breast implants do not properly reflect the muscle region such as the pectoralis major muscle line in the breast implant, but also do not properly reflect the shape of the breast, so there is a problem in that it is not possible to provide an implant suitable for the patient's breast. have.
- the muscle region combined with the breast region including the mammary gland or fat and the overall shape, size, and volume of the breast are taken into consideration as shown in Figure 350. implants can be provided.
- FIG. 4 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- the MRI image 410 may include an external shape of a breast, an internal muscle region, and a mammary gland or fat.
- the 3D modeling image 420 based on the MRI image 410 does not accurately reflect the appearance of the breast as shown.
- the 3D modeling image 430 based on the body scan data may reflect the appearance of the breast in a daily environment. Therefore, as described above, a process of merging the first image data obtained by obtaining the muscle region from the medical image data and the second image data obtained by 3D modeling the breast region of the body scan data is required.
- FIG. 5 is a flowchart illustrating a breast implant modeling process according to an embodiment of the present invention.
- Each step of the breast implant modeling method of the present invention may be performed by various types of electronic devices including the medical image acquisition unit 110 , the body image acquisition unit 120 , the memory 130 , and the processor 140 . .
- the embodiments described for the breast prosthesis modeling apparatus 100 are applicable to at least some or all of the breast prosthesis modeling methods, and on the contrary, the embodiments described for the breast prosthesis modeling method are for the breast prosthesis modeling apparatus 100. At least some or all of the embodiments are applicable. Also, the breast implant modeling method according to the disclosed embodiments is not limited to being performed by the breast implant modeling apparatus 100 disclosed herein, and may be performed by various types of electronic devices.
- the processor 140 may acquire medical image data and body scan data through the medical image acquisition unit 110 and the body image acquisition unit 120 [S100].
- the medical image data may include an MRI or CT scan image.
- the body scan data may include all image data scanned outside the patient's body.
- the processor 140 may 3D model the medical image data and obtain (or extract) the first image data by segmenting the muscle region from the 3D image data [S200].
- the processor 140 may perform 3D modeling by segmenting the breast region in the patient's medical image data, and may segment the muscle region from the modeled 3D image data.
- image segmentation or 3D modeling method is not limited.
- the processor 140 may 3D model the body scan data and obtain the second image data [S300].
- the second image data may mean a breast region of the body scan data.
- the body scan data is data obtained by 3D scanning, a separate 3D modeling process may be omitted.
- the processor 140 may generate the third image data by merging the first image data and the second image data [S400].
- a process of merging the first image data and the second image data will be described later in detail with reference to FIG. 8 .
- the processor 140 may generate information about the appearance of the breast prosthesis based on the third image data [S500].
- the processor 140 since the first image data includes information about the muscles inside the breast and the second image data includes information about shape, size, and volume, the processor 140 operates the muscle ( Specifically, it may be possible to model a personalized breast implant model that touches the pectoralis major muscle line) and corresponds to the shape of the breast.
- FIG. 6 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- the processor 140 3D models the medical image data and divides the muscle region 610 from the 3D image data to obtain the first image data.
- the muscle region 610 may include the pectoralis major muscle line within the breast.
- the breast implant modeled according to embodiments of the present invention may come into direct contact with the muscle region 610 . Through this, the patient can be provided with the breast implant precisely matched to the pectoral muscle interface.
- the processor 140 may 3D model the body scan data and obtain second image data for the chest region 620 .
- the second image data may include information on the shape, size, and volume of the chest.
- the processor 140 may merge the first image data and the second image data to generate the third image data as shown in FIG. 630 .
- the processor 140 may generate information about the appearance of the breast implant based on the third image data.
- a personalized breast implant 640a may be provided to the patient based on the generated appearance information.
- the breast implant may come into direct contact with the pectoral muscles 640b and 640c inside the breast.
- FIG. 7 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- FIG. 8 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- the processor 140 generates third image data by merging the first image data and the second image data. Specifically, the processor 140 may generate the third image data through image registration.
- the image registration refers to a technique for obtaining a cross-sectional shape of a region of interest among images acquired with different photographing devices and moving it to a single reference coordinate so as to overlap it.
- image registration is a feature element matching technique for extracting and matching major feature points of an image or a template-based matching technique for determining a region with the highest similarity by comparing a predetermined region in an image with a specified template (Template) -based registration).
- the feature element matching technique may consist of four steps: feature extraction, feature matching, transformation model estimation, and image registration.
- an intensity-based matching method such as CC (Cross-Correlation), MI (Mutual Information) or LSM (Least-Squares Matching) and SIFT (Scale-Invariant Feature Transform)
- feature-based matching methods such as Speeded Up Robust Features (SURF).
- the axes of the first image data 810 and the second image data 820 are aligned.
- the axis of the image data refers to x, y, and z axes in a three-dimensional space.
- the feature point includes a specific point in which a position in a three-dimensional space does not change according to a change in a patient's state (eg, breathing).
- the feature point extraction may be implemented by an artificial intelligence algorithm including machine learning or deep learning.
- the sizes and positions of the first image data 810 and the second image data 820 are matched based on the distance or position between the plurality of feature points, and then merged to generate third image data. can do.
- FIG. 9 is an exemplary view for explaining a breast implant modeling process according to an embodiment of the present invention.
- the breast includes a first chest region 910 mainly composed of mammary glands and fat, and a second chest region 920 containing pectoralis major muscle.
- the patient lies down to receive a medical image (eg, MRI).
- MRI medical image
- the shape of the first chest region 910 varies greatly depending on posture, but the shape of the second chest region 920 does not change significantly depending on posture (standing posture and lying posture). Therefore, it is possible to model the part in contact with the muscle region in the implant by using the information included in the first image data. Also, modeling of the size and shape of the implant may be possible using information included in the second image data 930 obtained from the body scan data.
- FIG 10 is an exemplary view showing a breast implant according to an embodiment of the present invention.
- both the right breast implant 1010a and the left breast implant 1010b may be modeled.
- the patient can be provided with a personalized breast implant. Accordingly, the patient can enjoy the effect of minimizing the occurrence of complications such as skin perforation and serousoma due to friction between the implant and the cut surface of the patient's breast by accurately matching the chest muscle interface and matching the volume.
- the deep neural network (DNN) of the present invention may include a system or network that constructs one or more layers in one or more computers and performs a determination based on a plurality of data.
- the deep neural network may be implemented as a set of layers including a convolutional pooling layer, a locally-connected layer, and a fully-connected layer.
- the convolutional pooling layer or local connection layer may be configured to extract features in an image.
- the fully connected layer may determine a correlation between features of an image.
- the overall structure of the deep neural network of the present invention may be formed in a form in which a local access layer is connected to a convolutional pooling layer, and a fully connected layer is formed in the local access layer.
- the deep neural network may include various judgment criteria (ie, parameters), and may add new judgment criteria (ie, parameters) through input image analysis.
- the deep neural network is a structure called a convolutional neural network suitable for image analysis, and a feature extraction layer that learns by itself the feature with the greatest discriminative power from given image data. ) and a prediction layer that learns a prediction model to obtain the highest prediction performance based on the extracted features may be configured in an integrated structure.
- the feature extraction layer spatially integrates a convolution layer that creates a feature map by applying a plurality of filters to each region of the image and a feature map that is invariant to changes in position or rotation. It can be formed in a structure that alternately repeats the pooling layer for extracting the . Through this, various level features can be extracted from low-level features such as points, lines, and planes to complex and meaningful high-level features.
- the convolutional layer obtains a feature map by taking a nonlinear activation function on the dot product of the filter and the local receptive field for each patch of the input image.
- CNNs are characterized by using filters with sparse connectivity and shared weights. Such a connection structure reduces the number of parameters to be learned, makes learning through the backpropagation algorithm efficient, and consequently improves prediction performance.
- the pooling layer (or sub-sampling layer) generates a new feature map by using local information of the feature map obtained from the previous convolutional layer.
- the newly created feature map by the integration layer is reduced to a smaller size than the original feature map.
- Representative integration methods include Max Pooling, which selects the maximum value of the corresponding region in the feature map, and the corresponding feature map in the feature map.
- Max Pooling which selects the maximum value of the corresponding region in the feature map, and the corresponding feature map in the feature map.
- There is an average pooling method that calculates the average value of a region.
- the feature map of the integrated layer may be less affected by the position of an arbitrary structure or pattern existing in the input image than the feature map of the previous layer.
- the integration layer can extract features more robust to regional changes such as noise or distortion in the input image or the previous feature map, and these features can play an important role in classification performance.
- Another role of the integration layer is to reflect the features of a wider area as you go up to the upper learning layer in the deep structure. As the feature extraction layers are piled up, the lower layers reflect local features and move up to the upper layers. More and more abstract features can be generated that reflect the features of the entire image.
- the features finally extracted through iteration of the convolutional layer and the integration layer are fully connected to the classification model such as multi-layer perception (MLP) or support vector machine (SVM). -connected layer) and can be used for classification model training and prediction.
- MLP multi-layer perception
- SVM support vector machine
- training data for machine learning may be generated based on the U-Net-dhSgement model.
- the U-Net-dhSgement model is based on an end-to-end fully connected convolutional network (FCN), and contracts an expansive path and a symmetric path. It could be a model that created a U-shaped architecture with skip connections for each level by setting it to .
- FCN fully connected convolutional network
- training data for the machine learning model may be composed of medical image data, first image data, body scan data, second image data, and third image data.
- the processor 140 may perform the process of modeling the personalized breast implant described above with reference to FIGS. 5 and 8 by using the machine learning model learned through the training data.
- Various embodiments of the present invention include one or more instructions stored in a storage medium (eg, memory) readable by a machine (eg, the breast prosthesis fitting apparatus 100 or a computer). It can be implemented as software that includes.
- the processor eg, the processor 140
- the device may call at least one of the one or more instructions stored from the storage medium and execute it. This enables the device to be operated to perform at least one function according to the at least one instruction called.
- the one or more instructions may include code generated by a compiler or code executable by an interpreter.
- the device-readable storage medium may be provided in the form of a non-transitory storage medium.
- 'non-transitory storage medium' is a tangible device and only means that it does not include a signal (eg, electromagnetic wave), and this term is used when data is semi-permanently stored in a storage medium and temporary storage.
- the 'non-transitory storage medium' may include a buffer in which data is temporarily stored.
- the method according to various embodiments disclosed in the present specification may be provided as included in a computer program product.
- Computer program products may be traded between sellers and buyers as commodities.
- the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play StoreTM) or on two user devices. It can be distributed (eg downloaded or uploaded) directly or online between devices (eg smartphones).
- at least a portion of the computer program product eg, a downloadable app
- a machine-readable storage medium such as a memory of a manufacturer's server, a server of an application store, or a relay server.
- the present invention may be temporarily stored or temporarily created.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Cardiology (AREA)
- Transplantation (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Robotics (AREA)
- Vascular Medicine (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims (10)
- 컴퓨터에서 수행되는 개인 맞춤형 유방 보형물 모델링 방법에 있어서,의료영상데이터 및 신체스캔데이터를 획득하는 단계;상기 의료영상데이터를 3D모델링하여 제1영상데이터를 획득하는 단계;상기 신체스캔데이터를 3D모델링하여 제2영상데이터를 획득하는 단계;상기 제1영상데이터 및 상기 제2영상데이터를 병합하여 제3영상데이터를 생성하는 단계; 및상기 제3영상데이터를 기초로 유방 보형물 외형정보를 생성하는 단계; 를 포함하는, 유방 보형물 모델링 방법.
- 제 1 항에 있어서,상기 외형정보를 생성하는 단계는 머신러닝 모델을 이용하여 수행되며,상기 머신러닝 모델을 위한 학습 데이터는 상기 의료영상데이터, 상기 제1영상데이터, 상기 신체스캔데이터, 상기 제2영상데이터 및 상기 제3영상데이터로 구성되는, 유방 보형물 모델링 방법.
- 제 1 항에 있어서,상기 병합은 영상 정합을 통해 수행되며,상기 영상 정합은 특징 요소 정합 기법 및 템플릿 기반 정합 기법을 적어도 하나를 포함하는, 유방 보형물 모델링 방법.
- 제 1 항에 있어서,상기 제3영상데이터는 왼쪽 유방과 오른쪽 유방에대하여 서로 다르게 생성되는, 유방 보형물 모델링 방법.
- 제 1 항에 있어서,상기 제1영상데이터는 가슴 근육에 대한 정보를 포함하고, 상기 제2영상데이터는 가슴의 모양, 크기 및 부피에 대한 정보를 포함하는, 유방 보형물 모델링 방법.
- 의료영상데이터를 획득하는 의료영상획득부;신체스캔데이터를 획득하는 신체영상획득부; 및상기 의료영상데이터를 3D모델링하여 제1영상데이터를 획득하고,상기 신체스캔데이터를 3D모델링하여 제2영상데이터를 획득하고,상기 제1영상데이터 및 상기 제2영상데이터를 병합하여 제3영상데이터를 생성하며,상기 제3영상데이터를 기초로 유방 보형물 외형정보를 생성하는 프로세서; 를 포함하는, 유방 보형물 모델링 장치.
- 제 6 항에 있어서,상기 병합은 영상 정합을 통해 수행되며,상기 영상 정합은 특징 요소 정합 기법 및 템플릿 기반 정합 기법을 적어도 하나를 포함하는, 유방 보형물 모델링 장치.
- 제 6 항에 있어서,상기 제3영상데이터는 왼쪽 유방과 오른쪽 유방에대하여 서로 다르게 생성되는, 유방 보형물 모델링 장치.
- 제 6 항에 있어서,상기 제1영상데이터는 가슴 근육에 대한 정보를 포함하고, 상기 제2영상데이터는 가슴의 모양, 크기 및 부피에 대한 정보를 포함하는, 유방 보형물 모델링 장치.
- 제 1 항에 기재된 유방 보형물 모델링 방법을 구현하기 위한 프로그램이 저장된 컴퓨터 판독 가능한 기록매체.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/844,577 US20220395329A1 (en) | 2019-12-20 | 2022-06-20 | Method and program for modeling personalized breast implant |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2019-0171953 | 2019-12-20 | ||
KR20190171953 | 2019-12-20 | ||
KR1020200176976A KR20210080232A (ko) | 2019-12-20 | 2020-12-17 | 개인 맞춤형 유방 보형물을 모델링하는 방법 및 프로그램 |
KR10-2020-0176976 | 2020-12-17 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/844,577 Continuation US20220395329A1 (en) | 2019-12-20 | 2022-06-20 | Method and program for modeling personalized breast implant |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021125851A1 true WO2021125851A1 (ko) | 2021-06-24 |
Family
ID=76477904
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2020/018592 WO2021125851A1 (ko) | 2019-12-20 | 2020-12-17 | 개인 맞춤형 유방 보형물을 모델링하는 방법 및 프로그램 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20220395329A1 (ko) |
KR (1) | KR102535865B1 (ko) |
WO (1) | WO2021125851A1 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4235572A1 (en) * | 2022-02-28 | 2023-08-30 | Imagoworks Inc. | Automated registration method of 3d facial scan data and 3d volumetric medical image data using deep learning and computer readable medium having program for performing the method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7058439B2 (en) * | 2002-05-03 | 2006-06-06 | Contourmed, Inc. | Methods of forming prostheses |
JP2015054089A (ja) * | 2013-09-12 | 2015-03-23 | 株式会社エクシールコーポレーション | 乳房再建用フォーミングカップの作製方法及び乳房再建用フォーミングカップ |
KR20190046465A (ko) * | 2017-10-26 | 2019-05-07 | 영남대학교 산학협력단 | 맞춤형 유방보형물 제조방법, 이를 수행하는 제조시스템, 이를 위한 컴퓨터 프로그램과 컴퓨터 판독가능한 기록매체, 맞춤형 유방보형물, 및 맞춤형 보정 브래지어 |
KR20190134864A (ko) * | 2018-04-27 | 2019-12-05 | 한림대학교 산학협력단 | 3d 프린터를 이용한 유방 보형물 제작 시스템 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB201302194D0 (en) * | 2013-02-07 | 2013-03-27 | Crisalix Sa | 3D platform for aesthetic simulation |
-
2020
- 2020-12-17 WO PCT/KR2020/018592 patent/WO2021125851A1/ko active Application Filing
-
2022
- 2022-06-20 US US17/844,577 patent/US20220395329A1/en active Pending
-
2023
- 2023-01-11 KR KR1020230004346A patent/KR102535865B1/ko active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7058439B2 (en) * | 2002-05-03 | 2006-06-06 | Contourmed, Inc. | Methods of forming prostheses |
JP2015054089A (ja) * | 2013-09-12 | 2015-03-23 | 株式会社エクシールコーポレーション | 乳房再建用フォーミングカップの作製方法及び乳房再建用フォーミングカップ |
KR20190046465A (ko) * | 2017-10-26 | 2019-05-07 | 영남대학교 산학협력단 | 맞춤형 유방보형물 제조방법, 이를 수행하는 제조시스템, 이를 위한 컴퓨터 프로그램과 컴퓨터 판독가능한 기록매체, 맞춤형 유방보형물, 및 맞춤형 보정 브래지어 |
KR20190134864A (ko) * | 2018-04-27 | 2019-12-05 | 한림대학교 산학협력단 | 3d 프린터를 이용한 유방 보형물 제작 시스템 |
Non-Patent Citations (2)
Title |
---|
JEONG YOUNG JIN, CHOI DONG HUN, ANNA SEO: "Development of segmentation/modeling algorithm using medical imaging (MRI) and 3D scanning data for 3D printing artificial implants for breast reconstruction in breast defects", THE KOREAN INSTITUTE OF INFORMATION SCIENTISTS AND ENGINEERS, 1 December 2018 (2018-12-01), pages 1267 - 1269, XP055823280 * |
WANG LIQIAN; CUI XIAOYU; XUE JINQI; ZHU XUDONG; CHEN GUANGLEI; GU XI; QIAO XINBO; LIU CAIGANG: "Breast-Shape Classification and Implant Construction Method for Unilateral Breast Reconstruction", IEEE ACCESS, vol. 7, 16 October 2019 (2019-10-16), pages 157506 - 157512, XP011754574, DOI: 10.1109/ACCESS.2019.2947744 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4235572A1 (en) * | 2022-02-28 | 2023-08-30 | Imagoworks Inc. | Automated registration method of 3d facial scan data and 3d volumetric medical image data using deep learning and computer readable medium having program for performing the method |
Also Published As
Publication number | Publication date |
---|---|
KR20230012091A (ko) | 2023-01-25 |
US20220395329A1 (en) | 2022-12-15 |
KR102535865B1 (ko) | 2023-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6947759B2 (ja) | 解剖学的対象物を自動的に検出、位置特定、及びセマンティックセグメンテーションするシステム及び方法 | |
CN109523522B (zh) | 内窥镜图像的处理方法、装置、系统及存储介质 | |
Li et al. | Computer-aided detection of bleeding regions for capsule endoscopy images | |
CN107563434B (zh) | 一种基于三维卷积神经网络的脑部mri图像分类方法、装置 | |
WO2016003258A1 (ko) | 치과 시술 시뮬레이션을 위한 얼굴모델 생성 방법 | |
WO2021125851A1 (ko) | 개인 맞춤형 유방 보형물을 모델링하는 방법 및 프로그램 | |
WO2020122606A1 (ko) | 인공신경망을 이용한 장기의 부피 측정 방법 및 그 장치 | |
CN116071401B (zh) | 基于深度学习的虚拟ct图像的生成方法及装置 | |
CN113646653A (zh) | 脑图像处理 | |
CN112749593A (zh) | 医学成像系统、识别检测对象的体位的方法、存储介质 | |
Bessa et al. | 3D digital breast cancer models with multimodal fusion algorithms | |
CN111462139A (zh) | 医学图像显示方法、装置、计算机设备和可读存储介质 | |
Reda et al. | Automatic pre-to intra-operative CT registration for image-guided cochlear implant surgery | |
CN111862118B (zh) | 压疮分期的训练方法、分期方法及分期系统 | |
CN113822323A (zh) | 脑部扫描图像的识别处理方法、装置、设备及存储介质 | |
WO2021125889A1 (ko) | 영상 분할을 통한 기관의 3차원 모델링 장치 및 방법 | |
KR20210080232A (ko) | 개인 맞춤형 유방 보형물을 모델링하는 방법 및 프로그램 | |
WO2021206517A1 (ko) | 수술 중 혈관 네비게이션 방법 및 시스템 | |
Dréan et al. | Inter-individual organ-driven CT registration for dose mapping in prostate cancer radiotherapy | |
CN113822917A (zh) | 一种肝癌影像组学图像精确配准方法 | |
WO2020101428A1 (ko) | 병변 영역 검출 장치, 병변 영역 검출 방법 및 컴퓨터 프로그램 | |
WO2019168310A1 (ko) | 딥러닝을 이용한 의료영상의 공간 정규화 장치 및 그 방법 | |
Lee et al. | Utilizing Mask RCNN for Monitoring Postoperative Free Flap: Circulatory Compromise Detection Based on Visible-Light and Infrared Images | |
WO2023058837A1 (ko) | 흉부 영상으로부터 횡격막을 검출하는 방법 및 이를 위한 장치 | |
WO2021251777A1 (ko) | 전신 ct 스캔 3d 모델링 방법 및 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20903950 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20903950 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20903950 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19/12/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20903950 Country of ref document: EP Kind code of ref document: A1 |