WO2014200017A1 - 骨切支援システム、情報処理装置、画像処理方法、および画像処理プログラム - Google Patents
骨切支援システム、情報処理装置、画像処理方法、および画像処理プログラム Download PDFInfo
- Publication number
- WO2014200017A1 WO2014200017A1 PCT/JP2014/065448 JP2014065448W WO2014200017A1 WO 2014200017 A1 WO2014200017 A1 WO 2014200017A1 JP 2014065448 W JP2014065448 W JP 2014065448W WO 2014200017 A1 WO2014200017 A1 WO 2014200017A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- bone
- osteotomy
- data
- image
- marker
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/14—Surgical saws ; Accessories therefor
- A61B17/15—Guides therefor
- A61B17/151—Guides therefor for corrective osteotomy
- A61B17/152—Guides therefor for corrective osteotomy for removing a wedge-shaped piece of bone
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/008—Cut plane or projection plane definition
Definitions
- the present invention relates to a technique for supporting a surgical operation by image processing.
- Patent Document 1 discloses a technique for determining an osteotomy position from an inverted image of a healthy bone and an image of a target bone.
- Patent Document 2 discloses a technique for generating an artificial bone model for compensation based on a determined osteotomy position.
- Non-Patent Document 1 describes a three-dimensional bone surface model (STL: Stereo) from DICOM (Digital Imaging and Communication Communication in Medical) data, which is a standardized format of medical images such as CT (Computed Tomography) / MRI (Magnetic Resonance Imaging). Software for generating Lithography) data is shown, and Non-Patent Document 2 shows software that simulates bone joint surgery in advance using three-dimensional bone surface model (STL) data.
- DICOM Digital Imaging and Communication Communication in Medical
- Bone model 3D data simple creation software (BoneViewer), Company Report, Olsley Inc. (http://www.orthree.jp/pdf/case_bv.pdf) Indirect bone surgery simulation software (BoneSimulator), Company Report, Olsley Inc. (http://www.orthree.jp/pdf/case_bs.pdf)
- An object of the present invention is to provide a technique for solving the above-described problems.
- an osteotomy support system comprises: Storage means for storing the three-dimensional shape data of the bone to be operated and the position data of the marker in association with each other; Based on the three-dimensional shape data of the surgical target bone, osteotomy plane determining means for determining the position and orientation of the osteotomy plane indicating the plane to cut the surgical target bone; An osteotomy plane display means for displaying the determined osteotomy plane based on an image obtained by imaging the marker; Is provided.
- an information processing apparatus provides: An information processing apparatus used for the osteotomy support system, Storage means for storing the three-dimensional shape data of the bone to be operated and the position data of the marker in association with each other; Based on the three-dimensional shape data of the surgical target bone, osteotomy plane determining means for determining the position and orientation of the osteotomy plane indicating the plane to cut the surgical target bone; Equipped with.
- an image processing method includes: An image processing method used for the osteotomy support system, A storage step of storing the three-dimensional shape data of the surgical target bone and the position data of the marker in association with each other in a storage unit; Based on the three-dimensional shape data of the surgical target bone, an osteotomy plane determining step for determining a position and orientation of an osteotomy plane indicating a plane to cut the surgical target bone; including.
- an image processing program provides: An image processing program used for the osteotomy support system, A storage step of storing the three-dimensional shape data of the surgical target bone and the position data of the marker in association with each other in a storage unit; Based on the three-dimensional shape data of the surgical target bone, an osteotomy plane determining step for determining a position and orientation of an osteotomy plane indicating a plane to cut the surgical target bone; Is executed on the computer.
- a pre-determined osteotomy surface can be accurately instructed to an operating doctor.
- movement means what expresses “rotational movement” and / or “parallel movement” of a three-dimensional image on a two-dimensional display screen.
- the osteotomy support system 100 is a system for supporting osteotomy surgery by image processing.
- the osteotomy support system 100 includes a storage unit 101, an osteotomy surface determination unit 102, and an osteotomy surface display unit 103.
- the storage unit 101 stores the three-dimensional shape data 111 of the surgical target bone in association with the position data 112 of the marker fixed to the surgical target bone.
- the osteotomy plane determination unit 102 determines the position and orientation of the osteotomy plane 121 indicating a plane for cutting the surgery target bone based on the three-dimensional shape data 111 of the surgery target bone.
- the osteotomy surface display unit 103 displays the osteotomy surface 121 determined based on an image obtained by imaging the marker 131 fixed to the surgical target bone.
- a predetermined osteotomy position can be accurately instructed to an operating doctor.
- the surgical operation support system according to the present embodiment includes the three-dimensional data of the first target bone of one of the surgical target bones, which is the reference for the placement of the surgical target bone, and the three-dimensional reference bone that serves as a reference for the shape after healing. Data is generated and stored in association with a first marker (for example, a two-dimensional code) fixed to the first target bone. Further, three-dimensional data of the other second target bone of the surgical target bone is generated and stored in association with a second marker (for example, a two-dimensional code) fixed to the second target bone.
- a first marker for example, a two-dimensional code
- the three-dimensional positions of the osteotomy plane, the first target bone, and the second target bone are determined from the first marker and the second marker that have been imaged and stored. Display based on dimensional data. Then, by determining whether or not the second target bone and the reference bone have appropriately overlapped, an appropriate arrangement of the surgical target bone is determined. Such a process assists the doctor in determining the appropriate placement of the surgical target bone.
- the surgical operation support system broadly includes a preoperative preparation data generation system and an intraoperative image processing system.
- the preoperative preparation data generation system is a system for generating and storing data used during an operation by generating and displaying three-dimensional data of a first target bone, a second target bone, and a reference bone before an operation.
- the intraoperative image processing system is a system that generates and displays a target bone image and a reference bone image based on marker imaging, and assists in determining the placement of a surgical target bone.
- the preoperative preparation data generation system and the intraoperative image processing system may be configured as one integrated system.
- FIG. 2 is a diagram for explaining the outline of the entire surgical operation according to this embodiment.
- FIG. 2 shows an example of osteotomy correction surgery for a diseased bone (surgical bone) that has undergone deformation healing.
- the osteotomy correction operation includes a preparation stage 201, a surgical target bone positioning stage 202, and a surgical target bone position fixing stage 203.
- a distal radius deformity healing operation will be described as an example, but the present invention is not limited to this.
- the present invention is similarly applied to deformation healing of other parts or other bones, or fracture treatment.
- a predetermined interval for example, Fixed at intervals of 1 cm or 2 cm.
- the length of the pin varies depending on the affected part and bone, but it may be about 5 cm to 10 cm as a length in which a marker can be set externally in the forearm and can be easily imaged.
- CT Computer Tomography
- three-dimensional data of the surgical target bone with the pins is generated and stored. Further, the positions and orientations of markers to be fixed later are set for the pins 211 and 212, and the position data of the markers, the three-dimensional data of the operation target bone, and the three-dimensional data of the reference bone are associated with each other.
- the position and orientation of the attached marker may be defined.
- the relationship between the plane generated by the two pins and the position and orientation of the marker may be set in advance, or a plurality of relationships (eg, parallel, perpendicular, 45 degrees with respect to the plane generated by the two pins). You may choose from "Nice”.
- one or more jigs used to fix the marker to the pin or three-dimensional shape data of the pin itself are prepared, and three-dimensional data of the pin acquired by CT imaging is 3
- the position of the marker may be defined by attaching the jig on the dimension space.
- the marker position data is data indicating the relative position between the marker position and the position of the surgical target bone. What is necessary is just the data which show the position of the marker seen from the origin of the three-dimensional space containing the three-dimensional data of the bone to be operated. Since the osteotomy surface data includes a relative position with respect to the three-dimensional data of the bone to be operated, as a result, a relative relationship between the position of the marker in the same three-dimensional space and the position of the osteotomy surface is defined. That is, by referring to this database, the position, size, and orientation for displaying the osteotomy plane can be determined from the position, size, and orientation of the marker in the captured image.
- the affected part is incised and the bone is cut at the position indicated by the system.
- the markers 221 and 222 are photographed in the positioning stage 202 of the bone to be operated.
- the positions, sizes, and orientations of the markers 221, 222 are recognized from the captured images, and the positions, sizes, and orientations of the surgical target bones are derived by referring to the database.
- the operation target bones 223 and 224 and the reference bone 225 having the derived positions, sizes, and orientations are displayed.
- the state of the marker 221 in the captured image changes.
- the display position, size, and inclination of the surgical target bone 223 in the display image are displayed so as to change according to changes in the position, size, and inclination of the marker 221.
- the three-dimensional shape data of the reference bone 225 is stored in advance along with the relative relationship with the position, size, and inclination of the marker 222, and the reference bone 225 is displayed at a predetermined position by imaging the marker 222. .
- the doctor finds a position where the operation target bone 223 overlaps the reference bone 225 the doctor proceeds to the operation target bone position fixing stage 203.
- 211 and 212 are fixed by a fixing tool 231.
- the pins 211 and 212 are protruded outside the wound, but the present invention is not limited to this.
- a short pin 1-2 cm
- a new long pin is connected to the short pin during the operation (positioning stage 202), and markers 221 and 222 are attached.
- a pin is virtually inserted into the bone CG data generated by CT of the bone alone without inserting the pin, and then the bone is opened during the operation, and the pin is actually placed at the position according to the CG data. May be inserted.
- the position of the marker is determined by using the CG data of the bone with the virtual pin, and a 3D printer generates a mold (a pin hole with a pin hole) that fits the affected bone.
- a mold a pin hole with a pin hole
- an actual pin may be inserted at the same position as the pin in the CG data.
- a method may be used in which a marker is attached to the mold itself with such a mold fitted to the bone.
- the feature point of the bone imaged with the digital camera may be discriminated and superimposed with the CG data with the pin, so that the pin may be inserted in the same position as the CG data in the same direction.
- FIG. 3 is a diagram showing a configuration of the preoperative preparation data generation system 320.
- the preoperative preparation data generation system 320 includes an information processing device 324 for generating a reference image and a CT scanner 321 for acquiring a tomographic image of the patient 322, which are connected by a network 323. Furthermore, as an option, an STL data generation server 325 that generates three-dimensional bone surface data (STL data) from the tomographic image data may be provided.
- the network may be a wide area network or a LAN.
- the CT scanner 321 acquires a tomographic image of the affected part of the patient 322 and a part serving as a reference of the affected part.
- a tomographic image of the right arm forehead in which four pins are inserted and fixed to the surgical target bone and a tomographic image of the left arm forearm on the healthy side are acquired.
- the tomographic image data is sent to the information processing apparatus 324 via the network 323 and converted into three-dimensional data by the information processing apparatus 324.
- the STL data generation server 325 may perform conversion from tomographic image data to three-dimensional data.
- the biological data used in the present embodiment is not limited to data acquired by CT / MRI, and the three-dimensional data is not limited to STL data.
- FIG. 4 is a diagram for explaining an outline of preoperative preparation data generation processing using the information processing device 324.
- Images 401 to 406 are CG (Computer Graphics) images displayed on the display screen of the information processing device 324, and each correspond to each stage of preoperative preparation data generation processing.
- CG Computer Graphics
- a non-affected bone three-dimensionally generated by CT and MRI or the like is used to internally image a non-affected bone that is in a position symmetrical to the surgical target bone of the forearm 213 (the healthy side).
- the data 411 is inverted to generate mirror image data.
- three-dimensional data (hereinafter referred to as reference bone) 412 of the reference bone having the same shape (at least partially overlapped) as the surgical target bone is generated.
- the reference bone image 412 is not limited to the mirror image data of the unaffected bone on the healthy side.
- other similar-shaped bones of the patient, ribs of other patients, and ribs manufactured by CG may be used.
- a surgical target bone three-dimensional data (hereinafter referred to as a surgical target bone) 421 of a surgical target bone (affected bone) generated by internal imaging of the surgical target bone of the forearm 213 by CT scan or the like. Is displayed. Since the operation target bone 421 is generated from STL data imaged in a state where the pins 211 and 212 are fixed, the operation target bone 421 also includes the pins 211 and 212 on the three-dimensional data. Then, the reference bone image 412 and the surgical target bone 421 are compared on the display screen, and the state of the surgical target bone 421 is confirmed.
- the surgical target bone 421 is operated on the image 403 while referring to the image. That is, the operation target bone 421 is moved or rotated with respect to the reference bone image 412 to overlap each end of the reference bone image 412 with each end of the operation target bone 421.
- the lower ends of the operation target bone 421 and the reference bone image 412 are overlapped to determine the osteotomy surface of the operation target bone 421.
- the shape (feature point) of the joint the lower end or the upper end in the figure
- the distortion and bending of the surgical target bone 421 are recognized, and the reference bone and the reference bone gradually move upward from the lower end.
- the branch position where the separation from the reference bone starts is defined as an osteotomy image 431.
- the osteotomy image 431 is a rectangular plane having a predetermined shape and size, but the present invention is not limited to this.
- a surface including a curved surface may be used according to the purpose of osteotomy.
- the doctor may evaluate and determine the osteotomy image 431 while observing how the reference bone image 412 and the surgical target bone 421 are superimposed, but the optimal osteotomy may be obtained by calculation. For example, after overlapping the lower ends, the non-overlapping volume per unit length in the axial direction of the surgical target bone 421 and the reference bone image 412 is sequentially calculated from the lower end, and the highest non-overlapping volume does not exceed a predetermined value. A plane connecting the points may be used as the osteotomy image 431. Alternatively, the surface of the reference bone image 412 is subdivided into unit areas, and the osteotomy image 431 is automatically generated by connecting the positions where the vertical distance to the surface of the surgical target bone 421 exceeds a predetermined value for each unit area.
- an image 404 when the two target bone images 441 and 442 generated by osteotomy are superimposed on the upper and lower ends of the reference bone image 412, the volume (or the distance between the surfaces) that protrudes from the reference bone image 412 is shown.
- the osteotomy plane may be determined so that the total value of) is minimized.
- the target bone image 441 and the target bone image 442 are separated.
- the osteotomy surface may be determined so that the gap (defect region) 443 sandwiched between the bones is minimized.
- the optimal position and angle of the osteotomy plane can be determined by repeatedly simulating the plane of every orientation as the osteotomy plane candidate while shifting the position in the axial direction of the target bone 421 by a unit distance (for example, 1 mm).
- a unit distance for example, 1 mm.
- the osteotomy image 431 is determined in this way, in the fourth stage, three-dimensional data of two target bone images 441 and 442 obtained by separating the surgery target bone 421 by the osteotomy image 431 is generated and stored. That is, the superimposed target bone image 442 and reference bone image 412 are stored as one set in association with the marker 222 attached to the pin 212. Then, as shown in an image 404, the target position of the target bone image 441 with respect to the target bone image 442 or the reference bone image 412 is stored in association with the position data of the marker 221 attached to the pin 211. Thereby, if the position and inclination of the marker 221 can be recognized in the real space, the position and inclination of the target bone image 441 can be estimated.
- the position, shape, and inclination data of the osteotomy image 431 are stored in association with the position data of the marker 221 or the marker 222.
- the position and orientation of the marker 221 with respect to the pin 211 and the position and orientation of the marker with respect to the pin 212 may be determined in advance, but in the present embodiment, a plurality (for example, four) can be selected.
- the first marker attachment type is a type in which a marker is attached in parallel to a pin plane formed by two pins.
- the second marker attachment type is a type in which a marker is attached to a plane parallel to the axial direction of the pin and perpendicular to the pin plane.
- the third marker attachment type is a type in which the marker is attached to a plane that is parallel to the axial direction of the pin and forms 45 degrees with the pin plane.
- the fourth marker attachment type is a type in which a marker is attached to a plane that is parallel to the axial direction of the pin and forms 135 degrees with the pin plane.
- the relative positional relationship between the marker and the surgical target bone or reference bone to be displayed may be changed.
- the display of the target bone image 441 and the reference bone image 412, the display of the target bone image 442 based on the position, size and orientation of the marker imaged at the time of surgery, and The osteotomy image 431 can be displayed.
- the gap 443 between the target bone image 441 and the target bone image 442 represents the shape of the joint bone necessary at the time of surgery. Therefore, at this time, the three-dimensional shape of the joint bone required at the time of surgery can also be acquired.
- a combination of the target bone images 441 and 442 determined as the target arrangement in the image 404 may be displayed as a unit.
- the positions of the pins 211 and 212 as the support members of the first and second markers 221 and 223 in a state where both the target bone images 441 and 442 are superimposed on the reference bone image 412 are used as target relative position data.
- the target position of the pin 212 of the second marker 222 may be displayed based on the stored target relative position data.
- osteotomy correction surgery of affected bone surgical target bone
- target bones on both sides of the osteotomy plane are considered, but the present invention is not limited to this.
- an accurate bone is displayed by displaying an osteotomy surface (for example, three surfaces) for generating a surface to which the artificial joint is to be attached using the AR technique as described above. Can be cut off.
- three-dimensional CG models of artificial joints of three sizes of S, M, and L are prepared.
- the relative position relationship between the adhesion surface of the artificial joint model and the marker may be stored, and the blade model may be displayed in AR according to the adhesion surface during the operation.
- a blade indicating the osteotomy surface may be attached to the 3D model of the artificial joint. In this case, one marker is sufficient.
- a marker may be attached to an actual blade to recognize its position and instruct the movement of the blade to the target position.
- FIG. 5 is a diagram showing a schematic configuration of an intraoperative image processing system 500 according to the present embodiment.
- the intraoperative image processing system 500 includes a tablet computer 501 as an information processing device and a display device 502.
- the tablet computer 501 includes a display 511 and a camera 512 (digital camera).
- Tablet computer 501 is fixed at a position where display 511 faces doctor 503 and camera 512 faces markers 221 and 222.
- the tablet computer 501 stores the three-dimensional data of the surgical target bone in advance, and recognizes the position and direction of the surgical target bone from the images of the markers 221 and 222.
- the tablet computer 501 displays an ideal osteotomy image at the recognized position on the display 511. Thereby, the doctor 503 can grasp the osteotomy surface at a glance.
- the doctor 503 grasps the forearm 213 of the patient 322 and twists or extends it, the positions of the markers 221 and 222 change accordingly, so that the operation target bone 421 in the display 511 also moves or rotates. .
- the target position of the surgical target bone is determined.
- the pins 211 and 212 are fixed using the fixing tool 231 at the determined position.
- FIG. 6A is a screen transition diagram for explaining the outline of the osteotomy operation and positioning operation of the surgical target bone during the operation. Prior to the operation, the markers 221 and 222 are fixed to the pins 211 and 212.
- the osteotomy surface image 431 is displayed three-dimensionally on the display 511, and the target bone is cut at an appropriate position.
- a portion indicated by a thick line is an image captured by the camera 512
- a portion indicated by a thin line is a CG image generated from three-dimensional data.
- the doctor inserts an osteotomy blade into the affected area according to the osteotomy image 431 and separates the affected bone that has undergone deformation healing.
- the patient's forearm is moved while referring to the coordinate space image 622 and the divided display images 623 to 626 to operate the target bone image 441 with respect to the target bone image 442.
- target bone images 441 and 442 having positions, sizes, and directions corresponding to the positions, sizes, and directions of the markers 221, 222 obtained by photographing are displayed.
- the image 622 displays the angle of the observation point with the X axis / Y axis / Z axis in the three-dimensional space, and extracts the relative position of the reference bone image 412 and the target bone images 441 and 442 in the three-dimensional space.
- the target bone images 441 and 442 can be rotated on the screen by moving the observation point.
- Images 623 to 626 are divided display images displayed on one screen, and the image 623 is the same as the image 621 and is a superimposed image of a captured image and a CG image.
- the image 624 is obtained by extracting only the CG image from the image 623, and here, a reference bone with a pin and a target bone are displayed.
- the image 625 is an image of the reference bone image 412 and the target bone images 441 and 442 when viewed from the axial direction of the bone, which is 90 degrees from the camera 512.
- the image 626 is an image of the reference bone image 412 and the target bone images 441 and 442 when viewed from the pin insertion direction at 90 degrees with the camera 512.
- the images 624 to 626 are three display images with the three axial directions in the three-dimensional space as observation points. The doctor determines an appropriate arrangement of the target bone images 441 and 442 while observing these display screens.
- the image 627 shows a state in which the target bone image 441 is superimposed on the reference bone image 412. In this state, the pins 211 and 212 attached to the target bone images 441 and 442 are fixed with a fixing tool.
- FIG. 6B is a diagram for explaining the outline of the intraoperative osteotomy support processing according to the present embodiment.
- the display screen 601 displays the affected part (the left forearm part) 213 captured in the body with the forearm rib, which is the bone to be operated, imaged by the camera 512, and markers 221 and 222 respectively fixed to the divided surgical object bones. To do. Further, an incision portion 611 for cutting the bone and a holding device 612 that holds the incision portion 611 in an open state for the bone cutting processing are displayed.
- the display screen 601 displays target bone images 441 and 442 that are generated in advance and stored in the storage unit based on the positions, sizes, and orientations of the markers 221 and 222, superimposed on the captured image. Further, in order to assist osteotomy, an osteotomy image 431 selected in advance and stored in the storage unit is superimposed on the surgical target bone image and displayed at the osteotomy position at the osteotomy angle.
- the display screen 602 is a screen when the osteotomy image 431 is matched with the depth direction of the display screen 602 by moving the patient's forearm or camera position. If the osteotomy tool 628 is applied to the bone along the osteotomy surface image 431 displayed on the display screen 602 and osteotomy is performed, a very accurate osteotomy can be realized.
- FIG. 7 is a block diagram illustrating a functional configuration of the information processing apparatus 324.
- CT data is shown as tomographic image data
- STL data is shown as three-dimensional bone surface model data.
- Each functional unit of the information processing device 324 is realized by processing image data by executing a program while using a memory by the CPU.
- the CT data acquisition unit 711 acquires CT data (DICOM) from the CT scanner 321 as an image of the patient 322.
- the CT database 712 stores the CT data acquired by the CT data acquisition unit 711 in a searchable manner.
- the bone shape data generation unit 713 generates STL data as 3D bone surface model data from CT data.
- the STL data DB 714 accumulates the searchable STL data generated by the bone shape data generation unit 713.
- the display unit / operation unit 715 includes a display, a touch panel, and the like.
- the display unit / operation unit 715 functions as a bone image display unit that three-dimensionally displays a bone image based on the STL data generated by the bone shape data generation unit 713, and includes instructions from a doctor. Accordingly, the bone image is three-dimensionally moved (rotated and moved). In this example, the image of the bone to be operated on by the patient 322 and the image of the healthy side bone are displayed so as to be simultaneously superposed. Further, in the display / operation unit 715, osteotomy position information of the surgical target bone can be input.
- a plurality of partial bones (first target bone / second target bone) obtained by cutting and separating the surgical target bone at the osteotomy position can be independently displayed in three dimensions (rotation and movement).
- the reference bone data generation unit 716 generates reference bone data by horizontally flipping the three-dimensional data of the healthy bone.
- the 3D data generation unit 717 generates 3D reference bone data by superimposing the 3D shape data of the first target bone and the reference bone separated based on the osteotomy position information in a virtual 3D space. Then, the generated three-dimensional reference bone data is stored in the preoperative preparation data DB 719. Further, the three-dimensional data generation unit 718 generates three-dimensional shape data of the second target bone. The generated three-dimensional shape data is stored in the preoperative preparation data DB 719.
- the target bone and the reference bone may be superposed on the basis of a doctor's operation, and the three-dimensional data generation units 717 and 718 may automatically perform the superimposition based on the bone shape (particularly, the shape of the joint portion). You may do it.
- the preoperative preparation data DB 719 accumulates the three-dimensional data generated by the three-dimensional data generation units 717 and 718 so as to be searchable using STL data.
- the STL data stored in the preoperative preparation data DB 719 is used in the intraoperative image processing system 500.
- FIG. 8 is a diagram showing a configuration of the STL data DB 714 according to the present embodiment.
- STL data DB 714 STL data representing the three-dimensional bone surface model in the present embodiment is stored so as to be searchable.
- the STL data DB 714 stores a CT data acquisition date 802, a patient name 803, an affected part 804, a symptom 805, and CT data 806 in association with the image ID 801.
- the STL data DB 714 stores STL data 807 generated from the CT data 806 and an STL data generation source 808 when the STL data is generated externally.
- FIG. 9 is a diagram showing a configuration of the preoperative preparation data DB 719 according to the present embodiment.
- STL data representing the three-dimensional bone image in the present embodiment is stored so as to be searchable.
- the preoperative preparation data DB 719 stores an affected area 902, a symptom 903, three-dimensional data 904 associated with the first marker, and three-dimensional data 905 associated with the second marker in association with the patient name 901.
- the three-dimensional data 904 includes three-dimensional data of the first target bone, three-dimensional position data of the first marker support device, and three-dimensional data of the reference bone.
- the three-dimensional data 905 includes three-dimensional data of the second target bone and three-dimensional position data of the second marker support device.
- the three-dimensional data 904 and 905 are stored in a format in which the display bone image can be moved and rotated in a three-dimensional space.
- FIG. 10A is a flowchart showing a processing procedure of the entire surgical operation support system including the preoperative preparation data generation system 320 and the intraoperative image processing system 500.
- step S1001 the preoperative preparation data generation system 320 acquires a tomographic image (for example, a CT image) of a surgical target bone with a pin fixed and a tomographic image of a healthy bone, and generates respective three-dimensional data. To do.
- a tomographic image for example, a CT image
- step S1003 while displaying the generated three-dimensional shape data, an appropriate arrangement of the osteotomy image 431 and the bone after osteotomy is determined, and those position data are stored.
- step S1005 the intraoperative image processing system 500 captures an image of the marker fixed to the surgical target bone.
- step S1007 the intraoperative image processing system 500 generates and displays an osteotomy image that changes in accordance with the movement of the marker, overlaid on the diseased part captured image.
- the doctor applies a blade to the bone according to the osteotomy image while looking at the display screen, and cuts the bone. Further, a bone image of the first target bone and the reference bone and a bone image of the second target bone are generated and displayed. The doctor moves the forearm while looking at the display screen.
- step S1009 the intraoperative image processing system 500 confirms that the two target bones of the forearm are arranged so that the bone image of the second target bone matches the bone image of the reference bone. If they do not match, the intraoperative image processing system 500 returns to step S605 and continues the processing until the target bone is arranged at the matching position.
- FIG. 10B is a flowchart illustrating a procedure of the osteotomy surface generation process (S1003) of FIG. 10A.
- step S1021 the information processing apparatus 324 reads the three-dimensional shape data of the operation target bone and the reference bone from the preoperative preparation data DB 719, and performs image display processing.
- step S1023 an osteotomy surface is temporarily determined, and three-dimensional shape data of the first and second target bones separated by the osteotomy surface is generated.
- the temporary determination of the osteotomy surface may be performed by a position instruction by a doctor, or may be an appropriate position determined by the system (for example, a position 3 cm from the end).
- step S1023 the information processing device 324 separates the target bone based on the three-dimensional data of the osteotomy surface, and generates shape data of the separated bone (first and second target bones). Then, a set of data of each separated bone data and osteotomy surface data is stored.
- step S1025 the end opposite to the osteotomy surface of the first target bone divided into two and the end opposite to the osteotomy surface of the second target bone are superimposed on both ends of the reference bone. At this time, not only the position but also the angle and direction are adjusted to the reference bone.
- step S1027 the osteotomy plane is evaluated in that state. There are various methods for evaluating the osteotomy surface.
- any one of the following methods or a plurality of combinations can be selected.
- step S1027 After overlapping the feature points at both ends, evaluation based on the overlap volume between the gap (defect site) between the target bone and the target bone and the reference bone (the smaller the better)
- step S1027 if an evaluation satisfying the predetermined threshold is obtained, the process proceeds to step S1029.
- the process returns to step S1021, and the position and inclination of the osteotomy plane are reset. Then, while changing the osteotomy surface sequentially, it repeats until the evaluation value satisfies a predetermined threshold value to find an appropriate osteotomy surface. For example, as described above, the bone cutting position is shifted from the lowest point of the surgical target bone 421 in the axial direction by a unit distance (for example, 1 mm), and the surface of every direction is repeatedly simulated as a bone cutting surface candidate. For example, at least the optimum position and angle of the osteotomy surface can be obtained.
- the osteotomy plane may be changed downward from the uppermost point of the surgical target bone 421. Or you may find the optimal osteotomy within the range set by the doctor.
- step S1029 the determined osteotomy surface and marker attachment information (position, angle, size, and marker type predetermined when the marker is attached) are associated with each other and registered in the preoperative preparation data DB 719.
- FIG. 11 is a block diagram showing a functional configuration of the tablet computer 501 in the intraoperative image processing system 500 according to the present embodiment.
- Each functional unit of the tablet computer 501 is realized by executing a program while using a memory by a CPU (not shown).
- the tablet computer 501 is used in the present embodiment, the present invention is not limited to this, and any portable information processing terminal including a display and a camera may be used. Further, the camera and the display unit / operation unit may be separated from the information processing apparatus and perform data communication with each other.
- the camera 512 images the affected part of the patient 322 in the operating room.
- the imaging range of the camera 512 includes markers 221 and 222 that are fixed at two locations on the surgical target bone of the forearm 213 of the patient 322.
- the marker analysis unit 1111 refers to the marker DB 1112 and analyzes the type of image to be displayed and the position and orientation at which the image is to be displayed from the marker image captured by the camera 512.
- the preoperative preparation data 1119 is the same as the data stored in the preoperative preparation data DB 719 shown in FIG.
- the information processing apparatus 324 illustrated in FIG. 7 may be copied to the tablet computer 501 by communication, or may be copied via a storage medium. Further, it may be acquired by accessing the preoperative preparation data DB 719 in the information processing apparatus 324 through direct communication from the tablet computer 501.
- the CG image generation unit 1114 displays a CG image to be displayed based on the three-dimensional position and direction of the marker acquired from the marker analysis unit 1111 and the three-dimensional data of the target bone and the reference bone included in the preoperative preparation data 1119. Generate.
- the CG image generation unit 1114 generates a bone image and a reference of the first target bone from the three-dimensional data of the first target bone and the three-dimensional data of the reference bone based on the position, size, and orientation of the captured first marker. It functions as first bone image generation means for generating a bone image of a bone.
- the CG image generation unit 1114 generates a bone image of the second target bone from the three-dimensional data of the second target bone based on the position, size, and orientation of the captured second marker. It also functions as a generation means.
- the display image generation unit 1115 superimposes the surgical target bone image and the reference bone image generated by the CG image generation unit 1114 on the affected part image of the forearm 213 of the patient 322 captured by the camera 512, and displays the display image data on the display. Generate. Using this image data, the target bone image and the reference bone image are simultaneously superimposed and displayed on the affected part image on the display 511. In addition, it is possible to display an image with the observation point moved or to simultaneously display images from a plurality of observation points. That is, the display image generation unit 1115 searches for the position of the first marker and the second marker so that the second target bone overlaps the reference bone, the bone image of the first target bone, the bone image of the reference bone, and the second The bone image of the target bone is displayed. In the display, the display image generation unit 1115 changes the relative position between the bone image of the first target bone and the bone image of the second target bone in accordance with the change in the relative position between the first marker and the second marker. To display.
- FIG. 12 is a diagram showing a configuration of the marker DB 1112 according to the present embodiment.
- the marker DB 1112 is used by the marker analysis unit 1111 to analyze the three-dimensional position and orientation of the marker (that is, the position and orientation of two pairs of pins) from the image data captured by the camera 512.
- the marker DB 1112 stores two-dimensional code matrix data 1202 in association with the marker ID 1201.
- the matrix data 1202 is, for example, binary or multi-value bit data indicating black and white or color arranged in two-dimensional coordinates, and the three-dimensional position and direction can be recognized by the change of the coordinate values. .
- the two-dimensional code is not limited to this.
- the marker DB 1112 stores a marker shape 1203 and a marker size 1204.
- FIG. 13 is a diagram illustrating a configuration of a marker analysis table 1301 used by the marker analysis unit 1111.
- the marker analysis table 1301 obtains the two-dimensional data on the marker, the position, size and orientation of the marker, or the three-dimensional data of the marker support device from the marker image captured by the camera 512, and obtains the target bone image and reference. It is a table used for three-dimensional display data generation of a bone image.
- the marker analysis table 1301 stores the marker two-dimensional code frame 1311 extracted from the captured image, the marker two-dimensional code matrix data 1312, and the marker ID 1313 determined from the matrix data 1312. Furthermore, the marker position, size and direction 1314 and the three-dimensional position and direction 1315 of the marker calculated from the marker position, size and direction 1314 are stored. Depending on the three-dimensional position and orientation 1315 of the marker, the position, size and orientation for displaying the three-dimensional data of the target bone to be displayed on the display can be determined.
- (3D data generation table) 14 and 15 are diagrams illustrating the configuration of intraoperative image generation tables 1401 and 1501 used by the CG image generation unit 1114.
- the intraoperative image generation table 1401 associates with the first target bone and reference bone ID 1411 and analyzes the three-dimensional position data 1412 of the analyzed first marker and the first marker 3 stored in the preoperative preparation data DB 719.
- Dimension position data 1413 is stored. Then, using the conversion vector for converting the three-dimensional position data 1413 of the first marker into the three-dimensional position data 1412, the three-dimensional data of the first target bone stored in the preoperative preparation data DB 719 is coordinate-converted.
- the three-dimensional data 1414 of the first target bone for display generated by the coordinate conversion is stored.
- the intraoperative image generation table 1401 stores three-dimensional data 1416 indicating the position, shape, inclination, and the like of the osteotomy plane in association with the first target bone and the reference bone ID 1411.
- the intraoperative image generation table 1501 includes the analyzed second marker 3D position data 1512 in association with the second target bone ID 1511, and the second marker 3D position data stored in the preoperative preparation data DB 719. 1513 is stored. Then, using the conversion vector for converting the three-dimensional position data 1513 of the second marker into the three-dimensional position data 1512, the coordinate conversion of the three-dimensional data of the second target bone stored in the preoperative preparation data DB 719 is performed. Then, the three-dimensional data 1514 of the second target bone for display generated by the coordinate conversion is stored.
- FIG. 16 is a flowchart showing a processing procedure of the tablet computer 501 according to the present embodiment. This flowchart is executed by the CPU of the tablet computer 501 as an intraoperative image generation program while using the RAM, and realizes the functional component shown in FIG.
- step S1601 the tablet computer 501 captures the affected area (the forearm in this example) and acquires two markers and image data of the affected area image.
- step S1603 the tablet computer 501 extracts a frame including a two-dimensional code from the image data of the affected area.
- the frame including the two-dimensional code is a rectangle, but it may be a circle or other shapes.
- step S1605 the tablet computer 501 acquires a two-dimensional code matrix within the frame.
- the tablet computer 501 identifies the marker by comparing the acquired two-dimensional code matrix with the two-dimensional code from the front stored in the marker DB 1112.
- the tablet computer 501 analyzes the marker coordinate system (position and orientation in the three-dimensional space) in consideration of the position, size, and orientation of the marker.
- the tablet computer 501 determines the three-dimensional data of the first marker fixed to the first target bone and the second marker fixed to the second target bone based on the analyzed position and orientation of each marker.
- the three-dimensional data is calculated.
- the tablet computer 501 calculates the three-dimensional data of the osteotomy plane based on the three-dimensional data stored as the preoperative preparation data 1119.
- the tablet computer 501 displays the affected part captured image and the generated image of the osteotomy plane in an overlapping manner.
- the information processing apparatus according to the present embodiment is different from the second embodiment in that the shape of the defective part is generated and the data is accumulated. Since other configurations and operations are the same as those of the second embodiment, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted. According to the present embodiment, it is possible to generate an accurate implant based on the three-dimensional data of the defect site.
- FIG. 17 is a block diagram showing a functional configuration of the information processing apparatus 1700 according to the present embodiment.
- a defect site shape generation unit 1721 is provided in the same manner as the osteotomy plane determination unit 720, and exchanges data with the three-dimensional data generation unit 718.
- the defect site shape generation unit 1721 is configured to superimpose the surgical target bones (first and second target bones) separated into two at the osteotomy plane determined by the osteotomy plane determination unit 720 on both ends of the reference bone.
- the shape of the gap (gap 443 in the image 404 in FIG. 4) is converted into three-dimensional shape data. Then, the three-dimensional shape data of the gap is stored in the preoperative preparation data DB 719 as missing part shape data.
- FIG. 18 is a flowchart showing the procedure of the osteotomy surface generation process (S1003) according to the present embodiment.
- step S1801 using the determined osteotomy data, the shape data of the defect site is generated as described above and stored in the preoperative preparation data DB 719.
- the information processing apparatus according to the present embodiment is different from the second embodiment in that an osteotomy position candidate is selected and notified. Since other configurations and operations are the same as those of the second embodiment or the third embodiment, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted.
- FIG. 19 is a block diagram showing a functional configuration of the information processing apparatus 1900 according to this embodiment.
- the same functional components as those in FIG. 7 are denoted by the same reference numerals, and description thereof is omitted.
- an osteotomy position candidate selection unit 1921 is added.
- the osteotomy position candidate selection unit 1921 displays at least one appropriate osteotomy surface and presents it to the doctor as an osteotomy position candidate.
- the determination and evaluation of an appropriate osteotomy surface are based on the degree of superimposition and the size and shape of the defect site when the target bone separated by the osteotomy surface determination unit 720 and the reference bone image are superimposed as described above. Determined.
- the degree of superimposition it is assumed that the degree of coincidence is high when the cumulative value of errors or the maximum value of errors is small or the number of errors equal to or greater than a predetermined threshold is small.
- an error of an important part of the bone for example, a joint part may be weighted.
- the doctor may manually input the range in advance in consideration of the limited range of the osteotomy position from the symptoms of the target bone.
- the osteotomy position candidate selection unit 1921 may automatically limit the range based on the information on the bone to be operated and the symptom, and then start the selection of the osteotomy position candidate.
- the osteotomy position candidate selected by the osteotomy position candidate selection unit 1921 may be reconfirmed by the doctor to adjust the position.
- the osteotomy position is automatically moved between the osteotomy position candidates to display the superposition state, and the osteotomy position is determined based on the observation of the doctor. Good.
- FIG. 20 is a flowchart showing a processing procedure of the information processing apparatus 1900 according to this embodiment.
- steps similar to those in FIG. 8 are denoted by the same step numbers and description thereof is omitted.
- step S2001 the information processing apparatus 1900 performs osteotomy simulation within the osteotomy range corresponding to the operation target bone and the symptom thereof, thereby generating an osteotomy plane candidate that satisfies the optimal condition (for example, the minimum defect site size).
- a plurality of osteotomy surface candidates may be generated instead of only one.
- step S2003 the doctor selects one osteotomy surface from the selected osteotomy surface candidates.
- the osteotomy plane can be determined more easily and accurately.
- the osteotomy support system according to the present embodiment uses a virtual three-dimensional marker without placing an actual marker on the target bone when generating preoperative preparation data. It differs in that it is generated on the screen and created by a three-dimensional printer. Since other configurations and operations are the same as those in the above embodiment, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted.
- FIG. 21A is a diagram illustrating an overview of preoperative preparation data generation processing using the information processing apparatus according to the present embodiment.
- the same reference numbers are assigned to display images or display elements similar to those in FIG.
- Each image is a CG image displayed on the display screen, and corresponds to each stage of preoperative preparation data generation processing.
- the surgical target bone 421 is generated from the captured STL data, and three-dimensional markers 2111, 2112 that are planned and drawn on the three-dimensional data are virtually displayed (in FIG. 21A, they are displayed by broken lines). ) Then, the reference bone image 412 and the surgical target bone 421 are compared on the display screen, and the state of the surgical target bone 421 is confirmed.
- the surgical target bone 421 is operated on the image 403 while referring to the image. That is, the operation target bone 421 is moved or rotated with respect to the reference bone image 412 to overlap each end of the reference bone image 412 with each end of the operation target bone 421.
- the lower ends of the operation target bone 421 and the reference bone image 412 are overlapped to determine the osteotomy surface of the operation target bone 421.
- the shape (feature point) of the joint the lower end or the upper end in the figure
- the distortion and bending of the surgical target bone 421 are recognized, and the reference bone and the reference bone gradually move upward from the lower end.
- the branch position where the separation from the reference bone starts is defined as an osteotomy image 431.
- the osteotomy image 431 is a rectangular plane having a predetermined shape and size, but the present invention is not limited to this.
- a surface including a curved surface may be used according to the purpose of osteotomy.
- the doctor may evaluate and determine the osteotomy image 431 while observing how the reference bone image 412 and the surgical target bone 421 are superimposed, but the optimal osteotomy may be obtained by calculation. For example, after overlapping the lower ends, the non-overlapping volume per unit length in the axial direction of the surgical target bone 421 and the reference bone image 412 is sequentially calculated from the lower end, and the highest non-overlapping volume does not exceed a predetermined value. A plane connecting the points may be used as the osteotomy image 431. Alternatively, the surface of the reference bone image 412 is subdivided into unit areas, and the osteotomy image 431 is automatically generated by connecting the positions where the vertical distance to the surface of the surgical target bone 421 exceeds a predetermined value for each unit area.
- the volume (or the distance between the surfaces) that protrudes from the reference bone image 412 The osteotomy plane may be determined so that the total value of) is minimized.
- the target bone image 441 and the target bone image 442 are separated.
- the osteotomy surface may be determined so that the gap (defect region) 443 sandwiched between the bones is minimized.
- the optimal position and angle of the osteotomy plane can be determined by repeatedly simulating the plane of every orientation as the osteotomy plane candidate while shifting the position in the axial direction of the target bone 421 by a unit distance (for example, 1 mm).
- a unit distance for example, 1 mm.
- the osteotomy image 431 is determined in this way, in the fourth stage, three-dimensional data of two target bone images 441 and 442 obtained by separating the surgery target bone 421 by the osteotomy image 431 is generated and stored. That is, the set target bone image 442 and the reference bone image 412 are stored as one set in association with the virtually drawn three-dimensional marker 2112. Then, as shown in the image 2104, the target position of the target bone image 441 with respect to the target bone image 442 or the reference bone image 412 is stored in association with the position data of the virtually drawn three-dimensional marker 2111. Note that the base blocks of the virtually drawn three-dimensional markers 2111 and 2112 are designed such that the surface matches the characteristic part of the target bone.
- the 3D marker produced by the 3D printer can reproduce the base block shape of the virtually drawn 3D markers 2111 and 2112, the 3D marker produced by the 3D printer will accurately indicate the position and orientation of the target bone. Can be shown. That is, if the position and inclination of the three-dimensional marker produced by the 3D printer in real space can be recognized, the position and inclination of the target bone image 441 can be estimated.
- the position, shape and inclination data of the osteotomy image 431 are stored in association with the position data of the drawn three-dimensional marker 2111 or three-dimensional marker 2112.
- the position and orientation of the two-dimensional marker to be attached to the three-dimensional marker 2111, and the position and orientation of the two-dimensional marker to be attached to the three-dimensional marker 2112 are determined in advance.
- the display of the target bone image 441 and the reference bone image 412 based on the position, size, and orientation of the three-dimensional marker that is set and imaged on the target bone at the time of surgery the target It is possible to display the bone image 442 and the osteotomy image 431.
- the gap 443 between the target bone image 441 and the target bone image 442 represents the shape of the joint bone necessary at the time of surgery. Therefore, at this time, the three-dimensional shape of the joint bone required at the time of surgery can also be acquired.
- a combination of the target bone images 441 and 442 determined as the target arrangement in the image 2104 may be displayed as a unit.
- the positions of the three-dimensional markers 2111, 2112 in a state where both the target bone images 441, 442 are superimposed on the reference bone image 412 may be stored in the storage unit as target relative position data.
- osteotomy correction surgery of affected bone surgical target bone
- target bones on both sides of the osteotomy plane are considered, but the present invention is not limited to this.
- an accurate bone is displayed by displaying an osteotomy surface (for example, three surfaces) for generating a surface to which the artificial joint is to be attached using the AR technique as described above. Can be cut off.
- three-dimensional CG models of artificial joints of three sizes of S, M, and L are prepared.
- the relative positional relationship between the adhesion surface of the artificial joint model and the three-dimensional marker may be stored, and the blade model may be AR-displayed in accordance with the adhesion surface during the operation.
- a blade indicating the osteotomy surface may be attached to the 3D model of the artificial joint. In this case, one three-dimensional marker is sufficient. It is also possible to attach a three-dimensional marker to an actual blade, recognize the position, and instruct the movement of the blade to the target position.
- FIG. 21B is a diagram illustrating an overview of the intraoperative osteotomy support processing according to the present embodiment.
- the same reference numerals are given to the same elements as in FIG. 6B.
- the display screen 2107 includes an affected part (a left forearm portion) 213 having a forearm rib, which is a bone to be operated, captured by a camera, and three-dimensional markers 2121 and 2122 each having a base block fixed to the bone to be operated during surgery. , Is displayed. Further, an incision portion 611 for cutting the bone and a holding device 612 that holds the incision portion 611 in an open state for the bone cutting processing are displayed.
- the display screen 2107 displays the target bone images 441 and 442 that are generated in advance and stored in the storage unit based on the positions, sizes, and orientations of the three-dimensional markers 2121 and 2122, superimposed on the affected part to be visually observed. Is done. Further, in order to assist osteotomy, an osteotomy image 431 selected in advance and stored in the storage unit is superimposed on the surgical target bone image and displayed at the osteotomy position at the osteotomy position.
- the display screen 2108 is a screen when the osteotomy image 431 is matched with the depth direction of the display screen 2107 by moving the patient's forearm or camera position. If the osteotomy instrument 628 is applied to the bone along the osteotomy image 431 displayed on the display screen 2108 and the osteotomy is performed, a very accurate osteotomy can be realized.
- FIG. 22 is a flowchart showing a processing procedure of the entire osteotomy support system including the preoperative preparation data generation system and the intraoperative image processing system according to the present embodiment.
- the osteotomy support system performs CT imaging of the affected area of the patient.
- the osteotomy support system performs three-dimensional modeling based on, for example, STL data.
- the osteotomy support system performs preoperative planning while displaying the three-dimensional data. For example, a three-dimensional marker is generated on the screen, and data for producing the three-dimensional marker is generated.
- the three-dimensional marker is associated with the bone to be operated in three-dimensional coordinates with the osteotomy surface, bone hole, implant, and the like necessary for a procedure necessary during the operation.
- the osteotomy support system creates a 3D marker having a base block that matches the target bone using a 3D printer based on the data of the 3D marker.
- step S2209 the osteotomy support system inputs the intraoperative application processing program and each piece of data associated with the three-dimensional marker.
- step S2211 the osteotomy support system executes osteotomy support based on the intraoperative application processing program and each piece of data associated with the three-dimensional marker.
- FIG. 23 is a diagram showing a functional configuration of the preoperative preparation data generation system 2300 according to the present embodiment.
- the same functional components as those in FIG. 7 are denoted by the same reference numerals, and description thereof is omitted.
- the bone image data generation unit 2311 is a functional configuration unit including the reference bone data generation unit 716 and the three-dimensional data generation units 717 and 718 in FIG.
- the three-dimensional marker data generation unit 2312 generates three-dimensional data of a three-dimensional marker generated based on the three-dimensional marker information input to the display unit / operation unit 2315.
- the osteotomy data generation unit 2313 generates three-dimensional data of the osteotomy surface generated based on the osteotomy position information input to the display unit / operation unit 2315.
- the preoperative preparation data DB 2319 stores the three-dimensional data of the three-dimensional marker in addition to the data of the preoperative preparation data DB 719 of FIG. 7, and associates the three-dimensional data of the three-dimensional marker with the target bone or osteotomy surface. , And the like are stored.
- the 3D printer 2320 produces a 3D marker based on 3D printer data generated from the 3D data of the 3D marker.
- FIG. 24 is a diagram showing a configuration of the preoperative preparation data DB 2319 according to the present embodiment.
- FIG. 24 shows a configuration of preparation data that is planned by a technique specific to this embodiment. 24 includes the configuration illustrated in FIG.
- the preoperative preparation data DB 2319 stores the affected part 2402 and the surgical procedure 2403 in association with the patient name 2401.
- the affected part is the right arm and the surgical procedure is the distal radius deformity healing.
- the planning item 2404 necessary for the affected part 2402 and the surgical procedure 2403 and the necessary three-dimensional data are stored in association with the three-dimensional marker.
- the three-dimensional data of the three-dimensional marker produced by the 3D printer 2420 and the corresponding three-dimensional data of the osteotomy surface are stored.
- FIG. 25 is a flowchart showing the procedure of the osteotomy surface generation process according to the present embodiment. This flowchart is executed by the CPU of the information processing apparatus 2310 using the RAM, and realizes the functional configuration unit of FIG.
- step S2501 the information processing apparatus 2310 acquires a CT image of the target bone of the patient and, if necessary, the healthy side bone.
- step S2503 the information processing apparatus 2310 generates STL data from the CT image data. When requesting generation of STL data to the outside, STL data is acquired.
- step S2507 the information processing apparatus 2310 acquires osteotomy position information, three-dimensional marker shape, and installation position information.
- the information processing apparatus 2310 generates the three-dimensional data of the osteotomy surface, the three-dimensional marker data, and the like in association with the three-dimensional data of the STL bone.
- step S2511 the information processing apparatus 2310 associates the generated three-dimensional data with each other and stores them in the preoperative preparation data DB 2319.
- step S2513 the information processing apparatus 2310 outputs 3D marker data for a 3D printer.
- FIG. 26 is a block diagram showing a functional configuration of the tablet computer 2610 in the intraoperative image processing system 2600 according to the present embodiment.
- the same reference numerals are assigned to the same functional components as those in FIG. 3, FIG. 5, or FIG.
- the preoperative preparation data DB 2319 stores the same preparation data generated by the preoperative preparation data generation system 2300 in FIG.
- the CG image generation unit 2614 converts the three-dimensional data such as the operation target bone and the osteotomy surface from the preoperative preparation data DB 2319 according to the position and orientation of the three-dimensional marker from the marker analysis unit 1111. Then, a CG image to be superimposed on the visually operated surgical site is generated.
- the display image generation unit 2615 converts the image generated by the CG image generation unit 2614 into a display image to be displayed on the display 511, the external monitor 2620, or the HMD 2630. In this embodiment, it is desirable to use an optical see-through HMD.
- the processing procedure of the tablet computer 2610 in the intraoperative image processing system 2600 is the same as the processing procedure of FIG. 16 except that the marker is created by the 3D prompter 2320 and is changed to a three-dimensional marker placed on each target bone during surgery. Since they are similar, illustration and description are omitted.
- a 3D marker produced by a 3D printer is installed so as to match the shape of the bone, so that a marker is made by opening a hole in the patient's surgical target bone before and during the operation. Bone cutting can be supported without installation.
- the osteotomy support system according to the present embodiment uses three-dimensional data of the surface of the site where the target bone is operated as a marker, and performs intraoperative image processing. The difference is that three-dimensional data of a target bone is acquired by a depth sensor to support osteotomy. Since other configurations and operations are the same as those in the above embodiment, the same configurations and operations are denoted by the same reference numerals, and detailed description thereof is omitted.
- the preoperative preparation data in the present embodiment is similar to that in the above embodiment except that it does not include additional marker information because a three-dimensional surface image of the surgical target bone is used as a marker, and the description thereof will be omitted.
- the case where the HMD and the depth sensor are integrated will be described. However, when the HMD and the depth sensor are separated, a marker is attached to the position sensor (such as GPS) or the depth sensor. It is necessary to determine the position.
- FIG. 27 is a block diagram illustrating a functional configuration of the intraoperative image processing system 2700 according to the present embodiment.
- the intraoperative image processing system 2700 includes a depth sensor & HMD 2720 and an information processing device 2710.
- the depth sensor & HMD 2720 includes a depth sensor and an optical see-through HMD.
- the depth sensor and the HMD may be different from each other, but are preferably integrated.
- the depth sensor includes an infrared projector 2721 and an infrared camera 2722, and acquires a depth image (distance image) of the surgical site during surgery.
- the distance image is equivalent to a three-dimensional image of the surface.
- a binocular HMD display unit 2723 (right eye part 2723a / left eye part 2723b) displays a three-dimensional osteotomy plane 431 by superimposing the three-dimensional position on the affected part of the patient 322 to be viewed.
- the communication unit 2711 of the information processing device 2710 controls transmission / reception of data with the depth sensor and the HMD 2720 and the HMD.
- the depth sensor image receiving unit 2712 receives a depth image (distance image).
- the bone surface image collation unit 2913 collates with the characteristic surface image of the target bone image of the preoperative preparation data 719 using the depth image (distance image) as a marker.
- the CG image generation unit 1714 performs three-dimensional coordinate conversion on the three-dimensional data of the preoperative preparation data 719 in response to the change in position and orientation necessary for the bone surface verification obtained from the bone surface image verification unit 2713. Then, a CG image is generated. Since the preoperative preparation data 719 of this embodiment does not require a separate marker, in FIG. 7, there is no marker position data, or detailed 3 of the characteristic surface shape of the surgical site of the surgical target bone. Dimension data is stored as marker position data.
- the eye coordinate system estimation unit 2715 estimates the eye coordinate system based on the line of sight and visual field of the doctor wearing the HMD from the depth sensor image data.
- the right-eye HMD display data generation unit 2716 refers to the eye coordinate system information from the eye coordinate system estimation unit 2715 and converts the display image data of the 3D camera coordinate system to the display data for the right eye of the 2D HMD screen coordinate system.
- the left-eye HMD display data generation unit 2717 refers to the eye coordinate system information from the eye coordinate system estimation unit 2715 and converts the display image data of the 3D camera coordinate system to the left eye of the 2D HMD screen coordinate system. Convert to display data.
- the display position of the converted display data of the two-dimensional HMD screen coordinate system is adjusted so that the three-dimensional target bone image and the reference bone image overlap the forearm 213 of the affected part that passes through the display unit 2723 of the HMD.
- the image transmission unit 2718 transmits the display image data of the two-dimensional HMD screen coordinate system to the HMD display unit 2723 via the communication unit 1271.
- the display unit 2923 of the depth sensor & HMD 2920 displays the display image from the right-eye HMD display data generation unit 2016 on the right-eye screen 2923a, and displays the display image from the left-eye HMD display data generation unit 2017 on the right-eye screen 2923b.
- FIG. 28 is a diagram showing a data table 2800 used in the bone surface image matching unit 2713 according to the present embodiment.
- the data table 2800 compares the depth image (distance image) acquired by the depth sensor from the surface of the surgical target bone of the affected part of the patient with the surgical target bone stored as the preoperative preparation data 719, and determines the current surgical target. Determine the position and orientation of the bone.
- the data table 2800 stores the collated 3D bone data 2802 in association with the depth sensor image 2801 acquired by the depth sensor, and the real space position and orientation 2803 of the target bone determined from the collation result.
- the data table 2800 includes three-dimensional information such as three-dimensional bone data 2804 obtained by three-dimensional coordinate conversion corresponding to the real space position and direction 2803 of the target bone, and the positions and directions of the osteotomy surface and each instrument. Data 2805 is stored.
- osteotomy can be supported without creating a marker separately as in the above embodiment.
- the description has focused on the support for intraoperative bone cutting.
- support for drilling other bone holes support for the processing of cutting bones into a predetermined surface shape, and replacement with implants.
- the present invention is applied to the support of the treatment that requires the operation of the instrument, such as the necessary implant placement support, and has the same effect.
- the present invention may be applied to a system composed of a plurality of devices, or may be applied to a single device. Furthermore, the present invention can also be applied to a case where an image processing program that implements the functions of the embodiments is supplied directly or remotely to a system or apparatus. Therefore, in order to realize the functions of the present invention with a computer, a control program installed in the computer, a medium storing the control program, and a WWW (World Wide Web) server that downloads the control program are also included in the scope of the present invention. include. In particular, at least a non-transitory computer readable medium storing a control program for causing a computer to execute the processing steps included in the above-described embodiments is included in the scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Robotics (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Geometry (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Surgical Instruments (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
手術対象骨の3次元形状データとマーカの位置データとを関連付けて記憶する記憶手段と、
前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定する骨切面決定手段と、
前記マーカを撮像した画像に基づいて、前記決定された骨切面を表示する骨切面表示手段と、
を備える。
上記骨切支援システムに利用する情報処理装置であって、
手術対象骨の3次元形状データとマーカの位置データとを関連付けて記憶する記憶手段と、
前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定する骨切面決定手段と、
を備えた。
上記骨切支援システムに用いる画像処理方法であって、
手術対象骨の3次元形状データとマーカの位置データとを関連付けて記憶手段に記憶する記憶ステップと、
前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定する骨切面決定ステップと、
を含む。
上記骨切支援システムに利用する画像処理プログラムであって、
手術対象骨の3次元形状データとマーカの位置データとを関連付けて記憶手段に記憶する記憶ステップと、
前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定する骨切面決定ステップと、
をコンピュータに実行させる。
本発明の第1実施形態としての骨切支援システム100について、図1を用いて説明する。骨切支援システム100は、画像処理により骨切外科手術を支援するためのシステムである。
次に、本発明の第2実施形態に係る外科手術支援システムについて説明する。本実施形態に係る外科手術支援システムは、あらかじめ手術対象骨の配置の基準となる手術対象骨の一方の第1対象骨の3次元データと、治癒後の形状の参照となる参照骨の3次元データとを生成して、第1対象骨に固定された第1マーカ(例えば2次元コード)と関連付けて記憶する。また、手術対象骨の他方の第2対象骨の3次元データを生成して、第2対象骨に固定された第2マーカ(例えば2次元コード)と関連付けて記憶する。手術時には、AR(Augmented reality)技術を利用して、撮像した第1マーカおよび第2マーカから、骨切面、第1対象骨および第2対象骨の3次元位置を判定して、記憶された3次元データに基づいて表示する。そして、第2対象骨と参照骨とが適正に重なったか否かを判定することで、手術対象骨の適切な配置を決定する。このような処理によって、医師による手術対象骨の適切な配置の決定を支援する。
図2は、本実施形態に係る外科手術全体の概要を説明する図である。図2は、変形治癒した罹患骨(手術対象骨)の骨切矯正手術の例を示している。骨切矯正手術は、準備段階201と、手術対象骨の位置決め段階202と、手術対象骨の位置固定段階203とを含む。本実施形態においては、橈骨遠位部変形治癒手術を例に説明するが、これに限定されない。他の部位や他の骨の変形治癒、あるいは、骨折治療手術にも同様に適用される。
図2では、ピン211、212を創外に突出させているが、本発明はこれに限定されるものではない。例えば、ピンの先端が創内に収まるような短いピン(1~2cm)として、術中(位置決め段階202)に、その短いピンに新たに長いピンを接続した上で、マーカ221、222を取り付けてもよい。あるいは、ピンを挿入せずに骨のみのCTを撮って生成された骨のCGデータに対して仮想的にピンを挿入し、その後、術中に開創して、CGデータどおりの位置に実際にピンを挿入してもよい。その際、仮想ピン付きの骨のCGデータを用いてマーカの位置を決めておき、3Dプリンタで、その患部の骨にぴったりと嵌る型(ピン穴付きカタ)を生成し、その型に合わせてピンを挿入することにより、CGデータにおけるピンと同じ位置に実際のピンを挿入してもよい。さらに、そのような型を骨にぴったりと嵌はめた状態で、その型自体にマーカを取り付ける方法でもよい。デジタルカメラで撮像した骨の特徴点を判別して、ピン付きのCGデータと重ね合わせることにより、CGデータと同じ位置に同じ方向でピンを挿入してもよい。これらの方法により、ピンを刺してCTスキャンを撮ってからの患者の負担、感染症の確立を抑えることができる。
図3は、術前準備データ生成システム320の構成を示す図である。
図4は、情報処理装置324を用いた術前準備データ生成処理の概要を説明する図である。画像401~406は、情報処理装置324の表示画面に表示されるCG(Computer Graphics)画像であり、それぞれ、術前準備データ生成処理の各ステージに対応している。
図5は、本実施形態に係る術中画像処理システム500の概略構成を示す図である。術中画像処理システム500は、情報処理装置としてのタブレット型コンピュータ501と、表示装置502と、を有する。タブレット型コンピュータ501は、ディスプレイ511とカメラ512(デジタルカメラ)とを備えている。
図6Aは、術中における手術対象骨の骨切作業および位置決め作業の概要を説明する画面遷移図である。手術前に、ピン211、212にマーカ221、222を固定する。
図6Bは、本実施形態に係る術中の骨切支援処理の概要を説明する図である。
図7は、情報処理装置324の機能構成を示すブロック図である。なお、図7においては、断層画像データとしてCTデータを示し、3次元骨表面モデルデータとしてSTLデータを示すが、これらに限定されない。情報処理装置324の各機能部は、CPUによりメモリを使用しながらプログラムを実行することにより画像データを処理することで実現される。
図8は、本実施形態に係るSTLデータDB714の構成を示す図である。STLデータDB714には、本実施形態における3次元骨表面モデルを表わすSTLデータが検索可能に蓄積されている。
図9は、本実施形態に係る術前準備データDB719の構成を示す図である。術前準備データDB719には、本実施形態における3次元骨画像を表わすSTLデータが検索可能に蓄積されている。
図10Aは、術前準備データ生成システム320と術中画像処理システム500とを含む外科手術支援システム全体の処理手順を示すフローチャートである。
図10Bは、図10Aの骨切面生成処理(S1003)の手順を示すフローチャートである。
(1)医師の視認による評価(2)両端の特徴点を重ねた上で、参照骨と対象骨の非重畳体積による評価(小さいほどよい)
(3)両端の特徴点を重ねた上で、表面間の鉛直方向距離の平均による評価(小さいほどよい)
(4)両端の特徴点を重ねた上で、参照骨からはみ出す対象骨の体積による評価(小さいほどよい)
(5)両端の特徴点を重ねた上で、対象骨と対象骨との間の間隙(欠損部位)と参照骨との重複体積による評価(小さいほどよい)
ステップS1027において、所定の閾値を満たす評価が得られた場合には、ステップS1029に進む。そのような評価が得られない場合にはステップS1021に戻り、骨切面の位置および傾きを設定し直す。そうして骨切面を順次変化させながら、評価値が所定の閾値を満たすまで繰り返し、適正な骨切面を見つける。例えば、上述したように、手術対象骨421の最下点から、軸方向に骨切位置を単位距離(例えば1mm)ずつ上方にずらしつつ、あらゆる向きの面を骨切面候補として、繰り返しシミュレートすれば、少なくとも最適な骨切面の位置および角度を求めることができる。例えば、橈骨であれば、骨軸に垂直をなす平面に対して60度~-60度の間で5度刻み×360度5度刻みの面(例えば300×24×72=約50万通りの平面)を骨切面候補として、繰り返しシミュレートする。もちろん手術対象骨421の最上点から下方へと骨切面を変化させてもよい。あるいは、医師によって設定された範囲内で最適骨切面を見つけてもよい。
図11は、本実施形態に係る術中画像処理システム500におけるタブレット型コンピュータ501の機能構成を示すブロック図である。タブレット型コンピュータ501の各機能部は、不図示のCPUによりメモリを使用しながらプログラムを実行することにより実現される。なお、本実施形態ではタブレット型コンピュータ501を用いているが本発明はこれに限定されるものではなく、ディスプレイとカメラを備えた可搬性の情報処理端末であればよい。また、カメラや表示部/操作部が、情報処理装置と分離して、互いにデータ通信するものであってもよい。
図12は、本実施形態に係るマーカDB1112の構成を示す図である。マーカDB1112は、マーカ解析部1111が、カメラ512が撮像した画像データからマーカの3次元の位置および向き(すなわち、2本で対のピンの位置および向き)を解析するために使用される。
図13は、マーカ解析部1111が用いるマーカ解析テーブル1301の構成を示す図である。マーカ解析テーブル1301は、カメラ512が撮像したマーカの画像から、マーカ上の2次元データや、マーカの位置、大きさおよび向き、あるいはマーカ支持器具の3次元データを求めて、対象骨画像や参照骨画像の3次元表示データ生成に使用するテーブルである。
図14、図15は、CG画像生成部1114が用いる術中画像生成用テーブル1401、1501の構成を示す図である。術中画像生成用テーブル1401は、第1対象骨および参照骨ID1411に対応付けて、解析された第1マーカの3次元位置データ1412と、術前準備データDB719に記憶されている第1マーカの3次元位置データ1413とを記憶する。そして、第1マーカの3次元位置データ1413を、3次元位置データ1412に変換する変換ベクトルを用いて、術前準備データDB719に記憶されている第1対象骨の3次元データを座標変換する。その座標変換によって生成された、表示用の第1対象骨の3次元データ1414を記憶する。また、同じ変換ベクトルを用いて術前準備データDB719に記憶されている参照骨の3次元データを座標変換して、表示用の参照骨の3次元データ1415を生成し、記憶する。また、術中画像生成用テーブル1401は、第1対象骨および参照骨ID1411に対応付けて、骨切面の位置、形状、傾きなどを示す3次元データ1416を記憶する。
図16は、本実施形態に係るタブレット型コンピュータ501の処理手順を示すフローチャートである。このフローチャートは、タブレット型コンピュータ501のCPUがRAMを使用しながら術中画像生成プログラムとして実行し、図11の機能構成部を実現する。
次に、本発明の第3実施形態に係る情報処理装置について説明する。本実施形態に係る情報処理装置は、上記第2実施形態と比べると、欠損部位の形状を生成して、そのデータを蓄積する点で異なる。その他の構成および動作は、第2実施形態と同様であるため、同じ構成および動作については同じ符号を付してその詳しい説明を省略する。本実施形態によれば、欠損部位の3次元データに基づいて正確なインプラントを生成することが可能となる。
次に、本発明の第4実施形態に係る情報処理装置について説明する。本実施形態に係る情報処理装置は、上記第2実施形態と比べると、骨切位置の候補を選定して報知する点で異なる。その他の構成および動作は、第2実施形態または第3実施形態と同様であるため、同じ構成および動作については同じ符号を付してその詳しい説明を省略する。
図19は、本実施形態に係る情報処理装置1900の機能構成を示すブロック図である。なお、図19において、図7と同様の機能構成部には同じ参照番号を付して、説明を省略する。
図20は、本実施形態に係る情報処理装置1900の処理手順を示すフローチャートである。なお、図20において、図8と同様のステップには同じステップ番号を付して、説明は省略する。
次に、本発明の第5実施形態に係る骨切支援システムについて説明する。本実施形態に係る骨切支援システムは、上記第2実施形態乃至第4実施形態と比べると、術前準備データ生成時に対象骨に実際のマーカを配置することなく、仮想的な3次元マーカを画面上で生成して、3次元プリンタにより作成する点で異なる。その他の構成および動作は、上記実施形態と同様であるため、同じ構成および動作については同じ符号を付してその詳しい説明を省略する。
図21Aおよび図21Bを参照して、本実施形態の骨切支援システムの処理の概要を説明する。
図21Aは、本実施形態に係る情報処理装置を用いた術前準備データ生成処理の概要を説明する図である。なお、図21Aにおいて、図4と同様の表示画像あるいは表示要素には同じ参照番号を付す。なお、各画像は、表示画面に表示されるCG画像であり、それぞれ、術前準備データ生成処理の各ステージに対応している。
図21Bは、本発実施形態に係る術中の骨切支援処理の概要を説明する図である。なお、図21Bにおいて、図6Bと同様の要素には同じ参照番号を付す。
図22は、本実施形態に係る術前準備データ生成システムと術中画像処理システムとを含む骨切支援システム全体の処理手順を示すフローチャートである。
図23は、本実施形態に係る術前準備データ生成システム2300の機能構成を示す図である。なお、図23において、図7と同様な機能構成部には同じ参照番号を付して、説明を省略する。
図24は、本実施形態に係る術前準備データDB2319の構成を示す図である。図24は、本実施形態に特有の術式でプラニングされる準備データの構成を示している。なお、図24には、図9に図示した構成も含まれる。
図25は、本実施形態に係る骨切面生成処理の手順を示すフローチャートである。このフローチャートは、情報処理装置2310のCPUがRAMを使用しながら実行して、図23の機能構成部を実現する。
図26は、本実施形態に係る術中画像処理システム2600におけるタブレット型コンピュータ2610の機能構成を示すブロック図である。図26において、図3、図5または図11と同様の機能構成部には同じ参照番号を付して、説明を省略する。
次に、本発明の第6実施形態に係る骨切支援システムについて説明する。本実施形態に係る骨切支援システムは、上記第2実施形態乃至第5実施形態と比べると、マーカとして対象骨の外科手術を行なう部位表面の3次元データを使用して、術中画像処理においては深度センサにより対象骨の3次元データを取得して骨切りを支援する点で異なる。その他の構成および動作は、上記実施形態と同様であるため、同じ構成および動作については同じ符号を付してその詳しい説明を省略する。
図27は、本実施形態に係る術中画像処理システム2700の機能構成を示すブロック図である。術中画像処理システム2700は、深度センサ&HMD2720と情報処理装置2710とを備える。
図28は、本実施形態に係る骨表面画像照合部2713において使用されるデータテーブル2800を示す図である。データテーブル2800は、深度センサが患者の患部の手術対象骨の表面から取得した深度画像(距離画像)と、術前準備データ719として格納された手術対象骨とを照合して、現在の手術対象骨の位置と向きとを決定する。
データテーブル2800は、深度センサが取得した深度センサ画像2801に対応付けて、照合した3次元骨データ2802と、照合結果から決定された対象骨の実空間位置と向き2803と、を記憶する。そして、データテーブル2800は、対象骨の実空間位置と向き2803に対応して、3次元座標変換した、3次元骨データ2804と、骨切り面や、各器具の位置と向き、などの3次元データ2805と、を記憶する。
なお、本実施形態においては、術中の骨切りの支援を中心に説明したが、例えば、他の骨孔開けの支援、骨を所定表面形状とするために削る処理の支援、インプラントへの置換が必要な術式のインプラント配置支援、などの特に器具の操作が必要な処理の支援に適用され、同様な効果を奏する。
Claims (13)
- 手術対象骨の3次元形状データとマーカの位置データとを関連付けて記憶する記憶手段と、
前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定する骨切面決定手段と、
前記マーカを撮像した画像に基づいて、前記決定された骨切面を表示する骨切面表示手段と、
を備える骨切支援システム。 - 前記手術対象骨の特徴部分に設置される3次元のマーカを画面上で仮想的に生成するマーカ生成手段と、
前記3次元のマーカを作成する3次元プリンタと、
をさらに備える請求項1に記載の骨切支援システム。 - 前記記憶手段は、前記マーカとして、前記手術対象骨の外科手術を行なう部位表面の3次元データを記憶し、
前記撮像手段は、前記手術対象骨の外科手術を行なう部位表面の3次元データを撮像する深度センサである、請求項1に記載の骨切支援システム。 - 前記記憶手段は、手術対象骨の3次元形状データを、前記手術対象骨に固定されるマーカの位置データに関連付けて記憶し、
前記骨切面決定手段は、前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定し、
前記骨切面表示手段は、前記手術対象骨に固定された前記マーカを撮像した画像に基づいて、前記決定された骨切面を表示する、請求項1に記載の骨切支援システム。 - 前記記憶手段は、前記骨切面の3次元形状データを、前記手術対象骨に固定されるマーカの位置データに関連付けて記憶し、
前記骨切面表示手段は、前記手術対象骨に固定された前記マーカを撮像した画像に基づいて、前記マーカの位置、大きさ、向きに応じた、位置、大きさ、向きで、前記決定された骨切面を表示する請求項4に記載の骨切支援システム。 - 前記記憶手段は、前記決定された骨切面で切断されて分離された前記手術対象骨の分離骨の3次元形状データをさらに記憶し、
前記骨切面表示手段は、前記記憶手段に記憶された手術対象骨の3次元形状データに基づいて、前記決定された骨切面で切断される前記手術対象骨を、さらに表示する請求項4または5に記載の骨切支援システム。 - 前記手術対象骨の3次元形状データに基づき、手術対象骨画像を生成して表示する骨画像表示手段と、
表示された前記手術対象骨画像に対して、前記骨切面の位置および向きの指定をユーザから入力する入力手段と、
をさらに備え、
前記骨切面決定手段は、前記ユーザの入力に基づいて、前記骨切面の位置および向きを決定することを特徴とする請求項4乃至6のいずれか1項に記載の骨切支援システム。 - 前記記憶手段は、前記手術対象骨の治癒後の形状の参照となる参照骨の3次元形状データをさらに記憶し、
前記骨切面決定手段は、前記手術対象骨を前記骨切面で切断して生成される分離骨の3次元形状データと、前記参照骨の3次元形状データとの重なり度合いに応じて、前記骨切面の位置および向きを決定する請求項7に記載の骨切支援システム。 - 前記骨切面決定手段は、前記手術対象骨を前記骨切面で切断して生成される分離骨の3次元形状データと、前記参照骨の3次元形状データとを、前記参照骨の端部を基準に重ねた場合に、前記参照骨の3次元形状データからはみ出した部分の前記手術対象骨の体積が最小となる位置および向きの前記骨切面を決定することを特徴とする請求項8に記載の骨切支援システム。
- 前記骨切面決定手段は、前記骨切面で切断して生成された前記分離骨の3次元形状データを、前記参照骨の端部を基準に重ねた場合に、前記参照骨の3次元形状データと重ねた場合の、前記参照骨において、分離骨同士に挟まれた部分の3次元形状データを、欠損部位の3次元形状データとして生成する欠損部位形状生成手段をさらに備えた請求項8または9に記載の骨切支援システム。
- 請求項1乃至10のいずれか1項に記載の骨切支援システムに利用する情報処理装置であって、
手術対象骨の3次元形状データとマーカの位置データとを関連付けて記憶する記憶手段と、
前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定する骨切面決定手段と、
を備えた情報処理装置。 - 請求項1乃至10のいずれか1項に記載の骨切支援システムに用いる画像処理方法であって、
手術対象骨の3次元形状データとマーカの位置データとを関連付けて記憶手段に記憶する記憶ステップと、
前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定する骨切面決定ステップと、
を含む画像処理方法。 - 請求項1乃至10のいずれか1項に記載の骨切支援システムに利用する画像処理プログラムであって、
手術対象骨の3次元形状データとマーカの位置データとを関連付けて記憶手段に記憶する記憶ステップと、
前記手術対象骨の3次元形状データに基づいて、前記手術対象骨を切断する面を示す骨切面の位置および向きを決定する骨切面決定ステップと、
をコンピュータに実行させる画像処理プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015522826A JP6122495B2 (ja) | 2013-06-11 | 2014-06-11 | 骨切支援システム、情報処理装置、画像処理方法、および画像処理プログラム |
US14/898,021 US10467752B2 (en) | 2013-06-11 | 2014-06-11 | Bone cutting support system, information processing apparatus, image processing method, and image processing program |
US16/596,223 US11302005B2 (en) | 2013-06-11 | 2019-10-08 | Bone cutting support system, information processing apparatus, image processing method, and image processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-123210 | 2013-06-11 | ||
JP2013123210 | 2013-06-11 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/898,021 A-371-Of-International US10467752B2 (en) | 2013-06-11 | 2014-06-11 | Bone cutting support system, information processing apparatus, image processing method, and image processing program |
US16/596,223 Continuation US11302005B2 (en) | 2013-06-11 | 2019-10-08 | Bone cutting support system, information processing apparatus, image processing method, and image processing program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014200017A1 true WO2014200017A1 (ja) | 2014-12-18 |
Family
ID=52022308
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/065448 WO2014200017A1 (ja) | 2013-06-11 | 2014-06-11 | 骨切支援システム、情報処理装置、画像処理方法、および画像処理プログラム |
Country Status (3)
Country | Link |
---|---|
US (2) | US10467752B2 (ja) |
JP (1) | JP6122495B2 (ja) |
WO (1) | WO2014200017A1 (ja) |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9622805B2 (en) | 2015-08-14 | 2017-04-18 | Treace Medical Concepts, Inc. | Bone positioning and preparing guide systems and methods |
JP2017076295A (ja) * | 2015-10-16 | 2017-04-20 | 富士フイルム株式会社 | 拡張現実提供システム及び方法、情報処理装置、並びにプログラム |
US9687250B2 (en) | 2015-01-07 | 2017-06-27 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
WO2017160651A1 (en) | 2016-03-12 | 2017-09-21 | Lang Philipp K | Devices and methods for surgery |
JP2018047240A (ja) * | 2016-09-16 | 2018-03-29 | インテリジェンスファクトリー合同会社 | 手術支援端末及びプログラム |
JP6392470B1 (ja) * | 2017-03-03 | 2018-09-19 | 誠 五島 | 取付位置確認部材、骨切断補助キット、及び位置検出プログラム |
US10342590B2 (en) | 2015-08-14 | 2019-07-09 | Treace Medical Concepts, Inc. | Tarsal-metatarsal joint procedure utilizing fulcrum |
JPWO2018078723A1 (ja) * | 2016-10-25 | 2019-09-05 | 株式会社 レキシー | 手術支援システム |
US10512470B1 (en) | 2016-08-26 | 2019-12-24 | Treace Medical Concepts, Inc. | Osteotomy procedure for correcting bone misalignment |
US10524808B1 (en) | 2016-11-11 | 2020-01-07 | Treace Medical Concepts, Inc. | Devices and techniques for performing an osteotomy procedure on a first metatarsal to correct a bone misalignment |
US10555757B2 (en) | 2014-07-15 | 2020-02-11 | Treace Medical Concepts, Inc. | Bone positioning and cutting system and method |
US10575862B2 (en) | 2015-09-18 | 2020-03-03 | Treace Medical Concepts, Inc. | Joint spacer systems and methods |
CN110970134A (zh) * | 2019-11-05 | 2020-04-07 | 华中科技大学 | 一种骨外科手术模拟方法及其应用 |
US10653467B2 (en) | 2015-05-06 | 2020-05-19 | Treace Medical Concepts, Inc. | Intra-osseous plate system and method |
US10849663B2 (en) | 2015-07-14 | 2020-12-01 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
US10849631B2 (en) | 2015-02-18 | 2020-12-01 | Treace Medical Concepts, Inc. | Pivotable bone cutting guide useful for bone realignment and compression techniques |
US10874446B2 (en) | 2015-07-14 | 2020-12-29 | Treace Medical Concepts, Inc. | Bone positioning guide |
US10939939B1 (en) | 2017-02-26 | 2021-03-09 | Treace Medical Concepts, Inc. | Fulcrum for tarsal-metatarsal joint procedure |
CN113133802A (zh) * | 2021-04-20 | 2021-07-20 | 四川大学 | 一种基于机器学习的骨手术线自动定点方法 |
US20210275251A1 (en) * | 2019-12-30 | 2021-09-09 | Ethicon Llc | Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto |
JP2022517132A (ja) * | 2019-01-23 | 2022-03-04 | プロプリオ インコーポレイテッド | 手術部位のメディエイテッドリアリティビューに対するリアルタイム手術画像への術前スキャン画像のアライメント |
US11278337B2 (en) | 2015-08-14 | 2022-03-22 | Treace Medical Concepts, Inc. | Tarsal-metatarsal joint procedure utilizing fulcrum |
JP2022116157A (ja) * | 2016-08-16 | 2022-08-09 | インサイト メディカル システムズ インコーポレイテッド | 医療処置における感覚増強のためのシステム |
US11583323B2 (en) | 2018-07-12 | 2023-02-21 | Treace Medical Concepts, Inc. | Multi-diameter bone pin for installing and aligning bone fixation plate while minimizing bone damage |
US11596443B2 (en) | 2018-07-11 | 2023-03-07 | Treace Medical Concepts, Inc. | Compressor-distractor for angularly realigning bone portions |
US11607250B2 (en) | 2019-02-13 | 2023-03-21 | Treace Medical Concepts, Inc. | Tarsal-metatarsal joint procedure utilizing compressor-distractor and instrument providing sliding surface |
US11622797B2 (en) | 2020-01-31 | 2023-04-11 | Treace Medical Concepts, Inc. | Metatarsophalangeal joint preparation and metatarsal realignment for fusion |
US11627954B2 (en) | 2019-08-07 | 2023-04-18 | Treace Medical Concepts, Inc. | Bi-planar instrument for bone cutting and joint realignment procedure |
USD1011524S1 (en) | 2022-02-23 | 2024-01-16 | Treace Medical Concepts, Inc. | Compressor-distractor for the foot |
US11889998B1 (en) | 2019-09-12 | 2024-02-06 | Treace Medical Concepts, Inc. | Surgical pin positioning lock |
US11890039B1 (en) | 2019-09-13 | 2024-02-06 | Treace Medical Concepts, Inc. | Multi-diameter K-wire for orthopedic applications |
US11986251B2 (en) | 2019-09-13 | 2024-05-21 | Treace Medical Concepts, Inc. | Patient-specific osteotomy instrumentation |
US12004789B2 (en) | 2020-05-19 | 2024-06-11 | Treace Medical Concepts, Inc. | Devices and techniques for treating metatarsus adductus |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2955481B1 (fr) | 2010-01-27 | 2013-06-14 | Tornier Sa | Dispositif et procede de caracterisation glenoidienne d'une omoplate a protheser ou a resurfacer |
GB201008281D0 (en) | 2010-05-19 | 2010-06-30 | Nikonovas Arkadijus | Indirect analysis and manipulation of objects |
JP6138566B2 (ja) * | 2013-04-24 | 2017-05-31 | 川崎重工業株式会社 | 部品取付作業支援システムおよび部品取付方法 |
WO2015052586A2 (en) | 2013-10-10 | 2015-04-16 | Imascap | Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants |
KR101570856B1 (ko) * | 2014-05-16 | 2015-11-24 | 큐렉소 주식회사 | 조직 위치 검출 방법 및 이를 이용하는 장치 |
GB2534359A (en) * | 2015-01-15 | 2016-07-27 | Corin Ltd | System and method for patient implant alignment |
GB2536650A (en) | 2015-03-24 | 2016-09-28 | Augmedics Ltd | Method and system for combining video-based and optic-based augmented reality in a near eye display |
US10092361B2 (en) | 2015-09-11 | 2018-10-09 | AOD Holdings, LLC | Intraoperative systems and methods for determining and providing for display a virtual image overlaid onto a visual image of a bone |
DE102016212240A1 (de) * | 2016-07-05 | 2018-01-11 | Siemens Aktiengesellschaft | Verfahren zur Interaktion eines Bedieners mit einem Modell eines technischen Systems |
US10835318B2 (en) * | 2016-08-25 | 2020-11-17 | DePuy Synthes Products, Inc. | Orthopedic fixation control and manipulation |
WO2019045144A1 (ko) * | 2017-08-31 | 2019-03-07 | (주)레벨소프트 | 의료용 항법 장치를 위한 의료 영상 처리 장치 및 의료 영상 처리 방법 |
US20190125288A1 (en) * | 2017-10-30 | 2019-05-02 | Leucadia Therapeutics, LLC | Diagnosis and prognosis of medical conditions using cribriform plate morphology |
KR20200084025A (ko) * | 2017-11-30 | 2020-07-09 | 씽크 써지컬, 인크. | 엔드 이펙터 공구 경로 외부에서 뼈에 하드웨어를 장착하기 위한 시스템 및 방법 |
EP3787543A4 (en) | 2018-05-02 | 2022-01-19 | Augmedics Ltd. | REGISTRATION OF A REFERENCE MARK FOR AN AUGMENTED REALITY SYSTEM |
WO2019245852A1 (en) | 2018-06-19 | 2019-12-26 | Tornier, Inc. | Virtual checklists for orthopedic surgery |
US20200015924A1 (en) | 2018-07-16 | 2020-01-16 | Ethicon Llc | Robotic light projection tools |
US20210322101A1 (en) * | 2018-08-30 | 2021-10-21 | Brainlab Ag | Automated pre-operative assessment of implant placement in human bone |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
FR3092748A1 (fr) * | 2019-02-18 | 2020-08-21 | Sylorus Robotics | Procédés et systèmes de traitement d’images |
US11439436B2 (en) | 2019-03-18 | 2022-09-13 | Synthes Gmbh | Orthopedic fixation strut swapping |
US11304757B2 (en) | 2019-03-28 | 2022-04-19 | Synthes Gmbh | Orthopedic fixation control and visualization |
US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
US11382712B2 (en) | 2019-12-22 | 2022-07-12 | Augmedics Ltd. | Mirroring in image guided surgery |
US11832996B2 (en) | 2019-12-30 | 2023-12-05 | Cilag Gmbh International | Analyzing surgical trends by a surgical system |
US12053223B2 (en) | 2019-12-30 | 2024-08-06 | Cilag Gmbh International | Adaptive surgical system control according to surgical smoke particulate characteristics |
US11744667B2 (en) | 2019-12-30 | 2023-09-05 | Cilag Gmbh International | Adaptive visualization by a surgical system |
US11896442B2 (en) | 2019-12-30 | 2024-02-13 | Cilag Gmbh International | Surgical systems for proposing and corroborating organ portion removals |
US12002571B2 (en) | 2019-12-30 | 2024-06-04 | Cilag Gmbh International | Dynamic surgical visualization systems |
US11776144B2 (en) | 2019-12-30 | 2023-10-03 | Cilag Gmbh International | System and method for determining, adjusting, and managing resection margin about a subject tissue |
US11284963B2 (en) | 2019-12-30 | 2022-03-29 | Cilag Gmbh International | Method of using imaging devices in surgery |
US11334997B2 (en) | 2020-04-03 | 2022-05-17 | Synthes Gmbh | Hinge detection for orthopedic fixation |
US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
WO2024057210A1 (en) | 2022-09-13 | 2024-03-21 | Augmedics Ltd. | Augmented reality eyewear for image-guided medical intervention |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001507614A (ja) * | 1997-10-28 | 2001-06-12 | カール−ツアイス−スチフツング | 骨セグメントナビゲーションシステム |
JP2009172124A (ja) * | 2008-01-24 | 2009-08-06 | Osaka Univ | 手術ナビゲーションシステム、画像表示方法、コンピュータプログラム及び記録媒体 |
Family Cites Families (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4841975A (en) * | 1987-04-15 | 1989-06-27 | Cemax, Inc. | Preoperative planning of bone cuts and joint replacement using radiant energy scan imaging |
US6430434B1 (en) * | 1998-12-14 | 2002-08-06 | Integrated Surgical Systems, Inc. | Method for determining the location and orientation of a bone for computer-assisted orthopedic procedures using intraoperatively attached markers |
US6491699B1 (en) * | 1999-04-20 | 2002-12-10 | Surgical Navigation Technologies, Inc. | Instrument guidance method and system for image guided surgery |
US6701174B1 (en) * | 2000-04-07 | 2004-03-02 | Carnegie Mellon University | Computer-aided bone distraction |
US20040068187A1 (en) * | 2000-04-07 | 2004-04-08 | Krause Norman M. | Computer-aided orthopedic surgery |
US6711432B1 (en) * | 2000-10-23 | 2004-03-23 | Carnegie Mellon University | Computer-aided orthopedic surgery |
US20050113846A1 (en) * | 2001-02-27 | 2005-05-26 | Carson Christopher P. | Surgical navigation systems and processes for unicompartmental knee arthroplasty |
US7001346B2 (en) * | 2001-11-14 | 2006-02-21 | Michael R. White | Apparatus and methods for making intraoperative orthopedic measurements |
ATE533420T1 (de) * | 2002-04-30 | 2011-12-15 | Orthosoft Inc | Berechnung der femur-resektion bei knieoperationen |
WO2003096920A1 (de) | 2002-05-21 | 2003-11-27 | Plus Endoprothetik Ag | Anordnung zur ermittlung funktionsbestimmender geometrischer grössen eines gelenkes eines wirbeltiers |
GB2393625C (en) * | 2002-09-26 | 2004-08-18 | Meridian Tech Ltd | Orthopaedic surgery planning |
WO2004071314A1 (ja) | 2003-02-12 | 2004-08-26 | Tsuyoshi Murase | 罹患骨切断補助部材及び矯正位置判断補助部材 |
US7104997B2 (en) * | 2003-06-19 | 2006-09-12 | Lionberger Jr David R | Cutting guide apparatus and surgical method for use in knee arthroplasty |
US7392076B2 (en) * | 2003-11-04 | 2008-06-24 | Stryker Leibinger Gmbh & Co. Kg | System and method of registering image data to intra-operatively digitized landmarks |
US20050234332A1 (en) * | 2004-01-16 | 2005-10-20 | Murphy Stephen B | Method of computer-assisted ligament balancing and component placement in total knee arthroplasty |
GB0404345D0 (en) * | 2004-02-27 | 2004-03-31 | Depuy Int Ltd | Surgical jig and methods of use |
US20080269596A1 (en) * | 2004-03-10 | 2008-10-30 | Ian Revie | Orthpaedic Monitoring Systems, Methods, Implants and Instruments |
US20050234465A1 (en) * | 2004-03-31 | 2005-10-20 | Mccombs Daniel L | Guided saw with pins |
WO2006033377A1 (ja) * | 2004-09-24 | 2006-03-30 | Hitachi Medical Corporation | 医用画像表示装置及び方法並びにプログラム |
US20070073136A1 (en) * | 2005-09-15 | 2007-03-29 | Robert Metzger | Bone milling with image guided surgery |
EP2001411B1 (en) * | 2006-03-17 | 2013-05-01 | Zimmer, Inc. | Methods of predetermining the contour of a resected bone surface and assessing the fit of a prosthesis on the bone |
US20070279435A1 (en) * | 2006-06-02 | 2007-12-06 | Hern Ng | Method and system for selective visualization and interaction with 3D image data |
US8934961B2 (en) * | 2007-05-18 | 2015-01-13 | Biomet Manufacturing, Llc | Trackable diagnostic scope apparatus and methods of use |
US20080319491A1 (en) * | 2007-06-19 | 2008-12-25 | Ryan Schoenefeld | Patient-matched surgical component and methods of use |
CA2882265C (en) * | 2007-08-17 | 2017-01-03 | Zimmer, Inc. | Implant design analysis suite |
US8617171B2 (en) * | 2007-12-18 | 2013-12-31 | Otismed Corporation | Preoperatively planning an arthroplasty procedure and generating a corresponding patient specific arthroplasty resection guide |
US8078440B2 (en) * | 2008-09-19 | 2011-12-13 | Smith & Nephew, Inc. | Operatively tuning implants for increased performance |
US9099015B2 (en) * | 2009-05-12 | 2015-08-04 | Edda Technology, Inc. | System, method, apparatus, and computer program for interactive pre-operative assessment involving safety margins and cutting planes in rendered 3D space |
JP5410525B2 (ja) | 2009-07-15 | 2014-02-05 | 剛 村瀬 | 補填用人工骨モデル、補填用人工骨の形成方法、並びに医療用シミュレーションシステム |
AU2011239570A1 (en) * | 2010-04-14 | 2012-11-01 | Smith & Nephew, Inc. | Systems and methods for patient- based computer assisted surgical procedures |
WO2011134083A1 (en) * | 2010-04-28 | 2011-11-03 | Ryerson University | System and methods for intraoperative guidance feedback |
US9706948B2 (en) * | 2010-05-06 | 2017-07-18 | Sachin Bhandari | Inertial sensor based surgical navigation system for knee replacement surgery |
KR101818682B1 (ko) * | 2010-07-08 | 2018-01-16 | 신세스 게엠바하 | 개선된 골 마커 및 맞춤형 임플란트 |
US8953865B2 (en) * | 2011-03-03 | 2015-02-10 | Hitachi Medical Corporation | Medical image processing device and medical image processing method |
EP2737854A1 (en) * | 2011-07-28 | 2014-06-04 | Panasonic Healthcare Co., Ltd. | Cutting simulation device and cutting simulation program |
US9225538B2 (en) | 2011-09-01 | 2015-12-29 | Microsoft Technology Licensing, Llc | Stateless application notifications |
WO2013033566A1 (en) * | 2011-09-02 | 2013-03-07 | Stryker Corporation | Surgical instrument including a cutting accessory extending from a housing and actuators that establish the position of the cutting accessory relative to the housing |
KR101711358B1 (ko) * | 2012-05-02 | 2017-02-28 | 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 | 의학적 애플리케이션들에서 증강 현실을 위한 동적 모델을 이용한 4 차원 이미지 등록 |
JP6008635B2 (ja) * | 2012-07-24 | 2016-10-19 | 富士フイルム株式会社 | 手術支援装置、方法およびプログラム |
US10687896B2 (en) * | 2013-06-11 | 2020-06-23 | Agency For Science, Technology And Research | Computer-aided planning of liver surgery |
WO2015081025A1 (en) * | 2013-11-29 | 2015-06-04 | The Johns Hopkins University | Cranial reference mount |
US10258256B2 (en) * | 2014-12-09 | 2019-04-16 | TechMah Medical | Bone reconstruction and orthopedic implants |
-
2014
- 2014-06-11 US US14/898,021 patent/US10467752B2/en active Active
- 2014-06-11 JP JP2015522826A patent/JP6122495B2/ja active Active
- 2014-06-11 WO PCT/JP2014/065448 patent/WO2014200017A1/ja active Application Filing
-
2019
- 2019-10-08 US US16/596,223 patent/US11302005B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001507614A (ja) * | 1997-10-28 | 2001-06-12 | カール−ツアイス−スチフツング | 骨セグメントナビゲーションシステム |
JP2009172124A (ja) * | 2008-01-24 | 2009-08-06 | Osaka Univ | 手術ナビゲーションシステム、画像表示方法、コンピュータプログラム及び記録媒体 |
Cited By (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10555757B2 (en) | 2014-07-15 | 2020-02-11 | Treace Medical Concepts, Inc. | Bone positioning and cutting system and method |
US10945764B2 (en) | 2014-07-15 | 2021-03-16 | Treace Medical Concepts, Inc. | Bone positioning and cutting system and method |
US11523845B2 (en) | 2014-07-15 | 2022-12-13 | Treace Medical Concepts, Inc. | Bone positioning and cutting system and method |
US11147590B2 (en) | 2014-07-15 | 2021-10-19 | Treace Medical Concepts, Inc. | Bone positioning and cutting system and method |
US11771467B2 (en) | 2014-07-15 | 2023-10-03 | Treace Medical Concepts, Inc. | Bone positioning and cutting system and method |
US11497528B2 (en) | 2014-07-15 | 2022-11-15 | Treace Medical Concepts, Inc. | Bone positioning and cutting system and method |
US11937849B2 (en) | 2014-07-15 | 2024-03-26 | Treace Medical Concepts, Inc. | Bone positioning and cutting system and method |
US9687250B2 (en) | 2015-01-07 | 2017-06-27 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
US10888335B2 (en) | 2015-01-07 | 2021-01-12 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
US10603046B2 (en) | 2015-01-07 | 2020-03-31 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
US11786257B2 (en) | 2015-01-07 | 2023-10-17 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
US10561426B1 (en) | 2015-01-07 | 2020-02-18 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
US11844533B2 (en) | 2015-02-18 | 2023-12-19 | Treace Medical Concepts, Inc. | Pivotable bone cutting guide useful for bone realignment and compression techniques |
US10849631B2 (en) | 2015-02-18 | 2020-12-01 | Treace Medical Concepts, Inc. | Pivotable bone cutting guide useful for bone realignment and compression techniques |
US11426219B2 (en) | 2015-05-06 | 2022-08-30 | Treace Medical Concepts, Inc. | Intra-osseous plate system and method |
US11969193B2 (en) | 2015-05-06 | 2024-04-30 | Treace Medical Concepts, Inc. | Intra-osseous plate system and method |
US10653467B2 (en) | 2015-05-06 | 2020-05-19 | Treace Medical Concepts, Inc. | Intra-osseous plate system and method |
US11950819B2 (en) | 2015-07-14 | 2024-04-09 | Treace Medical Concepts, Inc. | Bone positioning guide |
US9936994B2 (en) | 2015-07-14 | 2018-04-10 | Treace Medical Concepts, Inc. | Bone positioning guide |
US11602386B2 (en) | 2015-07-14 | 2023-03-14 | Treace Medical Concepts, Inc. | Bone positioning guide |
US10849663B2 (en) | 2015-07-14 | 2020-12-01 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
US10335220B2 (en) | 2015-07-14 | 2019-07-02 | Treace Medical Concepts, Inc. | Bone positioning guide |
US11116558B2 (en) | 2015-07-14 | 2021-09-14 | Treace Medical Concepts, Inc. | Bone positioning guide |
US10874446B2 (en) | 2015-07-14 | 2020-12-29 | Treace Medical Concepts, Inc. | Bone positioning guide |
US11963703B2 (en) | 2015-07-14 | 2024-04-23 | Treace Medical Concepts, Inc. | Bone cutting guide systems and methods |
US11185359B2 (en) | 2015-07-14 | 2021-11-30 | Treace Medical Concepts, Inc. | Bone positioning guide |
US12102368B2 (en) | 2015-07-14 | 2024-10-01 | Treace Medical Concepts, Inc. | Bone positioning guide |
US11690659B2 (en) | 2015-08-14 | 2023-07-04 | Treace Medical Concepts, Inc. | Tarsal-metatarsal joint procedure utilizing fulcrum |
US11039873B2 (en) | 2015-08-14 | 2021-06-22 | Treace Medical Concepts, Inc. | Bone positioning and preparing guide systems and methods |
US11602387B2 (en) | 2015-08-14 | 2023-03-14 | Treace Medical Concepts, Inc. | Bone positioning and preparing guide systems and methods |
US9622805B2 (en) | 2015-08-14 | 2017-04-18 | Treace Medical Concepts, Inc. | Bone positioning and preparing guide systems and methods |
US10849670B2 (en) | 2015-08-14 | 2020-12-01 | Treace Medical Concepts, Inc. | Bone positioning and preparing guide systems and methods |
US11413081B2 (en) | 2015-08-14 | 2022-08-16 | Treace Medical Concepts, Inc. | Tarsal-metatarsal joint procedure utilizing fulcrum |
US11911085B2 (en) | 2015-08-14 | 2024-02-27 | Treace Medical Concepts, Inc. | Bone positioning and preparing guide systems and methods |
US11213333B2 (en) | 2015-08-14 | 2022-01-04 | Treace Medical Concepts, Inc. | Bone positioning and preparing guide systems and methods |
US10342590B2 (en) | 2015-08-14 | 2019-07-09 | Treace Medical Concepts, Inc. | Tarsal-metatarsal joint procedure utilizing fulcrum |
US10045807B2 (en) | 2015-08-14 | 2018-08-14 | Treace Medical Concepts, Inc. | Bone positioning and preparing guide systems and methods |
US11278337B2 (en) | 2015-08-14 | 2022-03-22 | Treace Medical Concepts, Inc. | Tarsal-metatarsal joint procedure utilizing fulcrum |
US11648019B2 (en) | 2015-09-18 | 2023-05-16 | Treace Medical Concepts, Inc. | Joint spacer systems and methods |
US11771443B2 (en) | 2015-09-18 | 2023-10-03 | Treace Medical Concepts, Inc. | Joint spacer systems and methods |
US10575862B2 (en) | 2015-09-18 | 2020-03-03 | Treace Medical Concepts, Inc. | Joint spacer systems and methods |
JP2017076295A (ja) * | 2015-10-16 | 2017-04-20 | 富士フイルム株式会社 | 拡張現実提供システム及び方法、情報処理装置、並びにプログラム |
WO2017160651A1 (en) | 2016-03-12 | 2017-09-21 | Lang Philipp K | Devices and methods for surgery |
JP2022116157A (ja) * | 2016-08-16 | 2022-08-09 | インサイト メディカル システムズ インコーポレイテッド | 医療処置における感覚増強のためのシステム |
US10512470B1 (en) | 2016-08-26 | 2019-12-24 | Treace Medical Concepts, Inc. | Osteotomy procedure for correcting bone misalignment |
US11931047B2 (en) | 2016-08-26 | 2024-03-19 | Treace Medical Concepts, Inc. | Osteotomy procedure for correcting bone misalignment |
US11076863B1 (en) | 2016-08-26 | 2021-08-03 | Treace Medical Concepts, Inc. | Osteotomy procedure for correcting bone misalignment |
JP7012302B2 (ja) | 2016-09-16 | 2022-01-28 | インテリジェンスファクトリー合同会社 | 手術支援端末及びプログラム |
JP2018047240A (ja) * | 2016-09-16 | 2018-03-29 | インテリジェンスファクトリー合同会社 | 手術支援端末及びプログラム |
JPWO2018078723A1 (ja) * | 2016-10-25 | 2019-09-05 | 株式会社 レキシー | 手術支援システム |
US10524808B1 (en) | 2016-11-11 | 2020-01-07 | Treace Medical Concepts, Inc. | Devices and techniques for performing an osteotomy procedure on a first metatarsal to correct a bone misalignment |
US11364037B2 (en) | 2016-11-11 | 2022-06-21 | Treace Medical Concepts, Inc. | Techniques for performing an osteotomy procedure on bone to correct a bone misalignment |
US10582936B1 (en) | 2016-11-11 | 2020-03-10 | Treace Medical Concepts, Inc. | Devices and techniques for performing an osteotomy procedure on a first metatarsal to correct a bone misalignment |
US10939939B1 (en) | 2017-02-26 | 2021-03-09 | Treace Medical Concepts, Inc. | Fulcrum for tarsal-metatarsal joint procedure |
JP6392470B1 (ja) * | 2017-03-03 | 2018-09-19 | 誠 五島 | 取付位置確認部材、骨切断補助キット、及び位置検出プログラム |
US11596443B2 (en) | 2018-07-11 | 2023-03-07 | Treace Medical Concepts, Inc. | Compressor-distractor for angularly realigning bone portions |
US11583323B2 (en) | 2018-07-12 | 2023-02-21 | Treace Medical Concepts, Inc. | Multi-diameter bone pin for installing and aligning bone fixation plate while minimizing bone damage |
US11376096B2 (en) | 2019-01-23 | 2022-07-05 | Proprio, Inc. | Aligning pre-operative scan images to real-time operative images for a mediated-reality view of a surgical site |
JP7138802B2 (ja) | 2019-01-23 | 2022-09-16 | プロプリオ インコーポレイテッド | 手術部位のメディエイテッドリアリティビューに対するリアルタイム手術画像への術前スキャン画像のアライメント |
US11998401B2 (en) | 2019-01-23 | 2024-06-04 | Proprio, Inc. | Aligning pre-operative scan images to real-time operative images for a mediated-reality view of a surgical site |
JP2022517132A (ja) * | 2019-01-23 | 2022-03-04 | プロプリオ インコーポレイテッド | 手術部位のメディエイテッドリアリティビューに対するリアルタイム手術画像への術前スキャン画像のアライメント |
US11607250B2 (en) | 2019-02-13 | 2023-03-21 | Treace Medical Concepts, Inc. | Tarsal-metatarsal joint procedure utilizing compressor-distractor and instrument providing sliding surface |
US11627954B2 (en) | 2019-08-07 | 2023-04-18 | Treace Medical Concepts, Inc. | Bi-planar instrument for bone cutting and joint realignment procedure |
US11889998B1 (en) | 2019-09-12 | 2024-02-06 | Treace Medical Concepts, Inc. | Surgical pin positioning lock |
US11890039B1 (en) | 2019-09-13 | 2024-02-06 | Treace Medical Concepts, Inc. | Multi-diameter K-wire for orthopedic applications |
US11986251B2 (en) | 2019-09-13 | 2024-05-21 | Treace Medical Concepts, Inc. | Patient-specific osteotomy instrumentation |
CN110970134B (zh) * | 2019-11-05 | 2023-08-25 | 华中科技大学 | 一种骨外科手术模拟方法及其应用 |
CN110970134A (zh) * | 2019-11-05 | 2020-04-07 | 华中科技大学 | 一种骨外科手术模拟方法及其应用 |
US11813120B2 (en) * | 2019-12-30 | 2023-11-14 | Cilag Gmbh International | Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto |
US20210275251A1 (en) * | 2019-12-30 | 2021-09-09 | Ethicon Llc | Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto |
US11622797B2 (en) | 2020-01-31 | 2023-04-11 | Treace Medical Concepts, Inc. | Metatarsophalangeal joint preparation and metatarsal realignment for fusion |
US12004789B2 (en) | 2020-05-19 | 2024-06-11 | Treace Medical Concepts, Inc. | Devices and techniques for treating metatarsus adductus |
CN113133802A (zh) * | 2021-04-20 | 2021-07-20 | 四川大学 | 一种基于机器学习的骨手术线自动定点方法 |
USD1011524S1 (en) | 2022-02-23 | 2024-01-16 | Treace Medical Concepts, Inc. | Compressor-distractor for the foot |
Also Published As
Publication number | Publication date |
---|---|
US20160125603A1 (en) | 2016-05-05 |
US10467752B2 (en) | 2019-11-05 |
JPWO2014200017A1 (ja) | 2017-02-23 |
US20200043168A1 (en) | 2020-02-06 |
US11302005B2 (en) | 2022-04-12 |
JP6122495B2 (ja) | 2017-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6122495B2 (ja) | 骨切支援システム、情報処理装置、画像処理方法、および画像処理プログラム | |
JP6257728B2 (ja) | 外科手術支援システム、外科手術支援システムの作動方法、情報処理プログラムおよび情報処理装置 | |
JP7342069B2 (ja) | 蛍光透視法および追跡センサを使用する股関節の外科手術ナビゲーション | |
US11723724B2 (en) | Ultra-wideband positioning for wireless ultrasound tracking and communication | |
EP3273854B1 (en) | Systems for computer-aided surgery using intra-operative video acquired by a free moving camera | |
JP6463038B2 (ja) | 画像位置合せ装置、方法およびプログラム | |
US9375133B2 (en) | Endoscopic observation support system | |
JP6510301B2 (ja) | 医療支援システム、医療支援方法、画像処理装置およびその制御方法と制御プログラム | |
WO2022073342A1 (zh) | 手术机器人及其运动错误检测方法、检测装置 | |
CN108629845B (zh) | 手术导航装置、设备、系统和可读存储介质 | |
US20230114385A1 (en) | Mri-based augmented reality assisted real-time surgery simulation and navigation | |
US10078906B2 (en) | Device and method for image registration, and non-transitory recording medium | |
Klemm | Intraoperative Planning and Execution of Arbitrary Orthopedic Interventions Using Handheld Robotics and Augmented Reality | |
CN116568219A (zh) | 自动导航数字手术显微镜 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14810280 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015522826 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14898021 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14810280 Country of ref document: EP Kind code of ref document: A1 |