WO2020024638A1 - 一种手术导航装置 - Google Patents

一种手术导航装置 Download PDF

Info

Publication number
WO2020024638A1
WO2020024638A1 PCT/CN2019/085685 CN2019085685W WO2020024638A1 WO 2020024638 A1 WO2020024638 A1 WO 2020024638A1 CN 2019085685 W CN2019085685 W CN 2019085685W WO 2020024638 A1 WO2020024638 A1 WO 2020024638A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
surgical instrument
spatial position
visual
display unit
Prior art date
Application number
PCT/CN2019/085685
Other languages
English (en)
French (fr)
Inventor
王利峰
沈晨
Original Assignee
雅客智慧(北京)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 雅客智慧(北京)科技有限公司 filed Critical 雅客智慧(北京)科技有限公司
Publication of WO2020024638A1 publication Critical patent/WO2020024638A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware

Definitions

  • the embodiments of the present application relate to the field of medical technology, and in particular, to a surgical navigation device.
  • the graphic workstation displays the position of the surgical tool and the lesion.
  • the doctor can see that the image is blocked by other cellular tissues by viewing the image on the graphic workstation. Lesions, thereby improving surgical accuracy.
  • the current surgical navigation device displays the corresponding virtual surgical instrument and the three-dimensional model of the affected area on the screen after capturing the position and posture of the optical marker point based on the vision camera.
  • the doctor needs to observe the screen while operating the surgical instrument, so that the virtual surgical instrument can accurately reach the planned surgical position, and at the same time avoid misoperation and hurt the patient.
  • this surgical navigation device requires a high degree of attention from the doctor during operation, and the line of sight must be switched between the screen and the patient during the operation.
  • the rapid changes in the external environment and the continuous adjustment of the focal length of the eye will cause delay and discomfort. In severe cases, it may lead to medical accidents.
  • the embodiment of the present application provides a surgical navigation device, which is used to solve the problems that a doctor needs to switch between the screen of a surgical navigation device and a patient during a surgical operation, which may cause delay and discomfort and even cause a medical accident.
  • An embodiment of the present application provides a surgical navigation device including a surgical instrument, a positioning unit, and a display unit; wherein the display unit is mounted on the surgical instrument;
  • the positioning unit is configured to capture the spatial position and posture of the surgical instrument and the surgical area
  • the display unit is configured to display the virtual surgical scene, and update the virtual surgical instrument and the virtual surgical scene in the virtual surgical scene according to the spatial position and posture of the surgical instrument and the surgical area. Position and posture of the surgical area.
  • An embodiment of the present application provides a surgical navigation device, in which a display unit is mounted on a surgical instrument, so that the orientation of the display unit and the surgical instrument are the same, avoiding frequent adjustment of the direction of sight by the doctor, and making the information of the surgical navigation device more intuitive Feedback to the doctor to guide the operation in the real environment, ensure that the doctor's attention can be focused on the operation, and improve the safety of the operation.
  • FIG. 1 is a schematic structural diagram of a surgical navigation device according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of a surgical navigation device according to an embodiment of the present application.
  • connection unit 3 is a schematic structural diagram of a connection unit according to an embodiment of the present application.
  • FIG. 1 is a schematic structural diagram of a surgical navigation device according to an embodiment of the present application. As shown in FIG. 1, the device includes a surgical instrument 101, a positioning unit 102, and a display.
  • the display unit 103 wherein the display unit 103 is installed on the surgical instrument 101; the positioning unit 102 is configured to capture the spatial position and posture of the surgical instrument 101 and the surgical area; the display unit 103 is configured to display a virtual surgical scene, and according to the surgical instrument 101 and the spatial position and posture of the surgical area are more suitable for the position and posture of the virtual surgical instrument and the virtual surgical area in the virtual surgical scene.
  • the surgical instrument 101 refers to a medical instrument used in clinical surgery, and the surgical instrument 101 used in clinical surgery of different specialties is different, which is not specifically limited in the embodiment of the present application.
  • the surgical area refers to the relevant area that requires medical treatment in clinical surgery.
  • the positioning unit 102 is configured to capture the spatial position and posture of the surgical instrument 101 and the surgical region, and specifically includes the spatial position and posture of the surgical instrument 101 and the spatial position and posture of the surgical region.
  • the spatial position of the surgical instrument 101 is configured to indicate the overall spatial positioning information of the surgical instrument 101.
  • the posture of the surgical instrument 101 refers to the posture of the surgical instrument 101 determined by the relative spatial position of each marked point on the surgical instrument 101.
  • the spatial position of the area is configured to indicate the overall spatial positioning information of the surgical area
  • the posture of the surgical area refers to the posture of the surgical area determined by the relative spatial position of each marked point on the surgical area.
  • the positioning unit 102 has various implementation forms.
  • the positioning unit 102 may be an optical positioning and tracking system, a positioning and tracking system based on a visible light sensor, or a positioning and tracking system based on an electromagnetic sensor. The embodiments of the present application do not specifically limit this. .
  • the positioning unit 102 After acquiring the above information, the positioning unit 102 sends the above information to the display unit 103.
  • the display unit 103 can display a virtual surgery scene according to preset configuration information.
  • the virtual surgery scene here is a three-dimensional model including a virtual surgery instrument and a virtual surgery area.
  • the display unit 103 receives the spatial position and posture of the surgical instrument 101 and the surgical area sent by the positioning unit 102, it updates the spatial position and posture of the virtual surgical instrument in the virtual surgery scene based on the spatial position and posture of the surgical instrument 101, and based on the surgical area
  • the spatial position and pose of the virtual surgery area update the spatial position and pose of the virtual surgery area in the virtual surgery scene.
  • the display unit 103 may be a micro display device such as a smart phone or a tablet computer, which is not specifically limited in this embodiment of the present application.
  • the display unit 103 and the surgical instrument 101 may have an integrated structure or a detachable structure.
  • the display unit 103 is installed on the surgical instrument 101. Make the display unit 103 and the surgical instrument 101 have the same orientation, avoid doctors frequently adjusting the line of sight, make the information of the surgical navigation device more intuitively feedback to the doctor to guide the surgery in the real environment, and ensure that the doctor's attention can be focused on the surgery In operation, it improves the safety of the operation.
  • FIG. 2 is a schematic structural diagram of a surgical navigation device according to an embodiment of the present application.
  • a surgical navigation device a positioning unit 102 includes a first visual mark 201, a second visual mark 202, and Vision sensor 203; the first visual mark 201 is installed on the surgical instrument 101, and the second visual mark 202 is installed on the surgical area; the visual sensor 203 is configured to capture the spatial position of the first visual mark 201, and according to the first visual mark
  • the spatial position of 201 acquires the spatial position and posture of the surgical instrument 101 in the visual coordinate system; it is also configured to capture the spatial position of the second visual mark 202, and obtain the surgical position in the visual coordinate system according to the spatial position of the second visual mark 202 Down space position and attitude.
  • the first visual mark 201 is a plurality of visual marks installed on the surgical instrument 101
  • the second visual mark 202 is a plurality of visual marks installed on the surgical area.
  • the first visual mark is not provided in this embodiment of the present application.
  • the number of 201 and the second visual mark 202 are specifically limited.
  • the spatial position of the first visual mark 201 is configured to indicate the spatial position and posture of the surgical instrument 101
  • the spatial position of the second visual mark 202 is configured to indicate the spatial position and posture of the surgical area.
  • a surgical navigation device further includes a connection unit configured to mount the display unit 103 on the surgical instrument 101; the connection unit includes a screen clamping mechanism 301 and an instrument clamping mechanism 302, The screen clamping mechanism 301 is configured to clamp the display unit 103, and the instrument clamping mechanism 302 is configured to clamp the surgical instrument 101.
  • connection unit is configured to achieve a fixed connection between the display unit 103 and the surgical instrument 101.
  • the screen clamping mechanism 301 is configured to clamp the display unit 103
  • the instrument clamping mechanism 302 is configured to clamp the surgical instrument 101, so that the display unit 103 and the surgical instrument 101 can be fixedly connected through the connection unit, thereby the display unit 103 It is always consistent with the orientation of the surgical instrument 101, which prevents the doctor from frequently adjusting the line of sight.
  • the first visual mark 201 is mounted on the connection unit.
  • the first visual mark 201 is installed on the connecting unit, and the relative position between the first visual mark 201 and the surgical instrument 101 can also be ensured. Since the posture relationship is fixed, the spatial position of the first visual mark 201 can also be configured to indicate the spatial position and posture of the surgical instrument 101.
  • FIG. 3 is a schematic structural diagram of a connection unit according to an embodiment of the present application.
  • a surgical navigation device a screen clamping mechanism 301 includes a screen bracket 303 and a screen bracket 303.
  • the groove wall is provided with a threaded through hole, and the tightening knob 304 is installed on the screen bracket 303 through the threaded through hole.
  • the display unit 103 is clamped by tightening the set knob 304, so that the display unit 103 can be fixed in the screen clamping mechanism 301.
  • the number and position of the setting knobs 304 are not specifically limited. It should be noted that, because the screen clamping mechanism 301 fixes the display unit 103 through the screen bracket 303 and the tightening knob 304, the screen clamping mechanism 301 has the ability to adapt to the display units 103 of different sizes.
  • the instrument holding mechanism 302 is a ring-shaped structure, and the instrument holding mechanism 302 is held on the outer contour surface of the surgical instrument 101.
  • a surgical navigation device the visual sensor 203 obtains the spatial position and attitude of the surgical instrument 101 in the visual coordinate system through a binocular camera model according to the spatial position of the first visual mark 201; the visual sensor 203 is based on The spatial position of the second visual mark 202 is obtained through the binocular camera model in the spatial position and posture of the surgical position in the visual coordinate system.
  • the binocular camera model is a parallel binocular stereo vision model.
  • two or more cameras in order to uniquely determine the three-dimensional coordinates from the image point coordinates, two or more cameras must be jointly completed.
  • the imaging of a 3D scene through two cameras separated by a certain distance is binocular imaging, also known as stereo vision imaging.
  • the optical axes of the two cameras are first focused on the object of interest. The intersection of the two optical axes is called the convergence point, and the distance from this point to the center of the baseline is called the convergence distance.
  • the binocular stereo vision system is collectively referred to as a converged binocular stereo vision model; when the convergence distance is wirelessly long, the binocular stereo vision system is called a parallel binocular stereo vision model.
  • the positioning unit 102 is an optical positioning and tracking system in the near-infrared band.
  • near-infrared band optical positioning and tracking systems such as the Polaris series of the Canadian NDI company
  • the near-infrared band optical positioning and tracking system can be divided into an active optical positioning system and a passive optical positioning system.
  • Active optical positioning systems use marked points capable of autonomously emitting near-infrared wavelength light signals as feature points for optical positioning tracking, without the need for a near-infrared lighting system to provide a light source environment.
  • the biggest advantage of the active optical positioning and tracking system is that through the preset light emitting mode (such as the strobe mode), the feature points can be more accurately identified in the image processing process, and the planar image coordinates are determined.
  • active light targets require additional power support.
  • the passive optical positioning and tracking system uses an optical reflection method to attach a near-infrared wavelength reflective material to the marking ball to enhance the reflection ability of the marking ball to the illumination near-infrared light, thereby achieving the purpose of clearly distinguishing the marking ball from the surrounding environment in the image.
  • the first visual mark 201 and the second visual mark 202 in the embodiment of the present application may be mark points capable of autonomously emitting light signals of near-infrared wavelengths, or marks balls attached with reflective materials of near-infrared wavelengths, which are not described in the embodiments of the present application. Make specific restrictions.
  • a surgical navigation device, the positioning unit 102 and the display unit 103 are wirelessly connected. Specifically, information can be transmitted between the positioning unit 102 and the display unit 103 through wireless connection methods such as 2G / 3G / 4G, WIFI, Bluetooth, or ZigBee, which is not specifically limited in this embodiment of the present application.
  • wireless connection methods such as 2G / 3G / 4G, WIFI, Bluetooth, or ZigBee, which is not specifically limited in this embodiment of the present application.
  • a surgical navigation device When the surgical navigation device is applied to dental implant surgery, the second visual mark 202 is mounted on the jaw of the patient.
  • the visual sensor 203 can detect a change in the position of the second visual mark 202 installed on the patient's jawbone, and then update the spatial position and posture of the patient's jawbone in the visual coordinate system to achieve Update of virtual surgery scene.
  • Dental implant surgery is a precision operation under local anesthesia in a small space. Doctors need to hold a mobile phone (ie, a surgical instrument) to implant implants compatible with human bone into the alveolar bone of the missing tooth area. After a period of time, after the artificial tooth root is in close contact with the alveolar bone, a dental crown is made on the artificial tooth root.
  • the key to the effect of the operation is the accuracy of the implant.
  • the failure rate of the operation is high.
  • the positioning unit captures the position and posture of the optical marker point
  • the corresponding virtual surgical instrument and the three-dimensional model of the patient's jaw are displayed on the screen.
  • the doctor needs to observe the screen while operating the surgical instrument during the dental implantation process, so that the virtual surgical instrument can accurately reach the planned implantation position, while avoiding injury to the patient due to misoperation.
  • This method requires a high degree of attention from the doctor during the operation.
  • the line of sight must be switched between the screen and the patient. Rapid changes in the external environment and constant adjustment of the focal length of the eye will cause delay and discomfort. In severe cases, it may lead to medical accidents.
  • this example provides a navigation device for dental implant surgery.
  • a virtual surgery scene is displayed on the display unit 103 of the implanted mobile phone, so that the doctor can more effectively and intuitively observe the interior of the patient's mouth.
  • the positioning unit 102 of a navigation device for dental implant surgery is mainly composed of a visual sensor 203 and a visual marker.
  • the sensor can calculate the spatial position and attitude of the visual mark in the visual coordinate system according to the binocular camera model.
  • a first visual mark 201 is installed on the implanted mobile phone, and the virtual implanted mobile phone (virtual surgical instrument 101) is updated according to the actual position and posture of the first visual mark 201 in the virtual surgery scene.
  • a set of second visual marks 202 also needs to be fixed on the patient's jawbone.
  • the patient's jaw anatomy structure map in the virtual surgery scene is updated according to the position and posture of the second visual mark 202.
  • connection unit is used to connect the display unit 103 to the planting mobile phone, and the upper end, that is, the screen clamping mechanism 301 and the first visual mark 201 are connected together.
  • the display unit 103 for example, a mobile phone, a Pad, etc.
  • the main body of the screen bracket 303 is a rectangular structure, and can be adapted to display units 103 of different sizes within a certain size range.
  • the lower end of the connection unit is an instrument clamping mechanism 302.
  • the instrument clamping mechanism 302 is a ring-shaped structure and is configured to be clamped on the outer contour surface of the planting mobile phone. After clamping, the relative position and attitude relationship between the first visual mark 201 and the planting mobile phone is fixed. After calibrating this relative position and attitude relationship, it can be applied to dental implant navigation surgery. After the visual sensor 203 detects the position of the first visual mark 201, the position and posture of the planted mobile phone in the visual coordinate system can be calculated indirectly according to the previously calibrated relative position and attitude relationship. For example, the position and posture of the end of the instrument (needle tip) that the doctor cares about most can be updated in real time.

Abstract

本申请实施例提供一种手术导航装置,包括手术器械、定位单元和显示单元;其中,显示单元装设在手术器械上;定位单元被配置为捕捉手术器械和手术区域的空间位置和姿态,显示单元被配置为显示虚拟手术场景,并根据手术器械和手术区域的空间位置和姿态更新虚拟手术场景中的虚拟手术器械和虚拟手术区域的位置和姿态。本申请实施例提供的一种手术导航装置,将显示单元装设在手术器械上,使得显示单元与手术器械的方位一致,避免了医生频繁调整视线方向,使得手术导航装置的信息能够更加直观地反馈给医生以指导现实环境下的手术,保证医生的注意力能够集中在手术操作上,提高手术的安全性。

Description

一种手术导航装置
交叉引用
本申请引用于2018年08月02日提交的专利名称为“一种手术导航装置”的第201810871774X号中国专利申请,其通过引用被全部并入本申请。
技术领域
本申请实施例涉及医疗技术领域,尤其涉及一种手术导航装置。
背景技术
手术导航装置作为计算机辅助手术的一个重要应用,已经开始应用于现代外科手术中,通过图形工作站显示出手术工具的位置以及病灶部位,医生通过观察图形工作站上的图像可以看见被其它细胞组织遮挡住的病灶,从而提高手术精度。
当前的手术导航装置,基于视觉摄像机捕获光学标记点的位置姿态后,将对应的虚拟手术器械和患处三维模型显示在屏幕上。医生在手术过程中需要一边操作手术器械一边观察屏幕,以便使虚拟手术器械准确达到规划的手术位置,同时避免误操作伤及患者。
然而,这种手术导航装置对医生操作时的注意力要求较高,操作时视线要在屏幕和患者之间切换,而外界环境的快速变化以及眼睛焦距的不断调整会产生延迟与不适。严重情况下,可能导致医疗事故的发生。
发明内容
本申请实施例提供一种手术导航装置,用以解决医生手术操作时视线需要在手术导航装置的屏幕与患者之间切换,可能导致延迟不适甚至造成医疗事故的问题。
本申请实施例提供一种手术导航装置,包括手术器械、定位单元和显示单元;其中,显示单元装设在手术器械上;
定位单元被配置为捕捉手术器械和手术区域的空间位置和姿态,显示单元被配置为显示虚拟手术场景,并根据手术器械和手术区域的空间位置和姿态更新虚拟手术场景中的虚拟手术器械和虚拟手术区域的位置和姿 态。
本申请实施例提供的一种手术导航装置,将显示单元装设在手术器械上,使得显示单元与手术器械的方位一致,避免了医生频繁调整视线方向,使得手术导航装置的信息能够更加直观地反馈给医生以指导现实环境下的手术,保证医生的注意力能够集中在手术操作上,提高手术的安全性。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例的一种手术导航装置的结构示意图;
图2为本申请实施例的一种手术导航装置的结构示意图;
图3为本申请实施例的一种连接单元的结构示意图;
附图标记说明:
101-手术器械;      102-定位单元;         103-显示单元;
201-第一视觉标记;  202-第二视觉标记;     203-视觉传感器;
301-屏幕夹持机构;  302-器械夹持机构;     303-屏幕托槽;
304-紧定旋钮。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
由于现有的手术导航装置对医生操作时的注意力要求较高,操作时视线要在屏幕和患者之间切换,而外界环境的快速变化以及眼睛焦距的不断调整会产生延迟与不适。严重情况下,可能导致医疗事故的发生。针对上述情况,本申请实施例提出一种手术导航装置,图1为本申请实施例的一种手术导航装置的结构示意图,如图1所示,该装置包括手术器械101、 定位单元102和显示单元103;其中,显示单元103装设在手术器械101上;定位单元102被配置为捕捉手术器械101和手术区域的空间位置和姿态,显示单元103被配置为显示虚拟手术场景,并根据手术器械101和手术区域的空间位置和姿态更行虚拟手术场景中的虚拟手术器械和虚拟手术区域的位置和姿态。
此处,手术器械101是指在临床手术中所使用的医疗器械,不同专科的临床手术中应用的手术器械101各有不同,本申请实施例对此不作具体限定。手术区域是指在临床手术中需要进行医疗处理的相关区域。定位单元102被配置为捕捉手术器械101和手术区域的空间位置和姿态,具体包括手术器械101的空间位置和姿态以及手术区域的空间位置和姿态。此处手术器械101的空间位置被配置为指示手术器械101的整体空间定位信息,手术器械101的姿态是指通过手术器械101上各个标记点的相对空间位置确定得到的手术器械101的姿态,手术区域的空间位置被配置为指示手术区域的整体空间定位信息,手术区域的姿态是指通过手术区域上各个标记点的相对空间位置确定得到的手术区域的姿态。定位单元102具备多种实现形式,定位单元102可以是光学定位跟踪系统,还可以是基于可见光传感器的定位跟踪系统,或者是基于电磁传感器的定位跟踪系统等,本申请实施例对此不作具体限定。
定位单元102在获取上述信息后,将上述信息发送给显示单元103。显示单元103能够根据预先设定的配置信息显示虚拟手术场景,此处的虚拟手术场景为包含有虚拟手术器械和虚拟手术区域的三维模型。显示单元103接收到定位单元102发送的手术器械101和手术区域的空间位置和姿态后,基于手术器械101的空间位置和姿态更新虚拟手术场景中的虚拟手术器械的空间位置和姿态,基于手术区域的空间位置和姿态更新虚拟手术场景中虚拟手术区域的空间位置和姿态。此处,显示单元103可以是智能手机、平板电脑等微型显示设备,本申请实施例对此不作具体限定。另外,显示单元103与手术器械101可以是一体化结构,也可以是可拆卸结构。
为了解决当前的手术导航装置需要医生在手术过程中一边操作手术器械101一边观察屏幕,实现来回切换容易产生延迟和不适甚至于引发医疗事故的问题,将显示单元103装设在手术器械101上,使得显示单元103 与手术器械101的方位一致,避免了医生频繁调整视线方向,使得手术导航装置的信息能够更加直观地反馈给医生以指导现实环境下的手术,保证医生的注意力能够集中在手术操作上,提高手术的安全性。
基于上述实施例,图2为本申请实施例的一种手术导航装置的结构示意图,如图2所示,一种手术导航装置,定位单元102包括第一视觉标记201、第二视觉标记202和视觉传感器203;第一视觉标记201装设在手术器械101上,第二视觉标记202装设在手术区域;视觉传感器203被配置为捕捉第一视觉标记201的空间位置,并根据第一视觉标记201的空间位置获取手术器械101在视觉坐标系下的空间位置和姿态;还被配置为捕捉第二视觉标记202的空间位置,并根据第二视觉标记202的空间位置获取手术位置在视觉坐标系下的空间位置和姿态。
具体地,第一视觉标记201是若干个装设在手术器械101上的视觉标记,同样地第二视觉标记202是若干个装设在手术区域的视觉标记,本申请实施例不对第一视觉标记201和第二视觉标记202的数量作具体限定。此处,第一视觉标记201的空间位置被配置为指示手术器械101的空间位置和姿态,第二视觉标记202的空间位置被配置为指示手术区域的空间位置和姿态。
基于上述任一实施例,一种手术导航装置,还包括连接单元,连接单元被配置为将显示单元103装设在手术器械101上;连接单元包括屏幕夹持机构301和器械夹持机构302,其中屏幕夹持机构301被配置为夹持显示单元103,器械夹持机构302被配置为夹持手术器械101。
具体地,连接单元被配置为实现显示单元103与手术器械101之间的固连。其中,屏幕夹持机构301被配置为夹持显示单元103,器械夹持机构302被配置为夹持手术器械101,使得显示单元103和手术器械101能够通过连接单元固定连接,由此显示单元103与手术器械101的方位始终保持一致,避免了医生频繁调整视线方向。
基于上述任一实施例,一种手术导航装置,第一视觉标记201装设在连接单元上。
具体地,在显示单元103与手术器械101通过连接单元固连的情况下,将第一视觉标记201装设在连接单元上,同样能够保证第一视觉标记201 与手术器械101之间的相对位置、姿态关系固定不变,因而第一视觉标记201的空间位置同样能够被配置为指示手术器械101的空间位置和姿态。
基于上述任一实施例,图3为本申请实施例的一种连接单元的结构示意图,如图3所示,一种手术导航装置,屏幕夹持机构301包括屏幕托槽303,屏幕托槽303槽壁上设置有螺纹通孔,紧定旋钮304穿过螺纹通孔装设在屏幕托槽303上。
具体地,将显示单元103放置入屏幕夹持机构301后,通过旋紧紧定旋钮304卡住显示单元103,使得显示单元103能够固定在屏幕夹持机构301内。本申请实施例中不对紧定旋钮304的数量和位置作具体限定。需要说明的是,由于屏幕夹持机构301通过屏幕托槽303和紧定旋钮304实现显示单元103的固定,屏幕夹持机构301具备适配不同大小的显示单元103的能力。
基于上述任一实施例,参考图3,一种手术导航装置,器械夹持机构302为卡环状结构,器械夹持机构302夹持在手术器械101的外轮廓面上。
基于上述任一实施例,一种手术导航装置,视觉传感器203根据第一视觉标记201的空间位置,通过双目相机模型获取手术器械101在视觉坐标系下的空间位置和姿态;视觉传感器203根据第二视觉标记202的空间位置,通过双目相机模型获取手术位置在视觉坐标系下的空间位置和姿态。
具体地,双目相机模型,即平行双目立体视觉模型。通过针孔成像模型的分析可知,为了由像点坐标唯一的确定三维坐标,就必须通过两个或者多个相机来共同完成。通过两个相隔一定距离的相机来实现对3D场景的成像,就是双目成像,也称为立体视觉成像。相机拍摄景物时,先把两部相机的光轴汇聚于感兴趣的物体上,则两个光轴的交点称为汇聚点,而该点到基线中心的距离称为汇聚距离。当汇聚的距离有限时,双目立体视觉系统统称为汇聚式双目立体视觉模型;当汇聚距离无线远时,双目立体视觉系统就被成为平行式双目立体视觉模型。
基于上述任一实施例,一种手术导航装置,定位单元102为近红外波段的光学定位跟踪系统。
具体地,近红外波段光学定位跟踪系统,例如加拿大NDI公司的Polaris系列,多应用于对鲁棒性要求较高的领域,例如手术导航、机器人 定位等。依据标靶工作方式不同,可将近红外波段光学定位跟踪系统分为主动式光学定位系统和被动式光学定位系统。
主动式光学定位系统采用能自主发射近红外波长光信号的标记点作为光学定位跟踪的特征点,无需近红外照明系统提供光源环境。主动式光学定位跟踪系统的最大优势在于通过预先设定的发光模式(如频闪模式),在图像处理过程中可以较为准确地识别特征点,并确定平面图像坐标。不过主动发光的标靶需要额外电源支持。
被动式光学定位跟踪系统利用光学反射方法,在标记球上附着近红外波长反射材料,增强标记球对照明近红外光的反射能力,从而达到图像中标记球与周围环境明显区分的目的。本申请实施例中的第一视觉标记201和第二视觉标记202可以是能够自主发射近红外波长光信号的标记点,也可以是附着近红外波长反射材料的标记球,本申请实施例不对此作具体限定。
基于上述任一实施例,一种手术导航装置,定位单元102与显示单元103通过无线连接。具体地,定位单元102和显示单元103之间可以通过2G/3G/4G、WIFI、蓝牙或者ZigBee等无线连接方式进行信息传输,本申请实施例不对此作具体限定。
基于上述任一实施例,一种手术导航装置,当手术导航装置应用于种牙手术时,第二视觉标记202装设在患者的颌骨上。
具体地,当患者头部移动时,视觉传感器203能够检测到装设在患者颌骨上的第二视觉标记202位置的变化,进而更新患者颌骨在视觉坐标系下的空间位置和姿态,实现虚拟手术场景的更新。
为了更好地理解与应用本申请提出的一种手术导航装置,本申请进行以下示例,且本申请不仅局限于以下示例。
口腔种牙手术是在狭小空间内局部麻醉下的精密操作,医生需要手持种植手机(即手术器械)将与人体骨质兼容的种植体植入缺牙区的牙槽骨内。经过一段时间,当人工牙根与牙槽骨密合后,再在人工牙根上制作牙冠。影响手术效果的关键是种植体的植入精度,而由于口腔的非直视环境、操作空间狭小以及医生缺乏经验等因素,造成手术的失败率较高。当前用于种牙手术的导航装置中,定位单元捕获光学标记点的位置姿态后,将对 应的虚拟手术器械和患者颌骨三维模型显示在屏幕上。医生在种牙过程中需要一边操作手术器械一边观察屏幕,以便使虚拟手术器械准确达到规划的种植位置,同时避免误操作伤及患者。这种方法对医生操作时的注意力要求较高,操作时视线要在屏幕和患者之间切换,而外界环境的快速变化以及眼睛焦距的不断调整会产生延迟与不适。严重情况下,可能导致医疗事故的发生。
为了解决这一问题,本示例提供了一种用于种牙手术的导航装置,参考图2,将虚拟手术场景显示在种植手机的显示单元103上,让医生更有效直观地观察到患者口腔内部解剖结构,并根据实际情况及时调整手术操作,以提升医生手术操作的效率。
用于种牙手术的导航装置的定位单元102主要由视觉传感器203和视觉标记组成。当视觉标记处于视觉传感器203的视野中时,传感器能根据双目相机模型计算出视觉标记在视觉坐标系下的空间位置和姿态。
其中,在种植手机上安装有第一视觉标记201,在虚拟手术场景中根据第一视觉标记201的实际位置和姿态对虚拟种植手机(虚拟手术器械101)进行更新。同时患者颌骨上也需要固定一组第二视觉标记202,当患者头部移动时,根据第二视觉标记202的位置和姿态更新虚拟手术场景中的患者颌骨解剖结构图。
参考图3,连接单元用于将显示单元103连接到种植手机上,其上端即屏幕夹持机构301与第一视觉标记201连接在一起。将显示单元103(例如手机、Pad等)放入屏幕夹持机构301的屏幕托槽303中后,通过屏幕托槽303下端的紧定旋钮304锁进行固定。屏幕托槽303主体是一个矩形结构,可在一定尺寸范围内适配不同大小的显示单元103。
连接单元下端为器械夹持机构302,本示例中器械夹持机构302为卡环状结构,被配置为夹在种植手机的外轮廓面上。夹紧后第一视觉标记201与种植手机之间的相对位置姿态关系固定不变。标定好这一相对位置姿态关系后就可将其应用于牙科种植导航手术中。视觉传感器203检测到第一视觉标记201的位置后,根据之前标定好的相对位置姿态关系可以间接计算出种植手机在视觉坐标系下的位置和姿态。比如医生最关心的器械末端(针尖)在虚拟手术场景中的位置姿态就可以实时更新。
医生手持种植手机进行钻孔等操作时,可以很方便地通过种植手机上的显示单元103观察到虚拟手术场景中的状况。比如下钻深度、角度、是否碰撞到上下颌神经管等信息,这些信息医生无法直接通过肉眼获得。而且操作时由于显示单元103与种植手机方位一致,医生无需频繁调整视线方向。这一设计能将手术导航装置的信息更直观地反馈给医生来指导现实环境中的手术,使手术操作更简便。
最后应说明的是:以上各实施例仅用以说明本申请的实施例的技术方案,而非对其限制;尽管参照前述各实施例对本申请的实施例进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请的实施例各实施例技术方案的范围。

Claims (9)

  1. 一种手术导航装置,其特征在于,包括手术器械、定位单元和显示单元;其中,所述显示单元装设在所述手术器械上;
    所述定位单元被配置为捕捉所述手术器械和手术区域的空间位置和姿态,所述显示单元被配置为显示虚拟手术场景,并根据所述手术器械和所述手术区域的空间位置和姿态更新所述虚拟手术场景中的虚拟手术器械和虚拟手术区域的位置和姿态。
  2. 根据权利要求1所述的装置,其特征在于,所述定位单元包括第一视觉标记、第二视觉标记和视觉传感器;
    所述第一视觉标记装设在所述手术器械上,所述第二视觉标记装设在所述手术区域;
    所述视觉传感器被配置为捕捉所述第一视觉标记的空间位置,并根据所述第一视觉标记的空间位置获取所述手术器械在视觉坐标系下的空间位置和姿态;还被配置为捕捉所述第二视觉标记的空间位置,并根据所述第二视觉标记的空间位置获取所述手术位置在视觉坐标系下的空间位置和姿态。
  3. 根据权利要求2所述的装置,其特征在于,还包括连接单元,所述连接单元被配置为将所述显示单元装设在所述手术器械上;
    所述连接单元包括屏幕夹持机构和器械夹持机构,其中所述屏幕夹持机构被配置为夹持所述显示单元,所述器械夹持机构被配置为夹持所述手术器械。
  4. 根据权利要求3所述的装置,其特征在于,所述第一视觉标记装设在所述连接单元上。
  5. 根据权利要求3所述的装置,其特征在于,所述屏幕夹持机构包括屏幕托槽,所述屏幕托槽槽壁上设置有螺纹通孔,紧定旋钮穿过所述螺纹通孔装设在所述屏幕托槽上。
  6. 根据权利要求3所述的装置,其特征在于,所述器械夹持机构为卡环状结构,所述器械夹持机构夹持在所述手术器械的外轮廓面上。
  7. 根据权利要求2所述的装置,其特征在于,所述视觉传感器根据所述第一视觉标记的空间位置,通过双目相机模型获取所述手术器械在视觉 坐标系下的空间位置和姿态;
    所述视觉传感器根据所述第二视觉标记的空间位置,通过双目相机模型获取所述手术位置在视觉坐标系下的空间位置和姿态。
  8. 根据权利要求1至7中任一权利要求所述的装置,其特征在于,所述定位单元与所述显示单元通过无线连接。
  9. 根据权利要求2至7中任一权利要求所述的装置,其特征在于,当所述装置应用于种牙手术时,所述第二视觉标记装设在患者的颌骨上。
PCT/CN2019/085685 2018-08-02 2019-05-06 一种手术导航装置 WO2020024638A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810871774.X 2018-08-02
CN201810871774.XA CN108742876A (zh) 2018-08-02 2018-08-02 一种手术导航装置

Publications (1)

Publication Number Publication Date
WO2020024638A1 true WO2020024638A1 (zh) 2020-02-06

Family

ID=63968695

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/085685 WO2020024638A1 (zh) 2018-08-02 2019-05-06 一种手术导航装置

Country Status (2)

Country Link
CN (1) CN108742876A (zh)
WO (1) WO2020024638A1 (zh)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108742876A (zh) * 2018-08-02 2018-11-06 雅客智慧(北京)科技有限公司 一种手术导航装置
CN109528329A (zh) * 2019-01-15 2019-03-29 浙江科惠医疗器械股份有限公司 口腔种植系统、口腔种植方法及计算机可读存储介质
CN109700550B (zh) * 2019-01-22 2020-06-26 雅客智慧(北京)科技有限公司 一种用于牙科手术的增强现实方法及装置
CN110664483A (zh) * 2019-07-09 2020-01-10 苏州迪凯尔医疗科技有限公司 根尖外科手术的导航方法、装置、电子设备和存储介质
CN111388087A (zh) * 2020-04-26 2020-07-10 深圳市鑫君特智能医疗器械有限公司 手术导航系统及执行手术导航方法的计算机与存储介质
CN111658065A (zh) * 2020-05-12 2020-09-15 北京航空航天大学 一种下颌骨切削手术的数字化引导系统
CN114748201A (zh) * 2022-04-19 2022-07-15 深圳广成创新技术有限公司 一种牙科种植体的三维参数的获取方法、装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130218142A1 (en) * 2008-10-21 2013-08-22 Brainlab Ag Integration of surgical instrument and display device for assisting in image-guided surgery
CN105055021A (zh) * 2015-06-30 2015-11-18 华南理工大学 手术导航穿刺针的标定装置及其标定方法
CN205964186U (zh) * 2016-07-09 2017-02-22 荣春 一种骨科手术用固定装置
CN108742876A (zh) * 2018-08-02 2018-11-06 雅客智慧(北京)科技有限公司 一种手术导航装置
CN209059467U (zh) * 2018-08-02 2019-07-05 雅客智慧(北京)科技有限公司 一种手术导航装置

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201353203Y (zh) * 2009-02-09 2009-12-02 李晴航 计算机辅助手术术中定位系统
CN101889857B (zh) * 2009-05-22 2012-09-19 许杰 手术导航设备
KR20130080909A (ko) * 2012-01-06 2013-07-16 삼성전자주식회사 수술 로봇 및 그 제어 방법
CN103211655B (zh) * 2013-04-11 2016-03-09 深圳先进技术研究院 一种骨科手术导航系统及导航方法
CN105395252A (zh) * 2015-12-10 2016-03-16 哈尔滨工业大学 具有人机交互的可穿戴式血管介入手术三维立体图像导航装置
CN105852969A (zh) * 2016-03-29 2016-08-17 鞠克丰 一种新型骨科手术导航系统
CN207306723U (zh) * 2017-01-23 2018-05-04 新博医疗技术有限公司 Ct图像引导下的手术导航系统
CN107714178A (zh) * 2017-10-28 2018-02-23 深圳市前海安测信息技术有限公司 手术导航定位机器人及其控制方法
CN107874832B (zh) * 2017-11-22 2020-03-10 合肥美亚光电技术股份有限公司 骨科手术器械导航系统及方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130218142A1 (en) * 2008-10-21 2013-08-22 Brainlab Ag Integration of surgical instrument and display device for assisting in image-guided surgery
CN105055021A (zh) * 2015-06-30 2015-11-18 华南理工大学 手术导航穿刺针的标定装置及其标定方法
CN205964186U (zh) * 2016-07-09 2017-02-22 荣春 一种骨科手术用固定装置
CN108742876A (zh) * 2018-08-02 2018-11-06 雅客智慧(北京)科技有限公司 一种手术导航装置
CN209059467U (zh) * 2018-08-02 2019-07-05 雅客智慧(北京)科技有限公司 一种手术导航装置

Also Published As

Publication number Publication date
CN108742876A (zh) 2018-11-06

Similar Documents

Publication Publication Date Title
WO2020024638A1 (zh) 一种手术导航装置
CN107847278B (zh) 用于为医疗器械提供轨迹的可视化的靶向系统
US11382700B2 (en) Extended reality headset tool tracking and control
JP6804876B2 (ja) 位置調整デバイス並びにロボット支援手術のための装置
CA2924230C (en) Optical targeting and visusalization of trajectories
JP6889703B2 (ja) 患者の3d表面画像を手術中に観察するための方法及び装置
US20200054421A1 (en) Methods for conducting guided oral and maxillofacial procedures, and associated system
US20140221819A1 (en) Apparatus, system and method for surgical navigation
US11690697B2 (en) Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
JP2017000772A (ja) ロボット支援手術のための装置及び方法
KR101474098B1 (ko) 파노라마 엑스선 장치 및 파노라마 이미징을 위한 이미징될 층의 위치 설정
US20190380811A1 (en) Implant surgery guiding method
US20190108645A1 (en) Method and system for registration verification
CN108846866B (zh) 基于光学成像的颅颌面软组织矢状向中轴面确定方法及系统
US20210169605A1 (en) Augmented reality headset for navigated robotic surgery
US20160000514A1 (en) Surgical vision and sensor system
JP2016158911A (ja) 画像表示装置を使った外科手術方法及び、その外科手術に用いる装置
CN113558762A (zh) 将手术工具与由扩展现实头戴装置的摄像机跟踪的参考阵列配准以用于手术期间的辅助导航
JP2021194538A (ja) 基準シードを介した可視光での外科手術での対象の追跡および合成画像登録
WO2019037605A1 (zh) Ar眼镜及其追踪系统
CN109688403A (zh) 一种应用于手术室内的裸眼3d人眼追踪方法及其设备
KR20200056492A (ko) 정맥 탐지 장치
US20230310098A1 (en) Surgery assistance device
CN209358681U (zh) 一种应用于手术室内的裸眼3d人眼追踪设备
CN209059467U (zh) 一种手术导航装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19845441

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19/05/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19845441

Country of ref document: EP

Kind code of ref document: A1