JP2019055182A - タッチパッドユーザインタフェースとしての患者の顔 - Google Patents
タッチパッドユーザインタフェースとしての患者の顔 Download PDFInfo
- Publication number
- JP2019055182A JP2019055182A JP2018170344A JP2018170344A JP2019055182A JP 2019055182 A JP2019055182 A JP 2019055182A JP 2018170344 A JP2018170344 A JP 2018170344A JP 2018170344 A JP2018170344 A JP 2018170344A JP 2019055182 A JP2019055182 A JP 2019055182A
- Authority
- JP
- Japan
- Prior art keywords
- patient
- face
- image
- dimensional
- center
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 78
- 238000002591 computed tomography Methods 0.000 claims description 27
- 230000008569 process Effects 0.000 claims description 11
- 238000012545 processing Methods 0.000 description 28
- 230000006870 function Effects 0.000 description 21
- 210000003128 head Anatomy 0.000 description 10
- 238000005259 measurement Methods 0.000 description 5
- 210000001061 forehead Anatomy 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 210000000988 bone and bone Anatomy 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013170 computed tomography imaging Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000002594 fluoroscopy Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000036512 infertility Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000011164 ossification Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000000053 physical method Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
- G06F3/0426—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/24—Surgical instruments, devices or methods, e.g. tourniquets for use in the oral cavity, larynx, bronchial passages or nose; Tongue scrapers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30016—Brain
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Robotics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radiology & Medical Imaging (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Multimedia (AREA)
- General Business, Economics & Management (AREA)
- Business, Economics & Management (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Otolaryngology (AREA)
- Pulmonology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Image Analysis (AREA)
Abstract
Description
図1は、本発明の一実施形態による、外科システム100の概略図である。システム100は、患者101の頭部内の1つ又は複数の位置センサ(図示せず)の位置を追跡するように構成された磁気位置追跡システムを備える。磁気位置追跡システムは、磁場発生器104と、1つ又は複数の位置センサと、を備える。本明細書で説明するとおり、位置センサは、磁場発生器から検知された外部磁場に応答して位置信号を生成し、処理装置102が、位置追跡システムの座標系における各センサの位置をマッピングすることを可能にする。
(1) 患者の顔の三次元解剖学的画像を取得して、第1の座標系において前記患者の顔上のそれぞれの所定の部位に対応する複数の解剖学的点を識別する工程と、
少なくとも1つの機能を前記患者の顔上の少なくとも1つの所定の部位に割り当てる工程と、
第2の座標系において前記患者の顔上の前記それぞれの所定の部位における複数の位置を受信する工程と、
前記位置と前記患者の顔上の前記それぞれの解剖学的点とを相関させることで前記第1及び第2の座標系を位置合わせする工程と、
前記少なくとも1つの割り当てられた機能に対応するコマンドをトリガする工程と、
前記コマンドを電子機器に伝達する工程と、を含む方法。
(2) 前記三次元解剖学的画像はコンピュータ断層撮影画像を含む、実施態様1に記載の方法。
(3) 前記識別する工程は、前記三次元解剖学的画像上で前記所定の部位を決定する工程を更に含む、実施態様1に記載の方法。
(4) 前記決定する工程は、前記三次元解剖学的画像上で最高点を求める工程を含む、実施態様3に記載の方法。
(5) 前記決定する工程は、前記三次元解剖学的画像上で前記右目及び前記左目のそれぞれの中心を求める工程を更に含む、実施態様4に記載の工程。
(7) 前記複数の解剖学的点は、前記所定の部位のそれぞれに少なくとも2つの点を含む、実施態様1に記載の方法。
(8) 前記少なくとも1つの所定の部位は、前記患者の顔の左上象限、右上象限、左下象限、及び右下象限からなる群から選択される、実施態様1に記載の方法。
(9) 前記複数の位置を受信する工程は、位置センサを含む位置合わせツールから前記位置を受信する工程を含む、実施態様1に記載の方法。
(10) 前記複数の位置を受信する工程は、三次元スキャナで前記患者の顔をスキャンすることで前記位置を受信する工程を含む、実施態様1に記載の方法。
(12) 前記電子機器がコンピュータである、実施態様1に記載の方法。
(13) システムであって、
位置合わせツールであって、患者の顔上のそれぞれの所定の部位に前記位置合わせツールを位置決めすることにより第2の座標系における複数の位置を取得するように構成された、位置追跡システムの位置センサを備える位置合わせツールと、
プロセッサであって、
前記患者の顔の三次元解剖学的画像内で第1の座標系における前記それぞれの所定の部位に対応する複数の解剖学的点を識別し、
少なくとも1つの機能を前記患者の顔上の少なくとも1つの所定の部位に割り当て、
前記第2の座標系において測定された前記複数の位置を受信し、
前記位置と前記患者の顔上の前記それぞれの解剖学的点とを相関させることで前記第1及び第2の座標系を位置合わせし、
前記少なくとも1つの割り当てられた機能に対応するコマンドを読み出すように構成された、プロセッサと、を備えるシステム。
(14) 前記三次元解剖学的画像はコンピュータ断層撮影画像を含む、実施態様13に記載のシステム。
(15) 前記三次元解剖学的画像は三次元スキャンを含む、実施態様13に記載のシステム。
前記三次元解剖学的画像上で最高点を求める工程と、
前記三次元解剖学的画像上で前記右目及び前記左目のそれぞれの中心を求める工程と、
前記患者の顔の三次元解剖学的画像を2つの直交する線によって象限に分割する工程であって、前記第1の直交する線は、前記右目の前記中心と前記左目の前記中心とを接続する線上にあり、かつ前記右目の前記中心と前記左目の前記中心とを接続する前記線と平行し、前記第2の直交は、前記三次元画像の前記最高点と交差する、工程と、を含む、実施態様13に記載のシステム。
(17) 前記所定の部位は、前記患者の顔の左上象限、右上象限、左下象限、及び右下象限からなる群から選択される、実施態様13に記載のシステム。
Claims (17)
- 患者の顔の三次元解剖学的画像を取得して、第1の座標系において前記患者の顔上のそれぞれの所定の部位に対応する複数の解剖学的点を識別する工程と、
少なくとも1つの機能を前記患者の顔上の少なくとも1つの所定の部位に割り当てる工程と、
第2の座標系において前記患者の顔上の前記それぞれの所定の部位における複数の位置を受信する工程と、
前記位置と前記患者の顔上の前記それぞれの解剖学的点とを相関させることで前記第1及び第2の座標系を位置合わせする工程と、
前記少なくとも1つの割り当てられた機能に対応するコマンドをトリガする工程と、
前記コマンドを電子機器に伝達する工程と、を含む方法。 - 前記三次元解剖学的画像はコンピュータ断層撮影画像を含む、請求項1に記載の方法。
- 前記識別する工程は、前記三次元解剖学的画像上で前記所定の部位を決定する工程を更に含む、請求項1に記載の方法。
- 前記決定する工程は、前記三次元解剖学的画像上で最高点を求める工程を含む、請求項3に記載の方法。
- 前記決定する工程は、前記三次元解剖学的画像上で前記右目及び前記左目のそれぞれの中心を求める工程を更に含む、請求項4に記載の工程。
- 前記決定する工程は、前記患者の顔の三次元解剖学的画像を2つの直交する線によって象限に分割する工程を更に含み、前記第1の直交する線は、前記右目の前記中心と前記左目の前記中心とを接続する線上にあり、かつ前記右目の前記中心と前記左目の前記中心とを接続する前記線と平行し、前記第2の直交は、前記三次元画像の前記最高点と交差する、請求項5に記載の方法。
- 前記複数の解剖学的点は、前記所定の部位のそれぞれに少なくとも2つの点を含む、請求項1に記載の方法。
- 前記少なくとも1つの所定の部位は、前記患者の顔の左上象限、右上象限、左下象限、及び右下象限からなる群から選択される、請求項1に記載の方法。
- 前記複数の位置を受信する工程は、位置センサを含む位置合わせツールから前記位置を受信する工程を含む、請求項1に記載の方法。
- 前記複数の位置を受信する工程は、三次元スキャナで前記患者の顔をスキャンすることで前記位置を受信する工程を含む、請求項1に記載の方法。
- 前記コマンドをトリガする工程は、位置センサを備える外科用ツールで前記患者の顔上の前記少なくとも1つの所定の部位に触れる工程を含む、請求項1に記載の方法。
- 前記電子機器がコンピュータである、請求項1に記載の方法。
- システムであって、
位置合わせツールであって、患者の顔上のそれぞれの所定の部位に前記位置合わせツールを位置決めすることにより第2の座標系における複数の位置を取得するように構成された、位置追跡システムの位置センサを備える位置合わせツールと、
プロセッサであって、
前記患者の顔の三次元解剖学的画像内で第1の座標系における前記それぞれの所定の部位に対応する複数の解剖学的点を識別し、
少なくとも1つの機能を前記患者の顔上の少なくとも1つの所定の部位に割り当て、
前記第2の座標系において測定された前記複数の位置を受信し、
前記位置と前記患者の顔上の前記それぞれの解剖学的点とを相関させることで前記第1及び第2の座標系を位置合わせし、
前記少なくとも1つの割り当てられた機能に対応するコマンドを読み出すように構成された、プロセッサと、を備えるシステム。 - 前記三次元解剖学的画像はコンピュータ断層撮影画像を含む、請求項13に記載のシステム。
- 前記三次元解剖学的画像は三次元スキャンを含む、請求項13に記載のシステム。
- 前記プロセッサは、前記三次元解剖学的画像上で前記所定の部位を決定するように更に構成され、前記決定する工程は、
前記三次元解剖学的画像上で最高点を求める工程と、
前記三次元解剖学的画像上で前記右目及び前記左目のそれぞれの中心を求める工程と、
前記患者の顔の三次元解剖学的画像を2つの直交する線によって象限に分割する工程であって、前記第1の直交する線は、前記右目の前記中心と前記左目の前記中心とを接続する線上にあり、かつ前記右目の前記中心と前記左目の前記中心とを接続する前記線と平行し、前記第2の直交は、前記三次元画像の前記最高点と交差する、工程と、を含む、請求項13に記載のシステム。 - 前記所定の部位は、前記患者の顔の左上象限、右上象限、左下象限、及び右下象限からなる群から選択される、請求項13に記載のシステム。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/703,418 US10452263B2 (en) | 2017-09-13 | 2017-09-13 | Patient face as touchpad user interface |
US15/703,418 | 2017-09-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2019055182A true JP2019055182A (ja) | 2019-04-11 |
JP7330677B2 JP7330677B2 (ja) | 2023-08-22 |
Family
ID=63579083
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2018170344A Active JP7330677B2 (ja) | 2017-09-13 | 2018-09-12 | タッチパッドユーザインタフェースとしての患者の顔 |
Country Status (7)
Country | Link |
---|---|
US (1) | US10452263B2 (ja) |
EP (1) | EP3457406B1 (ja) |
JP (1) | JP7330677B2 (ja) |
CN (1) | CN109481016B (ja) |
AU (1) | AU2018229470A1 (ja) |
CA (1) | CA3017166A1 (ja) |
IL (1) | IL261727B (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11527002B2 (en) | 2019-12-05 | 2022-12-13 | Biosense Webster (Israel) Ltd. | Registration of an image with a tracking system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201307266Y (zh) * | 2008-06-25 | 2009-09-09 | 韩旭 | 双目视线跟踪装置 |
WO2012173001A1 (ja) * | 2011-06-13 | 2012-12-20 | シチズンホールディングス株式会社 | 情報入力装置 |
JP2015212898A (ja) * | 2014-05-02 | 2015-11-26 | キヤノン株式会社 | 画像処理装置、情報処理方法及びプログラム |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5391199A (en) | 1993-07-20 | 1995-02-21 | Biosense, Inc. | Apparatus and method for treating cardiac arrhythmias |
DE69514238T2 (de) | 1994-08-19 | 2000-05-11 | Biosense Inc | Medizinisches diagnose-, behandlungs- und darstellungssystem |
US6690963B2 (en) | 1995-01-24 | 2004-02-10 | Biosense, Inc. | System for determining the location and orientation of an invasive medical instrument |
WO1997029685A1 (en) | 1996-02-15 | 1997-08-21 | Biosense, Inc. | Independently positionable transducers for location system |
US6239724B1 (en) | 1997-12-30 | 2001-05-29 | Remon Medical Technologies, Ltd. | System and method for telemetrically providing intrabody spatial position |
US6560354B1 (en) | 1999-02-16 | 2003-05-06 | University Of Rochester | Apparatus and method for registration of images to physical space using a weighted combination of points and surfaces |
US6632089B2 (en) | 1999-11-30 | 2003-10-14 | Orametrix, Inc. | Orthodontic treatment planning with user-specified simulation of tooth movement |
US6484118B1 (en) | 2000-07-20 | 2002-11-19 | Biosense, Inc. | Electromagnetic position single axis system |
US6431711B1 (en) | 2000-12-06 | 2002-08-13 | International Business Machines Corporation | Multiple-surface display projector with interactive input capability |
JP3711038B2 (ja) | 2001-06-15 | 2005-10-26 | バブコック日立株式会社 | 頭蓋骨スーパーインポーズ方法と装置 |
US7729742B2 (en) | 2001-12-21 | 2010-06-01 | Biosense, Inc. | Wireless position sensor |
US20040068178A1 (en) | 2002-09-17 | 2004-04-08 | Assaf Govari | High-gradient recursive locating system |
US10546396B2 (en) * | 2010-12-30 | 2020-01-28 | St. Jude Medical International Holding S.à r. l. | System and method for registration of fluoroscopic images in a coordinate system of a medical system |
US9069164B2 (en) * | 2011-07-12 | 2015-06-30 | Google Inc. | Methods and systems for a virtual input device |
WO2013095330A1 (en) * | 2011-12-19 | 2013-06-27 | Shackelford Howard L | Anatomical orientation system |
US8908904B2 (en) * | 2011-12-28 | 2014-12-09 | Samsung Electrônica da Amazônia Ltda. | Method and system for make-up simulation on portable devices having digital cameras |
JP6363608B2 (ja) * | 2012-10-12 | 2018-07-25 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 患者の顔面データにアクセスするためのシステム |
WO2014169225A1 (en) * | 2013-04-12 | 2014-10-16 | Iconics, Inc. | Virtual touch screen |
EP3007635B1 (en) | 2013-08-23 | 2016-12-21 | Stryker European Holdings I, LLC | Computer-implemented technique for determining a coordinate transformation for surgical navigation |
US9921658B2 (en) | 2014-02-06 | 2018-03-20 | Sony Mobile Communications, Inc. | Device and method for detecting gestures on the skin |
US20160042557A1 (en) * | 2014-08-08 | 2016-02-11 | Asustek Computer Inc. | Method of applying virtual makeup, virtual makeup electronic system, and electronic device having virtual makeup electronic system |
GB201501157D0 (en) | 2015-01-23 | 2015-03-11 | Scopis Gmbh | Instrument guidance system for sinus surgery |
CA2973479C (en) * | 2015-07-21 | 2019-02-26 | Synaptive Medical (Barbados) Inc. | System and method for mapping navigation space to patient space in a medical procedure |
EP3423972A1 (en) * | 2016-03-02 | 2019-01-09 | Truinject Corp. | Sensory enhanced environments for injection aid and social training |
US10152786B2 (en) * | 2016-10-11 | 2018-12-11 | Biosense Webster (Israel) Ltd. | Registration of a magnetic tracking system with an imaging device |
-
2017
- 2017-09-13 US US15/703,418 patent/US10452263B2/en active Active
-
2018
- 2018-09-11 CA CA3017166A patent/CA3017166A1/en active Pending
- 2018-09-12 AU AU2018229470A patent/AU2018229470A1/en not_active Abandoned
- 2018-09-12 IL IL261727A patent/IL261727B/en unknown
- 2018-09-12 JP JP2018170344A patent/JP7330677B2/ja active Active
- 2018-09-12 EP EP18194067.7A patent/EP3457406B1/en active Active
- 2018-09-13 CN CN201811068953.6A patent/CN109481016B/zh active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201307266Y (zh) * | 2008-06-25 | 2009-09-09 | 韩旭 | 双目视线跟踪装置 |
WO2012173001A1 (ja) * | 2011-06-13 | 2012-12-20 | シチズンホールディングス株式会社 | 情報入力装置 |
JP2015212898A (ja) * | 2014-05-02 | 2015-11-26 | キヤノン株式会社 | 画像処理装置、情報処理方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN109481016B (zh) | 2024-01-02 |
CA3017166A1 (en) | 2019-03-13 |
IL261727A (en) | 2019-02-28 |
US10452263B2 (en) | 2019-10-22 |
US20190076197A1 (en) | 2019-03-14 |
CN109481016A (zh) | 2019-03-19 |
AU2018229470A1 (en) | 2019-03-28 |
EP3457406A1 (en) | 2019-03-20 |
JP7330677B2 (ja) | 2023-08-22 |
EP3457406B1 (en) | 2022-03-09 |
IL261727B (en) | 2022-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7106258B2 (ja) | 超音波を使用する位置追跡システムを用いた解剖学的画像の手術前の位置合わせ | |
TWI615126B (zh) | 影像增強真實度之方法與應用該方法在可穿戴式眼鏡之手術導引 | |
JP7171220B2 (ja) | 骨組織への近接に基づく解剖学的画像の位置追跡座標系との位置合わせ | |
JP6643362B2 (ja) | ロボット手術中に更新された患者画像を供給するための方法および装置 | |
US20070038223A1 (en) | Computer-assisted knee replacement apparatus and method | |
NL2014772B1 (en) | A lumbar navigation method, a lumbar navigation system and a computer program product. | |
JP6559532B2 (ja) | X線透視画像のリアルタイムシミュレーション | |
JP7191539B2 (ja) | 骨組織への視覚的近接性に基づく、解剖学的画像の位置追跡座標系との位置合わせの改善 | |
EP1667574A2 (en) | System and method for providing computer assistance with spinal fixation procedures | |
JP7330677B2 (ja) | タッチパッドユーザインタフェースとしての患者の顔 | |
AU2017221893A1 (en) | Ent image registration | |
CN114732518A (zh) | 用于单个图像配准更新的系统和方法 | |
IL272871A (en) | Map of body space | |
Danilchenko | Fiducial-based registration with anisotropic localization error | |
EP4091570A1 (en) | Probe for improving registration accuracy between a tomographic image and a tracking system | |
US10376335B2 (en) | Method and apparatus to provide updated patient images during robotic surgery | |
Thomas | Real-time Navigation Procedure for Robot-assisted Surgery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20210818 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20220629 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20220712 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20221012 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20230131 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20230424 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20230711 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20230809 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 7330677 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |