WO2022206417A1 - Procédé de positionnement d'objets par étalonnage spatial - Google Patents

Procédé de positionnement d'objets par étalonnage spatial Download PDF

Info

Publication number
WO2022206417A1
WO2022206417A1 PCT/CN2022/081521 CN2022081521W WO2022206417A1 WO 2022206417 A1 WO2022206417 A1 WO 2022206417A1 CN 2022081521 W CN2022081521 W CN 2022081521W WO 2022206417 A1 WO2022206417 A1 WO 2022206417A1
Authority
WO
WIPO (PCT)
Prior art keywords
positioning device
position information
positioning
space
spatial position
Prior art date
Application number
PCT/CN2022/081521
Other languages
English (en)
Chinese (zh)
Inventor
孙非
朱奕
郭晓杰
崔芙粒
单莹
Original Assignee
上海复拓知达医疗科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海复拓知达医疗科技有限公司 filed Critical 上海复拓知达医疗科技有限公司
Publication of WO2022206417A1 publication Critical patent/WO2022206417A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points

Definitions

  • the invention relates to the technical field of image processing, in particular to a method for calibrating and positioning objects in space.
  • Augmented reality technology usually captures images of real scenes through cameras. It needs to analyze and process the captured images of real scenes, and superimpose and display additional information to users on the basis of real scenes, that is, augmentation of reality.
  • the process of analyzing and processing images of real scenes often includes locating objects in the scene. Under certain specific requirements, the accuracy of object positioning in a scene is extremely high, and the accuracy of object positioning in a scene in the prior art cannot meet the requirements.
  • augmented reality technology when augmented reality technology is applied to surgical navigation scenarios, it is necessary to very accurately determine the positional relationship between medical devices, patients, and scenarios to ensure accurate navigation information is provided to users.
  • puncture navigation based on augmented reality technology can realize fast and accurate surgical navigation with the most simple, convenient, easy-to-learn and easy-to-use equipment.
  • one of the cores of precise navigation precise spatial positioning of surgical instruments based on visible light patterns, and registration of virtual organs and real human bodies, but before positioning, how to perform rapid positioning matching and correction is very important .
  • the positioning and calibration of the surgical instrument and the photographing device need to be performed first, in order to ensure the accuracy of the navigation position in the entire navigation process, and then to ensure the accuracy of the enhanced display in the later stage.
  • the purpose of the present invention is to provide a method for calibrating and positioning objects in space.
  • An object space calibration and positioning method comprising:
  • the image information of the object to be positioned in space is captured, and the identification characteristics of the object to be positioned in the image of the object to be positioned are identified to obtain the spatial position information of the object to be positioned; Determine the relative position of the positioning device;
  • the spatial position information of the positioning device is corrected to obtain the final spatial position information of the to-be-positioned device.
  • the identification characteristics of the positioning device include at least the morphological characteristics of the positioning device body and/or the identification characteristics of the positioning device mark; the morphological characteristics of the positioning device body at least include the structure, shape or color of the positioning device body; the identification characteristics of the positioning device mark at least include The pattern, graphic or QR code set on the positioning device.
  • the identification characteristics of the object to be located include at least the morphological characteristics of the object to be located and/or the identification characteristics of the object to be located; the morphological characteristics of the object to be located include at least the structure, shape or color of the object to be located;
  • the object mark identification features at least include patterns, graphics or two-dimensional codes set on the object to be positioned.
  • the spatial position information of the positioning device includes at least the spatial coordinates of the positioning device and/or the orientation of the positioning device; the spatial position information of the object to be positioned includes at least the spatial coordinates of the object to be positioned and/or the orientation of the object to be positioned.
  • the specific position is a position when the object to be positioned has a specific positional relationship with a preset point, line or surface on the positioning device, and the specific positional relationship includes coincidence or partial coincidence of points, lines or surfaces.
  • the correcting the spatial position information of the object to be positioned according to the spatial position information of the positioning device and the specific position includes:
  • the calibrating the spatial position information of the object to be positioned includes: calibrating the x and y coordinates of the object to be positioned.
  • It also includes correcting the spatial position information of the positioning device according to the spatial position information of the object to be positioned and the specific position.
  • the correcting the spatial position information of the positioning device according to the spatial position information of the object to be positioned and the specific position includes:
  • the calibrating the spatial position information of the positioning device includes: calibrating the z-coordinate of the positioning device.
  • the object to be positioned is a surgical instrument.
  • the object to be positioned is a puncture needle.
  • the positioning device includes: a support part, and a feature part and a limit part arranged on the support part;
  • the feature part comprises a display board, the display board is connected with the support part, and the display board is provided with an optical feature for being photographed and identified;
  • the limiting portion is configured to limit the object to be positioned.
  • the feature portion further includes a connection mechanism through which the display panel is connected to the support portion.
  • the connecting mechanism includes a hinge mechanism, and the display board is rotatably mounted on the support portion through the hinge mechanism.
  • the optical features include one or any combination of specific graphics, structures, colors for optical identification.
  • the optical feature is a pattern attached or printed on the display board, and the pattern is a two-dimensional code.
  • the limiting portion is disposed on one side of the feature portion. When the object to be positioned is moved to a predetermined position, the limiting portion limits the object to be positioned and forms a specific spatial positional relationship with the object to be positioned.
  • the limiting portion is a detachable structure, and can be installed on one side of the feature portion or replaced.
  • the position-limiting portion is a columnar structure, a positioning groove is formed on the columnar structure, and a horizontal ⁇ angle opening is horizontally formed on the positioning groove.
  • a through hole or a blind hole for limiting the position of the object to be positioned is provided along the central axis of the cylindrical structure.
  • the display board is provided with a shielding member for shielding the optical feature.
  • the invention provides an object space calibration and positioning method, which can make two objects reach a specific angle and position relationship by using the identification characteristics of objects with different error characteristics in the same scene for objects in a specific scene. , and then through the spatial association of the two corresponding objects, the image acquisition and the mutual correction of the positions of the two different objects are performed to achieve the improvement of the optical positioning accuracy of one or both parties.
  • the device and method can be applied in various occasions, such as surgery.
  • the positioning of the medical device operation in the process, the application in the teaching simulation operation, and the use in the game activity process, etc., the accurate positioning and the augmented reality of the position can help the user to carry out the precise and complete operation.
  • Fig. 1 is the flow chart of the positioning method of the present invention
  • Fig. 2 is the embodiment example diagram in the specific embodiment of the present invention.
  • FIG. 3 is a schematic diagram 1 of a positioning device in the object space calibration positioning method of the present invention.
  • FIG. 4 is a second schematic diagram of a positioning device in the object space calibration positioning method of the present invention.
  • Fig. 5 is the schematic diagram three of the positioning device of the object space calibration positioning method of the present invention.
  • FIG. 6 is a schematic diagram of the inter-calibration of the present invention.
  • the present invention provides an augmented reality method based on correcting the position of an object in space, which can be applied to an operation scene, an operation scene in a simulated teaching process, or a game process. position.
  • the present invention provides the user with the positioning of the surgical instruments in the object.
  • the user is the observer of the whole in vivo navigation process, and he is also the operator who probes the instrument into the body of the subject.
  • Objects can be people or other animals that the user needs to operate on.
  • the instrument can be any tool that can be penetrated into the body of the subject.
  • the instrument may be, for example, a puncture needle, a biopsy needle, a radiofrequency or microwave ablation needle, an ultrasound probe, a rigid endoscope, an endoscopic oval forceps, an electric knife or a stapler and other medical instruments.
  • the positioning device is a fixture in a surgical scene; the object to be positioned is a medical instrument in a surgical scene.
  • the present invention provides a method for calibrating and positioning objects in space, including:
  • the spatial position information at least includes the spatial coordinates of the positioning device and/or the orientation of the positioning device, and the specific spatial position information of the fixed positioning device can be carried out. position.
  • the identifying characteristics of the positioning device include at least the morphological characteristics of the positioning device body and/or the identifying characteristics of the marking of the positioning device.
  • the morphological characteristics of the positioning device body at least include the structure, shape or color of the positioning device body, but in the specific implementation process, it is not limited to this, and may also be other identifiable characteristics of the object.
  • the present invention can fix an object with a fixed shape. Before calibration, the structure and shape of the positioning device are identified. During the identification process, different display modes can be used to prompt the user whether the capture process and the identification process are successful. The positioning device is positioned and identified to obtain accurate spatial position information of the surgical positioning device.
  • the marking identification characteristic of the positioning device includes at least a pattern, graphic or two-dimensional code set on the positioning device.
  • the patterns, graphics or two-dimensional codes can be set on the positioning device through the printing process, and the identifiable patterns have different spatial accuracy according to their own pattern rules and production characteristics. Make full use of the combination of recognizable patterns with different characteristics to achieve rapid spatial calibration of navigation instruments.
  • the device for capturing the image of the positioning device is a device capable of image capturing, and the capturing angle is consistent with the user's viewing direction.
  • the photographing device is a head-mounted optical camera.
  • the acquisition angle of the head-mounted optical camera can be well kept consistent with its viewing direction.
  • the image of the positioning device is obtained by the photographing device, the identification characteristics of the marking of the positioning device are identified, and the morphological characteristics of the positioning device are obtained according to the identification characteristics of the positioning device, and the position of the positioning device in the xyz space coordinate system is obtained, wherein the z coordinate represents the shooting along the camera.
  • the coordinates in the depth direction of , the x and y coordinates are the coordinates in the direction perpendicular to the z coordinate axis, and the current positioning device space coordinates are set to X1, Y1, Z1 for the positioning device.
  • the object to be positioned When the object to be positioned is at a specific position, capture the image information of the object to be positioned in space, and identify the identification characteristics of the object to be positioned in the image of the object to be positioned to obtain spatial position information of the object to be positioned; wherein, the specific position Determine the relative position with the positioning device;
  • the object to be positioned is a moving instrument
  • the spatial position information of the object to be positioned includes the spatial coordinates of the object to be positioned and/or the orientation of the object to be positioned.
  • the identification characteristics of the object to be located include at least the morphological characteristics of the object to be located and/or the identification characteristics of the object to be located; the morphological characteristics of the object to be located include at least the structure, shape, or color of the object to be located;
  • the identification feature of the bit object mark at least includes the pattern, figure or two-dimensional code set on the object to be positioned.
  • a two-dimensional code is a black and white plane figure distributed on a plane, the points on it are very easy to identify, and the positioning of the two-dimensional code can be realized by identifying at least three points among them. Because the two-dimensional code is fixed to the object or device, the positioning of the object or device to which the two-dimensional code is fixed can be realized.
  • the marking recognition characteristic of the object to be located may also be other plane figures such as a checkerboard.
  • a checkerboard Using QR code or checkerboard as an identification makes positioning objects or instruments more accurate and fast. Thereby, fast moving instruments can be navigated more precisely.
  • the logo fixed on the surface of the instrument can also be a three-dimensional figure.
  • the logo can be the handle of the instrument, or a structure fixed on the side of the handle.
  • the object to be positioned in the present invention is a puncture needle in an operation, the end of the puncture needle is provided with an identification structure, and a two-dimensional code is printed.
  • capturing an image of the object to be positioned in space of the object to be positioned specifically includes:
  • the positioning device is fixed in the space, and the object to be positioned is a moving object.
  • the object to be positioned moves to a specific position, an image of the object to be positioned in the space is captured.
  • the specific position can be set so that the object to be positioned moves to coincide with the preset position of the positioning device. Or, according to the needs of the actual operation, when a certain position of the object to be positioned reaches a fixed position or completes a prescribed action, the positioning is carried out.
  • the positioning device is fixedly arranged in the space, and the object to be positioned is a moving object.
  • the object to be positioned moves to a specific position
  • the object to be positioned is identified according to the identification characteristics of the object to be positioned, and the object to be positioned is obtained.
  • the orientation of the bit object, and the current spatial coordinates of the object to be positioned are set for the object to be positioned, denoted as X2, Y2, and Z2.
  • the relative position of the specific position and the positioning device is determined, and the specific position is the position when the object to be positioned has a specific positional relationship with a preset associated point, line, or surface on the positioning device , the specific positional relationship includes point, line or plane coincidence, and partial coincidence.
  • the information board is used as the positioning device, and the puncture needle is the object to be positioned. positioned and aligned with each other.
  • the puncture needle is the object to be positioned. positioned and aligned with each other.
  • two objects can be relatively corrected according to the actual situation, such as calculating the theoretical position information of the object to be positioned according to the spatial position information of the positioning device and the specific position; according to the theoretical position information of the object to be positioned, the space of the object to be positioned is calculated. position information for correction;
  • the position information of the object in the space is calculated according to the image of the positioning device taken.
  • the coordinates of point A are the characteristics of the positioning device captured by the camera (mainly the pattern features on the panel). ) calculated;
  • the coordinates of point B of the needle tip of the puncture needle can be calculated according to the easily identifiable features set at the end of the puncture needle;
  • the two points A and B are coincident at this time, but the coordinates of the two points A and B obtained through step 1 and step 2 respectively are not necessarily the same.
  • the accuracy of the x and y coordinates of point A on the positioning device is high but the accuracy of the Z coordinate is relatively low, while the accuracy of the z coordinate of point B on the object to be located (puncture needle) is accurate It is relatively high, so the X2 and Y2 coordinates of the object to be positioned are corrected according to the X1 and Y1 coordinates of the positioning device, and the Z1 coordinate of the positioning device is corrected with the Z2 coordinate of the object to be positioned. Then the corresponding positions of the two structures in the database are adjusted as follows:
  • the calibration point has the following two expressions in the needle tip coordinate system:
  • the above two coordinates are the representation of the calibration point in the needle identifier coordinate system. Assuming that the expression (a) is more accurate for the z coordinate component, and the expression (b) is more accurate for the x and y coordinate components, the result after mutual calibration is
  • C camera coordinate system
  • T B ⁇ A Represents the coordinate transformation matrix from coordinate system A to coordinate system B
  • the camera recognizes the positioning device and the puncture needle, and then T C ⁇ Q and T C ⁇ N can be obtained. Place the puncture needle tip on a fixed point p on the identification plate. From the processing model of the identification plate, the coordinates of the fixed point in the identification plate coordinate system, that is, p Q , can be determined. According to the characteristic that the coordinates of this point in the camera coordinate system are unchanged, the following coordinate relationship can be obtained:
  • the present invention can also be calibrated by using direction calibration, which specifically includes:
  • the mutual calibration method consists of the following two parts, and the schematic diagram of the mutual calibration is shown in Figure 6:
  • the direction vector v N of the puncture needle in the needle identification object coordinate system is manually determined in advance.
  • the calibration direction has two expressions in the needle tip coordinate system:
  • the above two vectors are the representation of the calibration direction in the coordinate system of the needle identifier. Assuming that the expression (a) is more accurate for the w coordinate component, and the expression (b) is more accurate for the u and v coordinate components, the result after mutual calibration is
  • the orientation calibration method of the positioning device is shown in Figure 7.
  • T C ⁇ Q and T C ⁇ N can be obtained.
  • the direction vector of the hole in the identification plate coordinate system ie v Q , can be determined. Since the direction vector does not change in the camera coordinate system, the following conversion relationship can be obtained
  • the needle tip direction can be calculated in real time according to the following formula:
  • T C ⁇ N is given by the camera after identifying the needle identification object
  • v N is the calibration result calculated by using mutual calibration or orientation calibration of the positioning device.
  • the camera captures the video of the object and the instrument in real time.
  • the user can watch the video, in which not only the surface parts of the object and the instrument captured by the camera are displayed, but also the internal organs of the object, the lesions and the parts of the instrument inside the object that are not actually visible in three dimensions at corresponding positions.
  • the actually invisible internal organs, lesions, and parts of the instrument located inside the body are aligned with the human body and the actual instrument, thereby guiding the user in a virtual scene image similar to the real environment and the position of operating the instrument.
  • the positioning device and the object to be positioned can be identified, and the optical identification objects with different error characteristics can be used in the same scene.
  • the correlation of the coordinates of different identification patterns in the same space is determined by matching the geometric structure of the instruments with spatial correlation. By using the known trusted values, the calibration of the spatial recognition positions of different recognition patterns is realized.
  • the present invention further provides a positioning device, comprising: a support portion 1 , a feature portion 2 and a limit portion 3 disposed on the support portion 1 ;
  • the feature part 2 includes a display board 22, the display board 22 is connected with the support part 1, and the display board 22 is provided with an optical feature 221 for being photographed and identified;
  • the limiting portion 3 is disposed on one side of the feature 2, and is used to limit the position of the object to be positioned.
  • the feature portion 2 further includes a connecting mechanism 21 , and the display board 22 is connected to the supporting portion 1 through the connecting mechanism 21 .
  • the display board 22 can be reversibly installed on the support part 1 through the connecting mechanism 21.
  • the connecting mechanism 21 can be set as a hinge structure, which can be flipped and connected with the support part 1 or can be hingedly connected through a hinge. When the angle of the display board 22 needs to be adjusted At the time of rotation of the connecting mechanism 21, the flip adjustment can be realized to achieve the best angle suitable for shooting.
  • the display board 22 is provided with a shielding member 222 for shielding the optical features.
  • the shutter 222 can be a baffle that opens and closes up and down and opens and closes left and right, and can be opened when the planar identification object is recognized, and can be blocked when the needle-shaped identification object is identified.
  • the optical features 221 include one or any combination of specific graphics, structures, colors for being identified.
  • the pattern, figure, structure or two-dimensional code can be set on the positioning device through the printing process, and the identifiable pattern has different spatial accuracy according to its own pattern rules and production characteristics. Take advantage of the combination of different characteristic identifiable patterns for fast spatial calibration.
  • the optical feature is a pattern attached or printed on the display board 22, and the pattern is a two-dimensional code.
  • the photographing device recognizes the two-dimensional code, and calculates the position space information of the object according to the position information of many feature points on the two-dimensional code pattern.
  • the spatial position information includes one or more of the spatial coordinates of the positioning device and the placement form, and can specifically locate the spatial position of the fixed positioning device.
  • the limiting portion 3 is a detachable structure, which can be installed on one side of the characteristic portion or replaced.
  • the limiting portion 3 can be separated from the overall structure. This way the overall structure can be reused and only needs to be sterilized.
  • the limiting portion 3 that will contact the sterile surgical instrument may be a sterilized disposable component.
  • the structure 3 is combined with the overall structure for use, which improves the safety of surgical use.
  • the limiting portion is provided on one side of the feature portion, and when the object to be positioned moves to a predetermined position, the limiting portion limits the object to be positioned and forms a specific spatial positional relationship with the object to be positioned.
  • the photographing device is a head-mounted optical camera. When the user uses it, no matter what posture the user adopts, the acquisition angle of the head-mounted optical camera can be well kept consistent with its viewing direction.
  • the object to be positioned is a surgical instrument, and the logo fixed on the surface of the instrument can also be a three-dimensional figure. a structure.
  • the instrument may be, for example, a puncture needle, a biopsy needle, a radiofrequency or microwave ablation needle, an ultrasound probe, a rigid endoscope, an endoscopic oval forceps, an electric knife or a stapler and other medical instruments.
  • a puncture needle as an example, the needle tip of the puncture needle moves.
  • the needle tip and the positioning device form a specific spatial positional relationship to facilitate positioning.
  • the position information of the position is used to correct the position information of the needle tip.
  • the limiting portion 3 is a cylindrical structure, and a positioning groove 31 for positioning is provided at the top of the cylindrical structure, and a horizontal ⁇ angle opening is horizontally opened on the positioning groove 31 .
  • this structure can ensure that the needle tip will not slip out of the plane when the needle body is moved from the vertical to the horizontal angle of 90 degrees and the horizontal ⁇ angle range.
  • a through hole or a blind hole is provided along the central axis of the cylindrical structure, so that the through hole or blind hole is adapted to accommodate the insertion and placement of the puncture needle, so as to realize the restriction of the straight line where the needle body is located. bit.

Abstract

Procédé de positionnement d'objets par étalonnage spatial, comprenant les étapes consistant à : capturer une image d'un appareil de positionnement dans un espace et identifier des caractéristiques d'identification de l'appareil de positionnement dans l'image de l'appareil de positionnement pour obtenir des informations de position spatiale d'un appareil de positionnement (S1) ; lorsqu'un objet à positionner est situé au niveau d'une position spécifique, capturer des informations d'image de l'objet dans l'espace et identifier des caractéristiques d'identification de l'objet dans une image de l'objet, de façon à obtenir des informations de position spatiale de l'objet (S2) ; et corriger les informations de position spatiale de l'objet en fonction des informations de position spatiale de l'appareil de positionnement et de la position spécifique de façon à obtenir des informations de position spatiale finale d'un appareil à positionner (S3). Selon le procédé, pour des objets dans une scène spécifique, l'acquisition d'images et la correction de position réciproque peuvent être effectuées sur deux objets différents en faisant en sorte que les deux objets atteignent une relation angulaire et positionnelle spécifique, de telle sorte que la précision de positionnement optique d'une partie unique ou de deux parties soit améliorée, et qu'un utilisateur puisse être aidé à effectuer des opérations précises et complètes.
PCT/CN2022/081521 2021-04-01 2022-03-17 Procédé de positionnement d'objets par étalonnage spatial WO2022206417A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110357358.X 2021-04-01
CN202110357358.XA CN113509263A (zh) 2021-04-01 2021-04-01 一种物体空间校准定位方法

Publications (1)

Publication Number Publication Date
WO2022206417A1 true WO2022206417A1 (fr) 2022-10-06

Family

ID=78062303

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/081521 WO2022206417A1 (fr) 2021-04-01 2022-03-17 Procédé de positionnement d'objets par étalonnage spatial

Country Status (2)

Country Link
CN (1) CN113509263A (fr)
WO (1) WO2022206417A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113509263A (zh) * 2021-04-01 2021-10-19 上海复拓知达医疗科技有限公司 一种物体空间校准定位方法
CN113509264A (zh) * 2021-04-01 2021-10-19 上海复拓知达医疗科技有限公司 一种基于校正物体在空间中位置的增强现实系统、方法及计算机可读存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6497134B1 (en) * 2000-03-15 2002-12-24 Image Guided Technologies, Inc. Calibration of an instrument
US20050187562A1 (en) * 2004-02-03 2005-08-25 Grimm James E. Orthopaedic component inserter for use with a surgical navigation system
US20070106282A1 (en) * 2003-05-02 2007-05-10 Perception Raisonnement Action En Medecine Determination of the position of an anatomical element
CN103006335A (zh) * 2013-01-06 2013-04-03 新博医疗技术有限公司 一种手术导航用的通用标定模及标定方法
WO2015117644A1 (fr) * 2014-02-05 2015-08-13 Brainlab Ag Procédé d'identification et d'étalonnage
US20180250079A1 (en) * 2017-03-01 2018-09-06 Eped Inc. Handpiece register
CN110537983A (zh) * 2019-09-26 2019-12-06 重庆博仕康科技有限公司 光磁一体穿刺手术导航平台
CN113509263A (zh) * 2021-04-01 2021-10-19 上海复拓知达医疗科技有限公司 一种物体空间校准定位方法

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101367366B1 (ko) * 2012-12-13 2014-02-27 주식회사 사이버메드 영상 유도 수술을 위한 수술 도구를 보정하는 방법 및 도구
USRE49930E1 (en) * 2015-03-26 2024-04-23 Universidade De Coimbra Methods and systems for computer-aided surgery using intra-operative video acquired by a free moving camera
CN105931237A (zh) * 2016-04-19 2016-09-07 北京理工大学 一种图像校准方法和系统
CN107468350B (zh) * 2016-06-08 2020-12-08 北京天智航医疗科技股份有限公司 一种三维图像专用标定器、手术定位系统及定位方法
US10130430B2 (en) * 2016-11-14 2018-11-20 Intai Technology Corp. No-touch surgical navigation method and system thereof
WO2018206086A1 (fr) * 2017-05-09 2018-11-15 Brainlab Ag Génération d'une image de réalité augmentée d'un dispositif médical
CN109833092A (zh) * 2017-11-29 2019-06-04 上海复拓知达医疗科技有限公司 体内导航系统和方法
CN108294825B (zh) * 2017-12-26 2019-08-02 刘洋 用于手术导航的配准系统及方法
TWI678181B (zh) * 2018-04-30 2019-12-01 長庚大學 手術導引系統
CN111388087A (zh) * 2020-04-26 2020-07-10 深圳市鑫君特智能医疗器械有限公司 手术导航系统及执行手术导航方法的计算机与存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6497134B1 (en) * 2000-03-15 2002-12-24 Image Guided Technologies, Inc. Calibration of an instrument
US20070106282A1 (en) * 2003-05-02 2007-05-10 Perception Raisonnement Action En Medecine Determination of the position of an anatomical element
US20050187562A1 (en) * 2004-02-03 2005-08-25 Grimm James E. Orthopaedic component inserter for use with a surgical navigation system
CN103006335A (zh) * 2013-01-06 2013-04-03 新博医疗技术有限公司 一种手术导航用的通用标定模及标定方法
WO2015117644A1 (fr) * 2014-02-05 2015-08-13 Brainlab Ag Procédé d'identification et d'étalonnage
US20180250079A1 (en) * 2017-03-01 2018-09-06 Eped Inc. Handpiece register
CN110537983A (zh) * 2019-09-26 2019-12-06 重庆博仕康科技有限公司 光磁一体穿刺手术导航平台
CN113509263A (zh) * 2021-04-01 2021-10-19 上海复拓知达医疗科技有限公司 一种物体空间校准定位方法

Also Published As

Publication number Publication date
CN113509263A (zh) 2021-10-19

Similar Documents

Publication Publication Date Title
WO2022206417A1 (fr) Procédé de positionnement d'objets par étalonnage spatial
JP6889703B2 (ja) 患者の3d表面画像を手術中に観察するための方法及び装置
US10165981B2 (en) Surgical navigation method
EP2637593B1 (fr) Visualisation de données anatomiques par réalité augmentée
JP7202737B2 (ja) 歯科インプラントナビゲーション手術の追跡方法および装置
KR102105974B1 (ko) 의료 영상 시스템
Lathrop et al. Minimally invasive holographic surface scanning for soft-tissue image registration
CN107468350A (zh) 一种三维图像专用标定器、手术定位系统及定位方法
JP2016043211A (ja) 画像位置合せ装置、方法およびプログラム
CN107049489B (zh) 一种手术导航方法及系统
Zeng et al. A surgical robot with augmented reality visualization for stereoelectroencephalography electrode implantation
KR101491922B1 (ko) 하이브리드 내비게이션 시스템 및 그의 위치 추적 방법
CN111821024B (zh) 手术导航系统及其成像方法
CN109674533B (zh) 基于便携式彩超设备的手术导航系统及方法
US20230390021A1 (en) Registration degradation correction for surgical navigation procedures
Liu et al. On-demand calibration and evaluation for electromagnetically tracked laparoscope in augmented reality visualization
Jiang et al. Optical positioning technology of an assisted puncture robot based on binocular vision
WO2022206406A1 (fr) Système et procédé de réalité augmentée basés sur une position spatiale d'un objet corrigé et support de stockage lisible par ordinateur
CN114943802A (zh) 一种基于深度学习与增强现实的知识引导外科手术交互方法
CN113786228B (zh) 一种基于ar增强现实的辅助穿刺导航系统
CN216535498U (zh) 一种基于物体在空间中的定位装置
Sun et al. Virtually transparent epidermal imagery for laparo-endoscopic single-site surgery
WO2022206436A1 (fr) Système et procédé d'identification et d'invite de position dynamique
Wengert et al. Endoscopic navigation for minimally invasive suturing
TWI749921B (zh) 手術空間註冊系統與方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22778600

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22778600

Country of ref document: EP

Kind code of ref document: A1