KR20160022705A - Position tracking for tool - Google Patents

Position tracking for tool Download PDF

Info

Publication number
KR20160022705A
KR20160022705A KR1020140108599A KR20140108599A KR20160022705A KR 20160022705 A KR20160022705 A KR 20160022705A KR 1020140108599 A KR1020140108599 A KR 1020140108599A KR 20140108599 A KR20140108599 A KR 20140108599A KR 20160022705 A KR20160022705 A KR 20160022705A
Authority
KR
South Korea
Prior art keywords
tool
camera
axis
image
tracking
Prior art date
Application number
KR1020140108599A
Other languages
Korean (ko)
Inventor
전인호
케콥퍼 아쉐이
홍재성
최현석
정경화
Original Assignee
재단법인 아산사회복지재단
재단법인대구경북과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 재단법인 아산사회복지재단, 재단법인대구경북과학기술원 filed Critical 재단법인 아산사회복지재단
Priority to KR1020140108599A priority Critical patent/KR20160022705A/en
Priority to PCT/KR2015/008682 priority patent/WO2016028095A1/en
Publication of KR20160022705A publication Critical patent/KR20160022705A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to an apparatus for tracking a position of a tool, which tracks the position of the tool on an image obtained by a camera. According to the present invention, the apparatus can improve more accuracy of position information of the tool on the image by attaching the camera to the tool, tracking a position in a forward and backward (x-axis) direction, a left and right (y-axis) direction, and an upward and downward (z-axis) direction of the tool by using an optical flow technique or a feature of other images based on the image obtained by the camera, and displaying the same on the image.

Description

Position tracking for tool [0001]

The present invention relates to a device for tracking a position of a patient and a tool, and more particularly, to a tool for attaching at least one camera or at least one camera and an inertia or acceleration sensor to track all directions of the tool on an image acquired by the camera And more particularly, to a position tracking apparatus for a tool.

In general, an optical flow technique is used as a method of tracking and indicating the movement of an object. The visual flow technique is a technique in which an observer (which may be an eye or a camera) The motion pattern of the object, the surface, and the edge in the visual landscape caused by the motion.

That is, if the observer is a camera by the visual flow technique, the feature points are extracted from the image acquired through the camera, and the feature points are continuously tracked while the image obtained through the camera is updated. Directional information of the object can be obtained by acquiring the loci of the minutiae points of the image (A) and the image (B). That is, the feature points are continuously extracted to obtain direction information of the object.

This general optical flow technique is mainly used for obtaining motion information of a target object in a fixed state, for example, in a fixed state. However, in a surgical tool, a camera is fixed to a surgical tool without fixing the camera, Therefore, the location information of the surgical tool is obtained by tracking the feature points on the image of the camera, that is, the background screen.

Among the examples in which the camera is attached to the surgical tool described above, the conventional invention described in Patent Document 2 of the prior art document is that the camera and the sensor are installed on the surgical tool to obtain the position and orientation information of the surgical tool, However, the present invention differs from the present invention in that a camera and a sensor are installed in a surgical tool itself, and a camera and a sensor are installed using a separate frame.

Korean Patent Registration No. 10-13055806 U.S. Published Patent Application No. US 2014/0107471 A1

SUMMARY OF THE INVENTION The present invention has been made in order to solve all of the above problems, and it is an object of the present invention to at least one camera or at least one camera to be able to track all directions of a tool on an image acquired by a camera, And an apparatus for tracking the position of a tool attached with an inertial or acceleration sensor such as an IMU sensor.

In order to accomplish the above object, the apparatus for tracking a position of a tool on an image acquired by a camera, the apparatus for tracking a position of the tool on the basis of an image acquired by a camera, (X-axis), left and right (y-axis) and up-and-down (z-axis) directions of the tool using the optical flow technique or other image features .

In addition, the camera may be attached to at least one tool at intervals of different angles, and an optical flow technique or another image feature may be used based on the images acquired by the one or more cameras, (x-axis) direction, left-right (y-axis) direction, and up-and-down (z-axis) direction and displays it on the image.

In addition, any one of the at least one camera may track the position of the tool in the left and right (y axis) direction and the up and down (z axis) direction, and the other camera attached at a different angle (x-axis) direction to be displayed on the image.

Preferably, the inertial or acceleration sensor is attached to the tool with a camera, and the inertial or acceleration sensor is an IMU sensor.

Also, it is possible to track the position of the tool in the left and right (y-axis) direction and the up and down (z-axis) direction through the optical flow technique based on the image acquired by the camera, x-axis) direction is detected and displayed on the image.

Further, it is preferable that the image acquired through the camera is an image of the surrounding environment around the object to which the tool is directed.

The tool is preferably a surgical tool.

According to the position tracking apparatus of the present invention, the right and left (y-axis) and up-and-down (z-axis) directions of the tool as well as the front and rear direction (x- The accuracy of the positional information of the tool on the image can be further enhanced.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is an explanatory diagram of an optical flow technique used in a position tracking apparatus of a tool according to the present invention;
FIG. 2 is a conceptual diagram showing extraction of minutiae points by a position tracking device of a tool according to the present invention;
FIG. 3 is a conceptual diagram showing extraction of minutiae points when the tool is moved left or right by the position tracking device of the tool according to the present invention.
FIG. 4 is a conceptual diagram showing extraction of minutiae points when the tool is moved to the upper or lower side by the position tracking device of the tool according to the present invention.
FIG. 5 is a coordinate diagram showing the directions of movement of the tool in up, down, left, right, and front and back directions by the position tracking device of the tool according to the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a preferred embodiment of a device for locating a tool according to the present invention will be described in detail with reference to the accompanying drawings. It is to be understood that the present invention is not limited to the disclosed embodiments, but may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, It is provided to inform.

FIG. 2 is a conceptual diagram showing extraction of minutiae points by a device for locating a tool according to the present invention.

As shown in FIG. 2, the apparatus for tracking the position of a tool according to the present invention is a device for tracking the position of a tool by attaching one camera to a surgical tool, ) Direction, and the position information is read by the optical flow technique and displayed on the image. The optical flow technique is used to track the direction of the three axes of the surgical tool, but the direction information of the surgical tool can be represented using other image features.

However, when the position of the tool is tracked using one camera, as described above, the directions of the three axes of the x-axis, y-axis, and z-axis are all known. However, since the information about the x- To compensate, use one or more cameras or inertial or acceleration sensors such as one or more cameras and an IMU sensor to compensate for the accuracy of the fore and aft direction information.

Thus, for example, when two cameras are attached to a surgical tool, one camera is oriented in the direction of the surgical target and the other camera is oriented in a direction 90 degrees apart. The camera facing the direction of the surgical object basically traces the surrounding environment around the surgical object, that is, the operating room rather than the surgical object so that the visual flow technique using the characteristic points of the object extracted in the operating room environment Or the position information of the surgical tool is read by a feature of another image.

FIG. 3 is a conceptual view showing extraction of minutiae points when the tool is moved left or right by the tool position tracking apparatus according to the present invention. FIG. Or extracting feature points when the user moves to the lower part.

As shown in FIG. 3, when the surgical tool is moved in the left and right (y-axis) directions on the image acquired by one camera, using the feature points of the object extracted by tracking in the operating room environment as described above, The position information of the surgical tool moving to the left and right is read with a visual flow technique or another image feature. Also, as shown in FIG. 4, when the surgical tool is moved up and down (z-axis) direction on the image acquired by one camera, the feature points of the object extracted by tracking in the operating room environment are used The positional information of the surgical tool moving up and down with the visual flow technique or the feature of the other image is read.

5, when tracking the feature points on the image obtained through the camera when the surgical tool with the camera is moved, the left and right (y-axis) directions of the surgical tool shown on the right side of FIG. 5 Tracking of information in the vertical direction (z-axis) is easy, but tracking of information in the front and back (x-axis) directions of the surgical tool moving back and forth on the image shown on the left side of Fig. 5 is difficult. Therefore, the tracking of the information in the fore and aft direction (x-axis) is performed by one camera attached to the surgical tool at an interval of 90 degrees from one camera facing the direction of the surgical target. In other words, the position information of the surgical tool moving in the front and back (x-axis) directions is read by tracking the feature points on the image acquired by the other camera having the interval of 90 degrees.

In this way, two cameras attached to the surgical tool at intervals of 90 degrees read the position information of the surgical tool in which the vertical, left and right and front and back directions are tracked, but the inertia or acceleration sensor such as the IMU sensor It is possible to read and display the position information of the surgical tool that tracks the up, down, left and right and front and back directions. That is, one or more cameras read the position information through the visual flow technique to represent the movement in the left and right (y-axis) direction and the vertical (z-axis) direction of the surgical tool as described above, and the inertia or acceleration sensor (x-axis) direction to read position information through visual flow technique or other image features. The IMU sensor, which is a kind of inertial or acceleration sensor, is a sensor for measuring the speed, direction and gravity of an object and serves to detect motion in the front and back (x-axis) direction of the surgical tool.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, Various modifications may be made by those skilled in the art.

Claims (8)

An apparatus for tracking a position of a tool on an image acquired by a camera,
(X-axis) direction and left and right (y-axis) directions and up and down (z-axis) directions of the tool using the optical flow technique or other image features based on the image obtained by attaching the camera to the tool ) Direction and to display it on the image.
The method according to claim 1,
The camera is attached to at least one tool at intervals of different angles, and based on an image acquired by the one or more cameras, an image of the camera is captured using an optical flow technique or other image features, (x-axis) direction, the left and right (y-axis) direction, and the up and down (z-axis) direction and displays it on the image.
3. The method of claim 2,
Wherein one of the at least one camera tracks the position in the left and right (y-axis) direction and the up and down (z-axis) direction of the tool and the other camera attached at an interval of different angles, x-axis) direction is traced and displayed on the image.
3. The method of claim 2,
Wherein the inertial or acceleration sensor is attached to the tool with the camera.
5. The method of claim 4,
Wherein the inertial or acceleration sensor is an IMU sensor.
5. The method of claim 4,
(Y-axis) and up-and-down (z-axis) directions of the tool through an optical flow technique based on the image acquired by the camera, Axis) direction of the object to be displayed on the image.
7. The method according to any one of claims 1 to 6,
Wherein the image obtained through the camera is an image of a surrounding environment around an object to which the tool is directed.
8. The method of claim 7,
Wherein the tool is a surgical tool.
KR1020140108599A 2014-08-20 2014-08-20 Position tracking for tool KR20160022705A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020140108599A KR20160022705A (en) 2014-08-20 2014-08-20 Position tracking for tool
PCT/KR2015/008682 WO2016028095A1 (en) 2014-08-20 2015-08-20 Tool location tracking apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140108599A KR20160022705A (en) 2014-08-20 2014-08-20 Position tracking for tool

Publications (1)

Publication Number Publication Date
KR20160022705A true KR20160022705A (en) 2016-03-02

Family

ID=55350968

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140108599A KR20160022705A (en) 2014-08-20 2014-08-20 Position tracking for tool

Country Status (2)

Country Link
KR (1) KR20160022705A (en)
WO (1) WO2016028095A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220090597A (en) 2020-12-22 2022-06-30 한국전자기술연구원 Location tracking device and method using feature matching

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101305806B1 (en) 2011-11-30 2013-09-06 성균관대학교산학협력단 Surgery navigation apparatus and method for total knee arthroplasty
US20140107471A1 (en) 2011-06-27 2014-04-17 Hani Haider On-board tool tracking system and methods of computer assisted surgery

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4142460B2 (en) * 2003-01-31 2008-09-03 オリンパス株式会社 Motion detection device
JP4472362B2 (en) * 2004-01-16 2010-06-02 オリンパス株式会社 Endoscopic treatment tool
JP2010200894A (en) * 2009-03-02 2010-09-16 Tadashi Ukimura Surgery support system and surgical robot system
KR101159469B1 (en) * 2009-05-14 2012-06-25 국립암센터 Method of Detection for Surgical Instruments, Recording Media of the same, Surgical System of using the same and Surgical Subsystem of using the same
KR20130121521A (en) * 2012-04-27 2013-11-06 주식회사 고영테크놀러지 Method for tracking of the affected part and surgery instrument

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140107471A1 (en) 2011-06-27 2014-04-17 Hani Haider On-board tool tracking system and methods of computer assisted surgery
KR101305806B1 (en) 2011-11-30 2013-09-06 성균관대학교산학협력단 Surgery navigation apparatus and method for total knee arthroplasty

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220090597A (en) 2020-12-22 2022-06-30 한국전자기술연구원 Location tracking device and method using feature matching
US11847785B2 (en) 2020-12-22 2023-12-19 Korea Electronics Technology Institute Location tracking device and method using feature matching

Also Published As

Publication number Publication date
WO2016028095A1 (en) 2016-02-25

Similar Documents

Publication Publication Date Title
US10095031B1 (en) Non-overlapped stereo imaging for virtual reality headset tracking
CN111465886B (en) Selective tracking of head mounted displays
US11763531B2 (en) Surgeon head-mounted display apparatuses
US10852847B2 (en) Controller tracking for multiple degrees of freedom
US20170132806A1 (en) System and method for augmented reality and virtual reality applications
Lobo et al. Vision and inertial sensor cooperation using gravity as a vertical reference
WO2017172984A3 (en) Virtual reality headset with relative motion head tracker
KR20170095400A (en) Part attachment work support system and part attachment method
US10768711B2 (en) Mediated reality
WO2015139797A1 (en) Method for image stabilization
JP2008311690A (en) Eyeball movement controller employing principle of vestibulo-ocular reflex
EP2787425B1 (en) Optical detection of bending motions of a flexible display
CN109474817B (en) Optical sensing device, method and optical detection module
JP2017147689A (en) Video editing device, video editing method, and computer program for editing video
KR20160022705A (en) Position tracking for tool
JP6632298B2 (en) Information processing apparatus, information processing method and program
KR20180060403A (en) Control apparatus for drone based on image
JP7269617B2 (en) Face image processing device, image observation system, and pupil detection system
KR101473234B1 (en) Method and system for displaying an image based on body tracking
JP2013179544A5 (en)
JP6864333B2 (en) Face orientation detection system and face orientation detection device
CN106257924A (en) Multi-visual angle filming device and multi-visual angle filming method
US10845603B2 (en) Imaging assisting device and program
JP5522799B2 (en) Gaze position estimation system and gaze position estimation program
JP2010281685A (en) System and method for measurement of position

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
E902 Notification of reason for refusal