JP2000298427A - Medical practice device and method for evaluating medical practice result - Google Patents

Medical practice device and method for evaluating medical practice result

Info

Publication number
JP2000298427A
JP2000298427A JP11105217A JP10521799A JP2000298427A JP 2000298427 A JP2000298427 A JP 2000298427A JP 11105217 A JP11105217 A JP 11105217A JP 10521799 A JP10521799 A JP 10521799A JP 2000298427 A JP2000298427 A JP 2000298427A
Authority
JP
Japan
Prior art keywords
targets
image
training
space
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP11105217A
Other languages
Japanese (ja)
Other versions
JP3790638B2 (en
Inventor
Masato Miyahara
征人 宮原
Masakazu Suzuki
正和 鈴木
Teruzo Nakayama
照三 中山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
J Morita Manufaturing Corp
Original Assignee
J Morita Manufaturing Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by J Morita Manufaturing Corp filed Critical J Morita Manufaturing Corp
Priority to JP10521799A priority Critical patent/JP3790638B2/en
Publication of JP2000298427A publication Critical patent/JP2000298427A/en
Application granted granted Critical
Publication of JP3790638B2 publication Critical patent/JP3790638B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To provide a medical practice device which is capable of visually observe and objectively evaluate the result of a practice in real time and a method for evaluating the medical practice. SOLUTION: The medical practice device photographs three targets (targets 44A, 44B and 44C) with two units of image pickup cameras 32A and 32B. VC processes the images photographed by the image pickup cameras and forms the position data of the respective targets. These pieces of the position data are utilized for obtaining the three-dimensional coordinate data of the specific position (cutting appliance 50) made correspondent to the targets. The synthesized images superposing the movement of the specific position (cutting appliance) on the image of the object (tooth) to be practiced in accordance with the three-dimensional image data and the image data of the object (tooth 52) to be practiced is displayed on a display device.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、医療実習の結果を
客観的かつ視覚的に評価する医療実習装置及び医療実習
結果の評価方法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a medical training apparatus for objectively and visually evaluating the results of medical training and a method for evaluating the results of medical training.

【0002】[0002]

【従来の技術】従来、歯科実習の結果を視覚的に評価す
る装置として、特開平10−97187号公報に提案さ
れているものがある。この評価装置では、まず、未切削
の歯牙模型を計測して得られた未切開形状データと、教
師が手本として未切削の歯牙模型に実際に窩洞形成した
切削済の歯牙模型を計測して得られた切削済形状データ
との差分より、教師窩洞形状データを得る。同様にし
て、未切削形状データと、学生が実習として未切削の歯
牙模型に実際に窩洞形成した切削済の歯牙模型を計測し
て得られた切削形状データとの差分から対象窩洞形状デ
ータを得る。そして、教師窩洞形状データと対象窩洞形
状データに基づいて、教師窩洞形状と対象窩洞形状を表
示画面上で重ね合わせ、両者の重ね合わせ部分の過剰量
または不足量を点数に換算して、学生の形成した対象窩
洞形状を評価する。
2. Description of the Related Art Conventionally, as a device for visually evaluating the results of dental training, there is a device proposed in Japanese Patent Application Laid-Open No. 10-97187. In this evaluation device, first, the uncut shape data obtained by measuring the uncut tooth model and the cut tooth model that was actually formed in the uncut tooth model as a model by the teacher were measured. From the difference with the obtained cut shape data, teacher cavity shape data is obtained. Similarly, the target cavity shape data is obtained from the difference between the uncut shape data and the cut shape data obtained by measuring the cut tooth model in which the student actually formed the cavity in the uncut tooth model as a training. . Then, based on the teacher cavity shape data and the target cavity shape data, the teacher cavity shape and the target cavity shape are superimposed on the display screen. The formed target cavity shape is evaluated.

【0003】[0003]

【発明が解決しようとする課題】しかしながら、この評
価装置では、窩洞を形成した切削済の対象歯牙模型につ
いて対象窩洞形状を評価するものであって、窩洞形成の
過程をリアルタイムで観察し客観的に評価することがで
きない。
However, in this evaluation apparatus, the shape of the target cavity is evaluated on the cut target tooth model in which the cavity is formed, and the process of the cavity formation is observed in real time and objectively evaluated. Can not be evaluated.

【0004】本発明は、実習の成果をリアルタイムで視
覚的に観察し客観的に評価できる医療実習装置及び医療
実習評価方法を提供することを目的とする。
[0004] It is an object of the present invention to provide a medical training apparatus and a medical training evaluation method capable of visually observing the results of the training in real time and objectively evaluating the results.

【0005】[0005]

【課題を解決するための手段】この目的を達成するため
に、本発明の医療実習装置は、(a) 空間上に多角形
を描くように配置された少なくとも3つのターゲットを
有し、医療実習者が取り扱う器具の動きに応じて移動す
る被写体と、(b) 空間上で所定の場所に固定され、
上記少なくとも3つのターゲットを含む被写体の動きを
撮影する第1の撮像手段と、(c) 空間上で上記第1
の撮像手段から離れた場所に固定され、上記第1の撮像
手段と同時に、上記少なくとも3つのターゲットを含む
被写体の動きを撮影する第2の撮像手段と、(d) 上
記第1の撮像手段で撮影された画像を処理し、上記少な
くとも3つのターゲットの位置を表わす第1の二次元座
標データを得る第1の画像処理手段と、(e) 上記第
2の撮像手段で撮影された画像を処理し、上記少なくと
も3つのターゲットの位置を表わす第2の二次元座標デ
ータを得る第2の画像処理手段と、(f) 上記第1と
第2の二次元座標データから、上記少なくとも3つのタ
ーゲットと空間上で対応付けられた特定の位置の三次元
座標データを得る第3の画像処理手段と、(g) 上記
第1及び第2の撮像手段と空間上で対応付けられた実習
対象と、(h) 上記実習対象の画像データを格納した
記憶部と、(i) 上記実習対象の画像データと上記三
次元画像データとをもとに、上記実習対象の画像に上記
特定の位置の動きを重ね合わせた合成画像を得る第4の
画像処理手段とを有することを特徴とする。
In order to achieve this object, a medical training apparatus according to the present invention comprises: (a) at least three targets arranged so as to draw a polygon in space; (B) fixed at a predetermined location in space,
(C) first imaging means for photographing the movement of a subject including the at least three targets;
A second imaging unit fixed to a place distant from the imaging unit, and imaging the movement of a subject including the at least three targets simultaneously with the first imaging unit; and (d) the first imaging unit. First image processing means for processing the captured image to obtain first two-dimensional coordinate data representing the positions of the at least three targets; and (e) processing the image captured by the second imaging means A second image processing means for obtaining second two-dimensional coordinate data representing the positions of the at least three targets; and (f) obtaining the at least three targets from the first and second two-dimensional coordinate data. A third image processing means for obtaining three-dimensional coordinate data of a specific position associated with the space; (g) a training object associated with the first and second imaging means in the space; h) The above facts A storage unit storing image data of the target; and (i) a composite image obtained by superimposing the movement of the specific position on the image of the target based on the image data of the target and the three-dimensional image data. And a fourth image processing means for obtaining

【0006】本発明に係る医療用実習装置の他の形態
は、上記実習対象の画像データが、実習内容に応じた理
想の最終出来形を表わす形状データを含むことを特徴と
する。
Another embodiment of the medical training apparatus according to the present invention is characterized in that the image data of the training target includes shape data representing an ideal final shape according to the training content.

【0007】本発明に係る医療用実習装置の他の形態
は、上記3つのターゲットが異なる色を有することを特
徴とする。
Another embodiment of the medical training apparatus according to the present invention is characterized in that the three targets have different colors.

【0008】本発明に係る医療用実習装置の他の形態
は、上記3つのターゲットが異なる反射率を有すること
を特徴とする。
Another embodiment of the medical training apparatus according to the present invention is characterized in that the three targets have different reflectances.

【0009】本発明に係る医療実習結果の評価方法は、
(a) 実習対象の画像データを記憶する工程と、
(b) 空間上に多角形を描くように配置された少なく
とも3つのターゲットを有し、医療実習者が取り扱う器
具の動きに応じて移動する被写体を用意する工程と、
(c) 上記少なくとも3つのターゲットを含む被写体
の動きを、空間上の所定の場所に固定された第1の撮像
手段で撮影する工程と、(d) 上記第1の撮像手段と
同時に、上記少なくとも3つのターゲットを含む被写体
の動きを、空間上で上記第1の撮像手段から離れて固定
された第2の撮像手段で撮影する工程と、(e) 上記
第1の撮像手段で撮影された画像を処理し、上記少なく
とも3つのターゲットの位置を表わす第1の二次元座標
データを得る第1の画像処理工程と、(f) 上記第2
の撮像手段で撮影された画像を処理し、上記少なくとも
3つのターゲットの位置を表わす第2の二次元座標デー
タを得る第2の画像処理工程と、(g) 上記第1と第
2の二次元座標データから、上記少なくとも3つのター
ゲットと空間上で対応付けられた特定の位置の三次元座
標データを得る第3の画像処理工程と、(h) 上記実
習対象の画像データと上記三次元画像データとをもと
に、上記実習対象の画像に上記特定の位置の動きを重ね
合わせた合成画像を得る第4の画像処理工程とを有する
医療実習結果の評価方法。
[0009] The method of evaluating the results of medical training according to the present invention comprises:
(A) storing image data to be trained;
(B) preparing a subject having at least three targets arranged in a space to draw a polygon and moving in accordance with the movement of the instrument handled by the medical intern;
(C) photographing the movement of the subject including the at least three targets by a first imaging means fixed at a predetermined place in space; and (d) simultaneously with the first imaging means, Capturing a movement of a subject including three targets in space by a second imaging unit fixed away from the first imaging unit; and (e) an image captured by the first imaging unit. A first image processing step of obtaining first two-dimensional coordinate data representing the positions of the at least three targets; and (f) the second image processing step of:
A second image processing step of processing an image photographed by the imaging means to obtain second two-dimensional coordinate data representing the positions of the at least three targets; and (g) the first and second two-dimensional data. A third image processing step of obtaining, from the coordinate data, three-dimensional coordinate data of a specific position associated with the at least three targets in space, and (h) the image data of the training target and the three-dimensional image data And a fourth image processing step of obtaining a composite image obtained by superimposing the movement of the specific position on the image of the training target based on the above.

【0010】[0010]

【発明の作用及び効果】上記構成の医療用実習装置及び
実習結果の評価方法において、第1の撮像手段と第2の
撮像手段は、医療実習中、実習者の取り扱う器具に設け
た被写体(特に少なくとも3つのターゲット)を撮影す
る。第1と第2の撮像手段で撮影された画像はそれぞれ
第1と第2の画像処理手段で処理され、これらの画像に
ついて複数のターゲットの位置を表わす二次元座標デー
タが作成される。第3の画像処理手段は、これら第1と
第2の二次元座標データを対応づけ、基準的に対して空
間的に対応付けられた特定の位置(例えば、歯科切削用
インスツルメントの場合は切削器具)の三次元画像デー
タを得る。他方、実習対象(例えば、歯科実習の場合
は、歯牙)は第1及び第2の撮像手段と空間上で対応付
けられており、その画像データが記憶部に格納されてい
る。第4の画像処理手段は、記憶部に格納されている歯
牙の画像データと、上記特定の位置の三次元座標データ
とを用い、実習対象の画像に特定の位置の動きを重ね合
わせた合成画像を作成する。作成された合成画像は表示
装置に表示され、実習者、指導教官にリアルタイムで提
供される。
In the medical training apparatus and the evaluation method of the training result, the first imaging means and the second imaging means are provided with a subject (particularly, an object provided on a device handled by the trainee during the medical training). At least three targets). Images captured by the first and second imaging units are processed by the first and second image processing units, respectively, and two-dimensional coordinate data representing the positions of a plurality of targets is created for these images. The third image processing means associates the first and second two-dimensional coordinate data with each other, and a specific position spatially associated with the reference (for example, in the case of a dental cutting instrument, 3D image data of the cutting tool is obtained. On the other hand, the training target (for example, teeth in the case of dental training) is associated with the first and second imaging units in space, and the image data is stored in the storage unit. The fourth image processing means uses the image data of the teeth stored in the storage unit and the three-dimensional coordinate data of the specific position, and superimposes the movement of the specific position on the training target image. Create The created composite image is displayed on the display device and provided to the trainee and the supervisor in real time.

【0011】したがって、実習者自身が自己の実習状況
を視覚的に評価しながら実習を行うことができる。ま
た、実習の指導教官は、リアルタイムに表出される表示
装置の合成画像を見て、実習者に適宜アドバイスを与え
ることができる。
Therefore, the trainee can perform the training while visually assessing his / her training situation. Further, the instructor of the training can view the synthesized image of the display device displayed in real time and give advice to the trainee as appropriate.

【0012】また、実習対象の画像データが、実習内容
に応じた理想の最終出来形を表わす形状データを含む場
合、理想の最終出来形と実際の出来形とを視覚的に対比
できるので、実習結果をより客観的に評価できる。
When the image data to be trained includes shape data representing an ideal final work corresponding to the contents of the training, the ideal final work and the actual work can be visually compared. The result can be evaluated more objectively.

【0013】さらに、ターゲットが異なる色、または異
なる反射率を有する場合、これらのターゲットを画像上
で容易に識別できる。
Furthermore, if the targets have different colors or different reflectivities, these targets can be easily identified on the image.

【0014】[0014]

【発明の実施の形態】以下、添付図面を参照して本発明
の実施形態を説明する。なお、各図面において共通する
符号は同一の部材、部分を示す。また、本発明を歯科実
習装置に適用した実施の形態を説明するが、本発明はこ
れに限るものでなく、他の医療実習装置及び実習結果評
価方法にも同様に適用できるものである。
Embodiments of the present invention will be described below with reference to the accompanying drawings. Note that common reference numerals in the drawings indicate the same members and portions. Further, an embodiment in which the present invention is applied to a dental training apparatus will be described. However, the present invention is not limited to this, and can be similarly applied to other medical training apparatuses and training result evaluation methods.

【0015】図1は、歯科治療実習装置10の全体を示
し、この実習装置10は、実習者(例えば、歯学部の学
生、歯科治療実務者)12が目的の歯科実習を行うため
に必要な装置を備えている。具体的に、実習装置10は
実習用診療台14を備えている。実習用診療台14は、
昇降式の基台16と、基台16の上部に固定されたテー
ブル18と、テーブル18又はその他の場所に連結され
たインスツルメントホルダ20を備えている。また、テ
ーブル18の端部には人体頭部模型22が固定されてお
り、この人体頭部模型22に歯列模型24(図3参照)
が含まれている。テーブル18の一側部には支柱26が
配置されている。この支柱26は、その下端部がテーブ
ル18又はテーブル18と共に昇降する基台16の上部
に固定されている。支柱26は、テーブル18の上方で
頭部模型22の上方を横切るように、水平方向に伸びる
水平支持部28を有する。
FIG. 1 shows the entirety of a dental treatment training device 10. The training device 10 is a device necessary for a trainee (for example, a student of a dental school, a dental treatment practitioner) 12 to perform a target dental training. It has. Specifically, the training device 10 includes a training table 14. The training table 14 is
It includes an elevating base 16, a table 18 fixed on the base 16, and an instrument holder 20 connected to the table 18 or another place. A human head model 22 is fixed to an end of the table 18, and the human head model 22 is attached to a tooth model 24 (see FIG. 3).
It is included. A column 26 is arranged on one side of the table 18. The support 26 has its lower end fixed to the table 18 or to the upper portion of the base 16 which moves up and down together with the table 18. The support 26 has a horizontal support 28 extending in the horizontal direction so as to cross over the head model 22 above the table 18.

【0016】水平支持部28は、頭部模型22のほぼ口
部を鉛直方向に伸びる鉛直垂下線30を挟む両側にそれ
ぞれCCDカメラ等の撮像カメラ32A、32Bを有す
る。これら撮像カメラ32A、32Bは、それぞれ歯列
模型24に焦点が合わしてある。なお、これら撮影カメ
ラ32A、32Bはテーブル18の昇降と共に昇降する
ので、テーブル18を昇降しても撮像カメラ32A、3
2Bと歯列模型24との距離が変化したり、焦点が外れ
ることはない。水平支持部28にはまた、撮像カメラ3
2A、32Bの撮影位置を照明するために、本実施形態
では2つの照明ランプ36A、36Bが固定されてい
る。
The horizontal support 28 has imaging cameras 32A and 32B such as CCD cameras on both sides of a vertical underline 30 extending substantially vertically through the mouth of the head model 22. These imaging cameras 32A and 32B are each focused on the dental arch model 24. It should be noted that these photographing cameras 32A and 32B move up and down as the table 18 moves up and down.
The distance between 2B and the dental arch model 24 does not change or the focus is not lost. The horizontal support 28 also includes the imaging camera 3.
In this embodiment, two illumination lamps 36A and 36B are fixed in order to illuminate the shooting positions 2A and 32B.

【0017】実習者12が歯科実習に用いるインスツル
メント40には、図2に示すように、撮像カメラ32
A、32Bで撮影する対象物すなわち被写体42が固定
されている。被写体42は、空間上で三角形を描くよう
に配置された3つのターゲット44A、44B、44C
を有する。これらのターゲット44A、44B、44C
は、3本のアーム46A、46B、46Cの先端部に固
定されている。アーム46A、46B、46Cの基端部
は、インスツルメント40の基端側に固定されたリング
48に連結されている。このように、3つのターゲット
44A、44B、44Cがインスツルメント40に対し
て固定されていることにより、これら3つのターゲット
44A、44B、44Cとインスツルメント40の先端
部に装着されている切削器具50との空間的位置関係が
一義的に定まり、3つのターゲット44A、44B、4
4Cの空間上の位置が決まれば、切削器具50の空間上
の位置が決定できる。また、図3に示すように、撮像カ
メラ32A、32Bに対して、治療実習対象となる歯牙
52の空間上の位置関係を求めておき、この状態で撮像
カメラ32A、32Bに対する3つのターゲット44
A、44B、44Cの空間位置さらに切削器具50の空
間位置を求めれば、歯牙52に対する切削器具50の係
り状態(すなわち、切削状況)を推定できる。
The instrument 40 used by the trainee 12 for dental training includes an imaging camera 32 as shown in FIG.
A target to be photographed in A and 32B, that is, a subject 42 is fixed. The subject 42 has three targets 44A, 44B, and 44C arranged so as to draw a triangle in space.
Having. These targets 44A, 44B, 44C
Are fixed to the distal ends of the three arms 46A, 46B, 46C. The proximal ends of the arms 46A, 46B, 46C are connected to a ring 48 fixed to the proximal side of the instrument 40. Since the three targets 44A, 44B, and 44C are fixed to the instrument 40 in this manner, the three targets 44A, 44B, and 44C and the cutting attached to the distal end of the instrument 40 The spatial positional relationship with the device 50 is uniquely determined, and the three targets 44A, 44B, 4
If the position in the space of 4C is determined, the position of the cutting tool 50 in the space can be determined. Further, as shown in FIG. 3, a spatial positional relationship of a tooth 52 to be subjected to treatment training is obtained for the imaging cameras 32A and 32B, and in this state, three targets 44 for the imaging cameras 32A and 32B are obtained.
If the spatial positions of A, 44B, and 44C and the spatial position of cutting tool 50 are determined, the state of engagement of cutting tool 50 with tooth 52 (ie, the cutting state) can be estimated.

【0018】なお、ターゲット44A、44B、44C
は、撮像カメラ32A、32Bで撮影したときに区別で
きるように、例えば、赤、青、緑の異なる色で着色する
のが好ましい。また、ターゲット44A、44B、44
Cそのものを、異なる色を発するランプ又は発光素子で
形成してもよい。この場合、ランプ等を発光させるため
に、これらランプ等を電源に接続する必要がある。リン
グ48は、インスツルメント40に回転不能に固定して
もよいが、例えば公知の回り止め機構(ディテント機
構)を設け、インスツルメント40の長軸を中心とし
て、所定角度(例えば、15度、30度、45度)ごと
に位置を変えて固定できるようにするのが好ましい。
The targets 44A, 44B, 44C
For example, it is preferable to color the image with different colors such as red, blue, and green so that the images can be distinguished when photographed by the imaging cameras 32A and 32B. The targets 44A, 44B, 44
C itself may be formed of a lamp or a light emitting element that emits a different color. In this case, it is necessary to connect these lamps and the like to a power source in order to emit the lamps and the like. The ring 48 may be non-rotatably fixed to the instrument 40. For example, a known detent mechanism (detent mechanism) is provided, and a predetermined angle (for example, 15 degrees) about the long axis of the instrument 40 is provided. , 30 degrees, 45 degrees).

【0019】切削器具50による歯牙52の切削状況を
視覚的に捉えるために、図4に示すデータ処理装置60
が用意されている。データ処理装置60は、演算処理部
(マイクロコンピュータ)62を有する。演算処理部6
2には種々の入力部64が接続されている。これら入力
部64には、キーボード、マウスの他に、この制御系に
通信ケーブル等の伝送系を介して接続された他のコンピ
ュータも含まれる。これらキーボード等から入力される
データには、被写体42の3つのターゲット44A、4
4B、44Cの空間位置関係(例えば、3つのターゲッ
ト44A、44B、44C間の距離等)、3つのターゲ
ット44A、44B、44Cと切削器具50との空間位
置関係、切削器具50の外観形状(形状データ)、空間
上の固定された基準点を基準とした撮像カメラ32A、
32Bの受像面の空間座標・撮像カメラ32A、32B
の光軸座標(方向)、上記基準点に対する治療実習対象
歯牙52の空間座標及び外観形状(形状データ)、被写
体42がインスツルメント40に対して回転できる形態
の場合は被写体42の回転角が含まれる(図5の工程#
1)。これら入力部64から入力された情報はROM6
6に記憶され、後に説明する演算処理に利用される。
In order to visually grasp the cutting condition of the tooth 52 by the cutting tool 50, a data processing device 60 shown in FIG.
Is prepared. The data processing device 60 has an arithmetic processing unit (microcomputer) 62. Arithmetic processing unit 6
Various input units 64 are connected to 2. The input unit 64 includes, besides a keyboard and a mouse, other computers connected to the control system via a transmission system such as a communication cable. Data input from these keyboards and the like include three targets 44A,
4B, 44C (for example, the distance between the three targets 44A, 44B, 44C), the spatial relationship between the three targets 44A, 44B, 44C and the cutting tool 50, and the external shape (shape) of the cutting tool 50 Data), an imaging camera 32A based on a fixed reference point in space,
Spatial coordinates of image receiving surface of 32B / imaging cameras 32A, 32B
The optical axis coordinates (direction), the spatial coordinates and the external shape (shape data) of the treatment training target tooth 52 with respect to the reference point, and the rotation angle of the subject 42 when the subject 42 can be rotated with respect to the instrument 40. Included (step # in FIG. 5)
1). The information input from the input unit 64 is stored in the ROM 6
6 and used for arithmetic processing described later.

【0020】実習時、図1から図3に示すように、実習
者12が手に握ったインスツルメント40に固定されて
いる3つのターゲット44A、44B、44Cを撮影す
る(図5の工程#2)。撮影された情報は、撮像カメラ
32A、32Bからそれぞれ対応するビデオキャプチャ
ーボード(VCB)68A、68Bに出力される。本実
施形態では、VCB68A、68Bは、撮像カメラ32
A、32Bの出力をデジタル画像データに変換する(図
5の工程#3)。また、VCB68A、68Bは、この
デジタル画像データを二値化し(図5の工程#4)、対
応する撮像カメラ32A、32Bで撮影された3つのタ
ーゲット44A、44B、44Cの画像データをそれぞ
れ作成する。ここで作成されたターゲット44A、44
B、44Cの画像データは、演算処理部62を介してR
OM66に出力され、そこに記憶される。
During the training, as shown in FIGS. 1 to 3, the trainee 12 photographs three targets 44A, 44B, and 44C fixed to the instrument 40 held by the hand (step # in FIG. 5). 2). The captured information is output from the imaging cameras 32A and 32B to the corresponding video capture boards (VCB) 68A and 68B, respectively. In the present embodiment, the VCBs 68A and 68B are
The outputs of A and 32B are converted into digital image data (step # 3 in FIG. 5). The VCBs 68A and 68B binarize the digital image data (step # 4 in FIG. 5) and create image data of the three targets 44A, 44B and 44C captured by the corresponding imaging cameras 32A and 32B, respectively. . Targets 44A and 44 created here
The image data of B and 44C are transferred to the R
Output to the OM 66 and stored there.

【0021】各ターゲット44A、44B、44Cの画
像データをそれ以外の画像データから区別して抽出する
方法には色々ある。例えば、各ターゲット44A、44
B、44Cに固有の色(例えば、赤、青、緑)が着色さ
れている場合、また、各ターゲット44A、44B、4
4Cに固有の色を発光するランプ等が使用されている場
合、適当な閾値を設けてデジタル化された画像データの
中から特定の色の画像データだけを抽出する。これは、
一般的に、カラー複写機などで採用されている手法と同
様である。また、各ターゲット44A、44B、44C
が固有の外観を有する場合、その外観を画像データの中
から認識することで各ターゲット44A、44B、44
Cの画像データだけを抽出することができる。さらに、
外観(色、形状)を同一に保ち、反射効率の異なる材料
でターゲット表面を被覆し、撮像カメラ32A、32B
で撮影された各ターゲット44A、44B、44Cの画
像濃度の違いから各ターゲット44A、44B、44C
の画像データだけを抽出することも可能である。
There are various methods for extracting the image data of each of the targets 44A, 44B and 44C while distinguishing them from the other image data. For example, each target 44A, 44
B and 44C are colored with a unique color (for example, red, blue, and green), and each target 44A, 44B, 4
When a lamp or the like that emits a color unique to 4C is used, an appropriate threshold value is provided to extract only image data of a specific color from digitized image data. this is,
In general, it is the same as the method used in a color copying machine or the like. In addition, each target 44A, 44B, 44C
Has a unique appearance, each target 44A, 44B, 44 is recognized by recognizing the appearance from the image data.
Only the image data of C can be extracted. further,
The target surfaces are covered with materials having different reflection efficiencies while maintaining the same appearance (color and shape), and the imaging cameras 32A and 32B
The differences between the image densities of the targets 44A, 44B, and 44C photographed in the steps (A), (B), and (C) indicate
It is also possible to extract only the image data of.

【0022】ROM66に記憶されたターゲット44
A、44B、44Cの画像データは演算回路70に読み
出され、RAM72に記憶されている処理プログラムに
したがって処理される(図5の工程#5)。具体的に、
演算回路70では、撮像カメラ32Aで撮影された3つ
のターゲット44A、44B、44Cの画像データをも
とに、これら3つのターゲット44A、44B、44C
のそれぞれについて、この撮像カメラ32Aの画面上に
おける位置を示す二次元座標(位置データA)を演算す
る。同様に、演算回路70は、撮像カメラ32Bで撮影
された3つのターゲット44A、44B、44Cの画像
データをもとに、これら3つのターゲット44A、44
B、44Cのそれぞれについて、この撮像カメラ32B
の画面上における位置を示す二次元座標(位置データ
B)を演算する。
The target 44 stored in the ROM 66
The image data of A, 44B, and 44C is read by the arithmetic circuit 70 and processed according to the processing program stored in the RAM 72 (Step # 5 in FIG. 5). Specifically,
In the arithmetic circuit 70, based on the image data of the three targets 44A, 44B, 44C photographed by the imaging camera 32A, these three targets 44A, 44B, 44C
, Two-dimensional coordinates (position data A) indicating the position on the screen of the imaging camera 32A are calculated. Similarly, the arithmetic circuit 70 determines the three targets 44A, 44A, 44C based on the image data of the three targets 44A, 44B, 44C captured by the imaging camera 32B.
B and 44C, the imaging camera 32B
2D coordinates (position data B) indicating the position on the screen are calculated.

【0023】次に、演算回路70は、それぞれの撮像カ
メラ32A、32Bについて得られた3つのターゲット
44A、44B、44Cの位置データA、Bを関連づけ
る(図5の工程#6)。
Next, the arithmetic circuit 70 associates the position data A and B of the three targets 44A, 44B and 44C obtained for the respective imaging cameras 32A and 32B (step # 6 in FIG. 5).

【0024】続いて、演算回路70は、上述のように入
力部64から入力されている初期条件(例えば、撮像カ
メラ32A、32Bの受像面の空間座標・撮像カメラ3
2A、32Bの光軸座標)と位置データA、Bとを用い
て、3つのターゲット44A、44B、44Cの空間座
標(x’、y’、z’)を求める(図5の工程#7)。
Subsequently, the arithmetic circuit 70 calculates the initial conditions (for example, the spatial coordinates of the image receiving surfaces of the imaging cameras 32A and 32B and the imaging camera 3) input from the input unit 64 as described above.
The spatial coordinates (x ′, y ′, z ′) of the three targets 44A, 44B, 44C are obtained using the position data A, B (optical axis coordinates of 2A, 32B) (step # 7 in FIG. 5). .

【0025】演算回路70はまた、この空間座標をもと
に、3つのターゲット44A、44B、44Cと一定の
空間的位置関係に置かれている切削器具50の特定の点
の空間ベクトル(x、y、z、α、β、γ)を求める
(図5の工程#8)。この空間ベクトルは、初期入力さ
れた切削器具50の形状データと共に、切削器具50の
外形が占有している空間を求める際に利用される。
The arithmetic circuit 70 also calculates, based on the spatial coordinates, a spatial vector (x, x) of a specific point of the cutting tool 50 placed in a fixed spatial positional relationship with the three targets 44A, 44B, 44C. y, z, α, β, γ) (Step # 8 in FIG. 5). The space vector is used together with the initially input shape data of the cutting tool 50 to determine the space occupied by the outer shape of the cutting tool 50.

【0026】なお、これら3つのターゲット44A、4
4B、44Cと切削器具50の空間座標は、空間上の特
定の固定点を原点とした値で、この原点は任意に決定で
きる。
Note that these three targets 44A, 4A
The spatial coordinates of 4B, 44C and the cutting tool 50 are values with a specific fixed point in space as the origin, and the origin can be arbitrarily determined.

【0027】演算回路70はさらに、切削器具50の空
間ベクトルと、切削器具50の形状データを用い、切削
器具50の移動軌跡を表わす軌跡データを求める(図5
の工程#9)。なお、この移動軌跡は点軌跡ではなく、
空間上に一定の体積を占める空間軌跡である。
The arithmetic circuit 70 further uses the space vector of the cutting tool 50 and the shape data of the cutting tool 50 to obtain trajectory data representing the trajectory of the cutting tool 50 (FIG. 5).
Step # 9). In addition, this movement locus is not a point locus,
It is a space locus occupying a certain volume in space.

【0028】この軌跡データは、切削実習対象の歯牙5
2の形状データと共にグラフィック化回路74に送信さ
れる。グラフィック化回路74は、歯牙52の形状デー
タをもとに、適当なディスプレイ76上に、歯牙52の
形状を表出する。また、グラフィック化回路74は、軌
跡データと形状データとを重ね合わせ、切削器具50と
歯牙52の空間的干渉領域、すなわち切削器具50で切
削された歯牙部分を求め、その切削部分80をディスプ
レイ76に表出された歯牙イメージに重ねる(図5の工
程#10)。
This trajectory data is stored in the tooth 5 for cutting training.
The data is transmitted to the graphics circuit 74 together with the shape data of the second. The graphic circuit 74 displays the shape of the tooth 52 on an appropriate display 76 based on the shape data of the tooth 52. The graphic circuit 74 superimposes the trajectory data and the shape data to obtain a spatial interference region between the cutting tool 50 and the tooth 52, that is, a tooth portion cut by the cutting tool 50, and displays the cut portion 80 on the display 76. (Step # 10 in FIG. 5).

【0029】例えば、図6(a)、(b)、(c)は、
ディスプレイ76に表出される歯牙52の平面、B−B
断面、C−C断面を示し、表出された歯牙の形状中に、
切削部分80が表示される。なお、図6は、歯牙52及
び切削部分80を二次元的にしか表示していないが、こ
れらを三次元的に表示することも当然可能である。な
お、歯牙52の形状データには、実習内容に応じた理想
的な切削形状のデータを含めておき、この理想切削形状
を図6において点線82で示すように同時に表出しても
よい。この場合、理想切削形状82と実際に出来あがっ
た切削形状とを視覚的に比較対比できるので、評価の客
観性が向上する。
For example, FIGS. 6 (a), 6 (b) and 6 (c)
The plane of the tooth 52 displayed on the display 76, BB
Shows the cross section, C-C cross section, in the shape of the tooth that was exposed,
The cutting portion 80 is displayed. Although FIG. 6 shows only the teeth 52 and the cut portions 80 two-dimensionally, it is of course possible to display them three-dimensionally. The shape data of the tooth 52 may include data of an ideal cutting shape according to the contents of the training, and the ideal cutting shape may be simultaneously expressed as shown by a dotted line 82 in FIG. In this case, since the ideal cut shape 82 and the actually cut shape can be visually compared and compared, the objectivity of the evaluation is improved.

【0030】また、歯牙52の切削部分80は、切削実
習の進行と共にリアルタイムで表示される。したがっ
て、例えば、ディスプレイ76を実習者の見える位置に
配置しておけば、実習者自身が自己の実習状況を視覚的
に評価しながら実習を行うことができる。また、ディス
プレイ76を指導教官が見えるように配置しておけば、
指導教官は実習者の実習状況を見て、その実習者に適切
なアドバイスを与えることができる。
The cutting portion 80 of the tooth 52 is displayed in real time as the cutting training progresses. Therefore, for example, if the display 76 is arranged at a position where the trainee can see, the trainee can perform the training while visually evaluating his or her training situation. If the display 76 is placed so that the supervisor can see it,
The supervisor can look at the training status of the trainee and give appropriate advice to the trainee.

【0031】なお、上述した実施形態では、3つのター
ゲット44A、44B、44Cを撮影対象としたが、空
間上で多角形を描くことができれば、それ以上のターゲ
ットを利用することは問題ない。ターゲットはまた、空
間上で三角形以上の多角形を描くことができ、かつ、複
数のターゲットが多角形を描き得る状態で2つの撮像カ
メラにより同時に撮影できれば、インスツルメント40
の外面に直接取り付けてもよい。
In the above-described embodiment, three targets 44A, 44B, and 44C are photographed. However, as long as a polygon can be drawn in space, there is no problem in using more targets. If the target can draw a polygon of a triangle or more in space, and if two or more imaging cameras can simultaneously shoot in a state where a plurality of targets can draw a polygon, the instrument 40
It may be attached directly to the outer surface of the.

【0032】また、撮像カメラ32A、32Bから出力
された信号を処理して二値の画像データを作成する装置
として、ビデオキャプチャーボード68A、68Bを利
用したが、これに代えて、ビデオプロセッサ、デジタル
シグナルプロセッサを利用してもよい。
The video capture boards 68A and 68B are used as devices for processing the signals output from the imaging cameras 32A and 32B to generate binary image data. A signal processor may be used.

【図面の簡単な説明】[Brief description of the drawings]

【図1】 本発明に係る医療実習装置の全体を示す斜視
図。
FIG. 1 is a perspective view showing an entire medical training apparatus according to the present invention.

【図2】 実習者がインスツルメントを把持した状態及
び被写体を示す斜視図。
FIG. 2 is a perspective view showing a state where a trainee holds an instrument and a subject.

【図3】 歯列模型、インスツルメント、被写体、撮像
カメラの位置関係を模式的に表わした図。
FIG. 3 is a diagram schematically illustrating a positional relationship among a dental arch model, an instrument, a subject, and an imaging camera.

【図4】 医療実習装置に含まれるデータ処理装置の回
路ブロック図。
FIG. 4 is a circuit block diagram of a data processing device included in the medical training device.

【図5】 データ処理装置の処理を説明するフローチャ
ート。
FIG. 5 is a flowchart illustrating processing of the data processing device.

【図6】 ディスプレイに表出される歯牙を表わした図
で、図6(a)は歯牙の平面図、図6(b)はA−A断
面図、図6(c)はC−C断面図。
6 (a) is a plan view of the tooth, FIG. 6 (b) is a sectional view taken along line AA, and FIG. 6 (c) is a sectional view taken along line CC. .

【符号の説明】 12…実習装置、12…実習者、32A、32B…撮像
カメラ、40…インスツルメント、42…被写体、44
A、44B…ターゲット、50…切削器具、52…歯
牙、70…演算回路、80…切削部分、82…理想切削
形状。
[Description of Signs] 12: Training device, 12: Trainee, 32A, 32B: Imaging camera, 40: Instrument, 42: Subject, 44
A, 44B: target, 50: cutting tool, 52: tooth, 70: arithmetic circuit, 80: cutting part, 82: ideal cutting shape.

───────────────────────────────────────────────────── フロントページの続き (72)発明者 中山 照三 京都府京都市伏見区東浜南町680番地 株 式会社モリタ製作所内 Fターム(参考) 2C028 AA10 BB01 CA13 2C032 CA12 4C052 AA20 LL02 LL07 NN01 NN11 NN15  ────────────────────────────────────────────────── ─── Continuing on the front page (72) Inventor Teruzo Nakayama 680 Higashihama-machi, Fushimi-ku, Kyoto-shi, Kyoto F-term in Morita Manufacturing Co., Ltd. (reference) 2C028 AA10 BB01 CA13 2C032 CA12 4C052 AA20 LL02 LL07 NN01 NN11 NN15

Claims (4)

【特許請求の範囲】[Claims] 【請求項1】 (a) 空間上に多角形を描くように配
置された少なくとも3つのターゲットを有し、医療実習
者が取り扱う器具の動きに応じて移動する被写体と、
(b) 空間上で所定の場所に固定され、上記少なくと
も3つのターゲットを含む被写体の動きを撮影する第1
の撮像手段と、(c) 空間上で上記第1の撮像手段か
ら離れた場所に固定され、上記第1の撮像手段と同時
に、上記少なくとも3つのターゲットを含む被写体の動
きを撮影する第2の撮像手段と、(d) 上記第1の撮
像手段で撮影された画像を処理し、上記少なくとも3つ
のターゲットの位置を表わす第1の二次元座標データを
得る第1の画像処理手段と、(e) 上記第2の撮像手
段で撮影された画像を処理し、上記少なくとも3つのタ
ーゲットの位置を表わす第2の二次元座標データを得る
第2の画像処理手段と、(f) 上記第1と第2の二次
元座標データから、上記少なくとも3つのターゲットと
空間上で対応付けられた特定の位置の三次元座標データ
を得る第3の画像処理手段と、(g) 上記第1及び第
2の撮像手段と空間上で対応付けられた実習対象と、
(h) 上記実習対象の画像データを格納した記憶部
と、(i) 上記実習対象の画像データと上記三次元画
像データとをもとに、上記実習対象の画像に上記特定の
位置の動きを重ね合わせた合成画像を得る第4の画像処
理手段と、(j) 上記合成画像を表示する表示装置と
を有する医療用実習装置。
(A) a subject having at least three targets arranged in a space to draw a polygon and moving in accordance with the movement of an instrument handled by a medical practitioner;
(B) a first image capturing a movement of a subject including the at least three targets, which is fixed at a predetermined place in a space;
And (c) a second camera which is fixed in space at a position distant from the first imager, and simultaneously captures the movement of a subject including the at least three targets simultaneously with the first imager. (E) first image processing means for processing an image taken by the first imaging means to obtain first two-dimensional coordinate data representing the positions of the at least three targets; (e) (F) second image processing means for processing an image photographed by said second imaging means to obtain second two-dimensional coordinate data representing the positions of said at least three targets; and (f) said first and second image processing means. A third image processing means for obtaining three-dimensional coordinate data of a specific position associated with the at least three targets in space from the two two-dimensional coordinate data, and (g) the first and second imagings In terms of means and space And training target attached,
(H) a storage unit that stores the image data of the training object; and (i) the movement of the specific position in the image of the training object based on the image data of the training object and the three-dimensional image data. A medical training apparatus comprising: fourth image processing means for obtaining a superimposed composite image; and (j) a display device for displaying the composite image.
【請求項2】 上記実習対象の画像データが、実習内容
に応じた理想の最終出来形を表わす形状データを含むこ
とを特徴とする請求請1の実習装置。
2. The training apparatus according to claim 1, wherein the image data of the training target includes shape data representing an ideal final product according to the training content.
【請求項3】 上記3つのターゲットが異なる色を有す
ることを特徴とする請求項1又は2のいずれかの実習装
置。
3. The training apparatus according to claim 1, wherein the three targets have different colors.
【請求項4】 上記3つのターゲットが異なる反射率を
有することを特徴とする請求項1又は2のいずれかの実
習装置。 【請求請5】 (a) 実習対象の画像データを記憶す
る工程と、(b) 空間上に多角形を描くように配置さ
れた少なくとも3つのターゲットを有し、医療実習者が
取り扱う器具の動きに応じて移動する被写体を用意する
工程と、(c) 上記少なくとも3つのターゲットを含
む被写体の動きを、空間上の所定の場所に固定された第
1の撮像手段で撮影する工程と、(d) 上記第1の撮
像手段と同時に、上記少なくとも3つのターゲットを含
む被写体の動きを、空間上で上記第1の撮像手段から離
れて固定された第2の撮像手段で撮影する工程と、
(e) 上記第1の撮像手段で撮影された画像を処理
し、上記少なくとも3つのターゲットの位置を表わす第
1の二次元座標データを得る第1の画像処理工程と、
(f) 上記第2の撮像手段で撮影された画像を処理
し、上記少なくとも3つのターゲットの位置を表わす第
2の二次元座標データを得る第2の画像処理工程と、
(g) 上記第1と第2の二次元座標データから、上記
少なくとも3つのターゲットと空間上で対応付けられた
特定の位置の三次元座標データを得る第3の画像処理工
程と、(h) 上記実習対象の画像データと上記三次元
画像データとをもとに、上記実習対象の画像に上記特定
の位置の動きを重ね合わせた合成画像を得る第4の画像
処理工程、(i) 上記合成画像を表示する工程とを有
する医療実習結果の評価方法。
4. The training apparatus according to claim 1, wherein the three targets have different reflectivities. (A) a step of storing image data of a training subject; and (b) a movement of an instrument having at least three targets arranged in a space to draw a polygon and handled by a medical trainee. (C) photographing the movement of the object including the at least three targets by a first imaging means fixed at a predetermined location in space; (d) Simultaneously photographing the movement of the subject including the at least three targets with a second imaging unit fixed in a space away from the first imaging unit, simultaneously with the first imaging unit;
(E) a first image processing step of processing an image captured by the first imaging means to obtain first two-dimensional coordinate data representing the positions of the at least three targets;
(F) a second image processing step of processing an image captured by the second imaging means to obtain second two-dimensional coordinate data representing the positions of the at least three targets;
(G) a third image processing step of obtaining, from the first and second two-dimensional coordinate data, three-dimensional coordinate data of a specific position associated with the at least three targets in space; (h) A fourth image processing step of obtaining a composite image obtained by superimposing the movement of the specific position on the image of the training target based on the image data of the training target and the three-dimensional image data; (i) the combining Displaying an image.
JP10521799A 1999-04-13 1999-04-13 Medical training device and evaluation method of medical training result Expired - Fee Related JP3790638B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP10521799A JP3790638B2 (en) 1999-04-13 1999-04-13 Medical training device and evaluation method of medical training result

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP10521799A JP3790638B2 (en) 1999-04-13 1999-04-13 Medical training device and evaluation method of medical training result

Publications (2)

Publication Number Publication Date
JP2000298427A true JP2000298427A (en) 2000-10-24
JP3790638B2 JP3790638B2 (en) 2006-06-28

Family

ID=14401513

Family Applications (1)

Application Number Title Priority Date Filing Date
JP10521799A Expired - Fee Related JP3790638B2 (en) 1999-04-13 1999-04-13 Medical training device and evaluation method of medical training result

Country Status (1)

Country Link
JP (1) JP3790638B2 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010014773A1 (en) 2009-04-14 2010-11-25 J. Morita Manufacturing Corporation Medical cutting device and medical cutting device
WO2014074297A1 (en) * 2012-11-09 2014-05-15 Illinois Tool Works Inc. Systems and device for welding training comprising different markers
US9101994B2 (en) 2011-08-10 2015-08-11 Illinois Tool Works Inc. System and device for welding training
US9352411B2 (en) 2008-05-28 2016-05-31 Illinois Tool Works Inc. Welding training system
US9511443B2 (en) 2012-02-10 2016-12-06 Illinois Tool Works Inc. Helmet-integrated weld travel speed sensing system and method
US9583014B2 (en) 2012-11-09 2017-02-28 Illinois Tool Works Inc. System and device for welding training
US9583023B2 (en) 2013-03-15 2017-02-28 Illinois Tool Works Inc. Welding torch for a welding training system
US9589481B2 (en) 2014-01-07 2017-03-07 Illinois Tool Works Inc. Welding software for detection and control of devices and for analysis of data
US9666100B2 (en) 2013-03-15 2017-05-30 Illinois Tool Works Inc. Calibration devices for a welding training system
US9672757B2 (en) 2013-03-15 2017-06-06 Illinois Tool Works Inc. Multi-mode software and method for a welding training system
US9713852B2 (en) 2013-03-15 2017-07-25 Illinois Tool Works Inc. Welding training systems and devices
US9724788B2 (en) 2014-01-07 2017-08-08 Illinois Tool Works Inc. Electrical assemblies for a welding system
US9728103B2 (en) 2013-03-15 2017-08-08 Illinois Tool Works Inc. Data storage and analysis for a welding training system
US9724787B2 (en) 2014-08-07 2017-08-08 Illinois Tool Works Inc. System and method of monitoring a welding environment
US9751149B2 (en) 2014-01-07 2017-09-05 Illinois Tool Works Inc. Welding stand for a welding system
US9757819B2 (en) 2014-01-07 2017-09-12 Illinois Tool Works Inc. Calibration tool and method for a welding system
US9862049B2 (en) 2014-06-27 2018-01-09 Illinois Tool Works Inc. System and method of welding system operator identification
US9875665B2 (en) 2014-08-18 2018-01-23 Illinois Tool Works Inc. Weld training system and method
US9937578B2 (en) 2014-06-27 2018-04-10 Illinois Tool Works Inc. System and method for remote welding training
US10056010B2 (en) 2013-12-03 2018-08-21 Illinois Tool Works Inc. Systems and methods for a weld training system
US10105782B2 (en) 2014-01-07 2018-10-23 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US10170019B2 (en) 2014-01-07 2019-01-01 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US10204406B2 (en) 2014-11-05 2019-02-12 Illinois Tool Works Inc. System and method of controlling welding system camera exposure and marker illumination
US10210773B2 (en) 2014-11-05 2019-02-19 Illinois Tool Works Inc. System and method for welding torch display
US10239147B2 (en) 2014-10-16 2019-03-26 Illinois Tool Works Inc. Sensor-based power controls for a welding system
US10307853B2 (en) 2014-06-27 2019-06-04 Illinois Tool Works Inc. System and method for managing welding data
US10373517B2 (en) 2015-08-12 2019-08-06 Illinois Tool Works Inc. Simulation stick welding electrode holder systems and methods
US10373304B2 (en) 2014-11-05 2019-08-06 Illinois Tool Works Inc. System and method of arranging welding device markers
US10402959B2 (en) 2014-11-05 2019-09-03 Illinois Tool Works Inc. System and method of active torch marker control
US10417934B2 (en) 2014-11-05 2019-09-17 Illinois Tool Works Inc. System and method of reviewing weld data
US10427239B2 (en) 2015-04-02 2019-10-01 Illinois Tool Works Inc. Systems and methods for tracking weld training arc parameters
US10438505B2 (en) 2015-08-12 2019-10-08 Illinois Tool Works Welding training system interface
US10490098B2 (en) 2014-11-05 2019-11-26 Illinois Tool Works Inc. System and method of recording multi-run data
US10593230B2 (en) 2015-08-12 2020-03-17 Illinois Tool Works Inc. Stick welding electrode holder systems and methods
US10657839B2 (en) 2015-08-12 2020-05-19 Illinois Tool Works Inc. Stick welding electrode holders with real-time feedback features
US10665128B2 (en) 2014-06-27 2020-05-26 Illinois Tool Works Inc. System and method of monitoring welding information
US11014183B2 (en) 2014-08-07 2021-05-25 Illinois Tool Works Inc. System and method of marking a welding workpiece
US11090753B2 (en) 2013-06-21 2021-08-17 Illinois Tool Works Inc. System and method for determining weld travel speed
US11247289B2 (en) 2014-10-16 2022-02-15 Illinois Tool Works Inc. Remote power supply parameter adjustment
US11288978B2 (en) 2019-07-22 2022-03-29 Illinois Tool Works Inc. Gas tungsten arc welding training systems
US11776423B2 (en) 2019-07-22 2023-10-03 Illinois Tool Works Inc. Connection boxes for gas tungsten arc welding training systems

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4908457B2 (en) * 2008-06-03 2012-04-04 醫百科技股▲ふん▼有限公司 Dental clinical and educational training simulation tracking system and its evaluation method

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10748442B2 (en) 2008-05-28 2020-08-18 Illinois Tool Works Inc. Welding training system
US9352411B2 (en) 2008-05-28 2016-05-31 Illinois Tool Works Inc. Welding training system
US11423800B2 (en) 2008-05-28 2022-08-23 Illinois Tool Works Inc. Welding training system
US11749133B2 (en) 2008-05-28 2023-09-05 Illinois Tool Works Inc. Welding training system
DE102010014773A1 (en) 2009-04-14 2010-11-25 J. Morita Manufacturing Corporation Medical cutting device and medical cutting device
US10096268B2 (en) 2011-08-10 2018-10-09 Illinois Tool Works Inc. System and device for welding training
US9101994B2 (en) 2011-08-10 2015-08-11 Illinois Tool Works Inc. System and device for welding training
US9511443B2 (en) 2012-02-10 2016-12-06 Illinois Tool Works Inc. Helmet-integrated weld travel speed sensing system and method
US10596650B2 (en) 2012-02-10 2020-03-24 Illinois Tool Works Inc. Helmet-integrated weld travel speed sensing system and method
US11590596B2 (en) 2012-02-10 2023-02-28 Illinois Tool Works Inc. Helmet-integrated weld travel speed sensing system and method
US11612949B2 (en) 2012-02-10 2023-03-28 Illinois Tool Works Inc. Optical-based weld travel speed sensing system
US9522437B2 (en) 2012-02-10 2016-12-20 Illinois Tool Works Inc. Optical-based weld travel speed sensing system
US10417935B2 (en) 2012-11-09 2019-09-17 Illinois Tool Works Inc. System and device for welding training
WO2014074297A1 (en) * 2012-11-09 2014-05-15 Illinois Tool Works Inc. Systems and device for welding training comprising different markers
US9368045B2 (en) 2012-11-09 2016-06-14 Illinois Tool Works Inc. System and device for welding training
US9583014B2 (en) 2012-11-09 2017-02-28 Illinois Tool Works Inc. System and device for welding training
US9672757B2 (en) 2013-03-15 2017-06-06 Illinois Tool Works Inc. Multi-mode software and method for a welding training system
US10482788B2 (en) 2013-03-15 2019-11-19 Illinois Tool Works Inc. Welding torch for a welding training system
US9728103B2 (en) 2013-03-15 2017-08-08 Illinois Tool Works Inc. Data storage and analysis for a welding training system
US9713852B2 (en) 2013-03-15 2017-07-25 Illinois Tool Works Inc. Welding training systems and devices
US9583023B2 (en) 2013-03-15 2017-02-28 Illinois Tool Works Inc. Welding torch for a welding training system
US9666100B2 (en) 2013-03-15 2017-05-30 Illinois Tool Works Inc. Calibration devices for a welding training system
US11090753B2 (en) 2013-06-21 2021-08-17 Illinois Tool Works Inc. System and method for determining weld travel speed
US11127313B2 (en) 2013-12-03 2021-09-21 Illinois Tool Works Inc. Systems and methods for a weld training system
US10056010B2 (en) 2013-12-03 2018-08-21 Illinois Tool Works Inc. Systems and methods for a weld training system
US10913126B2 (en) 2014-01-07 2021-02-09 Illinois Tool Works Inc. Welding software for detection and control of devices and for analysis of data
US10105782B2 (en) 2014-01-07 2018-10-23 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US9589481B2 (en) 2014-01-07 2017-03-07 Illinois Tool Works Inc. Welding software for detection and control of devices and for analysis of data
US10170019B2 (en) 2014-01-07 2019-01-01 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US11241754B2 (en) 2014-01-07 2022-02-08 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US9724788B2 (en) 2014-01-07 2017-08-08 Illinois Tool Works Inc. Electrical assemblies for a welding system
US11676509B2 (en) 2014-01-07 2023-06-13 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US9751149B2 (en) 2014-01-07 2017-09-05 Illinois Tool Works Inc. Welding stand for a welding system
US9757819B2 (en) 2014-01-07 2017-09-12 Illinois Tool Works Inc. Calibration tool and method for a welding system
US10964229B2 (en) 2014-01-07 2021-03-30 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US10307853B2 (en) 2014-06-27 2019-06-04 Illinois Tool Works Inc. System and method for managing welding data
US9862049B2 (en) 2014-06-27 2018-01-09 Illinois Tool Works Inc. System and method of welding system operator identification
US10839718B2 (en) 2014-06-27 2020-11-17 Illinois Tool Works Inc. System and method of monitoring welding information
US10665128B2 (en) 2014-06-27 2020-05-26 Illinois Tool Works Inc. System and method of monitoring welding information
US9937578B2 (en) 2014-06-27 2018-04-10 Illinois Tool Works Inc. System and method for remote welding training
US11014183B2 (en) 2014-08-07 2021-05-25 Illinois Tool Works Inc. System and method of marking a welding workpiece
US9724787B2 (en) 2014-08-07 2017-08-08 Illinois Tool Works Inc. System and method of monitoring a welding environment
US9875665B2 (en) 2014-08-18 2018-01-23 Illinois Tool Works Inc. Weld training system and method
US10861345B2 (en) 2014-08-18 2020-12-08 Illinois Tool Works Inc. Weld training systems and methods
US11475785B2 (en) 2014-08-18 2022-10-18 Illinois Tool Works Inc. Weld training systems and methods
US11247289B2 (en) 2014-10-16 2022-02-15 Illinois Tool Works Inc. Remote power supply parameter adjustment
US10239147B2 (en) 2014-10-16 2019-03-26 Illinois Tool Works Inc. Sensor-based power controls for a welding system
US10417934B2 (en) 2014-11-05 2019-09-17 Illinois Tool Works Inc. System and method of reviewing weld data
US11482131B2 (en) 2014-11-05 2022-10-25 Illinois Tool Works Inc. System and method of reviewing weld data
US10402959B2 (en) 2014-11-05 2019-09-03 Illinois Tool Works Inc. System and method of active torch marker control
US10373304B2 (en) 2014-11-05 2019-08-06 Illinois Tool Works Inc. System and method of arranging welding device markers
US11127133B2 (en) 2014-11-05 2021-09-21 Illinois Tool Works Inc. System and method of active torch marker control
US10490098B2 (en) 2014-11-05 2019-11-26 Illinois Tool Works Inc. System and method of recording multi-run data
US10204406B2 (en) 2014-11-05 2019-02-12 Illinois Tool Works Inc. System and method of controlling welding system camera exposure and marker illumination
US10210773B2 (en) 2014-11-05 2019-02-19 Illinois Tool Works Inc. System and method for welding torch display
US10427239B2 (en) 2015-04-02 2019-10-01 Illinois Tool Works Inc. Systems and methods for tracking weld training arc parameters
US11462124B2 (en) 2015-08-12 2022-10-04 Illinois Tool Works Inc. Welding training system interface
US10438505B2 (en) 2015-08-12 2019-10-08 Illinois Tool Works Welding training system interface
US11081020B2 (en) 2015-08-12 2021-08-03 Illinois Tool Works Inc. Stick welding electrode with real-time feedback features
US11594148B2 (en) 2015-08-12 2023-02-28 Illinois Tool Works Inc. Stick welding electrode holder systems and methods
US10657839B2 (en) 2015-08-12 2020-05-19 Illinois Tool Works Inc. Stick welding electrode holders with real-time feedback features
US10373517B2 (en) 2015-08-12 2019-08-06 Illinois Tool Works Inc. Simulation stick welding electrode holder systems and methods
US10593230B2 (en) 2015-08-12 2020-03-17 Illinois Tool Works Inc. Stick welding electrode holder systems and methods
US11288978B2 (en) 2019-07-22 2022-03-29 Illinois Tool Works Inc. Gas tungsten arc welding training systems
US11776423B2 (en) 2019-07-22 2023-10-03 Illinois Tool Works Inc. Connection boxes for gas tungsten arc welding training systems

Also Published As

Publication number Publication date
JP3790638B2 (en) 2006-06-28

Similar Documents

Publication Publication Date Title
JP2000298427A (en) Medical practice device and method for evaluating medical practice result
JP7124011B2 (en) Systems and methods of operating bleeding detection systems
US9895131B2 (en) Method and system of scanner automation for X-ray tube with 3D camera
KR101295471B1 (en) A system and method for 3D space-dimension based image processing
KR101407986B1 (en) Medical robotic system providing three-dimensional telestration
JP5032343B2 (en) Method and apparatus for displaying a virtual object, and method and apparatus for overlaying a virtual object on an environmental image
US8139087B2 (en) Image presentation system, image presentation method, program for causing computer to execute the method, and storage medium storing the program
KR20180027492A (en) System and method for scanning anatomical structures and displaying scanning results
JP5039808B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US4851901A (en) Stereoscopic television apparatus
US11678950B2 (en) Multiple-viewpoint video image viewing system and camera system
JP2000172431A (en) Information input device and game device
JP2002210055A (en) Swing measuring system
US20190043215A1 (en) Endoscope apparatus
US10386633B2 (en) Virtual object display system, and display control method and display control program for the same
CN110208947B (en) Display device and display method based on human eye tracking
WO2015046152A1 (en) Endoscopy system
NL2022371B1 (en) Method and assembly for spatial mapping of a model of a surgical tool onto a spatial location of the surgical tool, as well as a surgical tool
KR20170095007A (en) Simulation system and method for surgery training
JPH0919441A (en) Image displaying device for assisting operation
JP2002263053A (en) Medical image display device and method
CN113660986A (en) Device comprising a plurality of markers
KR20140077029A (en) Apparatus and method for simulating laparoscopic surgery
JP2017205343A (en) Endoscope device and method for operating endoscope device
KR20130109794A (en) Virtual arthroscope surgery simulation system

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20051021

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20051025

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20051226

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060131

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060307

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20060322

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20060403

R150 Certificate of patent or registration of utility model

Ref document number: 3790638

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100407

Year of fee payment: 4

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100407

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110407

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120407

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120407

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130407

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140407

Year of fee payment: 8

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees