WO2020138671A1 - Virtual reality-based surgery assessment system, using simulator, for otolaryngology and neurosurgery - Google Patents

Virtual reality-based surgery assessment system, using simulator, for otolaryngology and neurosurgery Download PDF

Info

Publication number
WO2020138671A1
WO2020138671A1 PCT/KR2019/013827 KR2019013827W WO2020138671A1 WO 2020138671 A1 WO2020138671 A1 WO 2020138671A1 KR 2019013827 W KR2019013827 W KR 2019013827W WO 2020138671 A1 WO2020138671 A1 WO 2020138671A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
information
moving information
line
moving
Prior art date
Application number
PCT/KR2019/013827
Other languages
French (fr)
Korean (ko)
Inventor
김성원
김도현
최요철
김태균
김현문
허재일
Original Assignee
주식회사 홀로웍스
가톨릭대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 홀로웍스, 가톨릭대학교 산학협력단 filed Critical 주식회사 홀로웍스
Publication of WO2020138671A1 publication Critical patent/WO2020138671A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/285Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions

Definitions

  • the present invention relates to an otorhinolaryngology and neurosurgery surgery evaluation system, and more specifically, based on virtual reality, such as VR (virtual reality), AR (Augmented Reality), MR (MixedalReality), of an otolaryngology subject or neurosurgery
  • VR virtual reality
  • AR Algmented Reality
  • MR MatedalReality
  • VR Virtual reality
  • the medical field is emerging as an application field of VR that is developing around games, and attempts to actively introduce VR in hospitals are underway.
  • the VR technology industry is also predicting that the medical service will become the representative VR B2B market in the future, and is expanding its sales force by trying various projects.
  • Patent Document 1 augmented reality based laparoscopic surgery simulation system and method using the same.
  • the augmented reality-based simulation system for laparoscopic surgery includes a laparoscopic surgical tool for generating first laparoscopic surgical information; A simulator module that is provided with a human body model therein to generate second laparoscopic surgical information when it is sensed that the laparoscopic surgical instrument is cutting the human body model or inserted into the human body model; An information processing module for generating augmented reality information based on the first laparoscopic surgical information and the second laparoscopic surgical information; And augmented reality glass to output the augmented reality information. It includes.
  • the learner simulates laparoscopic surgery or compares the image information of the previously stored laparoscopic surgery with the laparoscopic surgery simulated by the learner himself, and compares the laparoscopic surgery simulated by the learner with respect to the previously stored laparoscopic surgery. It is possible to evaluate which information matches the image information, so that the surgical technique required for laparoscopic surgery can be effectively polished, and the educator can easily use the augmented reality information to facilitate educational 3D content for laparoscopic surgery. Can be produced.
  • Patent Document 2 published number 10-2018-0123310, hereinafter referred to as Patent Document 2
  • patent document 2 it relates to a laparoscopic surgical education system using augmented reality that gives a user the feeling of actually performing an operation and allows them to practice while being guided by a surgical method.
  • a laparoscopic surgical practice system using virtual reality glasses is provided. Therefore, according to the present invention, it is possible to overcome the reality that is difficult to practice in person, thereby giving an opportunity to deal with the actual internal organs, and thus has an advantage of being effective education even indirectly to students receiving medical education.
  • Patent Document 3 there is also a “virtual reality training system and method for dentistry (Publication No. 2003-0044909, hereinafter referred to as Patent Document 3)".
  • Patent Literature 3 it detects data regarding the spatial position of an actual element that can be held and used in hand, displays a three-dimensional display of a virtual object on a screen, and displays a virtual device corresponding to the actual spatial location of the actual element.
  • Virtual reality practice to obtain procedural movement in dentistry by processing spatial location data to provide spatial representation, providing a virtual instrument operating on the virtual object, and modeling the interaction between the virtual object and the virtual instrument It is about a system.
  • the hand-held element is a tactile human-machine interface (IHM) device having an actuator that is controlled to provide force feedback to a user holding the real element in hand when the virtual instrument interacts with the virtual object. Belongs.
  • IHM tactile human-machine interface
  • Patent Document 4 there is also a “virtual surgery simulation apparatus and its operation method (Publication No. 10-2016-0092425, hereinafter referred to as Patent Document 4)".
  • a virtual surgical simulation apparatus includes a control unit for performing virtual surgery on the patient by using a 3D image of a patient's surgical target area; An image processing unit generating a surgical image including at least a part of the process of the virtual surgery; And it characterized in that it comprises a communication unit for transmitting at least one of the surgical image and the surgical results.
  • the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention is devised to solve the conventional problems as described above, and presents the following problems to be solved.
  • the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention has the following problem solving means for the above-mentioned problems.
  • a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator is mounted on the front face of a medical practitioner, and includes goggles for visually providing spatial information to the medical practitioner; A controller provided in the hand of the medical practitioner to map moving information of the motion of the medical practitioner's hand into the spatial information; A spatial information receiving unit receiving the spatial information provided by the goggles; A moving information receiver configured to receive the moving information of the medical practitioner mapped by the controller; A reference data setting unit for pre-acquiring and setting preset reference data for the moving information on the spatial information; And an evaluation measurement unit receiving the preset reference data from the reference data setting unit and comparing the moving information of the medical practitioner on the spatial information to calculate a quantitative score.
  • the preset reference data of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator according to the present invention is obtained by obtaining the moving information provided by a plurality of specialists in advance, by the plurality of specialists on the spatial information It can be characterized in that it is set to the average value of the traces in the three-dimensional coordinates of the moving information.
  • the goggles of the surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to the present invention is characterized by providing at least one virtual reality among virtual reality (VR), Augmented Reality (AR), or Mixed Reality (MR) can do.
  • VR virtual reality
  • AR Augmented Reality
  • MR Mixed Reality
  • the evaluation measurement unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator includes: a spatial coordinate recognition unit for reading and acquiring the spatial information in three-dimensional spatial coordinates; A moving coordinate recognition unit for acquiring 3D spatial coordinates of the moving information in the 3D spatial coordinates of the spatial information; And a dimension extraction unit for extracting a predetermined dimension for evaluating the moving information on a predetermined surgical site in the spatial information.
  • the dimensional extraction unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator sets the predetermined dimension to a plurality of one-dimensional, and the evaluation measurement unit touches the predetermined surgical site in one dimension It may be characterized in that it further comprises a point triggering unit which is set as a touch line and is set as a fail line at a lower portion having a predetermined depth from the touch line.
  • the point triggering unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator includes: a target point setting unit configured to set the predetermined surgical site and provide the spatial information through the goggles; A touch line recognition unit recognizing whether the moving information of the medical practitioner input by the controller touches the touch line; And a fail line recognition unit recognizing whether the moving information of the medical practitioner input by the controller touches the fail line formed under the touch line.
  • the point triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator comprises: a touch time recognition unit for acquiring the total time that the moving information of the medical practitioner stays at the predetermined surgical site; A depth weighting unit continuously calculating a score of the moving information according to the depth of the moving information inserted into a space extending from the touch line to the failing line, and continuously subtracting the score in proportion to the depth of the moving information; And a depth dispersion measurement unit configured to measure a variance of the depth of the moving information inserted into a space leading from the touch line to the fail line, and to give a score of the moving information in inverse proportion to the variance. .
  • the dimensional extraction unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator sets the predetermined dimension in two dimensions, and the evaluation measurement unit sets the predetermined surgical area in a two-dimensional area, , It may be characterized in that it further comprises a line trigger for calculating a distance spaced from the origin of the two-dimensional area.
  • the line triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator includes: a target area setting unit for setting the origin of the two-dimensional area as a target area for the predetermined surgical area; A coordinate setting unit configured to set the two-dimensional area as x-y coordinates and recognize coordinates in the x-y coordinates of the two-dimensional area of the moving information; Deviation diameter calculation unit for measuring the distance the moving information is spaced on the x-y coordinates from the origin; And a coordinate shift unit configured to shift the two-dimensional area to cause the target region setting unit to reset the target region for the surgical site.
  • the line triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator counts the number of times the moving information of the medical practitioner touches the xy coordinates of the two-dimensional area, and the moving information is the It may be characterized in that it further comprises a touch count unit for recognizing whether to touch the xy coordinates of the two-dimensional area multiple times.
  • the virtual reality-based otolaryngology and neurosurgery simulator surgical evaluation system according to the present invention having the above configuration provides the following effects.
  • VR virtual reality
  • AR Augmented Reality
  • MR Mixed Reality
  • VR virtual reality
  • AR Augmented Reality
  • MR Magnetic Reality
  • FIG. 1 is a conceptual diagram of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a surgical evaluation system of a virtual reality based otorhinolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 3 is a block diagram showing sub-components of an evaluation measurement unit that is one component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 4 is an exemplary screen of a virtual reality that a medical trainee can access through VR or the like for use of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 5 is an exemplary screen of a virtual reality that a medical trainee can access through AR or MR for use in a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 6 is a block diagram showing sub-components of a point triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 7 is a front cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. .
  • FIG. 8 is a side cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. .
  • FIG. 9 is a block diagram showing sub-components of a line triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 10 is a conceptual diagram illustrating coordinates of a target region of a line triggering unit, which is a component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 1 is a conceptual diagram of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • 2 is a block diagram of a surgical evaluation system of a virtual reality based otorhinolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 3 is a block diagram showing sub-components of an evaluation measurement unit that is one component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 4 is an exemplary screen of a virtual reality that a medical trainee can access through VR or the like for use of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • 5 is an exemplary screen of a virtual reality that a medical trainee can access through AR or MR for use in a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • 6 is a block diagram showing sub-components of a point triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 7 is a front cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • FIG. 8 is a side cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. .
  • FIG. 9 is a block diagram showing sub-components of a line triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • 10 is a conceptual diagram illustrating coordinates of a target region of a line triggering unit, which is a component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
  • the surgical evaluation system of the virtual reality-based otorhinolaryngology and neurosurgery simulator according to the present invention is as shown in FIG. 1, medical students, interns, residents, general doctors, majors, or other persons engaged in medical systems (hereinafter referred to as'medical practice') It is a technical field of a system that allows a person to have an experience related to surgery through spatial information, which is a virtual reality, without using tangible objects, and to allow quantitative evaluation through such experiences.
  • the medical practitioner wears goggles (goggle, 11), and then provides a virtual reality (virtual reality) to the medical practitioner, the progress of the surgery through the operation of their hands in this virtual reality Based on how close the procedure is to the procedure normally performed by medical professionals who are proficient in the operation, the technical idea of quantitative evaluation is disclosed.
  • goggles goggles
  • virtual reality virtual reality
  • the surgical evaluation system of the virtual reality based otorhinolaryngology and neurosurgery simulator according to the present invention has an otorhinolaryngology surgical evaluation system, as shown in FIG. 1, and in front of such a system, goggles 11 that provide VR and medical practice It may include a controller 12 that reads the motion of the child's hand, that is, moving information in three dimensions.
  • the goggles 11 are configured to provide spatial information to the medical practitioner while being mounted on the front face of the medical practitioner.
  • various blank information is provided from a storage in which raw data 13 are stored, and this spatial information is three-dimensional stereoscopic information about the nasal cavity of the otorhinolaryngology system as shown in FIGS. .
  • At least one virtual reality among virtual reality (VR), augmented reality (AR), and mixed reality (MR) may be provided.
  • the controller 12 As shown in FIG. 1, it is provided to the hand of a medical practitioner to read the movement of the hand of the medical practitioner, that is, moving information, and the goggles 11 as described above are provided. It is a configuration that maps to spatial information.
  • the hands of the medical practitioner can read all the three-dimensional moving information necessary to proceed with the surgery.
  • controller 12 be able to grasp a fine motion to read a fine motion that moves within a volume of an adult male fist, rather than encompassing an area having a large radius of action.
  • the spatial information receiving unit 100 it is a configuration that receives the spatial information provided by the goggles 11.
  • the 3D stereoscopic information visually provided through the face of the medical practitioner through the goggles 11 is provided.
  • the spatial information receiving unit 100 may receive the visual elements of the spatial information provided by the virtual reality providing unit 10 to the goggles 11 as it is, and may also abbreviate and accommodate only spatial coordinate information, not visual elements. have.
  • the moving information receiving unit 200 is configured to receive moving information of a medical practitioner mapped by the controller 12.
  • the moving information receiving unit 200 is computationally connected to the controller 12 to accommodate the motion of the practitioner's hands, that is, reading moving information, in three-dimensional spatial coordinates.
  • the spatial information receiving unit 100 obtains and sets desirable moving information to be moved by the medical practitioner, that is, preset reference data, in advance on the spatial information received from the goggles 11 .
  • the preset reference data corresponds to the average value of traces in three-dimensional coordinates with respect to the desired hand movement of the surgical site that the surgical practitioner is currently experiencing and tested. More specifically, moving information by a plurality of specialists on spatial information is obtained on three-dimensional coordinates, and an average value of traces by moving information by a plurality of specialists is set as preset reference data. .
  • the preset reference data may be obtained after concentrating the controller 12 in a state where a plurality of specialists respectively wear goggles 11.
  • the preset reference data is received from the reference data setting unit 300, the moving information of the medical practitioner is compared on the spatial information as described above, and the moving information of the medical practitioner is preset. It corresponds to the configuration that calculates a quantitative score by comparing it with the reference data (moving information of professional medical staff).
  • a spatial coordinate recognition unit 410 may be included.
  • a moving coordinate recognition unit 420 may be included.
  • a dimension extraction unit 430 may be included.
  • the spatial information as shown in FIG. 4 is read and obtained as three-dimensional spatial coordinates.
  • the spatial information of FIGS. 4 and 5 is obtained by reflecting microscopic coordinate information by a three-dimensional x-y-z axis, in addition to a shape visually or visually confirmed through the goggles 11.
  • the moving coordinate recognition unit 420 In the case of the moving coordinate recognition unit 420, the three-dimensional spatial coordinates of the moving information in the spatial information to which the moving information is reflected and mapped are acquired according to the change of time.
  • the dimension extraction unit 430 a configuration for extracting a predetermined dimension for evaluating moving information on a predetermined surgical site in spatial information.
  • the fail line of FIG. 7 can be set as a blood vessel that should not be touched among surgical sites, and the touch line is incised by a medical practitioner and neatly cut through a mass It can be set to the line that should go out.
  • the predetermined surgical site can be viewed as a touch line, and as described above, in order to determine the hand motion of the medical practitioner, that is, moving information, a predetermined dimension as shown in FIG. 8 is a straight line, that is, a one-dimensional You can also set
  • a solid black line in FIG. 10 means a preferred incision line in which a medical practitioner is required to cut a specific skin surface in the nasal cavity into a weak S-shape through a mass.
  • a plane orthogonal to the corresponding incision line, that is, a two-dimensional plane is extracted.
  • a predetermined dimension means a hand gesture of a medical practitioner, that is, a dimension extracted for quantitative evaluation of moving information.
  • a point triggering unit 440 may be further included.
  • the dimension extraction unit 430 sets a plurality of predetermined dimensions to two one-dimensional lines, in this case, the point triggering unit 440 is previously determined as described above.
  • the surgical site is set as a one-dimensional touch line, and a fail line is set below the touch line having a predetermined depth.
  • the predetermined depth may be arbitrarily set at the ENT surgical site, and the surface of arteries or veins or nerves under the swollen skin tissue and the swollen skin tissue may be set as a fail line.
  • a target point setting unit 441, a touch line recognition unit 442, and a fail line recognition unit 443 may be included.
  • a surgical site that is, a predetermined surgical site is set, and spatial information is visually provided through the goggles 11 to the medical practitioner.
  • the target point setting unit 441 visually provides the crystalline species in the nasal cavity through the goggles 11.
  • the medical practitioner can determine the existence of these tissues only by visual observation, and through the visual observation, the medical practitioner can find the crystalline species.
  • the touch line recognition unit 442 it is recognized whether the moving information of the medical practitioner input by the controller touches the touch line.
  • the moving information of the medical practitioner input by the controller 12, that is, the 3D coordinate information of the hand gesture touches a fail line formed below the touch line. Or not.
  • the failline recognition unit 443 recognizes whether or not an area that should not be touched, such as a nerve, an artery, or a vein, is touched.
  • the fail line recognition unit 443 recognizes only whether the fail line is touched or not, and if there is a touch of the fail line, the failing operation of the medical practitioner is recognized as failing.
  • the point triggering unit 440 may further include a touch time recognition unit 444, a depth weighting unit 445, and a depth dispersion measurement unit 446.
  • the area to be operated should be incised smoothly through a mass at an appropriate rate, which should take into account the sharpness of the mass blade, the elasticity of the surgical site, tissue bleeding, and moisture evaporation. Because.
  • the speed of the incision is important for the total time that the medical practitioner's moving information stays at the surgical site, and the touch time recognition unit 444 acquires the medical practitioner.
  • the touch time recognition unit 444 measures the time around which the medical practitioner concentrates the mass and cuts it in the virtual space.
  • the touch line recognition unit 442 and the fail line recognition unit 443 it is possible to determine how well the touch line is cut along and only touches the fail line. Further, the depth weight unit 445 In the case of, the penalties are imparted proportionally according to how much more mass enters the touch line and the mass entered deeper.
  • FIG. 8(a) only the recognition of the touch line and the recognition of the fail line are determined.
  • FIG. 8(b) the depth of the deduction is continuously increased according to the depth from the touch line to the fail line.
  • the weight unit 445 is set.
  • the variance of the depth of the moving information inserted in the space from the touch line to the fail line is measured, and the moving information is scored in inverse proportion to the variance.
  • the evaluation measurement unit 400 may further include a line triggering unit 450 together with or separately from the point triggering unit 440.
  • the dimension extracting unit 430 sets a predetermined dimension in two dimensions, but the line triggering unit 450 sets the predetermined surgical area to a two dimensional area. It is a configuration that calculates the distance (l) spaced apart from the origin (c) of the two-dimensional area.
  • the solid line in FIG. 10 means preset reference data, that is, a desirable incision line, which is displayed when a plurality of specialists perform surgery.
  • the line triggering unit 450 includes a target area setting unit 451, a coordinate setting unit 452, a deviation diameter calculating unit 454, and a coordinate shift unit 455. can do.
  • the origin c of the two-dimensional area is set as a target region for a predetermined surgical site.
  • the incision is set as the origin (c), and another virtual two-dimensional plane perpendicular to the incision line, that is, an area is extracted in a predetermined dimension.
  • the coordinate setting unit 452 by setting the two-dimensional area as the xy coordinate as described above, the moving information of the medical practitioner, that is, the part where the mass passes through the nasal cavity, which is a virtual space, passes through FIG. 10. It is recognized as the xy coordinate of.
  • c means the solid line in Fig. 10, and the two-dimensional area corresponds to the area w crossing the solid line.
  • the distance l from c in the x-y coordinates of the moving information recognized by the coordinate setting unit 452 is measured.
  • the two-dimensional area is shifted to cause the target region setting beam 451 to reset the target region for the surgical site.
  • the touch counting unit 453 may be further included.
  • the touch counting unit 453 the number of times the moving information of the medical practitioner touches the x-y coordinate of the 2D area is counted to recognize whether the moving information touches the x-y coordinate of the 2D area multiple times.
  • the number of such incisions is counted because the incision line as shown in FIG. 10 must not be passed twice, that is, the incision should not be performed twice.
  • the moving time measuring unit 456 similar to the touch time recognition unit 442, the cutting time of the mass moving along the cutting line of FIG. 10 is measured.
  • reference data backup unit 500 information previously obtained by a plurality of specialists as described above, that is, a desirable mass movement that medical practitioners should imitate, that is, a storage for backing up moving information, which is preset in such storage Reference data may be physically divided and stored.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Computational Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Medicinal Chemistry (AREA)
  • Algebra (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pulmonology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a system for quantitatively assessing skill level in otolaryngology or neurosurgery surgery on the basis of virtual reality, the system comprising: goggles for visually providing space information to a medical trainee; a controller for mapping, in the space information, moving information about the movement of the hands of the medical trainee; a space information receiving unit for receiving the space information provided by the goggles; a moving information receiving unit for receiving the moving information, of the medical trainee, mapped by the controller; a reference data configuration unit for receiving, in advance, and configuring, on the space information, pre-configured reference data associated with the moving information; and an assessment measurement unit for calculating a quantitative score of the moving information of the medical trainee.

Description

가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템Surgery evaluation system of otolaryngology and neurosurgery simulator based on virtual reality
본 발명은 이비인후과 및 신경외과 수술 평가 시스템에 관한 것으로서, 보다 자세하게는, 가상현실 예컨대, VR(virtual reality), AR(Augmented Reality), MR(Mixed Reality)를 기반으로 하여, 이비인후과 과목이나 신경외과의 의료 실습생 등의 수술 평가 시스템에 관한 기술분야이다.The present invention relates to an otorhinolaryngology and neurosurgery surgery evaluation system, and more specifically, based on virtual reality, such as VR (virtual reality), AR (Augmented Reality), MR (MixedalReality), of an otolaryngology subject or neurosurgery This is a technical field for surgical evaluation systems such as medical trainees.
초고령화 시대와 경쟁심화로 인한 심리질환 등 의료서비스 수요 증가에 대응하기 위해 의료 전문가육성 및 심리치료의 대안으로 가상현실(VR) 기술이 부상되고 있다.Virtual reality (VR) technology is emerging as an alternative to nurturing medical professionals and psychological therapy to cope with the increasing demand for medical services such as psychological diseases caused by the age of ultra-aging and competition.
게임 중심으로 발전 중인 VR의 응용분야로 의료분야가 새롭게 부상하고 있는데 병원에서 VR을 적극적으로 도입하려는 시도가 추진중이다.The medical field is emerging as an application field of VR that is developing around games, and attempts to actively introduce VR in hospitals are underway.
VR기술업계도 앞으로 의료서비스가 대표적인 VR의 B2B시장이 될 것으로 예측하며, 여러가지의 프로젝트를 시도하며 영업력을 확장하고 있다.The VR technology industry is also predicting that the medical service will become the representative VR B2B market in the future, and is expanding its sales force by trying various projects.
이에 따라서, VR기술의 교육기능과 의학적 기능을 결합한 기술적인 시도는 여럿 존재해 왔었는데, 그 중 대표적으로는 "증강현실 기반의 복강경 수술용 시뮬레이션 시스템 및 이를 이용한 방법(등록번호 제10-1887805호, 이하 특허문헌 1이라 한다.)"이 존재한다.Accordingly, there have been several technical attempts that combine educational and medical functions of VR technology, and among them, "augmented reality based laparoscopic surgery simulation system and method using the same (Registration No. 10-1887805) , Hereinafter referred to as Patent Document 1)".
특허문헌 1의 경우, 증강현실 기반의 복강경 수술용 시뮬레이션 시스템 및 이를 이용한 방법이 개시된다. 증강현실 기반의 복강경 수술용 시뮬레이션 시스템은 제1 복강경 수술정보가 생성되도록 하는 복강경 수술도구; 내부에 인체모형이 마련되어, 상기 복강경 수술도구가 상기 인체모형을 절개하거나 상기 인체모형에 삽입되는 것이 감지되면, 제2 복강경 수술정보가 생성되도록 하는 시뮬레이터 모듈; 상기 제1 복강경 수술정보 및 상기 제2 복강경 수술정보를 기반으로 증강현실 정보가 생성되도록 하는 정보처리 모듈; 및 상기 증강현실 정보가 출력되도록 하는 증강현실 글라스; 를 포함한다. 이에 의해, 증강현실 정보를 이용하여 학습자가 복강경 수술을 시뮬레이션하거나 기저장된 복강경 수술에 대한 영상정보와 학습자 본인이 시뮬레이션해본 복강경 수술을 비교하여, 학습자에 의해 시뮬레이션된 복강경 수술이 기저장된 복강경 수술에 대한 영상정보와 어느 정보 일치하는지 평가받을 수 있어, 복강경 수술에 필요한 수술 술기(clinical skill)를 효과적으로 연마할 수 있으며, 또한, 교육자는 증강현실 정보를 이용하여 복강경 수술에 대한 교육용 3차원 콘텐츠를 용이하게 제작할 수 있다.In the case of Patent Document 1, a simulation system for laparoscopic surgery based on augmented reality and a method using the same are disclosed. The augmented reality-based simulation system for laparoscopic surgery includes a laparoscopic surgical tool for generating first laparoscopic surgical information; A simulator module that is provided with a human body model therein to generate second laparoscopic surgical information when it is sensed that the laparoscopic surgical instrument is cutting the human body model or inserted into the human body model; An information processing module for generating augmented reality information based on the first laparoscopic surgical information and the second laparoscopic surgical information; And augmented reality glass to output the augmented reality information. It includes. Thereby, by using augmented reality information, the learner simulates laparoscopic surgery or compares the image information of the previously stored laparoscopic surgery with the laparoscopic surgery simulated by the learner himself, and compares the laparoscopic surgery simulated by the learner with respect to the previously stored laparoscopic surgery. It is possible to evaluate which information matches the image information, so that the surgical technique required for laparoscopic surgery can be effectively polished, and the educator can easily use the augmented reality information to facilitate educational 3D content for laparoscopic surgery. Can be produced.
마찬가지로 "증강현실을 이용한 복강경 수술 교육시스템(공개번호 제10-2018-0123310호, 이하 특허문헌 2라 한다.)"도 존재한다.Likewise, there is a "laparoscopic surgery education system using augmented reality (published number 10-2018-0123310, hereinafter referred to as Patent Document 2)".
특허문헌 2의 경우, 사용자가 실제로 수술을 하는 듯한 느낌을 주며, 수술방법을 안내 받으면서 실습할 수 있도록 한 증강현실을 이용한 복강경 수술 교육시스템에 관한 것이다. 이를 위하여, 가상현실 글래스를 이용한 복강경 수술실습시스템을 제공한다. 따라서, 본 발명에 의하면, 직접 실습하기 어려운 현실을 극복하여 실제 내부장기를 다뤄볼 수 있는 기회를 줄 수 있어 의료교육을 받는 학생들에게 간접적으로나마 효과적인 교육이 될 수 있는 장점이 있다.In the case of patent document 2, it relates to a laparoscopic surgical education system using augmented reality that gives a user the feeling of actually performing an operation and allows them to practice while being guided by a surgical method. To this end, a laparoscopic surgical practice system using virtual reality glasses is provided. Therefore, according to the present invention, it is possible to overcome the reality that is difficult to practice in person, thereby giving an opportunity to deal with the actual internal organs, and thus has an advantage of being effective education even indirectly to students receiving medical education.
또한, "치과학용 가상 현실 실습 시스템 및 방법(공개번호 특2003-0044909호, 이하 특허문헌 3이라 한다.)"도 존재한다. In addition, there is also a "virtual reality training system and method for dentistry (Publication No. 2003-0044909, hereinafter referred to as Patent Document 3)".
특허문헌 3의 경우, 손에 들고 쓸 수 있는 실제 요소의 공간 위치에 관한 데이터를 감지하고, 스크린 상에 가상 물체의 3차원 표시를 하고, 상기 실제 요소의 실제 공간적인 위치에 대응하는 가상 기구의 공간 표시를 제공하기 위하여 공간적인 위치 데이터를 처리하고, 상기 가상 물체 상에서 작동하는 가상 기구를 제공하고 그리고 상기 가상 물체와 상기 가상 기구 사이의 상호 작용을 모델링함으로써 치과학에서 절차 이동을 얻기 위한 가상 현실 실습을 위한 시스템에 관한 것이다. 손에 들고 쓸 수 있는 요소는 가상 기구가 가상 물체와 상호 작용할 때 상기 실제 요소를 손에 보유하고 있는 사용자에게 힘 피드백을 제공하도록 제어되는 액츄에이터를 구비하는 촉각형 사람-기계 인터페이스(IHM) 장치에 속한다. 해당 발명은 교육 상 또는 직업적인 목적으로 유용하다.In the case of Patent Literature 3, it detects data regarding the spatial position of an actual element that can be held and used in hand, displays a three-dimensional display of a virtual object on a screen, and displays a virtual device corresponding to the actual spatial location of the actual element. Virtual reality practice to obtain procedural movement in dentistry by processing spatial location data to provide spatial representation, providing a virtual instrument operating on the virtual object, and modeling the interaction between the virtual object and the virtual instrument It is about a system. The hand-held element is a tactile human-machine interface (IHM) device having an actuator that is controlled to provide force feedback to a user holding the real element in hand when the virtual instrument interacts with the virtual object. Belongs. The invention is useful for educational or professional purposes.
또한, "가상 수술 시뮬레이션 장치 및 그 동작 방법(공개번호 제10-2016-0092425호, 이하 특허문헌4라 한다.)"도 존재한다.In addition, there is also a "virtual surgery simulation apparatus and its operation method (Publication No. 10-2016-0092425, hereinafter referred to as Patent Document 4)".
특허문헌 4의 경우, 가상 수술 시뮬레이션 장치가 제공된다. 상기 장치는 환자의 수술 대상 부위에 대한 3차원 영상을 이용하여, 상기 환자에 대한 가상 수술을 수행하는 제어부; 상기 가상 수술의 과정 중 적어도 일부를 포함하는 수술 영상을 생성하는 영상 처리부; 및 상기 수술 영상 및 수술 결과 중 적어도 하나를 전송하는 통신부를 포함하는 것을 특징으로 한다.In the case of patent document 4, a virtual surgical simulation apparatus is provided. The apparatus includes a control unit for performing virtual surgery on the patient by using a 3D image of a patient's surgical target area; An image processing unit generating a surgical image including at least a part of the process of the virtual surgery; And it characterized in that it comprises a communication unit for transmitting at least one of the surgical image and the surgical results.
그러나, 이들 특허문헌들의 경우, 대부분 정형외과나 성형외과, 치과 등의 특정 의료분야에의 문헌만이 존재하고, 기타 이외의 의료분야에 대한 문헌은 소수만 기재되고 있으며, 해당 분야의 수술과정을 평가할 수 있는 수단이 없어 교육방법이 효율성이 떨어지는 문제가 있다.However, in the case of these patent documents, most of them exist only in certain medical fields such as orthopedic surgery, plastic surgery, and dentistry, and only a few documents in other medical fields are described, and surgical procedures in the relevant fields can be evaluated. There is a problem that education methods are not efficient because there is no means to do so.
기존의 종래 기술들은 가상 현실 예컨대, VR(virtual reality), AR(Augmented Reality), MR(Mixed Reality) 등을 기반으로 하여, 의대생이나 의사들의 특정 시술이나 수술, 혹은 원격 진료 및 진단 등에 대한 기술적 지원에 대한 컨셉은 가지고 있었으나, 교육생 예컨대, 의대생이나 의료 전문의이나 최신 의학 관련 실습을 요하는 자의 VR(virtual reality), AR(Augmented Reality), MR(Mixed Reality) 기반의 실습에서 나아가, 이들 행위에 대한 정량적인 평가와 가이드라인을 제시하도록 하는 기술적 플랫폼이 전무한 실정이었다. Existing conventional technologies are based on virtual reality, such as virtual reality (VR), Augmented Reality (AR), and Mixed Reality (MR), providing technical support for medical students or doctors for specific procedures or surgery, or telemedicine and diagnosis. Though it had a concept for, it is not only for trainees, such as medical students, medical specialists, or those who require the latest medical practice, go beyond VR (virtual reality), AR (Augmented Reality), MR(Mixed Reality) based practice, and learn about these behaviors. There was no technical platform to provide quantitative evaluation and guidelines.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템은 상기한 바와 같은 종래 문제점을 해결하기 위해 안출된 것으로서, 다음과 같은 해결하고자 하는 과제를 제시한다.The surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention is devised to solve the conventional problems as described above, and presents the following problems to be solved.
첫째, 의료 실습생들에 대한 물리적인 실습물 예컨대, 시체나 인형 등을 통해서가 아니라, 가상현실을 기반으로 교육받을 수 있도록 기술적 시스템을 제공하고자 한다.First, it is intended to provide a technical system for medical practitioners to be trained on the basis of virtual reality, not through physical exercises such as bodies or dolls.
둘째, 이비인후과 수술이나 신경외과 수술에서 주요 수술의 주요 부위에 대한 수술의 경험과 이에 대한 정량적인 평가를 받을 수 있도록 하고자 한다.Second, in the otolaryngology surgery or neurosurgery surgery, I would like to be able to receive a quantitative evaluation of the experience of surgery on major areas of major surgery.
셋째, 의료 실습생들에 수술의 평가를 객관적인 데이터를 기준으로 평가될 수 있도록 하고자 하며, 이에 대한 내용을 외부의 서버와 공유하게 되어, 국민의 고품질 의료 서비스를 받을 권리를 확보하고자 한다.Third, it is intended to enable medical trainees to evaluate surgery based on objective data, and share this with external servers to secure the right to receive high-quality medical services from the public.
본 발명의 해결 과제는 이상에서 언급된 것들에 한정되지 않으며, 언급되지 아니한 다른 해결과제들은 아래의 기재로부터 당업자에게 명확하게 이해될 수 있을 것이다.The problem of the present invention is not limited to those mentioned above, and other problems not mentioned will be clearly understood by those skilled in the art from the following description.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템은 상기의 해결하고자 하는 과제를 위하여 다음과 같은 과제 해결 수단을 가진다.The surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention has the following problem solving means for the above-mentioned problems.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템은 의료 실습자의 안면(front face)에 장착되어, 상기 의료 실습자에게 공간 정보를 시각적으로 제공하는 고글; 상기 의료 실습자의 손에 제공되어, 상기 의료 실습자의 손의 움직임의 무빙 정보를 상기 공간 정보 내에 매핑시키는 콘트롤러; 상기 고글이 제공하는 상기 공간 정보를 수신하는 공간정보 수신부; 상기 콘트롤러가 매핑한 상기 의료 실습자의 상기 무빙 정보를 수신하는 무빙정보 수신부; 상기 공간 정보 상에 상기 무빙 정보에 관한 미리 설정된 기준데이터를 사전 획득하고 설정하는 기준데이터 설정부; 및 상기 기준데이터 설정부로부터 상기 미리 설정된 기준데이터를 수신하고, 상기 공간 정보 상에 상기 의료 실습자의 상기 무빙 정보를 비교하여 정량적인 점수를 산출하는 평가 측정부를 포함하는 것을 특징으로 할 수 있다.A surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to the present invention is mounted on the front face of a medical practitioner, and includes goggles for visually providing spatial information to the medical practitioner; A controller provided in the hand of the medical practitioner to map moving information of the motion of the medical practitioner's hand into the spatial information; A spatial information receiving unit receiving the spatial information provided by the goggles; A moving information receiver configured to receive the moving information of the medical practitioner mapped by the controller; A reference data setting unit for pre-acquiring and setting preset reference data for the moving information on the spatial information; And an evaluation measurement unit receiving the preset reference data from the reference data setting unit and comparing the moving information of the medical practitioner on the spatial information to calculate a quantitative score.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 미리 설정된 기준데이터는, 복수의 전문의들에 의해 제공된 무빙 정보들을 미리 획득하여, 상기 공간 정보 상에서의 상기 복수의 전문의들에 의한 무빙 정보들의 3차원 좌표에서의 트레이스들의 평균치로 설정되는 것을 특징으로 할 수 있다.The preset reference data of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator according to the present invention is obtained by obtaining the moving information provided by a plurality of specialists in advance, by the plurality of specialists on the spatial information It can be characterized in that it is set to the average value of the traces in the three-dimensional coordinates of the moving information.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 고글은, VR(virtual reality), AR(Augmented Reality) 또는 MR(Mixed Reality) 중 적어도 하나 이상의 가상 현실을 제공하는 것을 특징으로 할 수 있다.The goggles of the surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to the present invention is characterized by providing at least one virtual reality among virtual reality (VR), Augmented Reality (AR), or Mixed Reality (MR) can do.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 평가 측정부는, 상기 공간 정보를 3차원 공간 좌표로 판독하여 획득하는 공간좌표 인식부; 상기 공간 정보의 상기 3차원 공간 좌표 내, 상기 무빙 정보가 가지는 3차원 공간 좌표를 획득하는 무빙좌표 인식부; 및 상기 공간 정보 내에 미리 결정된 수술 부위 상에, 상기 무빙 정보의 평가를 위한 소정의 차원을 추출하는 차원 추출부를 포함하는 것을 특징으로 할 수 있다.The evaluation measurement unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator according to the present invention includes: a spatial coordinate recognition unit for reading and acquiring the spatial information in three-dimensional spatial coordinates; A moving coordinate recognition unit for acquiring 3D spatial coordinates of the moving information in the 3D spatial coordinates of the spatial information; And a dimension extraction unit for extracting a predetermined dimension for evaluating the moving information on a predetermined surgical site in the spatial information.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 차원 추출부는, 상기 소정의 차원을 복수 개의 1차원으로 설정하되, 상기 평가 측정부는, 상기 미리 결정된 수술 부위를 1차원의 터치 라인(touch line)으로 설정하며, 상기 터치 라인으로부터 소정의 뎁스(depth)를 가지는 하부에는 패일 라인(fail line)으로 설정하는 포인트 트리거링(point triggering)부를 더 포함하는 것을 특징으로 할 수 있다.The dimensional extraction unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator according to the present invention sets the predetermined dimension to a plurality of one-dimensional, and the evaluation measurement unit touches the predetermined surgical site in one dimension It may be characterized in that it further comprises a point triggering unit which is set as a touch line and is set as a fail line at a lower portion having a predetermined depth from the touch line.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 포인트 트리거링부는, 상기 미리 결정된 수술 부위를 설정하여, 상기 고글을 통해 상기 공간 정보로 제공하는 타깃지점 설정부; 상기 콘트롤러에 의해 입력되는 상기 의료 실습자의 상기 무빙 정보가 상기 터치 라인을 터치하는지 여부를 인식하는 터치라인 인식부; 및 상기 콘트롤러에 의해 입력되는 상기 의료 실습자의 상기 무빙 정보가 상기 터치 라인의 하부에 형성된 상기 패일 라인을 터치하는지 여부를 인식하는 패일라인 인식부를 포함하는 것을 특징으로 할 수 있다.The point triggering unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator according to the present invention includes: a target point setting unit configured to set the predetermined surgical site and provide the spatial information through the goggles; A touch line recognition unit recognizing whether the moving information of the medical practitioner input by the controller touches the touch line; And a fail line recognition unit recognizing whether the moving information of the medical practitioner input by the controller touches the fail line formed under the touch line.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 포인트 트리거링부는, 상기 의료 실습자의 상기 무빙 정보가 상기 미리 결정된 수술 부위에 머무르는 총 시간을 획득하는 터치시간 인식부; 상기 터치 라인으로부터 상기 패일 라인까지 이어지는 공간 내에 삽입되는 상기 무빙 정보의 깊이에 따라 상기 무빙 정보의 점수를 연속적으로 산출하여, 상기 무빙 정보의 깊이에 비례하여 점수를 연속적으로 감하는 뎁스 가중치부; 및 상기 터치 라인으로부터 상기 패일 라인까지 이어지는 공간 내에 삽입되는 상기 무빙 정보의 깊이의 분산을 측정하여, 분산에 반비례하여 상기 무빙 정보의 점수를 부여하는 뎁스분산 측정부를 더 포함하는 것을 특징으로 할 수 있다.The point triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention comprises: a touch time recognition unit for acquiring the total time that the moving information of the medical practitioner stays at the predetermined surgical site; A depth weighting unit continuously calculating a score of the moving information according to the depth of the moving information inserted into a space extending from the touch line to the failing line, and continuously subtracting the score in proportion to the depth of the moving information; And a depth dispersion measurement unit configured to measure a variance of the depth of the moving information inserted into a space leading from the touch line to the fail line, and to give a score of the moving information in inverse proportion to the variance. .
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 차원 추출부는, 상기 소정의 차원을 2차원으로 설정하되, 상기 평가 측정부는, 상기 미리 결정된 수술 부위를 2차원 면적으로 설정하며, 상기 2차원 면적의 원점으로부터 이격되는 거리를 산출하는 라인 트리거링부를 더 포함하는 것을 특징으로 할 수 있다.The dimensional extraction unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator according to the present invention sets the predetermined dimension in two dimensions, and the evaluation measurement unit sets the predetermined surgical area in a two-dimensional area, , It may be characterized in that it further comprises a line trigger for calculating a distance spaced from the origin of the two-dimensional area.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 라인 트리거링부는, 상기 2차원 면적의 원점을 상기 미리 결정된 수술 부위를 위한 타깃 영역으로 설정하는 타깃영역 설정부; 상기 2차원 면적을 x-y 좌표로 설정하고, 상기 무빙 정보의 상기 2차원 면적의 x-y좌표 내의 좌표를 인식하는 좌표 설정부; 상기 무빙 정보가 상기 원점으로부터 상기 x-y 좌표 상에서 이격된 거리를 측정하는 이탈직경 산출부; 및 상기 2차원 면적을 쉬프트 하여, 상기 타깃영역 설정부로 하여금 상기 수술 부위를 위한 타깃 영역을 재설정시키는 좌표 쉬프트부를 포함하는 것을 특징으로 할 수 있다.The line triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention includes: a target area setting unit for setting the origin of the two-dimensional area as a target area for the predetermined surgical area; A coordinate setting unit configured to set the two-dimensional area as x-y coordinates and recognize coordinates in the x-y coordinates of the two-dimensional area of the moving information; Deviation diameter calculation unit for measuring the distance the moving information is spaced on the x-y coordinates from the origin; And a coordinate shift unit configured to shift the two-dimensional area to cause the target region setting unit to reset the target region for the surgical site.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 상기 라인 트리거링부는, 상기 의료 실습자의 상기 무빙 정보가 상기 2차원 면적의 x-y 좌표를 터치하는 횟수를 카운트하여, 상기 무빙 정보가 상기 2차원 면적의 x-y 좌표를 복수 회 터치하는지 여부를 인식하는 터치 카운트부를 더 포함하는 것을 특징으로 할 수 있다.The line triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention counts the number of times the moving information of the medical practitioner touches the xy coordinates of the two-dimensional area, and the moving information is the It may be characterized in that it further comprises a touch count unit for recognizing whether to touch the xy coordinates of the two-dimensional area multiple times.
이상과 같은 구성의 본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템은 다음과 같은 효과를 제공한다.The virtual reality-based otolaryngology and neurosurgery simulator surgical evaluation system according to the present invention having the above configuration provides the following effects.
첫째, 의료 실습생들에 대한 가상현실 예컨대, VR(virtual reality), AR(Augmented Reality), MR(Mixed Reality) 기반으로 다양한 체험을 해볼 수 있게 한다.First, it enables a variety of experiences based on virtual reality for medical trainees, such as virtual reality (VR), Augmented Reality (AR), and Mixed Reality (MR).
둘째, 의료 실습생들에 대한 VR(virtual reality), AR(Augmented Reality), MR(Mixed Reality) 등의 가상 현실 상에서 자신이 직접 특정 부위에 대한 수술을 집도해보고, 이에 대한 정량적인 평가를 받을 수 있도록 하는 시스템을 제공한다.Second, in the virtual reality such as VR (virtual reality), AR (Augmented Reality), and MR(Mixed Reality) for medical trainees, they can perform surgery on a specific area and receive a quantitative evaluation System.
셋째, 의료 실습생들에 대한 VR(virtual reality), AR(Augmented Reality), MR(Mixed Reality) 등이 기존 해당 부위에 대한 전문 의료진의 기준 데이터로부터 얼마나 벗어났는지를 벤치마킹 할 수 있다.Third, it is possible to benchmark how far the VR (virtual reality), AR (Augmented,Reality), and MR(Mixed Reality) for medical practitioners deviated from the standard data of the specialists for the relevant area.
본 발명의 효과는 이상에서 언급한 것들에 한정되지 않으며, 언급되지 아니한 다른 효과들은 아래의 기재로부터 당업자에게 명확하게 이해될 수 있을 것이다.The effects of the present invention are not limited to those mentioned above, and other effects not mentioned will be clearly understood by those skilled in the art from the following description.
도 1은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 개념도이다.1 is a conceptual diagram of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
도 2는 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 블록도이다.2 is a block diagram of a surgical evaluation system of a virtual reality based otorhinolaryngology and neurosurgery simulator according to an embodiment of the present invention.
도 3은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 일 구성요소 중 하나인 평가 측정부의 하위 구성들을 도시한 블록도이다.FIG. 3 is a block diagram showing sub-components of an evaluation measurement unit that is one component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
도 4는 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 사용을 위해 의료 실습생이 VR 등을 통해 접할 수 있는 가상현실의 예시적은 화면이다.4 is an exemplary screen of a virtual reality that a medical trainee can access through VR or the like for use of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
도 5는 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 사용을 위해 의료 실습생이 AR 또는 MR 등을 통해 접할 수 있는 가상현실의 예시적은 화면이다.5 is an exemplary screen of a virtual reality that a medical trainee can access through AR or MR for use in a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
도 6은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 일 구성요소인 포인트 트리거링부의 하위 구성들을 도시한 블록도이다.6 is a block diagram showing sub-components of a point triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
도 7은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 포인트 트리거링부가 타깃 지점에 대한 터치 라인(touch line)과 패일 라인(fail line)을 도시한 정단면도이다.FIG. 7 is a front cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. .
도 8은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 포인트 트리거링부가 타깃 지점에 대한 터치 라인(touch line)과 패일 라인(fail line)을 도시한 측단면도이다.FIG. 8 is a side cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. .
도 9는 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 일 구성요소인 라인 트리거링부의 하위 구성들을 도시한 블록도이다.9 is a block diagram showing sub-components of a line triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
도 10은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 일 구성요소인 라인 트리거링부의 타깃 영역에 대한 좌표를 도시한 개념도이다.10 is a conceptual diagram illustrating coordinates of a target region of a line triggering unit, which is a component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템은 다양한 변경을 가할 수 있고 여러 가지 실시예를 가질 수 있는바, 특정 실시예들을 도면에 예시하고 상세한 설명에 상세하게 설명하고자 한다. 그러나, 이는 본 발명을 특정한 실시 형태에 대해 한정하려는 것이 아니며, 본 발명의 기술적 사상 및 기술 범위에 포함되는 모든 변경, 균등물 내지 대체물을 포함하는 것으로 이해되어야 한다. The surgical evaluation system of the virtual reality-based otorhinolaryngology and neurosurgery simulator according to the present invention can apply various changes and have various embodiments, and specific embodiments will be illustrated in the drawings and described in detail in the detailed description. However, this is not intended to limit the present invention to specific embodiments, and should be understood to include all modifications, equivalents, and substitutes included in the technical spirit and technical scope of the present invention.
도 1은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 개념도이다. 도 2는 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 블록도이다. 도 3은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 일 구성요소 중 하나인 평가 측정부의 하위 구성들을 도시한 블록도이다. 도 4는 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 사용을 위해 의료 실습생이 VR 등을 통해 접할 수 있는 가상현실의 예시적은 화면이다. 도 5는 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 사용을 위해 의료 실습생이 AR 또는 MR 등을 통해 접할 수 있는 가상현실의 예시적은 화면이다. 도 6은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 일 구성요소인 포인트 트리거링부의 하위 구성들을 도시한 블록도이다. 도 7은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 포인트 트리거링부가 타깃 지점에 대한 터치 라인(touch line)과 패일 라인(fail line)을 도시한 정단면도이다. 도 8은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 포인트 트리거링부가 타깃 지점에 대한 터치 라인(touch line)과 패일 라인(fail line)을 도시한 측단면도이다. 도 9는 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 일 구성요소인 라인 트리거링부의 하위 구성들을 도시한 블록도이다. 도 10은 본 발명의 일 실시예에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템의 일 구성요소인 라인 트리거링부의 타깃 영역에 대한 좌표를 도시한 개념도이다.1 is a conceptual diagram of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. 2 is a block diagram of a surgical evaluation system of a virtual reality based otorhinolaryngology and neurosurgery simulator according to an embodiment of the present invention. FIG. 3 is a block diagram showing sub-components of an evaluation measurement unit that is one component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. 4 is an exemplary screen of a virtual reality that a medical trainee can access through VR or the like for use of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. 5 is an exemplary screen of a virtual reality that a medical trainee can access through AR or MR for use in a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. 6 is a block diagram showing sub-components of a point triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. FIG. 7 is a front cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. . FIG. 8 is a side cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. . 9 is a block diagram showing sub-components of a line triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. 10 is a conceptual diagram illustrating coordinates of a target region of a line triggering unit, which is a component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템은 도 1에 도시된 바와 같이, 의대생, 인턴, 레지던트, 일반의사, 전공의 또는 기타 의료 계통에서 종사하는 사람(이하, '의료 실습자'라 한다.)에게 유형물을 이용하지 않고도, 가상 현실인 공간 정보를 통하여 수술 관련의 체험을 가지게 하고, 이러한 체험을 통해 정량적인 평가가 이루어질 수 있도록 하는 시스템에 관한 기술분야이다.The surgical evaluation system of the virtual reality-based otorhinolaryngology and neurosurgery simulator according to the present invention is as shown in FIG. 1, medical students, interns, residents, general doctors, majors, or other persons engaged in medical systems (hereinafter referred to as'medical practice') It is a technical field of a system that allows a person to have an experience related to surgery through spatial information, which is a virtual reality, without using tangible objects, and to allow quantitative evaluation through such experiences.
도 1에 도시된 바와 같이, 의료 실습자는 고글(goggle, 11)을 착용한 후, 의료 실습자에게 가상현실(virtual reality)를 제공하여, 이러한 가상현실 내에서 자신의 손 동작을 통해 수술의 진행 과정이 해당 수술에 능통한 전문 의료진들이 보편적으로 행하는 수술 진행과정과 얼마나 근접했는지를 기준으로 정량적인 평가에 대한 기술적 사상을 개시하고 있다. As shown in Figure 1, the medical practitioner wears goggles (goggle, 11), and then provides a virtual reality (virtual reality) to the medical practitioner, the progress of the surgery through the operation of their hands in this virtual reality Based on how close the procedure is to the procedure normally performed by medical professionals who are proficient in the operation, the technical idea of quantitative evaluation is disclosed.
본 발명에 따른 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템은 도 1에 도시된 바와 같이, 이비인후과 수술 평가 시스템이 있으며, 이러한 시스템의 전방에는 VR을 제공하는 장비인 고글(11)과 의료 실습자의 손 동작 즉, 무빙 정보를 3차원적으로 읽어들이는 콘트롤러(12)를 포함할 수 있다. The surgical evaluation system of the virtual reality based otorhinolaryngology and neurosurgery simulator according to the present invention has an otorhinolaryngology surgical evaluation system, as shown in FIG. 1, and in front of such a system, goggles 11 that provide VR and medical practice It may include a controller 12 that reads the motion of the child's hand, that is, moving information in three dimensions.
상술한 바와 같이, 고글(11)은 의료 실습자의 안면(front face)에 장착되면서, 의료 실습자에게 공간 정보를 제공하게 되는 구성이다. As described above, the goggles 11 are configured to provide spatial information to the medical practitioner while being mounted on the front face of the medical practitioner.
고글(11)으로는 로 데이터(raw data, 13)가 저장된 저장소로부터 다양한 공 정보가 제공되는데, 이러한 공간 정보는 도 4 및 5에 도시된 바와 같은 이비인후과 계통의 비강 등에 대한 3차원적 입체 정보이다. As the goggles 11, various blank information is provided from a storage in which raw data 13 are stored, and this spatial information is three-dimensional stereoscopic information about the nasal cavity of the otorhinolaryngology system as shown in FIGS. .
고글(11)의 경우, VR(virtual reality), AR(augmented reality), MR(Mixed Reality) 중 적어도 하나 이상의 가상현실을 제공할 수 있다.In the case of the goggles 11, at least one virtual reality among virtual reality (VR), augmented reality (AR), and mixed reality (MR) may be provided.
콘트롤러(12)의 경우, 도 1에 도시된 바와 같이, 의료 실습자의 손에 제공되어 의료 실습자의 손의 움직임 즉, 무빙(moving) 정보를 읽어들여, 상술한 바와 같은, 고글(11)이 제공하는 공간 정보에 매핑하는 구성이다.In the case of the controller 12, as shown in FIG. 1, it is provided to the hand of a medical practitioner to read the movement of the hand of the medical practitioner, that is, moving information, and the goggles 11 as described above are provided. It is a configuration that maps to spatial information.
콘트롤러(12)의 경우, 의료 실습자의 손이 집도하여 수술을 진행하는데 필요한 3차원적 무빙 정보를 모두 읽어들일 수 있다. In the case of the controller 12, the hands of the medical practitioner can read all the three-dimensional moving information necessary to proceed with the surgery.
콘트롤러(12)는 그 행동 반경이 큰 영역을 다 아우르기 보다는, 성인 남성 주먹 하나의 부피 내에서 움직이는 미세한 동작을 읽어들일 수 있도록 섬세한 동작 파악이 이루어질 수 있도록 하는 것이 바람직하다. It is preferable that the controller 12 be able to grasp a fine motion to read a fine motion that moves within a volume of an adult male fist, rather than encompassing an area having a large radius of action.
공간정보 수신부(100)의 경우, 고글(11)이 제공하는 공간 정보를 수신하는 구성이다.In the case of the spatial information receiving unit 100, it is a configuration that receives the spatial information provided by the goggles 11.
공간정보 수신부(100)의 경우, 고글(11)을 통해 의료 실습자의 안면을 통해 시각적으로 제공되는 3차원 입체적 정보가 제공되는 구성이다. In the case of the spatial information receiving unit 100, the 3D stereoscopic information visually provided through the face of the medical practitioner through the goggles 11 is provided.
공간정보 수신부(100)는 가상현실 제공유닛(10)이 고글(11)에 제공하는 공간 정보의 시각적인 요소 그대로 수신할 수도 있으며, 시각적인 요소가 아닌, 공간 좌표적인 정보만을 축약해서 수용할 수도 있다. The spatial information receiving unit 100 may receive the visual elements of the spatial information provided by the virtual reality providing unit 10 to the goggles 11 as it is, and may also abbreviate and accommodate only spatial coordinate information, not visual elements. have.
무빙정보 수신부(200)의 경우, 콘트롤러(12)가 매핑한 의료 실습자의 무빙 정보를 수신하는 구성이다. The moving information receiving unit 200 is configured to receive moving information of a medical practitioner mapped by the controller 12.
무빙정보 수신부(200)는 콘트롤러(12)에 전산적으로 연결되어, 콘트롤러(12)가 실습자의 손 동작 즉, 무빙 정보를 읽어들이는 것을 3차원 공간 좌표적으로 수용하게 된다. The moving information receiving unit 200 is computationally connected to the controller 12 to accommodate the motion of the practitioner's hands, that is, reading moving information, in three-dimensional spatial coordinates.
기준데이터 설정부(300)의 경우, 공간정보 수신부(100)가 고글(11)로부터 수신한 공간 정보 상에서 의료 실습자가 움직여야할 바람직한 무빙정보 즉, 미리 설정된 기준데이터를 사전에 획득하고 설정하는 구성이다. In the case of the reference data setting unit 300, the spatial information receiving unit 100 obtains and sets desirable moving information to be moved by the medical practitioner, that is, preset reference data, in advance on the spatial information received from the goggles 11 .
여기서 미리 설정된 기준데이터의 경우, 수술 실습자가 현재 체험하고 테스트받게 되는 수술 부위의 바람직한 손의 움직임에 대한 3차원 좌표에서의 트레이스들(traces)의 평균치에 해당한다. 보다 자세하게는, 공간 정보 상에서 복수의 전문의들에 의한 무빙 정보들을 3차원 좌표 상에서 획득하고, 이러한 복수의 전문의들에 의한 무빙 정보에 의한 트레이스들(traces)의 평균값을 미리 설정된 기준데이터로 설정하게 된다.Here, in the case of the preset reference data, it corresponds to the average value of traces in three-dimensional coordinates with respect to the desired hand movement of the surgical site that the surgical practitioner is currently experiencing and tested. More specifically, moving information by a plurality of specialists on spatial information is obtained on three-dimensional coordinates, and an average value of traces by moving information by a plurality of specialists is set as preset reference data. .
당연히, 미리 설정된 기준데이터는 복수의 전문의들이 각각 고글(11)을 쓴 상태로 콘트롤러(12)를 집도한 후, 취득될 수 있다. Naturally, the preset reference data may be obtained after concentrating the controller 12 in a state where a plurality of specialists respectively wear goggles 11.
평가 측정부(400)의 경우, 기준데이터 설정부(300)로부터 미리 설정된 기준데이터를 수신하고, 상술한 바와 같은 공간 정보 상에 의료 실습자의 무빙 정보를 비교하여, 의료 실습자의 무빙 정보를 미리 설정된 기준데이터(전문 의료진의 무빙정보)와 비교하여 정량적인 점수를 산출하는 구성에 해당한다. In the case of the evaluation measurement unit 400, the preset reference data is received from the reference data setting unit 300, the moving information of the medical practitioner is compared on the spatial information as described above, and the moving information of the medical practitioner is preset. It corresponds to the configuration that calculates a quantitative score by comparing it with the reference data (moving information of professional medical staff).
보다 자세하게, 평가 측정부(400)의 경우, 도 3에 도시된 바와 같이, 공간좌표 인식부(410), 무빙좌표 인식부(420), 및 차원 추출부(430)를 포함할 수 있다.In more detail, in the case of the evaluation measurement unit 400, as shown in FIG. 3, a spatial coordinate recognition unit 410, a moving coordinate recognition unit 420, and a dimension extraction unit 430 may be included.
먼저, 공간좌표 인식부(410)의 경우, 도 4에 도시된 바와 같은 공간 정보를 3차원의 공간 좌표로서 판독하여 획득하는 구성이다. First, in the case of the spatial coordinate recognition unit 410, the spatial information as shown in FIG. 4 is read and obtained as three-dimensional spatial coordinates.
즉, 도 4 및 5의 공간 정보는 외형적 또는 시각적으로 고글(11)을 통해 사람의 육안 확인이 되는 형상 외, 3차원의 x-y-z 축에 의한 미시적인 좌표적 정보까지 반영된 것을 획득하게 된다. That is, the spatial information of FIGS. 4 and 5 is obtained by reflecting microscopic coordinate information by a three-dimensional x-y-z axis, in addition to a shape visually or visually confirmed through the goggles 11.
무빙좌표 인식부(420)의 경우, 무빙 정보가 반영되고 매핑되는 공간 정보 내에서 무빙 정보가 가지는 3차원적인 공간 좌표를 시간의 추이에 따라 획득하게 된다.In the case of the moving coordinate recognition unit 420, the three-dimensional spatial coordinates of the moving information in the spatial information to which the moving information is reflected and mapped are acquired according to the change of time.
차원 추출부(430)의 경우, 공간 정보 내에서 미리 결정된 수술 부위 상에서, 무빙 정보의 평가를 위한 소정의 차원을 추출하는 구성이다. In the case of the dimension extraction unit 430, a configuration for extracting a predetermined dimension for evaluating moving information on a predetermined surgical site in spatial information.
도 7를 참고하면, 도 7의 패일 라인(fail line)은 수술 부위 중에서 건드려서는 절대로 안되는 혈관으로 설정할 수 있고, 터치 라인(touch line)은 의료 실습자가 집도하여 깔끔하게 매스(mass)를 통해 절개해 나가야 하는 라인으로 설정할 수 있다. Referring to FIG. 7, the fail line of FIG. 7 can be set as a blood vessel that should not be touched among surgical sites, and the touch line is incised by a medical practitioner and neatly cut through a mass It can be set to the line that should go out.
도 7에 있어서, 미리 결정된 수술 부위는 터치 라인(touch line)으로 볼 수 있으며, 상술한 바와 같이, 의료 실습자의 손동작 즉 무빙 정보의 판단을 위해서는 도 8과 같이 소정의 차원을 직선 즉, 1차원으로 설정할 수도 있다. In FIG. 7, the predetermined surgical site can be viewed as a touch line, and as described above, in order to determine the hand motion of the medical practitioner, that is, moving information, a predetermined dimension as shown in FIG. 8 is a straight line, that is, a one-dimensional You can also set
다른 예를 들면, 도 10을 참고하면, 도 10에서 검은 실선은 의료 실습자가 매스(mass)를 통해, 비강 내의 특정 피부 표면을 약한 S자 형태로 절개해 나가야 하는 바람직한 절개 라인을 의미하게 된다. 이 경우, 해당 절개 라인에 직교하는 평면 즉, 2차원의 평면을 추출하게 된다. For another example, referring to FIG. 10, a solid black line in FIG. 10 means a preferred incision line in which a medical practitioner is required to cut a specific skin surface in the nasal cavity into a weak S-shape through a mass. In this case, a plane orthogonal to the corresponding incision line, that is, a two-dimensional plane is extracted.
따라서, 소정의 차원은 상술한 바와 같이, 의료 실습자의 손동작 즉, 무빙 정보의 정량적 평가를 위하여 추출해내는 차원을 의미하게 된다. Therefore, as described above, a predetermined dimension means a hand gesture of a medical practitioner, that is, a dimension extracted for quantitative evaluation of moving information.
차원 추출부(430)의 경우, 도 6에 도시된 바와 같이, 포인트 트리거링(point triggering)부(440)를 더 포함할 수 있다. In the case of the dimension extraction unit 430, as illustrated in FIG. 6, a point triggering unit 440 may be further included.
도 7 및 8에 도시된 바와 같이, 차원 추출부(430)가 소정의 차원을 복수 개, 여기서는 두 개의 1차원 라인으로 설정하며, 이 경우, 포인트 트리거링부(440)는 상술한 바와 같이 미리 결정된 수술 부위를 1차원의 터치 라인(touch line)으로 설정하며, 터치 라인(touch line)으로부터 소정의 뎁스(depth)를 가지는 하부에는 패일 라인(fail line)으로 설정하게 된다. 7 and 8, the dimension extraction unit 430 sets a plurality of predetermined dimensions to two one-dimensional lines, in this case, the point triggering unit 440 is previously determined as described above. The surgical site is set as a one-dimensional touch line, and a fail line is set below the touch line having a predetermined depth.
여기서의 소정의 뎁스는 이비인후과 수술 부위에서 임의로 설정할 수 있는데, 비강 내 부풀어 오른 피부 조직과, 부풀어 오른 피부 조직의 하부에서 동맥 또는 정맥 또는 신경 들의 표면을 패일 라인(line)으로 설정할 수 있다. The predetermined depth may be arbitrarily set at the ENT surgical site, and the surface of arteries or veins or nerves under the swollen skin tissue and the swollen skin tissue may be set as a fail line.
나아가, 포인트 트리거링부(440)의 경우, 타깃지점 설정부(441), 터치라인 인식부(442), 패일라인 인식부(443)를 포함할 수 있다. Furthermore, in the case of the point triggering unit 440, a target point setting unit 441, a touch line recognition unit 442, and a fail line recognition unit 443 may be included.
타깃지점 설정부(441)의 경우, 수술 부위 즉, 미리 결정된 수술 부위를 설정하여, 의료 실습자에게 고글(11)을 통해 공간 정보를 시각적으로 제공하게 된다. In the case of the target point setting unit 441, a surgical site, that is, a predetermined surgical site is set, and spatial information is visually provided through the goggles 11 to the medical practitioner.
미리 결정된 수술 부위가 비강 내 특정한 조직 예컨대, 결정종(ganglion)이라 하면, 타깃지점 설정부(441)는 비강 내 결정종을 고글(11)을 통해 시각적으로 제공하게 된다. 의료 실습자는 육안의 관찰만으로 이러한 조직의 존부를 판단하도록 할 수 있으며, 이러한 육안 관찰을 통해 의료 실습자는 결정종을 찾아내도록 하게 된다.If the predetermined surgical site is a specific tissue in the nasal cavity, for example, a ganglion, the target point setting unit 441 visually provides the crystalline species in the nasal cavity through the goggles 11. The medical practitioner can determine the existence of these tissues only by visual observation, and through the visual observation, the medical practitioner can find the crystalline species.
터치라인 인식부(442)의 경우, 콘트롤러에 의해 입력되는 의료 실습자의 무빙 정보가 터치 라인을 터치하는지 여부를 인식하게 된다. In the case of the touch line recognition unit 442, it is recognized whether the moving information of the medical practitioner input by the controller touches the touch line.
상술한 예와 같이, 비강 내 결정종이 있으면, 결정종을 절개하게 위하여 매스(mass)가 절개하며 찔러 들어가야하는 최적의 라인을 터치 라인으로 인식하게 되는데, 이러한 터리 라인을 터치하는지 여부를 터치라인 인식부(442)는 인식하게 된다. As in the above-described example, if there is a crystalline species in the nasal cavity, the optimal line to be pierced with a mass is cut to recognize the crystalline species as a touch line. Wealth 442 will recognize.
물론 이와 같은 터치라인 인식부(442)의 기작의 경우, 공간 정보와 무빙 정보가 상호 3차원 좌표공간에서의 좌표지점이 상호 연동되는 것을 전제로 한다.Of course, in the case of the mechanism of the touch line recognition unit 442, it is assumed that spatial information and moving information are coordinated with each other in coordinate coordinate points in a three-dimensional coordinate space.
패일라인 인식부(443)의 경우, 콘트롤러(12)에 의해 입력되는 의료 실습자의 무빙 정보 즉, 손동작의 3차원 좌표 정보가 터치 라인(touch line)의 하부에 형성된 패일 라인(fail line)을 터치하는지 여부를 인식하게 된다. In the case of the fail line recognition unit 443, the moving information of the medical practitioner input by the controller 12, that is, the 3D coordinate information of the hand gesture, touches a fail line formed below the touch line. Or not.
상술한 예를 통해 다시 설명하면, 비강 내 결정종이 있으면, 결정종을 절개하게 위하여 매스(mass)가 절개하며 찔러 들어가야하는 최적의 라인을 터치 라인으로 인식하게 되는데, 이러한 터리 라인을 찢고 매스가 하부로 내려간 후, 절대로 건드려서는 안되는 영역 예컨대, 신경, 동맥, 정맥 등을 터치하는지 여부를 패일라인 인식부(443)는 인식하게 된다. Referring again through the above-described example, if there is a crystalline species in the nasal cavity, a mass is cut and a best line to be pierced is recognized as a touch line in order to cut the crystalline species, and the mass is cut and the mass is lowered. After going down to, the failline recognition unit 443 recognizes whether or not an area that should not be touched, such as a nerve, an artery, or a vein, is touched.
패일라인 인식부(443)는 패일 라인(fail line)을 터치하는지 안하는지 여부만을 인식하고, 이러한 패일 라인의 터치가 있으면, 의료 실습자의 수술 행위가 실패하였음으로 인지시키게 된다. The fail line recognition unit 443 recognizes only whether the fail line is touched or not, and if there is a touch of the fail line, the failing operation of the medical practitioner is recognized as failing.
나아가, 포인트 트리거링부(440)는 터치시간 인식부(444), 뎁스 가중치부(445), 및 뎁스분산 측정부(446)을 더 포함할 수 있다. Furthermore, the point triggering unit 440 may further include a touch time recognition unit 444, a depth weighting unit 445, and a depth dispersion measurement unit 446.
의료 실습자의 경우, 수술해야 하는 부위를 매스(mass)를 통하여 매끄럽게 적절한 속도로 절개해야하는데, 이는 매스(mass)의 날의 날카로움과 수술 부위의 탄력 그리고 조직의 출혈 및 수분 증발 등을 고려하여야 하기 때문이다. In the case of medical practitioners, the area to be operated should be incised smoothly through a mass at an appropriate rate, which should take into account the sharpness of the mass blade, the elasticity of the surgical site, tissue bleeding, and moisture evaporation. Because.
따라서, 절개의 속도는 의료 실습자의 무빙 정보가 수술 부위에 머무르는 총 시간이 중요한데, 터치시간 인식부(444)는 의료 실습자가 이를 획득하게 된다.Therefore, the speed of the incision is important for the total time that the medical practitioner's moving information stays at the surgical site, and the touch time recognition unit 444 acquires the medical practitioner.
터치시간 인식부(444)는 가상의 공간에서 의료 실습자가 매스를 집도하여 절개하는 시간을 중심으로 측정하게 된다. The touch time recognition unit 444 measures the time around which the medical practitioner concentrates the mass and cuts it in the virtual space.
상술한 바와 같이, 터치라인 인식부(442)와 패일라인 인식부(443)의 경우, 터치라인을 얼마나 잘 따라서 절개하는지와 패일라인을 건드리는지만을 파악하게 되는데, 나아가 뎁스 가중치부(445)의 경우, 터치라인을 찌르고 더 깊이있게 들어간 매스가 터치라인으로부터 매스가 얼마나 더 들어가는지에 따라 비례하여 감점을 부여하게 된다. As described above, in the case of the touch line recognition unit 442 and the fail line recognition unit 443, it is possible to determine how well the touch line is cut along and only touches the fail line. Further, the depth weight unit 445 In the case of, the penalties are imparted proportionally according to how much more mass enters the touch line and the mass entered deeper.
도 8(a)에서는 터치 라인의 인식 여부와 패일 라인의 인식 여부만 판단하게 되는데, 도 8(b)에서는 터치 라인으로부터 패일 라인까지의 깊이의 정도에 따라 연속적으로 그 감점의 정도가 커지도록 뎁스 가중치부(445)가 설정하게 된다. In FIG. 8(a), only the recognition of the touch line and the recognition of the fail line are determined. In FIG. 8(b), the depth of the deduction is continuously increased according to the depth from the touch line to the fail line. The weight unit 445 is set.
뎁스분산 측정부(446)의 경우, 터치 라인으로부터 패일 라인까지 이어지는 공간 내에 삽입되는 무빙 정보의 깊이의 분산을 측정하여, 분산에 반비례하여 무빙 정보에 점수를 부여한다.In the depth dispersion measurement unit 446, the variance of the depth of the moving information inserted in the space from the touch line to the fail line is measured, and the moving information is scored in inverse proportion to the variance.
즉, 도 8(b)에서 의료 실습자가 매스(mass)를 통해 터치 라인을 뚫고 들어간 후, 터치 라인과 패일 라인 사이에서 매스의 깊이가 고르지 못하고 오르락 내리락 하게 되면 그 분산의 정도가 커지면서 낮은 점수가 부여되도록 하며, 반대로 터치 라인을 뚫고 패일 라인 전까지 매스가 침투한 경우라도 그 깊이의 정도가 균일하면 균일하지 못한 경우보다는 좋은 점수를 받도록 하게 된다.That is, when the medical practitioner penetrates the touch line through the mass in FIG. 8(b) and the depth of the mass is uneven and fluctuates between the touch line and the fail line, the degree of dispersion increases and the low score increases. On the contrary, even if the mass penetrates through the touch line and before the fail line, if the depth is uniform, a better score is obtained than if it is not uniform.
평가 측정부(400)는 포인트 트리거링부(440)와 함께, 혹은 별도로 라인 트리거링부(450)를 더 포함할 수 있다. The evaluation measurement unit 400 may further include a line triggering unit 450 together with or separately from the point triggering unit 440.
도 10에 도시된 바와 같이, 라인 트리거링부(450)의 경우, 차원 추출부(430)가 소정의 차원을 2차원으로 설정하되, 라인 트리거링부(450)는 미리 결정된 수술 부위를 2차원 면적으로 설정하며, 2차원 면적의 원점(c)으로부터 이격되는 거리(l)를 산출하는 구성이다. As shown in FIG. 10, in the case of the line triggering unit 450, the dimension extracting unit 430 sets a predetermined dimension in two dimensions, but the line triggering unit 450 sets the predetermined surgical area to a two dimensional area. It is a configuration that calculates the distance (l) spaced apart from the origin (c) of the two-dimensional area.
상술한 바와 같이, 도 10의 실선은 복수의 전문의가 집도하여 수술을 집행할 때 나타내는 미리 설정된 기준데이터 즉, 바람직한 절개라인을 의미하게 된다.As described above, the solid line in FIG. 10 means preset reference data, that is, a desirable incision line, which is displayed when a plurality of specialists perform surgery.
이를 위하여 도 9 및 10에 도시된 바와 같이, 라인 트리거링부(450)는 타깃영역 설정부(451), 좌표설정부(452), 이탈직경 산출부(454) 및 좌표 쉬프트부(455)를 포함할 수 있다.9 and 10, the line triggering unit 450 includes a target area setting unit 451, a coordinate setting unit 452, a deviation diameter calculating unit 454, and a coordinate shift unit 455. can do.
먼저, 타깃영역 설정부(451)의 경우, 2차원 면적의 원점(c)을 미리 결정된 수술 부위를 위한 타깃 영역으로 설정하게 된다. First, in the case of the target region setting unit 451, the origin c of the two-dimensional area is set as a target region for a predetermined surgical site.
도 10에서 도시된 바와 같이, 실선의 경우, 비강 내 특정 표면을 절개해야하는 부분이며, 이러한 부분은 미리 설정된 기준데이터에 해당하게 된다.As shown in FIG. 10, in the case of a solid line, a specific surface in the nasal cavity needs to be cut, and this part corresponds to preset reference data.
이러한 절개는 원점(c)으로 설정되며, 이러한 절개 라인에 수직하는 또 하나의 가상의 2차원 평면, 즉 면적을 소정의 차원으로 추출하게 된다.The incision is set as the origin (c), and another virtual two-dimensional plane perpendicular to the incision line, that is, an area is extracted in a predetermined dimension.
좌표 설정부(452)의 경우, 상술한 바와 같은 2차원의 면적을 x-y좌표로 설정하여, 의료 실습자의 무빙 정보 즉, 가상의 공간인 비강 내에서 매스(mass)가 절개하며 지나가는 부분을 도 10의 x-y 좌표로서 인식하게 된다. 물론 여기서 c는 도 10의 실선을 의미하게 되며, 2차원 면적은 실선을 가로지르는 면적인 w에 해당하게 된다. In the case of the coordinate setting unit 452, by setting the two-dimensional area as the xy coordinate as described above, the moving information of the medical practitioner, that is, the part where the mass passes through the nasal cavity, which is a virtual space, passes through FIG. 10. It is recognized as the xy coordinate of. Of course, here, c means the solid line in Fig. 10, and the two-dimensional area corresponds to the area w crossing the solid line.
이탈직경 산출부(454)의 경우, 좌표 설정부(452)가 인식한 무빙 정보의 x-y좌표 내에서의 c로부터의 이격된 거리(l)을 측정하게 되는 것이다. In the case of the departure diameter calculating unit 454, the distance l from c in the x-y coordinates of the moving information recognized by the coordinate setting unit 452 is measured.
물론, 이탈직경 산출부(454)에 의해 의료 실습자의 무빙 정보가 x-y좌표 상에서의 원점(c)로부터 이격된 거리(l)가 클수록 낮은 점수를 부여받게 된다. Of course, the larger the distance l separated from the origin c on the x-y coordinate by the departure diameter calculating unit 454, the lower the score is given.
좌표 쉬프트부(455)의 경우, 도 10에서 w에서 w'로 이동하는 바와 같이, 2차원 면적을 쉬프트하여, 타깃영역 설정보(451)로 하여금 수술 부위를 위한 타깃 영역을 재설정시키게 된다.In the case of the coordinate shift unit 455, as shown in FIG. 10, from w to w', the two-dimensional area is shifted to cause the target region setting beam 451 to reset the target region for the surgical site.
나아가, 라인 트리거링부(450)의 경우, 터치 카운트부(453)를 더 포함할 수 있다. Furthermore, in the case of the line triggering unit 450, the touch counting unit 453 may be further included.
터치 카운트부(453)의 경우, 의료 실습자의 무빙 정보가 2차원 면적의 x-y좌표를 터치하는 횟수를 카운트하여, 무빙 정보가 2차원 면적의 x-y좌표를 복수 회 터치하는지 여부를 인식하게 된다. In the case of the touch counting unit 453, the number of times the moving information of the medical practitioner touches the x-y coordinate of the 2D area is counted to recognize whether the moving information touches the x-y coordinate of the 2D area multiple times.
터치 카운트부(453)의 경우, 도 10과 같은 절개 라인을 두번 지나면 안되기 때문에, 즉, 두번 절개하면 안되기 때문에, 이러한 절개의 횟수를 카운트하게 된다. In the case of the touch counting unit 453, the number of such incisions is counted because the incision line as shown in FIG. 10 must not be passed twice, that is, the incision should not be performed twice.
무빙타임 측정부(456)의 경우, 터치시간 인식부(442)와 유사하게, 도 10의 절개라인을 따라 움직이는 매스의 절개 시간을 측정하게 된다. In the case of the moving time measuring unit 456, similar to the touch time recognition unit 442, the cutting time of the mass moving along the cutting line of FIG. 10 is measured.
기준데이터 백업부(500)의 경우, 상술한 바와 같은 복수의 전문의에 의하여 미리 획득된 정보 즉, 의료 실습자가 가급적 모방해야할 바람직한 매스의 움직임 즉, 무빙 정보를 백업하는 저장소로서, 이러한 저장소에 미리 설정된 기준데이터가 물리적으로 분할되어 저장될 수 있다. In the case of the reference data backup unit 500, information previously obtained by a plurality of specialists as described above, that is, a desirable mass movement that medical practitioners should imitate, that is, a storage for backing up moving information, which is preset in such storage Reference data may be physically divided and stored.
본 발명의 권리 범위는 특허청구범위에 기재된 사항에 의해 결정되며, 특허 청구범위에 사용된 괄호는 선택적 한정을 위해 기재된 것이 아니라, 명확한 구성요소를 위해 사용되었으며, 괄호 내의 기재도 필수적 구성요소로 해석되어야 한다.The scope of rights of the present invention is determined by the matters described in the claims, and the parentheses used in the claims are not used for selective limitation, but are used for clear elements, and the descriptions in parentheses are also interpreted as essential elements. Should be.

Claims (10)

  1. 의료 실습자의 안면(front face)에 장착되어, 상기 의료 실습자에게 공간 정보를 시각적으로 제공하는 고글; A goggle mounted on the front face of the medical practitioner to visually provide spatial information to the medical practitioner;
    상기 의료 실습자의 손에 제공되어, 상기 의료 실습자의 손의 움직임의 무빙 정보를 상기 공간 정보 내에 매핑시키는 콘트롤러; A controller provided in the hand of the medical practitioner to map moving information of the motion of the medical practitioner's hand into the spatial information;
    상기 고글이 제공하는 상기 공간 정보를 수신하는 공간정보 수신부; A spatial information receiving unit receiving the spatial information provided by the goggles;
    상기 콘트롤러가 매핑한 상기 의료 실습자의 상기 무빙 정보를 수신하는 무빙정보 수신부;A moving information receiver configured to receive the moving information of the medical practitioner mapped by the controller;
    상기 공간 정보 상에 상기 무빙 정보에 관한 미리 설정된 기준데이터를 사전 획득하고 설정하는 기준데이터 설정부; 및A reference data setting unit for pre-acquiring and setting preset reference data for the moving information on the spatial information; And
    상기 기준데이터 설정부로부터 상기 미리 설정된 기준데이터를 수신하고, 상기 공간 정보 상에 상기 의료 실습자의 상기 무빙 정보를 비교하여 정량적인 점수를 산출하는 평가 측정부를 포함하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템. And an evaluation measurement unit that receives the preset reference data from the reference data setting unit and compares the moving information of the medical practitioner on the spatial information to calculate a quantitative score. .
  2. 제1항에 있어서, 상기 미리 설정된 기준데이터는,According to claim 1, The preset reference data,
    복수의 전문의들에 의해 제공된 무빙 정보들을 미리 획득하여, 상기 공간 정보 상에서의 상기 복수의 전문의들에 의한 무빙 정보들의 3차원 좌표에서의 트레이스들의 평균치로 설정되는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.Otorhinolaryngology surgical evaluation system, characterized in that the moving information provided by a plurality of specialists is obtained in advance and set as an average value of traces in three-dimensional coordinates of moving information by the plurality of specialists on the spatial information.
  3. 제1항에 있어서, 상기 고글은,According to claim 1, wherein the goggles,
    VR(virtual reality), AR(Augmented Reality) 또는 MR(Mixed Reality) 중 적어도 하나 이상의 가상 현실을 제공하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.Otorhinolaryngology surgical evaluation system, characterized by providing at least one virtual reality of virtual reality (VR), Augmented Reality (AR) or Mixed Reality (MR).
  4. 제1항에 있어서, 상기 평가 측정부는,According to claim 1, The evaluation measurement unit,
    상기 공간 정보를 3차원 공간 좌표로 판독하여 획득하는 공간좌표 인식부;A spatial coordinate recognition unit that reads and obtains the spatial information in 3D spatial coordinates;
    상기 공간 정보의 상기 3차원 공간 좌표 내, 상기 무빙 정보가 가지는 3차원 공간 좌표를 획득하는 무빙좌표 인식부; 및A moving coordinate recognition unit for acquiring 3D spatial coordinates of the moving information in the 3D spatial coordinates of the spatial information; And
    상기 공간 정보 내에 미리 결정된 수술 부위 상에, 상기 무빙 정보의 평가를 위한 소정의 차원을 추출하는 차원 추출부를 포함하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.And a dimensional extraction unit for extracting a predetermined dimension for evaluation of the moving information on a predetermined surgical site in the spatial information.
  5. 제4항에 있어서, 상기 차원 추출부는,According to claim 4, The dimension extraction unit,
    상기 소정의 차원을 복수 개의 1차원으로 설정하되,Set the predetermined dimension to a plurality of one-dimensional,
    상기 평가 측정부는,The evaluation measurement unit,
    상기 미리 결정된 수술 부위를 1차원의 터치 라인(touch line)으로 설정하며, 상기 터치 라인으로부터 소정의 뎁스(depth)를 가지는 하부에는 패일 라인(fail line)으로 설정하는 포인트 트리거링(point triggering)부를 더 포함하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.A point triggering unit for setting the predetermined surgical site as a one-dimensional touch line, and setting a fail line below a predetermined depth from the touch line is further provided. Otolaryngology surgery evaluation system, characterized in that it comprises.
  6. 제5항에 있어서, 상기 포인트 트리거링부는,The method of claim 5, wherein the point triggering unit,
    상기 미리 결정된 수술 부위를 설정하여, 상기 고글을 통해 상기 공간 정보로 제공하는 타깃지점 설정부;A target point setting unit that sets the predetermined surgical site and provides the spatial information through the goggles;
    상기 콘트롤러에 의해 입력되는 상기 의료 실습자의 상기 무빙 정보가 상기 터치 라인을 터치하는지 여부를 인식하는 터치라인 인식부; 및A touch line recognition unit recognizing whether the moving information of the medical practitioner input by the controller touches the touch line; And
    상기 콘트롤러에 의해 입력되는 상기 의료 실습자의 상기 무빙 정보가 상기 터치 라인의 하부에 형성된 상기 패일 라인을 터치하는지 여부를 인식하는 패일라인 인식부를 포함하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.And a fail line recognition unit for recognizing whether or not the moving information of the medical practitioner input by the controller touches the fail line formed below the touch line.
  7. 제6항에 있어서, 상기 포인트 트리거링부는,The method of claim 6, wherein the point triggering unit,
    상기 의료 실습자의 상기 무빙 정보가 상기 미리 결정된 수술 부위에 머무르는 총 시간을 획득하는 터치시간 인식부;A touch time recognition unit acquiring a total time for which the moving information of the medical practitioner stays at the predetermined surgical site;
    상기 터치 라인으로부터 상기 패일 라인까지 이어지는 공간 내에 삽입되는 상기 무빙 정보의 깊이에 따라 상기 무빙 정보의 점수를 연속적으로 산출하여, 상기 무빙 정보의 깊이에 비례하여 점수를 연속적으로 감하는 뎁스 가중치부; 및A depth weighting unit continuously calculating a score of the moving information according to the depth of the moving information inserted into a space extending from the touch line to the failing line, and continuously subtracting the score in proportion to the depth of the moving information; And
    상기 터치 라인으로부터 상기 패일 라인까지 이어지는 공간 내에 삽입되는 상기 무빙 정보의 깊이의 분산을 측정하여, 분산에 반비례하여 상기 무빙 정보의 점수를 부여하는 뎁스분산 측정부를 더 포함하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.And characterized in that it further comprises a depth variance measurement unit for measuring the variance of the depth of the moving information inserted into the space leading from the touch line to the failing line, to give the score of the moving information inversely proportional to the variance, otolaryngology surgery Evaluation system.
  8. 제4항에 있어서, 상기 차원 추출부는,According to claim 4, The dimension extraction unit,
    상기 소정의 차원을 2차원으로 설정하되,Set the predetermined dimension to 2D,
    상기 평가 측정부는,The evaluation measurement unit,
    상기 미리 결정된 수술 부위를 2차원 면적으로 설정하며, 상기 2차원 면적의 원점으로부터 이격되는 거리를 산출하는 라인 트리거링부를 더 포함하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.Otolaryngology surgical evaluation system further comprising a line triggering unit for setting the predetermined surgical area in a two-dimensional area and calculating a distance spaced from the origin of the two-dimensional area.
  9. 제8항에 있어서, 상기 라인 트리거링부는,The line triggering unit of claim 8,
    상기 2차원 면적의 원점을 상기 미리 결정된 수술 부위를 위한 타깃 영역으로 설정하는 타깃영역 설정부;A target area setting unit that sets the origin of the two-dimensional area as a target area for the predetermined surgical site;
    상기 2차원 면적을 x-y 좌표로 설정하고, 상기 무빙 정보의 상기 2차원 면적의 x-y좌표 내의 좌표를 인식하는 좌표 설정부;A coordinate setting unit configured to set the two-dimensional area as x-y coordinates and recognize coordinates in the x-y coordinates of the two-dimensional area of the moving information;
    상기 무빙 정보가 상기 원점으로부터 상기 x-y 좌표 상에서 이격된 거리를 측정하는 이탈직경 산출부; 및Deviation diameter calculation unit for measuring the distance the moving information is spaced on the x-y coordinates from the origin; And
    상기 2차원 면적을 쉬프트 하여, 상기 타깃영역 설정부로 하여금 상기 수술 부위를 위한 타깃 영역을 재설정시키는 좌표 쉬프트부를 포함하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.And a coordinate shift unit configured to shift the two-dimensional area and cause the target region setting unit to reset the target region for the surgical site.
  10. 제9항에 있어서, 상기 라인 트리거링부는,The line triggering unit of claim 9,
    상기 의료 실습자의 상기 무빙 정보가 상기 2차원 면적의 x-y 좌표를 터치하는 횟수를 카운트하여, 상기 무빙 정보가 상기 2차원 면적의 x-y 좌표를 복수 회 터치하는지 여부를 인식하는 터치 카운트부를 더 포함하는 것을 특징으로 하는, 이비인후과 수술 평가 시스템.Further comprising a touch count unit for counting the number of times the moving information of the medical practitioner touches the xy coordinates of the 2D area, and recognizing whether the moving information touches the xy coordinates of the 2D area multiple times. Otolaryngology surgery evaluation system, characterized by.
PCT/KR2019/013827 2018-12-27 2019-10-21 Virtual reality-based surgery assessment system, using simulator, for otolaryngology and neurosurgery WO2020138671A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0170079 2018-12-27
KR1020180170079A KR102143784B1 (en) 2018-12-27 2018-12-27 System for estimating otorhinolaryngology and neurosurgery surgery based on simulator of virtual reality

Publications (1)

Publication Number Publication Date
WO2020138671A1 true WO2020138671A1 (en) 2020-07-02

Family

ID=71126540

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/013827 WO2020138671A1 (en) 2018-12-27 2019-10-21 Virtual reality-based surgery assessment system, using simulator, for otolaryngology and neurosurgery

Country Status (2)

Country Link
KR (1) KR102143784B1 (en)
WO (1) WO2020138671A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077662A (en) * 2021-04-03 2021-07-06 刘铠瑞 Laparoscopic surgery and training system based on 5G network technology application

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102532795B1 (en) * 2020-07-31 2023-05-15 전남대학교산학협력단 Collection system and method for performance behavior data of dental training subjects based on virtual reality

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100106834A (en) * 2009-03-24 2010-10-04 주식회사 이턴 Surgical robot system using augmented reality and control method thereof
KR101166554B1 (en) * 2011-01-14 2012-07-18 가톨릭대학교 산학협력단 Apparatus and method for generating animation effects of cauterization
KR20150132681A (en) * 2014-05-15 2015-11-26 리치앤타임(주) Virtual network training processing unit included client system of immersive virtual training system that enables recognition of respective virtual training space and collective and organizational cooperative training in shared virtual workspace of number of trainees through multiple access and immersive virtual training method using thereof
JP2016500157A (en) * 2012-11-13 2016-01-07 エイドス−メディスン リミティッド・ライアビリティ・カンパニー Hybrid medical laparoscopic simulator
KR101887805B1 (en) * 2017-03-23 2018-08-10 최재용 System for simulating laparoscopic surgery base on augmented reality and method for using the same

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2808366B1 (en) 2000-04-26 2003-12-19 Univ Paris Vii Denis Diderot VIRTUAL REALITY LEARNING METHOD AND SYSTEM, AND APPLICATION IN ODONTOLOGY
TWI605795B (en) * 2014-08-19 2017-11-21 鈦隼生物科技股份有限公司 Method and system of determining probe position in surgical site
KR20160092425A (en) 2015-01-27 2016-08-04 국민대학교산학협력단 Apparatus for virtual surgery simulation and the operation method thereof
KR20180123310A (en) 2017-05-08 2018-11-16 서정훈 Laparoscopic surgery education system using augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100106834A (en) * 2009-03-24 2010-10-04 주식회사 이턴 Surgical robot system using augmented reality and control method thereof
KR101166554B1 (en) * 2011-01-14 2012-07-18 가톨릭대학교 산학협력단 Apparatus and method for generating animation effects of cauterization
JP2016500157A (en) * 2012-11-13 2016-01-07 エイドス−メディスン リミティッド・ライアビリティ・カンパニー Hybrid medical laparoscopic simulator
KR20150132681A (en) * 2014-05-15 2015-11-26 리치앤타임(주) Virtual network training processing unit included client system of immersive virtual training system that enables recognition of respective virtual training space and collective and organizational cooperative training in shared virtual workspace of number of trainees through multiple access and immersive virtual training method using thereof
KR101887805B1 (en) * 2017-03-23 2018-08-10 최재용 System for simulating laparoscopic surgery base on augmented reality and method for using the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077662A (en) * 2021-04-03 2021-07-06 刘铠瑞 Laparoscopic surgery and training system based on 5G network technology application

Also Published As

Publication number Publication date
KR20200080534A (en) 2020-07-07
KR102143784B1 (en) 2020-08-12

Similar Documents

Publication Publication Date Title
CN107067856B (en) Medical simulation training system and method
EP2915157B1 (en) System for injection training
Gallagher et al. PicSOr: an objective test of perceptual skill that predicts laparoscopic technical skill in three initial studies of laparoscopopic performance
Banerjee et al. Accuracy of ventriculostomy catheter placement using a head-and hand-tracked high-resolution virtual reality simulator with haptic feedback
ES2597809T3 (en) Simulation system for training in arthroscopic surgery
US20030031993A1 (en) Medical examination teaching and measurement system
Linke et al. Assessment of skills using a virtual reality temporal bone surgery simulator
WO2018062722A1 (en) Acupuncture training simulation system
Müller et al. Virtual reality in surgical arthroscopic training
WO2021125547A1 (en) Cardiopulmonary resuscitation training system based on virtual reality
Wheeler et al. Interactive computer-based simulator for training in blade navigation and targeting in myringotomy
WO2020138671A1 (en) Virtual reality-based surgery assessment system, using simulator, for otolaryngology and neurosurgery
Wei et al. Augmented optometry training simulator with multi-point haptics
KR102146719B1 (en) System for estimating orthopedics surgery based on simulator of virtual reality
Crossan et al. A horse ovary palpation simulator for veterinary training
RU2687564C1 (en) System for training and evaluating medical personnel performing injection and surgical minimally invasive procedures
JP2021043443A (en) Laparoscopic simulator
Komizunai et al. An immersive nursing education system that provides experience of exemplary procedures from first person viewpoint with haptic feedback on wrist
WO2020145455A1 (en) Augmented reality-based virtual training simulator system for cardiovascular system surgical procedure, and method therefor
KR20040084243A (en) Virtual surgical simulation system for total hip arthroplasty
WO2020091224A1 (en) Virtual reality-based cataract surgery simulator system
Kabuye et al. A mixed reality system combining augmented reality, 3D bio-printed physical environments and inertial measurement unit sensors for task planning
RU2615686C2 (en) Universal simulator of surdologist, audiologist
Kaluschke et al. The Impact of 3D Stereopsis and Hand-Tool Alignment on Effectiveness of a VR-based Simulator for Dental Training
CN107993506A (en) A kind of force feedback endoscopy virtual training system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19903445

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29/09/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19903445

Country of ref document: EP

Kind code of ref document: A1