CN108922300A - Surgical simulation 3D system based on digitized humans - Google Patents
Surgical simulation 3D system based on digitized humans Download PDFInfo
- Publication number
- CN108922300A CN108922300A CN201810821433.1A CN201810821433A CN108922300A CN 108922300 A CN108922300 A CN 108922300A CN 201810821433 A CN201810821433 A CN 201810821433A CN 108922300 A CN108922300 A CN 108922300A
- Authority
- CN
- China
- Prior art keywords
- scene
- sub
- tracker
- scalpel
- level
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Abstract
Surgical simulation 3D system provided by the invention based on digitized humans, including scalpel, positioning gloves, VR glasses, scene database, selecting unit and analogue unit;Tracker is equipped on scalpel and positioning gloves;Scene database is stored with preset simulated scenario;Simulated scenario is constructed using two-dimentional 2D picture;Selective listing is equipped in selecting unit;Selective listing includes first category and second category;Selecting unit reads the location information of tracker, obtains the gesture motion of user according to the location information of tracker on positioning gloves, selects simulated scenario;Analogue unit is used for after entering simulated scenario, is started to carry out surgical simulation, is read the location information of tracker on positioning gloves and scalpel, switched over according to the location information of tracker to simulated scenario neutron scene;VR glasses are worn for user, the system, enable the physician to carry out surgical simulation, improve the experience sense that doctor comes personally.
Description
Technical field
The invention belongs to field of computer technology, and in particular to the surgical simulation 3D system based on digitized humans.
Background technique
The surgical procedure technical ability needs of culture surgeon are trained using a large amount of animals or human body entity.Animal is real
Body one side source is rare, on the other hand also can not true representations inside of human body tissue construction, carry out operation instruction using human body
Practice, in addition to the above-mentioned limited source the problem of other than, more there is ethics and jural limitation.Virtual reality (VR) is applied to
Operative training, solution is not only effective, but also has a extensive future, and trained surgeon is in the hand for receiving to be based on virtual reality (VR)
When art skills training, feeling of immersion on the spot in person is obtained by experiencing in conjunction with of both from vision and operation, but in mesh
The technical level of the surgery training system of preceding virtual reality (VR), can only be conceived to the visual experience of user, not to use
The operating experience of person, for trainee, operating experience not only affects its Cognitive Effects to operation method, more influences
Its gimmick intuition to operation method, therefore be directly related to trained effect, make the inefficient of operation.
Since body physiological institutional framework and the pathological anatomy construction of each individual are there are a degree of difference, in order to
The success rate of operation is improved, especially for complicated operation case, surgical field is high to the cry of preoperative simulation.
Summary of the invention
For the defects in the prior art, the present invention provides a kind of surgical simulation 3D system based on digitized humans
System enables the physician to carry out surgical simulation, improves the experience sense that doctor comes personally.
In a first aspect, a kind of surgical simulation 3D system based on digitized humans, including acquisition unit, modeling unit
And converting unit;
The size of acquisition unit multiple human bodies for receiving input;The human body include head, neck, on
Trunk, lower trunk, thigh, shank, foot, upper arm, forearm and hand;
Modeling unit is used to construct digitized humans according to the size of human body;
Converting unit is used to the digitized humans being converted to 3D scene.
Second aspect, a kind of surgical simulation 3D system based on digitized humans, including scalpel, positioning gloves,
VR glasses, scene database, selecting unit and analogue unit;
The scalpel includes blade and knife handle, and the lower end surface of blade forms blade, on the knife handle of the scalpel, blade
In the middle part of front end, blade and blade rear end is equipped with tracker;
The positioning gloves include the back of the hand portion and multiple fingerstall portions being arranged in the back of the hand portion for wrapping up the back of the hand, described every
Tracker is equipped in a fingerstall portion;
The scene database is stored with preset simulated scenario, and each simulated scenario is named using operation names, each
Simulated scenario includes multiple sub-scenes;The simulated scenario constructs in the following manner:
Two dimension 2D picture is set, identifies position of the 2D object and 2D object in two dimension 2D picture in 2D picture;It obtains
The model of the corresponding three-dimensional 3D object of 2D object;According to position of the 2D object in 2D picture, it is 3D pairs corresponding to calculate 2D object
As the corresponding position on the horizontal plane of 3D scene;The model of simulation 3D object drops to 3D from 3D scene predetermined distance thereabove
Jing Zhong, wherein the position of the drop point of the model of 3D object in the horizontal plane is correspondence of the 3D object on the horizontal plane of 3D scene
Position;
Selective listing is equipped in the selecting unit;Selective listing includes being classified according to human organ, with human body device
The first category of official's name, and be associated with first category, classified according to operation names, with the second of operation names name
Classification;
Selecting unit is also used to read the location information of tracker on positioning gloves, according to the position of tracker on positioning gloves
Confidence ceases to obtain the gesture motion of user;Selecting unit is also used to receive the selection instruction of user, into preset selection scene,
Scene is selected to show that first category in selective listing, cursor are located at the first row in selective listing;When the gesture motion recognized is
When upper sliding, cursor current location in selective listing moves up a row;When the gesture motion recognized is to glide, cursor is being selected
Current location line down in list;When the gesture motion recognized is left sliding, first category in selective listing, and light are shown
Mark the first row in selective listing;When the gesture motion recognized is right sliding, indicated by display and cursor current location
The associated second category of first category, and cursor is located at the first row in selective listing;When the gesture motion recognized is fist,
It reads and enters and the associated simulated scenario of second category indicated by cursor current location;
The analogue unit is used for after entering simulated scenario, is started to carry out surgical simulation, is read positioning gloves and hand
The location information of tracker on art knife switches over simulated scenario neutron scene according to the location information of tracker;
The VR glasses are worn for user, simulated scenario in viewing selecting unit selection scene and analogue unit.
Further, the sub-scene of the simulated scenario includes level-one sub-scene and multiple second level sub-scenes;The level-one
Sub-scene includes the body image in range of operation;The second level sub-scene includes the human body shadow after human body different parts are spaced
Picture;Second level sub-scene is equipped with trigger position.
Further, the location information according to tracker switches over simulated scenario neutron scene and specifically includes:
It is identified to obtain shift position of the scalpel in level-one sub-scene according to the location information of tracker;
When shift position is located in the trigger position of second level sub-scene, the second level sub-scene is switched to.
Further, the analogue unit, which is also used to work as, detects shift position and institute of the scalpel in level-one sub-scene
When thering is the trigger position of second level sub-scene to mismatch, error prompting information is shown.
Further, which further includes scoring unit;
Scoring unit is for scoring to the surgical simulation process of user.
Further, the scoring unit is also used to generate simulation report according to the surgical simulation process of user;Simulation report
It include that the operation diagram that screenshot obtains is carried out to current sub-scene when detecting that scalpel moves in level-one sub-scene in announcement
Piece.
Further, the location information according to tracker identifies to obtain mobile position of the scalpel in level-one sub-scene
It sets and specifically includes:
Effective moving area is equipped in level-one sub-scene;
When identifying that scalpel moves in effective moving area of level-one sub-scene according to the location information of tracker, obtain
Shift position of the scalpel in level-one sub-scene.
Further, the tracker is HTC VIVE Tracker tracker.
As shown from the above technical solution, the surgical simulation 3D system provided by the invention based on digitized humans, hand
Tracker on art knife can obtain the location information and angle information of scalpel, determine the shift position of scalpel;Position hand
Tracker can obtain the location information and angle information in each fingerstall portion on set, determine gesture and the position of user, realize
The positioning of scalpel and user hand.Surgical simulation 3D system provided by the invention constructs simulation yard by the way of simple and fast
Scape, and VR technology is combined, the true hand of true scalpel is positioned, user is allowed to put on VR glasses at viewing scene
While accurately simulate operation, improve the feeling of immersion of user, make user using when achieve the effect that on the spot in person, doctor can be into
Row surgical simulation improves the experience sense that doctor comes personally.
Detailed description of the invention
It, below will be to specific in order to illustrate more clearly of the specific embodiment of the invention or technical solution in the prior art
Embodiment or attached drawing needed to be used in the description of the prior art are briefly described.In all the appended drawings, similar element
Or part is generally identified by similar appended drawing reference.In attached drawing, each element or part might not be drawn according to actual ratio.
Fig. 1 is the module frame chart for the system that embodiment two provides.
Specific embodiment
It is described in detail below in conjunction with embodiment of the attached drawing to technical solution of the present invention.Following embodiment is only used for
Clearly illustrate technical solution of the present invention, therefore be only used as example, and cannot be used as a limitation and limit protection model of the invention
It encloses.It should be noted that unless otherwise indicated, technical term or scientific term used in this application are should be belonging to the present invention
The ordinary meaning that field technical staff is understood.
Embodiment one:
A kind of surgical simulation 3D system based on digitized humans, including acquisition unit, modeling unit and conversion are single
Member;
The size of acquisition unit multiple human bodies for receiving input;The human body include head, neck, on
Trunk, lower trunk, thigh, shank, foot, upper arm, forearm and hand;
Modeling unit is used to construct digitized humans according to the size of human body;
Converting unit is used to the digitized humans being converted to 3D scene.
Embodiment two:
A kind of surgical simulation 3D system based on digitized humans, referring to Fig. 1, including scalpel, positioning gloves, VR
Glasses, scene database, selecting unit and analogue unit;
The scalpel includes blade and knife handle, and the lower end surface of blade forms blade, on the knife handle of the scalpel, blade
In the middle part of front end, blade and blade rear end is equipped with tracker;
Specifically, the tracker is HTC VIVE Tracker tracker.3 tracker energy on scalpel are set
It is enough accurately located the blade of scalpel, then by location information synchronizing in the scalpel into virtual environment, realizes that scalpel exists
It is true and virtually in it is synchronous.
The positioning gloves include the back of the hand portion and multiple fingerstall portions being arranged in the back of the hand portion for wrapping up the back of the hand, described every
Tracker is equipped in a fingerstall portion;
Specifically, the tracker in each fingerstall portion on positioning gloves is set, the position of user's finger can be accurately located
Set, then by location information synchronizing in the hand into virtual environment, realize hand it is true and virtually in it is synchronous.
The scene database is stored with preset simulated scenario, and each simulated scenario is named using operation names, each
Simulated scenario includes multiple sub-scenes;The simulated scenario constructs in the following manner:
Two dimension 2D picture is set, identifies position of the 2D object and 2D object in two dimension 2D picture in 2D picture;It obtains
The model of the corresponding three-dimensional 3D object of 2D object;According to position of the 2D object in 2D picture, it is 3D pairs corresponding to calculate 2D object
As the corresponding position on the horizontal plane of 3D scene;The model of simulation 3D object drops to 3D from 3D scene predetermined distance thereabove
Jing Zhong, wherein the position of the drop point of the model of 3D object in the horizontal plane is correspondence of the 3D object on the horizontal plane of 3D scene
Position;
Specifically, each simulated scenario is constructed according to the factors such as the position of the organ of human body, size, arrangement in surgical procedure
It is formed.In surgical procedure, sub-scene is generated according to different body image, such as:According to separating the building of the image after A organ
Sub-scene A constructs sub-scene B according to the image after B organ is separated.
Specifically, 2D object is the X-Y scheme generated by 2D graphical authoring tools, for example, color lump, geometric figure, text
Block.It identifies the position of 2D object and 2D object in picture in a two-dimension picture, includes color lump, the color in identification picture
The color of block and position.The model for obtaining the corresponding 3D object of 2D object, can be through pre-set 2D object and 3D pairs
Corresponding relationship or mapping relations as between determine the corresponding 3D object of a 2D object.Calculate the corresponding 3D object of 2D object
It can be in the corresponding position on the horizontal plane of 3D scene according to the mapping between picture and the horizontal surface areas of a 3D scene
Relationship calculates position of the corresponding 3D object of 2D object on the horizontal plane of 3D scene by the coordinate of position of the 2D object in picture
The coordinate set.Model the dropping in 3D scene from 3D scene predetermined distance thereabove of the simulation 3D object, including simulation 3D
The model of object rests on the dummy object in the 3D scene that descent is encountered.
Selective listing is equipped in the selecting unit;Selective listing includes being classified according to human organ, with human body device
The first category of official's name, and be associated with first category, classified according to operation names, with the second of operation names name
Classification;
Specifically, first category is named with human organ, such as heart, liver, stomach etc..Second category is operation names point
Class, such as operation relevant to heart have bypass surgery, congenital heart disease operation, valve replacement surgery etc..Then with
The associated second category of this first category of heart includes bypass surgery, congenital heart disease operation, valve replacement surgery etc.
Deng.
Selecting unit is also used to read the location information of tracker on positioning gloves, according to the position of tracker on positioning gloves
Confidence ceases to obtain the gesture motion of user;
Specifically, the movement that each finger is positioned according to the position of tracker on positioning gloves, obtains gesture motion.
Selecting unit is also used to receive the selection instruction of user, into preset selection scene, selects scene display selection
First category in list, cursor are located at the first row in selective listing;When the gesture motion recognized is upper sliding, cursor is being selected
Current location moves up a row in list;When the gesture motion recognized is to glide, cursor is in selective listing under current location
Move a line;When the gesture motion recognized is left sliding, first category in selective listing is shown, and cursor is located in selective listing
The first row;When the gesture motion recognized is right sliding, display and first category indicated by cursor current location associated the
Two classifications, and cursor is located at the first row in selective listing;When the gesture motion recognized is fist, read and entrance and cursor
The associated simulated scenario of second category indicated by current location;
Specifically, selection instruction, which is used to indicate, starts to select simulated scenario, and when starting selection, first which organ selection is
Operation, which first category belonged to.Specific operation, i.e. second category are selected under the first category again.Pass through two layers
Selective listing can help user quickly to choose simulated scenario.When sliding on user, cursor is moved to lastrow.Work as user
When downslide, cursor is moved to next line.When user is left sliding, returns to first category and selected.When user is right sliding, into light
The associated second category of current first category is marked to be selected.When the gesture motion for recognizing user is fist, into cursor
The simulated scenario of the second category currently indicated.
The analogue unit is used for after entering simulated scenario, is started to carry out surgical simulation, is read positioning gloves and hand
The location information of tracker on art knife switches over simulated scenario neutron scene according to the location information of tracker;
Specifically, such as after detecting that user separates A organ in virtual world, it is switched to sub-scene A.If separated
B organ is switched to sub-scene B.
The VR glasses are worn for user, simulated scenario in viewing selecting unit selection scene and analogue unit.
Surgical simulation 3D system is in use, user takes VR glasses, since tracker is to the hand of scalpel and user
Positioned, by scalpel and the hand of user in true environment scalpel and hand of the location information synchronizing into virtual environment
In portion, realize scalpel and the hand of user it is true and virtually in it is synchronous.In this way, after user takes VR glasses, Ke Yi
Position or the movement of the hand of virtual scalpel and user are seen in virtual environment.In this way, user can execute true behaviour
Operate the movement of knife, while it can also be seen that sub-scene when operation knife in virtual environment.The system combines VR
Technology positions the true hand of true scalpel, and user is allowed to put on VR glasses accurate mould while watching scene
Quasi- operation, improves the feeling of immersion of user, make user using when achieve the effect that on the spot in person, doctor is able to carry out surgical simulation,
Improve the experience sense that doctor comes personally.
Embodiment three:
Embodiment three on the basis of example 2, increases the following contents:
The sub-scene of the simulated scenario includes level-one sub-scene and multiple second level sub-scenes;The level-one sub-scene includes
Body image in range of operation;The second level sub-scene includes the body image after human body different parts are spaced;Second level
Scene is equipped with trigger position.
Specifically, the instruction of level-one sub-scene does not start body image when performing the operation also, it can be seen that the complete shape of human body.Touching
Hair position is for triggering the switching of second level sub-scene.Such as according to separating the building sub-scene A of the image after A organ, according to separating B device
Image after official constructs sub-scene B.
After building sub-scene, the location information according to tracker switches over tool to simulated scenario neutron scene
Body includes:
It is identified to obtain shift position of the scalpel in level-one sub-scene according to the location information of tracker;
When shift position is located in the trigger position of second level sub-scene, the second level sub-scene is switched to.
Specifically, shift position of the scalpel in level-one sub-scene is for indicating simulated environment menisectomy knife in human body
Cutting position illustrate that the cutting position of scalpel just cuts to accurate position when shift position is located in trigger position
It sets, switches to the sub-scene after the position is spaced at this time.
The present embodiment also provides error prompting function.The analogue unit, which is also used to work as, detects scalpel in level-one subfield
When the trigger position of shift position and all second level sub-scenes in scape mismatches, error prompting information is shown.
Specifically, if the trigger position of the shift position of simulated environment menisectomy knife and all second level sub-scenes not
Together, then it is assumed that the cutting position of user is wrong, progress error prompting.
Further, which further includes scoring unit;
Scoring unit is for scoring to the surgical simulation process of user.
Specifically, if mistake is fewer during the surgical simulation of user, score is higher.Use is assessed by scoring score
The surgical simulation process at family.Score range can be 0-10 points.
Further, the scoring unit is also used to generate simulation report according to the surgical simulation process of user;Simulation report
It include that the operation diagram that screenshot obtains is carried out to current sub-scene when detecting that scalpel moves in level-one sub-scene in announcement
Piece.
After system provided in this embodiment is in addition to scoring to operation simulation process, simulation report is also generated.Simulation report
Include screenshot when scalpel moves every time in level-one sub-scene in announcement, user is facilitated to look back hand during entire surgical simulation
The shift position of art knife, summarizes to faulty operation.
Further, the location information according to tracker identifies to obtain mobile position of the scalpel in level-one sub-scene
It sets and specifically includes:
Effective moving area is equipped in level-one sub-scene;
When identifying that scalpel moves in effective moving area of level-one sub-scene according to the location information of tracker, obtain
Shift position of the scalpel in level-one sub-scene.
Specifically, effective moving area is specifically limited according to different operations.In order to avoid by all scalpels in level-one
The mobile misjudgement of sub-scene is effective movement, and each level-one sub-scene is equipped with effective moving area, and only scalpel is one
When moving in effective moving area of grade sub-scene, it is just considered effective movement, just identifies scalpel in level-one sub-scene
Shift position.
System provided by the embodiment of the present invention, to briefly describe, embodiment part does not refer to place, can refer to embodiment
Corresponding contents in two.
Finally it should be noted that:The above embodiments are only used to illustrate the technical solution of the present invention., rather than its limitations;To the greatest extent
Present invention has been described in detail with reference to the aforementioned embodiments for pipe, those skilled in the art should understand that:Its according to
So be possible to modify the technical solutions described in the foregoing embodiments, or to some or all of the technical features into
Row equivalent replacement;And these are modified or replaceed, various embodiments of the present invention technology that it does not separate the essence of the corresponding technical solution
The range of scheme should all cover within the scope of the claims and the description of the invention.
Claims (9)
1. a kind of surgical simulation 3D system based on digitized humans, which is characterized in that including acquisition unit, modeling unit
And converting unit;
The size of acquisition unit multiple human bodies for receiving input;The human body includes head, neck, upper body
Dry, lower trunk, thigh, shank, foot, upper arm, forearm and hand;
Modeling unit is used to construct digitized humans according to the size of human body;
Converting unit is used to the digitized humans being converted to 3D scene.
2. a kind of surgical simulation 3D system based on digitized humans, which is characterized in that including scalpel, positioning gloves,
VR glasses, scene database, selecting unit and analogue unit;
The scalpel includes blade and knife handle, and the lower end surface of blade forms blade, on the knife handle of the scalpel, before blade
In the middle part of end, blade and blade rear end is equipped with tracker;
The positioning gloves include the back of the hand portion and multiple fingerstall portions being arranged in the back of the hand portion for wrapping up the back of the hand, each finger
Set is equipped with tracker in portion;
The scene database is stored with preset simulated scenario, and each simulated scenario is named using operation names, each simulation
Scene includes multiple sub-scenes;The simulated scenario constructs in the following manner:
Two dimension 2D picture is set, identifies position of the 2D object and 2D object in two dimension 2D picture in 2D picture;Obtain 2D pairs
As the model of corresponding three-dimensional 3D object;According to position of the 2D object in 2D picture, calculates the corresponding 3D object of 2D object and exist
Corresponding position on the horizontal plane of 3D scene;The model of simulation 3D object drops to 3D scene from 3D scene predetermined distance thereabove
In, wherein the position of the drop point of the model of 3D object in the horizontal plane is correspondence position of the 3D object on the horizontal plane of 3D scene
It sets;
Selective listing is equipped in the selecting unit;Selective listing includes being classified according to human organ, being ordered with human organ
The first category of name, and the second class for being associated with, being classified according to operation names, being named with operation names with first category
Not;
Selecting unit is also used to read the location information of tracker on positioning gloves, is believed according to the position of tracker on positioning gloves
Breath obtains the gesture motion of user;Selecting unit is also used to receive the selection instruction of user, into preset selection scene, selection
Scene shows that first category in selective listing, cursor are located at the first row in selective listing;When the gesture motion recognized is upper cunning
When, cursor current location in selective listing moves up a row;When the gesture motion recognized is to glide, cursor is in selective listing
Middle current location line down;When the gesture motion recognized is left sliding, first category in selective listing, and cursor position are shown
The first row in selective listing;When the gesture motion recognized is right sliding, first indicated by display and cursor current location
The second category of category associations, and cursor is located at the first row in selective listing;When the gesture motion recognized is fist, read
And enter and the associated simulated scenario of second category indicated by cursor current location;
The analogue unit is used for after entering simulated scenario, is started to carry out surgical simulation, is read positioning gloves and scalpel
The location information of upper tracker switches over simulated scenario neutron scene according to the location information of tracker;
The VR glasses are worn for user, simulated scenario in viewing selecting unit selection scene and analogue unit.
3. the surgical simulation 3D system based on digitized humans according to claim 2, which is characterized in that the simulation
The sub-scene of scene includes level-one sub-scene and multiple second level sub-scenes;The level-one sub-scene includes the human body in range of operation
Image;The second level sub-scene includes the body image after human body different parts are spaced;Second level sub-scene is equipped with trigger position.
4. the surgical simulation 3D system based on digitized humans according to claim 3, which is characterized in that the basis
The location information of tracker switches over simulated scenario neutron scene and specifically includes:
It is identified to obtain shift position of the scalpel in level-one sub-scene according to the location information of tracker;
When shift position is located in the trigger position of second level sub-scene, the second level sub-scene is switched to.
5. the surgical simulation 3D system based on digitized humans according to claim 4, which is characterized in that the simulation
Unit is also used to when detecting the trigger position of shift position of the scalpel in level-one sub-scene and all second level sub-scenes not
When matching, error prompting information is shown.
6. the surgical simulation 3D system based on digitized humans according to claim 4, which is characterized in that the system is also
Including the unit that scores;
Scoring unit is for scoring to the surgical simulation process of user.
7. the surgical simulation 3D system based on digitized humans according to claim 6, which is characterized in that the scoring
Unit is also used to generate simulation report according to the surgical simulation process of user;It include working as to detect scalpel one in simulation report
When moving in grade sub-scene, the operation picture that screenshot obtains is carried out to current sub-scene.
8. the surgical simulation 3D system based on digitized humans according to claim 4, which is characterized in that the basis
The location information of tracker, which identifies to obtain shift position of the scalpel in level-one sub-scene, to be specifically included:
Effective moving area is equipped in level-one sub-scene;
When identifying that scalpel moves in effective moving area of level-one sub-scene according to the location information of tracker, performed the operation
Shift position of the knife in level-one sub-scene.
9. the surgical simulation 3D system based on digitized humans according to claim 2, which is characterized in that the tracking
Device is HTC VIVE Tracker tracker.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810821433.1A CN108922300A (en) | 2018-07-24 | 2018-07-24 | Surgical simulation 3D system based on digitized humans |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810821433.1A CN108922300A (en) | 2018-07-24 | 2018-07-24 | Surgical simulation 3D system based on digitized humans |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108922300A true CN108922300A (en) | 2018-11-30 |
Family
ID=64416349
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810821433.1A Pending CN108922300A (en) | 2018-07-24 | 2018-07-24 | Surgical simulation 3D system based on digitized humans |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108922300A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113971896A (en) * | 2021-11-17 | 2022-01-25 | 苏州大学 | Operation training system and training method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100783552B1 (en) * | 2006-10-11 | 2007-12-07 | 삼성전자주식회사 | Input control method and device for mobile phone |
US20100229125A1 (en) * | 2009-03-09 | 2010-09-09 | Samsung Electronics Co., Ltd. | Display apparatus for providing a user menu, and method for providing user interface (ui) applicable thereto |
CN103136781A (en) * | 2011-11-30 | 2013-06-05 | 国际商业机器公司 | Method and system of generating three-dimensional virtual scene |
CN103578133A (en) * | 2012-08-03 | 2014-02-12 | 浙江大华技术股份有限公司 | Method and device for reconstructing two-dimensional image information in three-dimensional mode |
US20140104274A1 (en) * | 2012-10-17 | 2014-04-17 | Microsoft Corporation | Grasping virtual objects in augmented reality |
CN104821122A (en) * | 2015-03-11 | 2015-08-05 | 张雁儒 | Human anatomy teaching method |
CN106128205A (en) * | 2016-07-15 | 2016-11-16 | 安徽工业大学 | A kind of dangerous materials leak analog systems and construction method thereof and using method |
CN106708260A (en) * | 2016-11-30 | 2017-05-24 | 宇龙计算机通信科技(深圳)有限公司 | Generation method and device for virtual reality surgery scene |
CN107067856A (en) * | 2016-12-31 | 2017-08-18 | 歌尔科技有限公司 | A kind of medical simulation training system and method |
CN107168530A (en) * | 2017-04-26 | 2017-09-15 | 腾讯科技(深圳)有限公司 | Object processing method and device in virtual scene |
-
2018
- 2018-07-24 CN CN201810821433.1A patent/CN108922300A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100783552B1 (en) * | 2006-10-11 | 2007-12-07 | 삼성전자주식회사 | Input control method and device for mobile phone |
US20100229125A1 (en) * | 2009-03-09 | 2010-09-09 | Samsung Electronics Co., Ltd. | Display apparatus for providing a user menu, and method for providing user interface (ui) applicable thereto |
CN103136781A (en) * | 2011-11-30 | 2013-06-05 | 国际商业机器公司 | Method and system of generating three-dimensional virtual scene |
CN103578133A (en) * | 2012-08-03 | 2014-02-12 | 浙江大华技术股份有限公司 | Method and device for reconstructing two-dimensional image information in three-dimensional mode |
US20140104274A1 (en) * | 2012-10-17 | 2014-04-17 | Microsoft Corporation | Grasping virtual objects in augmented reality |
CN104821122A (en) * | 2015-03-11 | 2015-08-05 | 张雁儒 | Human anatomy teaching method |
CN106128205A (en) * | 2016-07-15 | 2016-11-16 | 安徽工业大学 | A kind of dangerous materials leak analog systems and construction method thereof and using method |
CN106708260A (en) * | 2016-11-30 | 2017-05-24 | 宇龙计算机通信科技(深圳)有限公司 | Generation method and device for virtual reality surgery scene |
CN107067856A (en) * | 2016-12-31 | 2017-08-18 | 歌尔科技有限公司 | A kind of medical simulation training system and method |
CN107168530A (en) * | 2017-04-26 | 2017-09-15 | 腾讯科技(深圳)有限公司 | Object processing method and device in virtual scene |
Non-Patent Citations (4)
Title |
---|
北京奥医科技有限公司: ""VR医学教育-"VR活体"手术操作演示"", 《HTTPS://V.QQ.COM/X/PAGE/M0601NBH3EE.HTML》 * |
视频用户: ""VR外科手术"", 《HTTPS://V.QQ.COM/X/PAGE/N0533HLG5U8.HTML》 * |
视频用户: ""VR阑尾切除手术"", 《HTTPS://V.QQ.COM/X/PAGE/Y0631YH7EGL.HTML》 * |
魏保志: "《专利审查研究2013》", 30 June 2014, 知识产权出版社 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113971896A (en) * | 2021-11-17 | 2022-01-25 | 苏州大学 | Operation training system and training method |
CN113971896B (en) * | 2021-11-17 | 2023-11-24 | 苏州大学 | Surgical training system and training method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107067856B (en) | Medical simulation training system and method | |
US11751957B2 (en) | Surgical system with training or assist functions | |
CN108701429B (en) | Method, system, and storage medium for training a user of a robotic surgical system | |
JP2016524262A (en) | 3D user interface | |
CN106725860B (en) | Method and apparatus for the gesture control in micro-wound surgical operation system | |
CN103793060B (en) | A kind of user interactive system and method | |
CN1973780B (en) | System and method for facilitating surgical | |
CN104834384B (en) | Improve the device and method of exercise guidance efficiency | |
WO2019204777A1 (en) | Surgical simulator providing labeled data | |
CN110390851A (en) | Augmented reality training system | |
CN109979600A (en) | Orbital Surgery training method, system and storage medium based on virtual reality | |
US10172676B2 (en) | Device and method for the computer-assisted simulation of surgical interventions | |
CN106980383A (en) | A kind of dummy model methods of exhibiting, module and the virtual human body anatomical model display systems based on the module | |
CN109806004A (en) | A kind of surgical robot system and operating method based on cloud data technique | |
Long et al. | Integrating artificial intelligence and augmented reality in robotic surgery: An initial dvrk study using a surgical education scenario | |
Rivas-Blanco et al. | A surgical dataset from the da Vinci Research Kit for task automation and recognition | |
CN114387836B (en) | Virtual operation simulation method and device, electronic equipment and storage medium | |
CN109064817A (en) | Surgery simulation system based on CT Three-dimension Reconstruction Model | |
CN105630155A (en) | Computing apparatus and method for providing three-dimensional (3d) interaction | |
CN108922300A (en) | Surgical simulation 3D system based on digitized humans | |
CN104517016A (en) | Surgery simulation system using motion sensing technology and virtual reality technology | |
Gras et al. | Context-aware modeling for augmented reality display behaviour | |
Zhou | Role of human body posture recognition method based on wireless network Kinect in line dance aerobics and gymnastics training | |
CN110996178A (en) | Intelligent interactive data acquisition system for table tennis game video | |
TW201619754A (en) | Medical image object-oriented interface auxiliary explanation control system and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181130 |