CN108777001A - Surgical simulation method and device - Google Patents
Surgical simulation method and device Download PDFInfo
- Publication number
- CN108777001A CN108777001A CN201810675056.5A CN201810675056A CN108777001A CN 108777001 A CN108777001 A CN 108777001A CN 201810675056 A CN201810675056 A CN 201810675056A CN 108777001 A CN108777001 A CN 108777001A
- Authority
- CN
- China
- Prior art keywords
- models
- operation tool
- organ
- information
- collision body
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Public Health (AREA)
- Primary Health Care (AREA)
- Architecture (AREA)
- Medical Informatics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Geometry (AREA)
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the present invention provides a kind of surgical simulation method and device,This method is first by obtaining 3D rendering information of the operation tool when being performed the operation,It is then based on location information of the 3D rendering acquisition of information operation tool in the 3D models being pre-created,Judge that operation tool collision body whether corresponding with the target organ in the 3D organ models collides based on the location information again,When judging that operation tool collision physique corresponding with the target organ in the 3D organ models collides,Generate simulated crash reaction,Thus,It can be collided by detecting operation tool collision body whether corresponding with target organ,When detecting that operation tool collision body corresponding with target organ collides,Generate simulated crash reaction,Doctor can be reminded,Operation tool is collided with target organ during actual operation,It performs the operation to find accurate target organ,Ensure that operation be smoothed out and safety.
Description
Technical field
The present invention relates to data processing fields, in particular to a kind of surgical simulation method and device.
Background technology
Since 21 century, various diseases highlight easily more, perform the operation more and more, and the serious hope of the success rate for operation
Degree, and medical industry also faces the practice operation of various operations, medical industry constantly popularizes the practice operation of this part, and gives
Internet research and development bring great test, and during this surgical experiments, are one for patient and doctor and great examine
It tests.
And in the prior art, doctor is carrying out patient to rely solely on some auxiliary medical equipments and doctor in surgical procedure
Visually observe surgical procedure, if but when performing the operation for certain organs in body, due to can not specifically observe body
Internal surgical procedure so that doctor can only be the organ for currently needing to perform the operation with feeling to know whether scalpel act on,
Such case thereby results in serious doctor probably due to maloperation causes scalpel not find operation organ or surgery location accurately
Treatment accident.
Invention content
In view of this, the embodiment of the present invention is designed to provide a kind of surgical simulation method and device, it is above-mentioned to improve
Problem.
In a first aspect, an embodiment of the present invention provides a kind of surgical simulation method, the method includes:Obtain operation tool
3D rendering information when being performed the operation;
It is described based on location information of the operation tool in the 3D models being pre-created described in the 3D rendering acquisition of information
3D models include 3D organ models, are added with collision body in the 3D organ models in advance;
Judge whether the operation tool is corresponding with the target organ in the 3D organ models based on the location information
Collision body collide;
When to be, simulated crash reaction is generated.
Further, 3D rendering information of the operation tool when being performed the operation is obtained, including:
3D rendering information of the operation tool when being performed the operation is obtained by camera.
Further, the 3D models further include 3D operation tool models, are added in advance in the 3D operation tools model
There is collision body, judges whether the operation tool is corresponding with the target organ in the 3D organ models based on the location information
Collision body collide, when to be, generate simulated crash reaction, including:
Judge that the operation tool corresponding collision body in the 3D operation tools model is based on the location information
No collision body corresponding with target organ in the 3D organ models collides;
When to be, simulated crash reaction is generated.
Further, based on position of the operation tool in the 3D models being pre-created described in the 3D rendering acquisition of information
Information, including:
Based on position coordinates of the operation tool in the 3D rendering information described in the 3D rendering acquisition of information;
The position coordinates are converted into the coordinates of targets in the 3D models, the coordinates of targets is the operation tool
Location information in the 3D models.
Further, after generating simulated crash reaction, the method further includes:
Obtain that operation tool collision body corresponding with the target organ in the 3D organ models collides touches
Hit depth information;
Corresponding simulated crash reaction is generated based on the collision depth information.
Second aspect, an embodiment of the present invention provides a kind of surgical simulation device, described device includes:
Image information acquisition module, for obtaining 3D rendering information of the operation tool when being performed the operation;
Position information acquisition module, for based on operation tool described in the 3D rendering acquisition of information in the 3D being pre-created
Location information in model, the 3D models include 3D organ models, are added with collision body in the 3D organ models in advance;
Collision judgment module, for based on the location information judge the operation tool whether with the 3D organ models
In the corresponding collision body of target organ collide;
Reaction simulation module, in judging the operation tool and the 3D organ models based on the location information
Target organ corresponding collision body when colliding, generate simulated crash reaction.
Further, described image data obtaining module is specifically used for obtaining operation tool in progress hand by camera
3D rendering information when art.
Further, the 3D models further include 3D operation tool models, are added in advance in the 3D operation tools model
There are collision body, the collision judgment module to be specifically used for judging the operation tool in the 3D hands based on the location information
Corresponding collision body collision body whether corresponding with the target organ in the 3D organ models collides in art tool model.
Further, the position information acquisition module, specifically for being based on performing the operation described in the 3D rendering acquisition of information
Position coordinates of the tool in the 3D rendering information;The position coordinates are converted into the coordinates of targets in the 3D models,
The coordinates of targets is location information of the operation tool in the 3D models.
Further, described device further includes:
Simulation of depth module is collided, it is corresponding with the target organ in the 3D organ models for obtaining the operation tool
The collision depth information that collides of collision body;Corresponding simulated crash reaction is generated based on the collision depth information.
The third aspect, the embodiment of the present application provide a kind of electronic equipment, including processor and memory, the memory
It is stored with computer-readable instruction fetch, when the computer-readable instruction fetch is executed by the processor, operation such as above-mentioned the
On the one hand the step in the method provided.
Fourth aspect, the embodiment of the present application provide a kind of readable storage medium storing program for executing, are stored thereon with computer program, the meter
The step in the method that first aspect offer is as above provided is run when calculation machine program is executed by processor.
The advantageous effect of the embodiment of the present invention:
The embodiment of the present invention provides a kind of surgical simulation method and device, this method first by obtain operation tool into
3D rendering information when row operation, is then based on operation tool described in the 3D rendering acquisition of information in the 3D models being pre-created
In location information, the 3D models include 3D organ models, are added with collision body in the 3D organ models in advance, then are based on
The location information judges that operation tool collision body whether corresponding with the target organ in the 3D organ models occurs
Collision, when judging that operation tool collision body corresponding with the target organ in the 3D organ models collides, production
Raw simulated crash reaction can be collided as a result, by detecting operation tool collision body whether corresponding with target organ, to
It may determine that whether operation tool collides with target organ in real process, is detecting operation tool and target organ pair
When the collision body answered collides, simulated crash reaction is generated, you can remind doctor, operation tool has been during actual operation
It is collided with target organ, is performed the operation to find accurate target organ, ensure that being smoothed out and pacifying for operation
Entirely.
Other features and advantages of the present invention will be illustrated in subsequent specification, also, partly be become from specification
It is clear that by implementing understanding of the embodiment of the present invention.The purpose of the present invention and other advantages can be by saying what is write
Specifically noted structure is realized and is obtained in bright book, claims and attached drawing.
Description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below will be to needed in the embodiment attached
Figure is briefly described, it should be understood that the following drawings illustrates only certain embodiments of the present invention, therefore is not construed as pair
The restriction of range for those of ordinary skill in the art without creative efforts, can also be according to this
A little attached drawings obtain other relevant attached drawings.
Fig. 1 shows a kind of structure diagram can be applied to the electronic equipment in the embodiment of the present application;
Fig. 2 is a kind of flow chart of surgical simulation method provided in an embodiment of the present invention;
Fig. 3 is a kind of structure diagram of surgical simulation device provided in an embodiment of the present invention;
Fig. 4 is the structural schematic diagram of a kind of electronic equipment provided by the embodiments of the present application.
Specific implementation mode
Below in conjunction with attached drawing in the embodiment of the present invention, technical solution in the embodiment of the present invention carries out clear, complete
Ground describes, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.Usually exist
The component of the embodiment of the present invention described and illustrated in attached drawing can be arranged and be designed with a variety of different configurations herein.Cause
This, the detailed description of the embodiment of the present invention to providing in the accompanying drawings is not intended to limit claimed invention below
Range, but it is merely representative of the selected embodiment of the present invention.Based on the embodiment of the present invention, those skilled in the art are not doing
The every other embodiment obtained under the premise of going out creative work, shall fall within the protection scope of the present invention.
It should be noted that:Similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi
It is defined, then it further need not be defined and explained in subsequent attached drawing in a attached drawing.Meanwhile the present invention's
In description, term " first ", " second " etc. are only used for distinguishing description, are not understood to indicate or imply relative importance.
Fig. 1 is please referred to, Fig. 1 shows a kind of structure diagram for the electronic equipment 100 that can be applied in the embodiment of the present application.
Electronic equipment 100 may include surgical simulation device, memory 101, storage control 102, processor 103, Peripheral Interface
104, input-output unit 105, audio unit 106, display unit 107.
The memory 101, storage control 102, processor 103, Peripheral Interface 104, input-output unit 105, sound
Frequency unit 106,107 each element of display unit are directly or indirectly electrically connected between each other, to realize the transmission or friendship of data
Mutually.It is electrically connected for example, these elements can be realized between each other by one or more communication bus or signal wire.The operation
Simulator includes that at least one can be stored in the memory 101 or be cured in the form of software or firmware (firmware)
Software function module in the operating system (operating system, OS) of the surgical simulation device.The processor
103 are used to execute the executable module stored in memory 101, such as the software function module that the surgical simulation device includes
Or computer program.
Wherein, memory 101 may be, but not limited to, random access memory (Random Access Memory,
RAM), read-only memory (Read Only Memory, ROM), programmable read only memory (Programmable Read-Only
Memory, PROM), erasable read-only memory (Erasable Programmable Read-Only Memory, EPROM),
Electricallyerasable ROM (EEROM) (Electric Erasable Programmable Read-Only Memory, EEPROM) etc..
Wherein, memory 101 is for storing program, and the processor 103 executes described program after receiving and executing instruction, aforementioned
The method performed by server that the stream process that any embodiment of the embodiment of the present invention discloses defines can be applied to processor 103
In, or realized by processor 103.
Processor 103 can be a kind of IC chip, the processing capacity with signal.Above-mentioned processor 103 can
To be general processor, including central processing unit (Central Processing Unit, abbreviation CPU), network processing unit
(Network Processor, abbreviation NP) etc.;Can also be digital signal processor (DSP), application-specific integrated circuit (ASIC),
Ready-made programmable gate array (FPGA) either other programmable logic device, discrete gate or transistor logic, discrete hard
Part component.It may be implemented or execute disclosed each method, step and the logic diagram in the embodiment of the present invention.General processor
Can be microprocessor or the processor 103 can also be any conventional processor etc..
The Peripheral Interface 104 couples various input/output devices to processor 103 and memory 101.At some
In embodiment, Peripheral Interface 104, processor 103 and storage control 102 can be realized in one single chip.Other one
In a little examples, they can be realized by independent chip respectively.
Input-output unit 105 is for being supplied to user input data to realize user and the server (or local terminal)
Interaction.The input-output unit 105 may be, but not limited to, mouse and keyboard etc..
Audio unit 106 provides a user audio interface, may include that one or more microphones, one or more raises
Sound device and voicefrequency circuit.
Display unit 107 provides an interactive interface (such as user's operation circle between the electronic equipment 100 and user
Face) or for display image data give user reference.In the present embodiment, the display unit 107 can be liquid crystal display
Or touch control display.Can be the capacitance type touch control screen or resistance for supporting single-point and multi-point touch operation if touch control display
Formula touch screen etc..Single-point and multi-point touch operation is supported to refer to touch control display and can sense on the touch control display one
Or at multiple positions simultaneously generate touch control operation, and by the touch control operation that this is sensed transfer to processor 103 carry out calculate and
Processing.
The Peripheral Interface 104 couples various input/output devices to processor 103 and memory 101.At some
In embodiment, Peripheral Interface 104, processor 103 and storage control 102 can be realized in one single chip.Other one
In a little examples, they can be realized by independent chip respectively.
The interaction that input-output unit 105 is used to that user input data to be supplied to realize user and processing terminal.It is described defeated
Enter output unit 105 may be, but not limited to, mouse and keyboard etc..
It is appreciated that structure shown in FIG. 1 is only to illustrate, the electronic equipment 100 may also include more than shown in Fig. 1
Either less component or with the configuration different from shown in Fig. 1.Hardware, software may be used in each component shown in Fig. 1
Or combinations thereof realize.
Fig. 2 is please referred to, Fig. 2 is a kind of flow chart of surgical simulation method provided in an embodiment of the present invention, the method packet
Include following steps:
Step S110:Obtain 3D rendering information of the operation tool when being performed the operation.
When being performed the operation, real-time recording can be carried out by camera, the image information got is sent to by camera
Above-mentioned electronic equipment, however since the image information that camera obtains may be 2D images, so electronic equipment is obtaining 2D
Need reconvert at 3D rendering after image.Certainly the 3D figures of operation tool in the course of surgery can be also directly acquired by camera
As information.
Step S120:Based on position of the operation tool in the 3D models being pre-created described in the 3D rendering acquisition of information
Information.
3D models pre-establish, and in order to be simulated to surgical procedure, can establish model to the organ of human body, i.e.,
3D models include 3D organ models, are added with collision body in the 3D organ models in advance, i.e., when creating 3D models, by 3D moulds
Collision body is added in type, collision body can using gridding resource and build its impinger based on the grid.It wherein, can be to each
Correspondence adds collision body to organ respectively respectively, i.e., the collision body between each organ is independent, can be independently controlled.
In order to simulate in surgical procedure operation tool and human organ whether actual contact, so human organ is all added
Network impinger is used as collision body, the operation tool in 3D models is added rigid body, because collision body is the one of physical assemblies
Class, rigid body allow object by physical control, and collision body will be added to ability in the related object in 3D models together with rigid body
Can be that other grid collision bodies touch in the grid collision body labeled as protrusion when triggering collision, and carrying out collision detection
Hit, you can with by the corresponding grid mark of target organ be protrusion, using collision body detect the target organ whether with its
He collides collision body, and if target organ is heart, the corresponding cardiac module of heart is added with collision body, and is labeled as
Protrusion, if then perform the operation to heart, operation tool can be by collision physical examination when colliding the corresponding grid of the heart
It measures, then it represents that the operation tool is collided with heart during actual operation, such as contact.
And after the 3D rendering information for obtaining operation tool, it can get operation tool corresponding position letter in 3D models
Breath, specifically, position coordinates that can be based on the 3D rendering information acquisition operation tool in 3D rendering information then will be described
Position coordinates are converted to the coordinates of targets in 3D models, and the coordinates of targets is position of the operation tool in the 3D models
Confidence ceases, then can be obtained location information of the operation tool in 3D models.
Step S130:Based on the location information judge the operation tool whether with the target in the 3D organ models
The corresponding collision body of organ collides.
After obtaining location information of the operation tool in 3D models, judge the operation tool whether with target organ pair
The collision body answered collides, i.e., location information of the collision body based on operation tool detect operation tool distance objective organ it
Between distance, at a distance from operation tool is between target organ be 0 when, then it represents that operation tool is touched with target organ
It hits, in order to simulate whether operation tool acts on target organ, then can be added for collision body corresponding when adding collision body
Attribute information, as material, radius, center, size (the impinger size i.e. in X, Y, Z three-dimensional) and operation tool collide
The simulation reaction of generation when the target organ is generated vibrations reaction, is cured with this to remind when colliding heart such as operation tool
Raw, operation tool is collided with target organ.In order to which target organ is accurately positioned, can refer to add target organ
Collision body can not generate simulated crash reaction, be mesh to indicate the organ not then if operation tool touches other organs
Organ is marked, is that can generate simulated crash reaction if operation tool is contacted with target organ, is achieved in the accurate of target organ
Positioning, by simulation reaction, that is, can be known, the target organ is the organ performed the operation to doctor, you can carries out subsequent hand
Art process.
The object in 3D models can be made to become lively in addition, collision body is used in combination with rigid body, rigid body allow object by
Physical control, and collision body allows object mutually to collide, so can also add collision body in operation tool, i.e., in 3D models
In further include having 3D operation tool models, in the 3D operation tool models in advance be added with collision body, then can be based on the operation
The location information of tool judge the operation tool in the 3D operation tools model corresponding collision body whether with the 3D
The corresponding collision body of target organ in organ model collides, and when to be, simulated crash reaction occurs, specifically, can
Location information based on the operation tool judges that the operation tool corresponding grid in the 3D operation tools model is
No grid corresponding with target organ in the 3D organ models collides, and when to be, simulated crash reaction occurs.
And two objects for being added to collision body cannot be collided mutually, unless they are labeled as protrusion, so, it can be by target
The corresponding collision body of organ is labeled as protrusion, and raised collision body must be shorter than 255 triangles, so can be in operation tool model
In also add collision body, also can detect operation tool whether collide with target organ.
When judging that operation tool collision body corresponding with the target organ in the 3D organ models collides,
Executing step S140:Simulated crash reaction occurs.It, can be with i.e. when operation tool collides the corresponding collision body of target organ
Send out the reactions such as vibrations, auditory tone cues so that doctor can perceive, so that surgical procedure is smoothed out.
In addition, as an implementation, after step s 140, the method, which may also include, obtains the operation tool
The collision depth information that collision body corresponding with the target organ in the 3D organ models collides;It is deep based on the collision
It spends information and generates corresponding simulated crash reaction.
Specifically, after judging that operation tool collides with target organ, also it can be obtained based on the dynamics of its collision
Collide depth information, for example, the image information that deform upon of target organ when can be collided by obtaining, with target organ not with hand
Image information before art tool collision is compared, such as can get its displacement information generated in X, Y, Z axis, then can be incited somebody to action
The vector sum of displacement information on three axis as collision depth information, so as to based on different collision depth informations come
Different simulated crash reactions is generated, when such as collision depth information is larger, simulated crash reacts the shake that can generate the long period
Dynamic reaction, when collision depth information is smaller, simulated crash reaction can generate vibrations reaction of short period etc..
Fig. 3 is please referred to, Fig. 3 is a kind of structure diagram of surgical simulation device 200 provided in an embodiment of the present invention, the dress
Set including:
Image information acquisition module 210, for obtaining 3D rendering information of the operation tool when being performed the operation;
Position information acquisition module 220 is being pre-created for being based on operation tool described in the 3D rendering acquisition of information
3D models in location information, the 3D models include 3D organ models, in the 3D organ models in advance be added with collision
Body;
Collision judgment module 230, for based on the location information judge the operation tool whether with the 3D organs
The corresponding collision body of target organ in model collides;
Reaction simulation module 240, for judging the operation tool and the 3D organs mould based on the location information
When the corresponding collision body of target organ in type collides, simulated crash reaction is generated.
As an implementation, described image data obtaining module 210 is specifically used for obtaining operation work by camera
Has the 3D rendering information when being performed the operation.
As an implementation, the 3D models further include 3D operation tool models, in the 3D operation tools model
It is added with collision body in advance, the collision judgment module 230 is specifically used for judging the operation tool based on the location information
The corresponding collision body collision whether corresponding with the target organ in the 3D organ models in the 3D operation tools model
Body collides.
As an implementation, the position information acquisition module 220 is obtained specifically for being based on the 3D rendering information
Take position coordinates of the operation tool in the 3D rendering information;The position coordinates are converted into the 3D models
Coordinates of targets, the coordinates of targets are location information of the operation tool in the 3D models.
As an implementation, described device further includes:
Simulation of depth module is collided, it is corresponding with the target organ in the 3D organ models for obtaining the operation tool
The collision depth information that collides of collision body;Corresponding simulated crash reaction is generated based on the collision depth information.
Fig. 4 is please referred to, Fig. 4 is the structural schematic diagram of a kind of electronic equipment provided by the embodiments of the present application, the electronic equipment
May include:At least one processor 410, such as CPU, at least one communication interface 420, at least one processor 430 and extremely
A few communication bus 440.Wherein, communication bus 440 is for realizing the direct connection communication of these components.Wherein, of the invention
The communication interface 420 of equipment is used to carry out the communication of signaling or data with other node devices in embodiment.Memory 430 can be with
It is high-speed RAM memory, can also be non-labile memory (non-volatile memory), for example, at least a magnetic
Disk storage.Memory 430 optionally can also be at least one storage device for being located remotely from aforementioned processor.Memory
It is stored with computer-readable instruction fetch in 430, and has computer-readable instruction fetch luck in the execution memory 430 of processor 410
Step in the above-mentioned surgical simulation method of row.
The embodiment of the present application also provides a kind of readable storage medium storing program for executing, is stored thereon with computer program, the computer journey
The step in the surgical simulation method as above stated is run when sequence is executed by processor.
It is apparent to those skilled in the art that for convenience and simplicity of description, the device of foregoing description
Specific work process, can refer to preceding method in corresponding process, no longer excessively repeat herein.
In conclusion the embodiment of the present invention provides a kind of surgical simulation method and device, this method is first by obtaining hand
3D rendering information of the art tool when being performed the operation is then based on operation tool described in the 3D rendering acquisition of information and is created in advance
Location information in the 3D models built, the 3D models include 3D organ models, are added with and touch in advance in the 3D organ models
Collision body, then judge whether the operation tool is corresponding with the target organ in the 3D organ models based on the location information
Collision body collides, and is judging operation tool collision body generation corresponding with the target organ in the 3D organ models
When collision, simulated crash reaction is generated, can be occurred as a result, by detecting operation tool collision body whether corresponding with target organ
Collision, so as to judge whether operation tool collides with target organ in real process, detect operation tool with
When the corresponding collision body of target organ collides, simulated crash reaction is generated, you can remind doctor, hand during actual operation
Art tool is collided with target organ, is performed the operation to find accurate target organ, be ensure that the suitable of operation
Profit carries out and safety.
In several embodiments provided herein, it should be understood that disclosed device and method can also pass through
Other modes are realized.The apparatus embodiments described above are merely exemplary, for example, the flow chart in attached drawing and block diagram
Show the device of multiple embodiments according to the present invention, the architectural framework in the cards of method and computer program product,
Function and operation.In this regard, each box in flowchart or block diagram can represent the one of a module, section or code
Part, a part for the module, section or code, which includes that one or more is for implementing the specified logical function, to be held
Row instruction.It should also be noted that at some as in the realization method replaced, the function of being marked in box can also be to be different from
The sequence marked in attached drawing occurs.For example, two continuous boxes can essentially be basically executed in parallel, they are sometimes
It can execute in the opposite order, this is depended on the functions involved.It is also noted that every in block diagram and or flow chart
The combination of box in a box and block diagram and or flow chart can use function or the dedicated base of action as defined in executing
It realizes, or can be realized using a combination of dedicated hardware and computer instructions in the system of hardware.
In addition, each function module in each embodiment of the present invention can integrate to form an independent portion
Point, can also be modules individualism, can also two or more modules be integrated to form an independent part.
It, can be with if the function is realized and when sold or used as an independent product in the form of software function module
It is stored in a computer read/write memory medium.Based on this understanding, technical scheme of the present invention is substantially in other words
The part of the part that contributes to existing technology or the technical solution can be expressed in the form of software products, the meter
Calculation machine software product is stored in a storage medium, including some instructions are used so that a computer equipment (can be
People's computer, server or network equipment etc.) it performs all or part of the steps of the method described in the various embodiments of the present invention.
And storage medium above-mentioned includes:USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited
The various media that can store program code such as reservoir (RAM, Random Access Memory), magnetic disc or CD.
The foregoing is only a preferred embodiment of the present invention, is not intended to restrict the invention, for the skill of this field
For art personnel, the invention may be variously modified and varied.All within the spirits and principles of the present invention, any made by repair
Change, equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.It should be noted that:Similar label and letter exist
Similar terms are indicated in following attached drawing, therefore, once being defined in a certain Xiang Yi attached drawing, are then not required in subsequent attached drawing
It is further defined and is explained.
The above description is merely a specific embodiment, but scope of protection of the present invention is not limited thereto, any
Those familiar with the art in the technical scope disclosed by the present invention, can easily think of the change or the replacement, and should all contain
Lid is within protection scope of the present invention.Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
It should be noted that herein, relational terms such as first and second and the like are used merely to a reality
Body or operation are distinguished with another entity or operation, are deposited without necessarily requiring or implying between these entities or operation
In any actual relationship or order or sequence.Moreover, the terms "include", "comprise" or its any other variant are intended to
Non-exclusive inclusion, so that the process, method, article or equipment including a series of elements is not only wanted including those
Element, but also include other elements that are not explicitly listed, or further include for this process, method, article or equipment
Intrinsic element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that
There is also other identical elements in process, method, article or equipment including the element.
Claims (10)
1. a kind of surgical simulation method, which is characterized in that the method includes:
Obtain 3D rendering information of the operation tool when being performed the operation;
Based on location information of the operation tool in the 3D models being pre-created described in the 3D rendering acquisition of information, the 3D moulds
Type includes 3D organ models, is added with collision body in the 3D organ models in advance;
Judge whether the operation tool is corresponding with the target organ in the 3D organ models based on the location information to touch
Collision body collides;
When to be, simulated crash reaction is generated.
2. according to the method described in claim 1, it is characterized in that, obtaining 3D rendering letter of the operation tool when being performed the operation
Breath, including:
3D rendering information of the operation tool when being performed the operation is obtained by camera.
3. described according to the method described in claim 1, it is characterized in that, the 3D models further include 3D operation tool models
In 3D operation tool models in advance be added with collision body, based on the location information judge the operation tool whether with the 3D
The corresponding collision body of target organ in organ model collides, and when to be, generates simulated crash reaction, including:
Based on the location information judge the operation tool in the 3D operation tools model corresponding collision body whether with
The corresponding collision body of target organ in the 3D organ models collides;
When to be, simulated crash reaction is generated.
4. according to the method described in claim 1, it is characterized in that, being existed based on operation tool described in the 3D rendering acquisition of information
Location information in the 3D models being pre-created, including:
Based on position coordinates of the operation tool in the 3D rendering information described in the 3D rendering acquisition of information;
The position coordinates are converted into the coordinates of targets in the 3D models, the coordinates of targets is the operation tool in institute
State the location information in 3D models.
5. according to the method described in claim 1, it is characterized in that, after generating simulated crash reaction, the method further includes:
It is deep to obtain the collision that operation tool collision body corresponding with the target organ in the 3D organ models collides
Spend information;
Corresponding simulated crash reaction is generated based on the collision depth information.
6. a kind of surgical simulation device, which is characterized in that described device includes:
Image information acquisition module, for obtaining 3D rendering information of the operation tool when being performed the operation;
Position information acquisition module, for based on operation tool described in the 3D rendering acquisition of information in the 3D models being pre-created
In location information, the 3D models include 3D organ models, in the 3D organ models in advance be added with collision body;
Collision judgment module, for based on the location information judge the operation tool whether in the 3D organ models
The corresponding collision body of target organ collides;
Reaction simulation module, for the mesh in judging the operation tool and the 3D organ models based on the location information
When the corresponding collision body of mark organ collides, simulated crash reaction is generated.
7. device according to claim 6, which is characterized in that described image data obtaining module is taken the photograph specifically for passing through
As head obtains 3D rendering information of the operation tool when being performed the operation.
8. device according to claim 6, which is characterized in that the 3D models further include 3D operation tool models, described
Collision body is added in 3D operation tool models in advance, the collision judgment module is sentenced specifically for being based on the location information
Break the operation tool in the 3D operation tools model corresponding collision body whether with the target in the 3D organ models
The corresponding collision body of organ collides.
9. device according to claim 6, which is characterized in that the position information acquisition module is specifically used for being based on institute
State position coordinates of the operation tool described in 3D rendering acquisition of information in the 3D rendering information;The position coordinates are converted into
Coordinates of targets in the 3D models, the coordinates of targets are location information of the operation tool in the 3D models.
10. device according to claim 6, which is characterized in that described device further includes:
Simulation of depth module is collided, the operation tool is corresponding with the target organ in the 3D organ models to be touched for obtaining
The collision depth information that collision body collides;Corresponding simulated crash reaction is generated based on the collision depth information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810675056.5A CN108777001A (en) | 2018-06-26 | 2018-06-26 | Surgical simulation method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810675056.5A CN108777001A (en) | 2018-06-26 | 2018-06-26 | Surgical simulation method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108777001A true CN108777001A (en) | 2018-11-09 |
Family
ID=64029904
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810675056.5A Pending CN108777001A (en) | 2018-06-26 | 2018-06-26 | Surgical simulation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108777001A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112379771A (en) * | 2020-10-10 | 2021-02-19 | 杭州翔毅科技有限公司 | Real-time interaction method, device and equipment based on virtual reality and storage medium |
CN115410702A (en) * | 2022-10-31 | 2022-11-29 | 武汉楚精灵医疗科技有限公司 | Endoscope protection early warning method and device and related equipment thereof |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102207997A (en) * | 2011-06-07 | 2011-10-05 | 哈尔滨工业大学 | Force-feedback-based robot micro-wound operation simulating system |
US20150320514A1 (en) * | 2014-05-08 | 2015-11-12 | Samsung Electronics Co., Ltd. | Surgical robots and control methods thereof |
CN105069301A (en) * | 2015-08-14 | 2015-11-18 | 南通大学 | Lumbar puncture virtual simulation and training system supporting haptic interaction |
CN105825752A (en) * | 2016-04-22 | 2016-08-03 | 吉林大学 | Force feedback device-based virtual corneal surgery training system |
CN105992996A (en) * | 2014-04-04 | 2016-10-05 | 外科手术室公司 | Dynamic and interactive navigation in a surgical environment |
US20170282372A1 (en) * | 2006-06-29 | 2017-10-05 | Intuitive Surgical Operations, Inc. | Synthetic representation of a surgical robot |
CN107811710A (en) * | 2017-10-31 | 2018-03-20 | 微创(上海)医疗机器人有限公司 | Operation aided positioning system |
-
2018
- 2018-06-26 CN CN201810675056.5A patent/CN108777001A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170282372A1 (en) * | 2006-06-29 | 2017-10-05 | Intuitive Surgical Operations, Inc. | Synthetic representation of a surgical robot |
CN102207997A (en) * | 2011-06-07 | 2011-10-05 | 哈尔滨工业大学 | Force-feedback-based robot micro-wound operation simulating system |
CN105992996A (en) * | 2014-04-04 | 2016-10-05 | 外科手术室公司 | Dynamic and interactive navigation in a surgical environment |
US20150320514A1 (en) * | 2014-05-08 | 2015-11-12 | Samsung Electronics Co., Ltd. | Surgical robots and control methods thereof |
CN105069301A (en) * | 2015-08-14 | 2015-11-18 | 南通大学 | Lumbar puncture virtual simulation and training system supporting haptic interaction |
CN105825752A (en) * | 2016-04-22 | 2016-08-03 | 吉林大学 | Force feedback device-based virtual corneal surgery training system |
CN107811710A (en) * | 2017-10-31 | 2018-03-20 | 微创(上海)医疗机器人有限公司 | Operation aided positioning system |
Non-Patent Citations (1)
Title |
---|
TED BOARDMAN等,: "建模方法", 《3D STUDIO MAX 2技术精粹(第2卷:建模与材质)》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112379771A (en) * | 2020-10-10 | 2021-02-19 | 杭州翔毅科技有限公司 | Real-time interaction method, device and equipment based on virtual reality and storage medium |
CN115410702A (en) * | 2022-10-31 | 2022-11-29 | 武汉楚精灵医疗科技有限公司 | Endoscope protection early warning method and device and related equipment thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200402425A1 (en) | Device for training users of an ultrasound imaging device | |
Alaraj et al. | Virtual reality training in neurosurgery: review of current status and future applications | |
Gallo et al. | 3D interaction with volumetric medical data: experiencing the Wiimote | |
Samuelson et al. | Comparing image detection algorithms using resampling | |
US11647983B2 (en) | Automating ultrasound examination of a vascular system | |
Neumuth et al. | Online recognition of surgical instruments by information fusion | |
CN105407800A (en) | Brain dysfunction assessment method, brain dysfunction assessment device, and program thereof | |
JP7013236B2 (en) | Visualization of distance on an electrical anatomical map | |
CN105596021A (en) | Image analyzing device and image analyzing method | |
CN108014492A (en) | virtual reality interactive processing method and electronic device | |
Wright et al. | Design and evaluation of an augmented reality simulator using leap motion | |
CN108777001A (en) | Surgical simulation method and device | |
CN107315915A (en) | A kind of simulated medical surgery method and system | |
CN109363718A (en) | Ultrasonic imaging method and supersonic imaging apparatus | |
JP2008134373A (en) | Method and system of preparing biological data for operation simulation, operation simulation method, and operation simulator | |
KR20180028764A (en) | Apparatus and method for children learning using augmented reality | |
JP7236694B2 (en) | Information processing method and information processing system | |
CN109171604A (en) | A kind of intelligent endoscope operating system having AR function | |
CN112650391A (en) | Human-computer interaction method, device and equipment based on virtual reality and storage medium | |
CN109310392A (en) | The method and system of interactive laparoscopy ultrasound guidance ablation plan and surgical procedures simulation | |
KR20200080534A (en) | System for estimating otorhinolaryngology and neurosurgery surgery based on simulator of virtual reality | |
EP3337418B1 (en) | Simulating breast deformation | |
CN114694442B (en) | Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment | |
Schütz et al. | Interactive Shape Sonification for Tumor Localization in Breast Cancer Surgery | |
JP7044297B2 (en) | Excitement propagation visualization device, excitement propagation visualization method, and excitement propagation visualization program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181109 |
|
RJ01 | Rejection of invention patent application after publication |