WO2019132167A1 - Procédé et dispositif de rendu de modèle élastique tridimensionnel, et programme - Google Patents

Procédé et dispositif de rendu de modèle élastique tridimensionnel, et programme Download PDF

Info

Publication number
WO2019132167A1
WO2019132167A1 PCT/KR2018/010331 KR2018010331W WO2019132167A1 WO 2019132167 A1 WO2019132167 A1 WO 2019132167A1 KR 2018010331 W KR2018010331 W KR 2018010331W WO 2019132167 A1 WO2019132167 A1 WO 2019132167A1
Authority
WO
WIPO (PCT)
Prior art keywords
node
voxel
tree
determining
condition
Prior art date
Application number
PCT/KR2018/010331
Other languages
English (en)
Korean (ko)
Inventor
김호승
Original Assignee
(주)휴톰
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020180026573A external-priority patent/KR101862677B1/ko
Application filed by (주)휴톰 filed Critical (주)휴톰
Publication of WO2019132167A1 publication Critical patent/WO2019132167A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N99/00Subject matter not provided for in other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present invention relates to a three-dimensional elastic model rendering method, apparatus and program.
  • Rendering is a computer graphics term that refers to the process of creating a three-dimensional image by injecting realism into a two-dimensional image in consideration of external information such as light source, position, and color.
  • Rendering is a computer graphics process that adds realism to a solid object by giving shadows or changes in the density of the object.
  • An octree is an octree, and is often used to recursively partition a three-dimensional space.
  • Octree is used to represent a three-dimensional space enclosed in a cube, and is used for voxel-based rendering.
  • the present invention provides a three-dimensional elastic model rendering method, apparatus, and program.
  • a method of rendering a three-dimensional elastic model comprising: searching a tree by a computer; determining a limit node for a first path included in the tree; Comprising the steps of: obtaining information about a first voxel corresponding to a node and displaying the object including the first voxel using the obtained information, And one or more nodes each corresponding to one or more voxels that have been divided into groups.
  • the information about the first voxel may include information about the elasticity of the first voxel, and the information about the elasticity of the first voxel may include information on the elasticity of the first voxel, And the color of the position corresponding to the second color is obtained.
  • the step of displaying the image may include calculating a state change of the object based on the information about the elasticity value of the first voxel, and displaying the image based on the calculation result have.
  • the limit node may be a node that satisfies a first condition that a difference between elastic values of voxels corresponding to child nodes of the limit node is less than or equal to a predetermined threshold value.
  • the limit node may be a node that satisfies a second condition that color information of a position corresponding to each child node of the limit node in the medical image falls within a predetermined range.
  • the color information may be a gray scale value of a position corresponding to each child node of the limit node in the medical image.
  • the limit node may be a node having the lowest level among the nodes on the first path satisfying the first condition or the second condition.
  • the step of determining the limit node may include the step of determining the end node of the first path as the limit node when there is no node satisfying the first condition or the second condition on the first path .
  • the step of determining the limit node may include sequentially performing a search along the first path until a node satisfying the first condition or the second condition is found from the root node, Determining, if a node satisfying the second condition is found, the discovered node as the limit node, and if no node satisfying the first condition or the second condition up to the end node is found, And determining the node as the limit node.
  • the determining of the threshold node may also include determining a level of the threshold node based on a distance from a point of view.
  • the tree may be an octree having a hexahedron containing the three-dimensional model data as a root.
  • a method for rendering a three-dimensional elastic model comprising the steps of: obtaining a medical image of a body part of a body of a computer; The method of claim 1, further comprising: obtaining three-dimensional model data; hierarchically dividing the three-dimensional model data into one or more voxels; generating a tree including nodes corresponding to each of the hierarchically- Determining an attribute value of a first voxel corresponding to a first node included in the tree based on a color of a position corresponding to the first voxel in the medical image; And storing the determined elasticity value of the first voxel in the first node.
  • an apparatus for rendering a three-dimensional elastic model comprising: a memory for storing one or more instructions; and a processor for executing the one or more instructions stored in the memory, The method comprising: searching a tree by executing one or more instructions; determining a threshold node for a first path included in the tree; obtaining information about a first voxel corresponding to the determined threshold node; And displaying the object including the first voxel using the information, wherein the tree includes one or more nodes each corresponding to one or more voxels that hierarchically divide the three-dimensional model data of the object .
  • an apparatus for rendering a three-dimensional elastic model comprising: a memory for storing one or more instructions; and a processor for executing the one or more instructions stored in the memory, Obtaining three-dimensional model data of the body part on the basis of the medical image by executing one or more instructions, acquiring a medical image of the body part of the object by executing one or more instructions, , Generating a tree including nodes corresponding to each of the hierarchically divided voxels, determining an attribute value of a first voxel corresponding to a first node included in the tree, Wherein in the medical image, based on a color of a position corresponding to the first voxel, Determining an attribute value of the voxel; and storing the determined elastic value of the first voxel in the first node.
  • a computer program stored in a computer-readable recording medium for performing a three-dimensional elastic model rendering method according to the disclosed embodiments in combination with a computer, .
  • FIG. 1 is a simplified schematic diagram of a system capable of performing robotic surgery in accordance with the disclosed embodiments.
  • FIG. 2 is a diagram illustrating a three-dimensional elastic model rendering system in accordance with one embodiment.
  • FIG. 3 is a flowchart illustrating a method of rendering a three-dimensional elastic model according to an embodiment.
  • FIG. 4 is a diagram showing an example of a method of dividing three-dimensional model data into voxels and a method of obtaining elastic values of voxels based on a medical image.
  • FIG. 5 is a diagram showing three-dimensional model data and a tree corresponding thereto.
  • FIG. 6 is a diagram illustrating an example of methods for searching for a threshold node.
  • FIG. 7 is a diagram showing an example of a result of determining the threshold nodes of the tree.
  • FIG. 8 is a diagram illustrating a method of performing rendering based on a point of view according to one embodiment.
  • Fig. 9 is a diagram showing an example of a change in the tree structure when the object is cut.
  • FIG. 10 is a configuration diagram of an apparatus 900 according to one embodiment.
  • the term “part” or “module” refers to a hardware component, such as a software, FPGA, or ASIC, and a “component” or “module” performs certain roles. However, “part” or “ module “ is not meant to be limited to software or hardware. A “module “ or “ module “ may be configured to reside on an addressable storage medium and configured to play back one or more processors. Thus, by way of example, “a” or " module " is intended to encompass all types of elements, such as software components, object oriented software components, class components and task components, Microcode, circuitry, data, databases, data structures, tables, arrays, and variables, as used herein. Or " modules " may be combined with a smaller number of components and "parts " or " modules " Can be further separated.
  • spatially relative can be used to easily describe a correlation between an element and other elements.
  • Spatially relative terms should be understood in terms of the directions shown in the drawings, including the different directions of components at the time of use or operation. For example, when inverting an element shown in the figures, an element described as “below” or “beneath” of another element may be placed “above” another element .
  • the exemplary term “below” can include both downward and upward directions.
  • the components can also be oriented in different directions, so that spatially relative terms can be interpreted according to orientation.
  • the rendering is understood as meaning encompassing all kinds of calculation processes performed to create a three-dimensional image.
  • FIG. 1 is a simplified schematic diagram of a system capable of performing robotic surgery in accordance with the disclosed embodiments.
  • the robotic surgery system includes a medical imaging apparatus 10, a server 20, a control unit 30 provided in an operating room, a display 32, and a surgical robot 34.
  • the medical imaging equipment 10 may be omitted from the robotic surgery system according to the disclosed embodiment.
  • the surgical robot 34 includes a photographing device 36 and a surgical tool 38.
  • robotic surgery is performed by the user controlling the surgical robot 34 using the control unit 30.
  • robot surgery may be performed automatically by the control unit 30 without user control.
  • the server 20 is a computing device including at least one processor and a communication unit.
  • the control unit 30 includes a computing device including at least one processor and a communication unit. In one embodiment, the control unit 30 includes hardware and software interfaces for controlling the surgical robot 34.
  • the photographing apparatus 36 includes at least one image sensor. That is, the photographing device 36 includes at least one camera device and is used to photograph a target object, that is, a surgical site. In one embodiment, the imaging device 36 includes at least one camera coupled with a surgical arm of the surgical robot 34.
  • the image photographed at the photographing device 36 is displayed on the display 340.
  • the surgical robot 34 includes one or more surgical tools 38 that can perform cutting, clipping, anchoring, grabbing, etc., of the surgical site.
  • the surgical tool 38 is used in combination with the surgical arm of the surgical robot 34.
  • the control unit 30 receives information necessary for surgery from the server 20, or generates information necessary for surgery and provides the information to the user. For example, the control unit 30 displays on the display 32 information necessary for surgery, which is generated or received.
  • the user operates the control unit 30 while viewing the display 32 to perform the robot surgery by controlling the movement of the surgical robot 34.
  • the server 20 generates information necessary for the robot surgery using the medical image data of the object photographed previously from the medical imaging apparatus 10 and provides the generated information to the control unit 30.
  • the control unit 30 provides the information received from the server 20 to the user by displaying the information on the display 32 or controls the surgical robot 34 using the information received from the server 20.
  • the means that can be used in the medical imaging equipment 10 is not limited, and various other medical imaging acquiring means such as CT, X-Ray, PET, MRI and the like may be used.
  • the surgical image obtained in the photographing device 36 is transmitted to the control section 30.
  • control unit 30 may segment the surgical image obtained during the operation in real time.
  • control unit 30 transmits a surgical image to the server 20 during or after surgery.
  • the server 20 can divide and analyze the surgical image.
  • FIG. 2 is a diagram illustrating a three-dimensional elastic model rendering system in accordance with one embodiment.
  • a three-dimensional elastic model rendering system includes a client 100 and a server 200.
  • client 100 and server 200 are computer devices that include at least one processor.
  • the client 100 may be a computing device in an operating room (surgical site).
  • the client 100 may correspond to the control unit 30 shown in FIG.
  • the client 100 may be provided in the operating room as a separate computing device from the control unit 30, and may be connected to the control unit 30.
  • a connection includes not only a physical connection but also an electronic connection concept, and it can also be understood as a concept of connection that the communication state is mutually communicable.
  • the client 100 may be connected to the control unit 30 by wire or wireless, and may be in a state where they can communicate with each other using short-range wireless communication or network communication.
  • FIG. 3 is a flowchart illustrating a method of rendering a three-dimensional elastic model according to an embodiment.
  • the method shown in FIG. 3 is a step-by-step illustration of operations performed in the client 100 or the server 200 shown in FIG.
  • the computer is described as performing the method shown in FIG. 3 and the embodiments shown in FIG. 3 and below, but at least some or all of the steps and embodiments may be performed by the client 100 or the server 200 ), And the subject of execution is not limited.
  • a tree is defined.
  • the tree includes one or more nodes each corresponding to one or more voxels that hierarchically divide the three-dimensional model data of the object.
  • the tree may be, but is not limited to, an octree consisting of nodes corresponding to one or more voxels that divide the cube containing the object.
  • the three-dimensional model data of the object is obtained based on the medical image of the object.
  • information about the voxels corresponding to each node may be stored in each node of the tree.
  • the information about the voxel includes the elastic value of the voxel.
  • the elasticity value of the voxel may be obtained based on the color of the position corresponding to each voxel in the medical image taken of the object.
  • the medical image may be a CT image, and the color may mean a gray scale value, but is not limited thereto.
  • CT images represent different gray scales depending on the physical properties, such as the molecular structure of the subject (subject) to be photographed. Therefore, the physical property of each part can be determined based on the gray scale value of the CT image, and the elasticity value can be determined based on the property.
  • FIG. 4 an example of a method of dividing three-dimensional model data into voxels and a method of acquiring elastic values of respective voxels based on medical images is shown.
  • a hexahedron 400 including a three-dimensional model 300 and a three-dimensional model 300 is shown.
  • the three-dimensional model 300 may be a three-dimensional model that models the liver.
  • the computer divides the hexahedron 400 into eight hexahedrons. Each cube is again divided, and a cube 410 having a limited volume can be obtained through a metric division.
  • the computer creates a tree (e.g., an octree) with nodes corresponding to each cube generated in the segmentation process.
  • a voxel is understood as a concept that can cover both the hexahedron 402 and the first hexahedron 400 generated in the division process, in addition to the terminal hexahedron 410 having a limited volume.
  • the voxels may be understood as the smallest unit having a limit volume, and may also include larger volumes of voxels produced by combining them.
  • the computer acquires the grayscale value of the position 520 corresponding to the terminal hexahedron 410 in the medical image 500 to obtain the elasticity value of the voxel 410.
  • the grayscale value may be obtained as an average of the grayscale values contained in location 520, but is not limited thereto.
  • the object i.e., the hexahedron not including the three-dimensional model 300, can be removed.
  • the removed cube is not included in the tree.
  • the computer in the partitioning of the hexahedron 400 and the creation of the tree, can either halt the partitioning if certain conditions are met, or merge the partitioned cubes.
  • the computer can stop the division and determine the hexahedron to be the terminal hexahedron.
  • the computer may divide a specific hexahedron, and if the grayscale values of each of the eight hexahedrons generated therefrom are similar (e.g., within a predetermined range, or each gray scale value is below a predetermined reference value)
  • the eight hexahedrons can be merged, and the merged hexahedron can be determined as the terminal hexahedron.
  • the load can be reduced by not further dividing the hexahedron .
  • the load of calculation can be reduced by adjusting the dividing step of each cube.
  • the determination of the rendering level may be performed in the process of creating the tree as described above.
  • step S110 the computer searches for a tree generated based on the three-dimensional model data of the object.
  • the computer searches the tree from the root node of the tree.
  • the computer can search the tree in a top-down or bottom-up manner, but is not limited thereto.
  • step S120 the computer determines a limit node for the first path included in the tree searched in step S110. For example, in the process of searching the first path included in the tree, the computer determines a limit node that is a rendering target node.
  • step S130 the computer obtains information on the first voxel corresponding to the limit node determined in step S120.
  • the information on the first voxel includes information on the elasticity values of the first voxel.
  • step S140 the computer performs rendering on the object including the first voxel and the first voxel, using the information obtained in step S130.
  • the computer performs rendering on the object including the first voxel and the first voxel using the elasticity value of the first voxel.
  • the computer performs a calculation for displaying an object including the first voxel, performs rendering for displaying an object including the first voxel based on the calculation result, 1 Display an object containing a voxel.
  • an object comprising a first voxel may refer to a body part (e.g., organ) of a patient.
  • the computer computes a state change of an object corresponding to an external stimulus for an object comprising the first voxel using the elastic value of the first voxel.
  • a change in the state of an object may include, but is not limited to, movement and transformation of the object.
  • the elasticity value may be used for the calculation to have a similar reaction to the actual object. For example, when you press or cut a model that rendered an object in the simulation process, it can be pushed or cut in a similar fashion to the actual object.
  • the computer uses the tree to determine the threshold nodes for computing the motion and deformation of the object, and calculates the motion and deformation of the object using the information obtained from the determined threshold nodes.
  • the computer determines a limit node for each path using a tree instead of calculating elastic values corresponding to all the voxels included in the object,
  • the motion and deformation of the object can be calculated using the elasticity value of the corresponding voxel.
  • the computer can display the object by changing the state of the object based on the calculation result, and performing rendering to display the object based on the changed state.
  • the computer also determines a limit node for rendering the object based on the distance from the rendering point to the object.
  • a computer may render a three-dimensional model represented by a tree, but instead of rendering all the end nodes, the computer determines a limit node on the path connecting each end node from the root, and then only the voxels corresponding to the limit node To reduce the load required for rendering. Specific methods for determining the limit nodes for rendering based on distance will be described below.
  • a tree 600 generated corresponding to the three-dimensional model data 300 is shown.
  • the computer searches the tree 600 and determines rendering levels, i.e., threshold nodes, for rendering the three-dimensional model data 300.
  • the tree 600 may be an octree rooted at the hexahedron 400 including the three-dimensional model data 300, but is not limited thereto.
  • the tree 600 includes a first path that includes a root node 610 and its child nodes 620, child nodes 630 of the child nodes 620 and child nodes 640 of the child nodes 630 .
  • the computer determines a limit node to render along the first path.
  • the lower the level of the limit node the more the rendering is performed. Therefore, the load of the rendering is reduced. The higher the level of the limit node, the smaller the rendering is performed.
  • determining the rendering level determines the level of the limit node to render for each path included in the tree 600.
  • the level is defined as being smaller toward the root of the tree, and increasing as the distance from the root increases.
  • a level can be defined as the distance from the root.
  • the threshold node may be a node that satisfies a first condition in which the difference in elastic values of voxels corresponding to the child nodes of the threshold node is less than or equal to a predetermined threshold.
  • 1 path can be a limit node.
  • the threshold node may be a node that satisfies the second condition that the color information of the position corresponding to each of the child nodes of the threshold node in the medical image 500 falls within a predetermined range.
  • the color information e.g., gray scale value
  • the node 630 may be the limit node of the first path.
  • the threshold node may be the node having the lowest level among the nodes on each path satisfying the first condition or the second condition.
  • a node 620 may be a limit node of one or more paths.
  • FIG. 6 an example of methods for searching for a threshold node is shown.
  • the computer can search the threshold node in a top-down manner or in a bottom-up manner.
  • the computer may stop searching if a limit node is found, and may not search child nodes of the limit node.
  • the computer may compare, at node 630, the elastic or gray scale values corresponding to the child nodes of node 630.
  • Each elastic value or grayscale value may be stored in each node and may be obtained in a search process. If the elastic or grayscale values corresponding to the child nodes of node 630 are similar to each other, the computer can determine node 630 as a threshold node and stop the search.
  • the computer can continuously search for nodes satisfying the condition at a lower level even when a node satisfying the condition of the limit node is found.
  • the computer can search the tree in a recursive manner and perform searches from the end node.
  • the computer may obtain the elastic or grayscale values from each of the child nodes 640-644, compare the obtained values at the parent node 630, and determine the parent node 630 as the limit node according to the comparison result have.
  • the computer must continue to search for the sibling nodes of node 630 and the parent nodes (and ancestor nodes) of node 630.
  • the parent node may become a new limit node if both the siblings of node 630 also meet the condition of the limit node, and the parent node of node 630 also meets the condition of the limit node.
  • the computer renders the voxel corresponding to the end node of the first path. That is, the end node of the first path becomes the limit node of the first path.
  • the computer sequentially searches from the root node along the first path until a node satisfying the first condition or the second condition is found. If a node satisfying the first condition or the second condition is found, the computer determines the discovered node as the limit node.
  • the computer determines the end node as the limit node.
  • FIG. 7 is a diagram showing an example of a result of determining the threshold nodes of the tree.
  • FIG. 7 a tree 700 generated based on three-dimensional model data is shown.
  • the tree 700 is shown in the form of a binary tree in FIG. 7, but the form of the tree is not limited thereto.
  • the tree 700 may be an octree.
  • the length (level) of each path included in the tree 700 may be different. For example, in the process of creating a tree, if the corresponding hexahedron is empty (i.e., does not include a three-dimensional model), the corresponding node may be deleted from the tree.
  • siblings having similar elasticity values or grayscale values may be deleted from the tree, and the parent node may be a terminal node. These nodes may be deleted during the preprocessing or tree creation process and may be retained and excluded from the search and rendering process.
  • nodes 720 are nodes corresponding to voxels whose sibling nodes have similar elastic or gray scale values to one another.
  • nodes 720 are excluded in the rendering process, and the nodes 710 are rendered as voxels corresponding to the nodes 710 as the limit nodes for each path.
  • the elasticity values of the voxels corresponding to the nodes 710 may be determined by the sum of the elastic values of the voxels corresponding to the respective child nodes, but are not limited thereto.
  • the elasticity of the vertices of the voxels corresponding to the nodes 710 may be the sum of the elastic values of the vertices of the voxels corresponding to the child nodes.
  • FIG. 8 is a diagram illustrating a method of performing rendering based on a point of view according to one embodiment.
  • the location of the imaging device 36 is used to determine the rendering time.
  • the position of the object 800 is used as a position relative to the position of the photographing device 36 to determine the position of the three-dimensional model to be rendered.
  • the computer determines the level of the limit node for each path of the tree based on the distance from the point of view.
  • the voxel 810 may correspond to a higher-level limit node than the voxel 820.
  • the path itself corresponding to the invisible part 830 at the rendering time can be excluded from the tree, and in this case, the limit node of the path can be the root node of the tree, but is not limited thereto.
  • the path corresponding to the invisible portion 830 at the rendering time may be a path of a subtree included in the tree.
  • the limit node of the path may be the root node of the included subtree.
  • Fig. 9 is a diagram showing an example of a change in the tree structure when the object is cut.
  • the object 800 may be a body part, i.e., an organ, and the result of rendering the object 800 may be used for a surgical simulation or the like, so that the object 800 may be cut.
  • Polygon-based rendering is vulnerable to the truncation of the model and has the disadvantage of requiring new rendering each time.
  • the tree 700 is divided based on the cut surface 802 from which the target 800 is cut, Can be expressed.
  • the computer may divide the tree 700 based on branch 702 that corresponds to the cut surface 802 of the object 800.
  • the divided subtrees correspond to the divided parts of the object, respectively.
  • the computer may divide the tree based on the node corresponding to the section plane 802 of the object 800, and the division method is not limited thereto.
  • FIG. 10 is a configuration diagram of an apparatus according to an embodiment.
  • the processor 902 may include a connection path (e.g., a bus, etc.) that transmits and receives signals with one or more cores (not shown) and a graphics processing unit (not shown) and / .
  • a connection path e.g., a bus, etc.
  • the processor 902 may include a connection path (e.g., a bus, etc.) that transmits and receives signals with one or more cores (not shown) and a graphics processing unit (not shown) and / .
  • the processor 902 in accordance with one embodiment performs the surgical video data learning method described with respect to Figures 1-9 by executing one or more instructions stored in the memory 904.
  • the processor 902 may search the tree by executing one or more instructions stored in memory, determine a threshold node for the first path included in the tree, and determine a threshold for the first voxel corresponding to the determined threshold node Dimensional model data of the object, and the tree is generated by rendering one or more nodes corresponding to one or more voxels, which hierarchically divide the three-dimensional model data of the object, wherein the information on the first voxel includes information on elasticity values of the first voxel and information on the elasticity values of the first voxel includes information on elasticity values of the first voxel, 1 < / RTI > voxel.
  • the processor 902 may include a random access memory (RAM) (not shown) and a read-only memory (ROM) for temporarily and / or permanently storing signals (or data) , Not shown).
  • the processor 902 may be implemented as a system-on-chip (SoC) including at least one of a graphics processing unit, a RAM, and a ROM.
  • SoC system-on-chip
  • the memory 904 may store programs (one or more instructions) for processing and control of the processor 902. Programs stored in the memory 904 can be divided into a plurality of modules according to functions.
  • the surgical image segmentation method may be implemented as a program (or an application) to be executed in combination with a hardware computer and stored in a medium.
  • the above-described program may be stored in a computer-readable medium such as C, C ++, JAVA, machine language, or the like that can be read by the processor (CPU) of the computer through the device interface of the computer, And may include a code encoded in a computer language of the computer.
  • code may include a functional code related to a function or the like that defines necessary functions for executing the above methods, and includes a control code related to an execution procedure necessary for the processor of the computer to execute the functions in a predetermined procedure can do.
  • code may further include memory reference related code as to whether the additional information or media needed to cause the processor of the computer to execute the functions should be referred to at any location (address) of the internal or external memory of the computer have.
  • the code may be communicated to any other computer or server remotely using the communication module of the computer
  • a communication-related code for determining whether to communicate, what information or media should be transmitted or received during communication, and the like.
  • the medium to be stored is not a medium for storing data for a short time such as a register, a cache, a memory, etc., but means a medium that semi-permanently stores data and is capable of being read by a device.
  • examples of the medium to be stored include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, but are not limited thereto.
  • the program may be stored in various recording media on various servers to which the computer can access, or on various recording media on the user's computer.
  • the medium may be distributed to a network-connected computer system so that computer-readable codes may be stored in a distributed manner.
  • the steps of a method or algorithm described in connection with the embodiments of the present invention may be embodied directly in hardware, in software modules executed in hardware, or in a combination of both.
  • the software module may be a random access memory (RAM), a read only memory (ROM), an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, a hard disk, a removable disk, a CD- May reside in any form of computer readable recording medium known in the art to which the invention pertains.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Computer Graphics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Image Generation (AREA)

Abstract

L'invention concerne un procédé de rendu de modèle élastique tridimensionnel comprenant les étapes consistant : à rechercher un arbre au moyen d'un ordinateur ; à déterminer un nœud de limite pour un premier parcours dans l'arbre ; à acquérir des informations sur un premier voxel correspondant au nœud de limite déterminé ; et à afficher un objet comprenant le premier voxel à l'aide des informations acquises, l'arbre incluant un ou plusieurs nœuds qui correspondent à chacun des voxels obtenus par division hiérarchique de données de modèle tridimensionnel de l'objet.
PCT/KR2018/010331 2017-12-28 2018-09-05 Procédé et dispositif de rendu de modèle élastique tridimensionnel, et programme WO2019132167A1 (fr)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
KR10-2017-0182898 2017-12-28
KR10-2017-0182900 2017-12-28
KR20170182899 2017-12-28
KR20170182898 2017-12-28
KR20170182900 2017-12-28
KR10-2017-0182899 2017-12-28
KR1020180026573A KR101862677B1 (ko) 2018-03-06 2018-03-06 3차원 탄성 모델 렌더링 방법, 장치 및 프로그램
KR10-2018-0026573 2018-03-06

Publications (1)

Publication Number Publication Date
WO2019132167A1 true WO2019132167A1 (fr) 2019-07-04

Family

ID=67067790

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/010331 WO2019132167A1 (fr) 2017-12-28 2018-09-05 Procédé et dispositif de rendu de modèle élastique tridimensionnel, et programme

Country Status (1)

Country Link
WO (1) WO2019132167A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004005373A (ja) * 2001-11-27 2004-01-08 Samsung Electronics Co Ltd 深さイメージに基づく3次元物体を表現するためのノード構造
US20090295800A1 (en) * 2008-05-30 2009-12-03 Siemens Corporate Research, Inc. Method for direct volumetric rendering of deformable bricked volumes
KR101175065B1 (ko) * 2011-11-04 2012-10-12 주식회사 아폴로엠 수술용 영상 처리 장치를 이용한 출혈 부위 검색 방법
JP2014064957A (ja) * 2014-01-22 2014-04-17 Mitsubishi Precision Co Ltd 生体データモデル作成方法及びその装置並びに生体データモデルのデータ構造及び生体データモデルのデータ格納装置並びに三次元データモデルの負荷分散方法及びその装置
KR20160005490A (ko) * 2014-07-07 2016-01-15 삼성전자주식회사 렌더링 시스템 및 이의 렌더링 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004005373A (ja) * 2001-11-27 2004-01-08 Samsung Electronics Co Ltd 深さイメージに基づく3次元物体を表現するためのノード構造
US20090295800A1 (en) * 2008-05-30 2009-12-03 Siemens Corporate Research, Inc. Method for direct volumetric rendering of deformable bricked volumes
KR101175065B1 (ko) * 2011-11-04 2012-10-12 주식회사 아폴로엠 수술용 영상 처리 장치를 이용한 출혈 부위 검색 방법
JP2014064957A (ja) * 2014-01-22 2014-04-17 Mitsubishi Precision Co Ltd 生体データモデル作成方法及びその装置並びに生体データモデルのデータ構造及び生体データモデルのデータ格納装置並びに三次元データモデルの負荷分散方法及びその装置
KR20160005490A (ko) * 2014-07-07 2016-01-15 삼성전자주식회사 렌더링 시스템 및 이의 렌더링 방법

Similar Documents

Publication Publication Date Title
WO2013015549A2 (fr) Système de réalité augmentée sans repère à caractéristique de plan et son procédé de fonctionnement
KR20190080702A (ko) 3차원 탄성 모델 렌더링 방법, 장치 및 프로그램
WO2019132168A1 (fr) Système d'apprentissage de données d'images chirurgicales
WO2013168998A1 (fr) Appareil et procédé de traitement d'informations 3d
WO2019132614A1 (fr) Procédé et appareil de segmentation d'image chirurgicale
WO2019050360A1 (fr) Dispositif électronique et procédé de segmentation automatique d'être humain dans une image
JP2004235934A (ja) キャリブレーション処理装置、およびキャリブレーション処理方法、並びにコンピュータ・プログラム
KR101995411B1 (ko) 신체 모델 생성 장치 및 방법
WO2017007254A1 (fr) Dispositif et procédé de génération et d'affichage de carte en 3d
WO2016006786A1 (fr) Système de restitution et son procédé de restitution
CN113129362B (zh) 一种三维坐标数据的获取方法及装置
WO2019132167A1 (fr) Procédé et dispositif de rendu de modèle élastique tridimensionnel, et programme
JP7479793B2 (ja) 画像処理装置、仮想視点映像を生成するシステム、画像処理装置の制御方法及びプログラム
WO2020101300A1 (fr) Appareil de traitement d'image et son procédé de fonctionnement
JP2005092451A (ja) 頭部検出装置及び頭部検出方法、並びに頭部検出プログラム
JP2004213481A (ja) 画像処理装置及びその処理方法
JP2007315777A (ja) 三次元形状計測システム
WO2018182066A1 (fr) Procédé et appareil d'application d'un effet dynamique à une image
WO2023113471A1 (fr) Dispositif électronique et procédé d'acquisition de données de squelette tridimensionnel d'un sujet photographié à l'aide d'une pluralité de dispositifs de prise de vues
WO2012074174A1 (fr) Système utilisant des données d'identification originales pour mettre en oeuvre une réalité augmentée
KR101862677B1 (ko) 3차원 탄성 모델 렌더링 방법, 장치 및 프로그램
WO2024053876A1 (fr) Dispositif électronique effectuant un étalonnage de caméra, et son procédé de fonctionnement
US11640692B1 (en) Excluding objects during 3D model generation
WO2024090989A1 (fr) Segmentation multi-vues et inpainting perceptuel à champs de rayonnement neuronal
US11410368B2 (en) Animation control rig generation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18897464

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18897464

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 01/02/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18897464

Country of ref document: EP

Kind code of ref document: A1