CN112001998B - Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms - Google Patents

Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms Download PDF

Info

Publication number
CN112001998B
CN112001998B CN202010910398.8A CN202010910398A CN112001998B CN 112001998 B CN112001998 B CN 112001998B CN 202010910398 A CN202010910398 A CN 202010910398A CN 112001998 B CN112001998 B CN 112001998B
Authority
CN
China
Prior art keywords
collision
ultrasonic
ray
light
virtual reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010910398.8A
Other languages
Chinese (zh)
Other versions
CN112001998A (en
Inventor
彭博
青芮冰
汪强
王世元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Petroleum University
Original Assignee
Southwest Petroleum University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Petroleum University filed Critical Southwest Petroleum University
Priority to CN202010910398.8A priority Critical patent/CN112001998B/en
Publication of CN112001998A publication Critical patent/CN112001998A/en
Application granted granted Critical
Publication of CN112001998B publication Critical patent/CN112001998B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/286Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for scanning or photography techniques, e.g. X-rays, ultrasonics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Abstract

The invention discloses a real-time simulation ultrasonic imaging method based on an OptiX and Unity3D virtual reality platform, which comprises the following steps: importing 3D model data, and acquiring the space coordinates of the 3D model through an ultrasonic probe; uniformly converting the space coordinates of the 3D model in the virtual reality environment into space coordinates in a ray tracing scene; starting ray tracing to obtain a collision point set of the ultrasonic simulation image according to the initial ray intensity; generating an image contour map, and performing convolution operation on the image contour map to acquire pixel value information of the ultrasonic simulation image; and obtaining the ultrasonic simulation image according to the pixel value information of the ultrasonic simulation image. The invention combines the virtual reality and the ray tracing acceleration frame, and provides a brand-new professional ultrasonic doctor training method with good real-time performance and interactivity for the doctor learner by using the immersive virtual reality technology on the premise of ensuring a higher professional level of the ultrasonic simulation image in the medical professional field.

Description

Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms
Technical Field
The invention relates to a method for simulating ultrasonic imaging in real time based on an OptiX and Unity3D virtual reality platform, and belongs to the technical field of medical ultrasonic imaging teaching.
Background
The ultrasonic imaging technology is one of the image diagnosis means which has low risk and low cost and is widely applied to clinical diagnosis. The ultrasonic examination has little harm to human body, the hardware cost is lower than that of CT and MRI, but the image is real, and the characteristic of high resolution can be used for carrying out preliminary screening judgment on pathology more quickly, which is a great advantage. The principle is mainly that the ultrasonic waves form images by the refraction, reflection and scattering principle generated when the ultrasonic waves propagate in human tissues. At present, the ultrasonic image diagnosis technology has important significance in the examination of evaluating pregnancy week, predelivery period, high-risk factors of pregnancy, fetal heart rate and the like in the abdomen and pregnancy examination, and has important value in real-time flexible image guidance of heart eating ultrasonic in the operation.
Ultrasound imaging relies primarily on signal processing of tissue boundaries formed by reflections of ultrasound echoes, speckle noise formed by scattering, and attenuation of acoustic energy. The reflection of sound waves comes from different materials of human tissue structures, so the reflection coefficient of sound is not uniform. Scattering originates from small tissues in the human body with a size less than or equal to the wavelength of the ultrasonic waves, and attenuation is a normal phenomenon of the transmission of acoustic wave energy in a medium. At present, the most classical simulation of the imaging technology is linear simulation based on a convolution method, and although the linear simulation can effectively simulate the distribution of speckle noise, the complex and nonlinear real condition of human tissues cannot be simulated; the method based on the neural network model can also be used for image simulation, the effect is better than that of a general convolution algorithm, but if the neural network model or input data is not perfect enough, the output result image cannot be well evaluated, and a certain time is needed for training the model; the machine learning method is basically to simulate a specific data into an ultrasonic picture, which is not suitable for the virtual reality system.
Disclosure of Invention
The invention mainly overcomes the defects in the prior art and provides a real-time simulation ultrasonic imaging method based on an OptiX and Unity3D virtual reality platform.
The technical scheme provided by the invention for solving the technical problems is as follows: the method for simulating ultrasonic imaging in real time based on the OptiX and Unity3D virtual reality platforms comprises the following steps:
step one, building a virtual reality environment in Unity3D, importing 3D model data, and acquiring a space coordinate of a 3D model through an ultrasonic probe;
step two, creating a ray tracing scene, loading a 3D model in the scene, and uniformly converting the space coordinates of the 3D model in the Unity3D virtual reality environment into the space coordinates in the ray tracing scene;
thirdly, starting ray tracing according to the converted initial light intensity of the space coordinate to obtain a collision point set of the ultrasonic simulation image;
generating an image profile of the ultrasonic simulation image according to the collision point set of the ultrasonic simulation image, and performing convolution operation on the image profile to acquire pixel value information of the ultrasonic simulation image;
and fifthly, transmitting the pixel value information of the ultrasonic simulation image to a Unity3D to calculate a gray value corresponding to the pixel point, and finally displaying a corresponding ultrasonic simulation image obtained by scanning the cross section of the corresponding model at the current ultrasonic probe position.
The further technical scheme is that the 3D model data comprises point data and surface data.
The further technical scheme is that the process of the unified transformation of the spatial coordinates in the second step comprises translation and rotation;
wherein the translation formula is as follows:
S=T.*P
Figure BDA0002663038820000021
in the formula: s represents the position of the probe after being changed; p represents the original position of the probe; t represents a translation matrix; t is tx、ty、tzRespectively representing the moving lengths on the x, y and z axes;
wherein the rotation formula is as follows:
S=e.*P
e=(X,Y,Z)
Q=w+ix+jy+kz
X=a sin(2wx-2yz)
Y=a tan2(2wx+2xz,1-2x2-2y2)
Z=a tan2(2wz+2xy,1-2x2-2z2)
in the formula: s represents the position of the probe after being changed; p represents the original position of the probe; e represents the Euler angle; q represents a quaternion; x, Y, Z indicate the rotation angles around the X, Y and Z axes, respectively; w, x, y, and z represent real numbers, and i, j, and k represent imaginary numbers.
The further technical scheme is that the third step comprises the following specific steps: transferring the converted space coordinates to a light ray emission origin in a light ray tracking scene; setting collision sum and collision detection functions by using the point data and the surface data; then, carrying out light ray collision according to a ray tracing algorithm to obtain information of collision points; and finally generating a collision point set according to the information of the collision points.
The further technical scheme is that the specific process of the ray tracing algorithm for ray collision is as follows: creating a bounding box acceleration structure, creating light ray buffering according to corresponding probe parameters, and creating light ray collision buffering; and if the light ray collides with the model bounding box, calculating the intensity and the collision times of the light ray: if the light intensity is greater than the minimum intensity and the collision times do not reach the collision threshold, recording the information of the collision point, and then generating corresponding reflected and refracted light; if the light intensity is less than the minimum intensity or the light collision frequency is equal to the threshold value, recording related information and terminating the current light; if the ray does not interact with any object in the scene, the relevant information is recorded and the current ray is terminated.
The further technical scheme is that the specific process of generating the image contour map of the ultrasonic simulation image by the collision point set in the fourth step is as follows:
according to the collision point set, traversing the image matrix from left to right and from top to bottom, moving a traversing pointer according to the ratio of the resolution to the ray stepping length, and describing by using the following formula when the collision point is not met;
I=Ioe-alf
in the formula: i represents the collision point amplitude; a represents the attenuation coefficient of the tissue; l represents a distance; f represents a frequency; i isoRepresents the initial energy;
when a collision point is encountered, the following formula is used for describing the collision point and converting the material value to continue describing downwards until the detection length is exceeded; finally, drawing an image outline graph;
Figure BDA0002663038820000041
in the formula: riRepresents a ray of light, PTRepresenting the collision point, I representing the collision point amplitude, ωiRepresenting the angle of incidence of the light, n representing how many rays hit this point; i istRepresents the incident energy of ω in the direction; t represents an integral variable ranging from 0 to T.
The invention has the following beneficial effects:
1) the invention combines Unity3D virtual reality with the Yingwei OptiX ray tracing acceleration frame, and provides a brand-new professional ultrasonic doctor training method with good real-time performance and interactivity for a doctor learner by using the immersive virtual reality technology on the premise of ensuring a higher professional level of an ultrasonic simulation image in the medical professional field;
2) in real-time, the ray tracing simulation algorithm can reach 25 fps.
Drawings
FIG. 1 is a block flow diagram of the present invention;
FIG. 2 is a diagram showing an example of a 3D organization model in the OBJ format in the embodiment;
FIG. 3 is a schematic view of a probe and an optical fiber;
FIG. 4 is a schematic diagram of Unity3D and OptiX engine coordinate and rotation unification;
FIG. 5 is a schematic representation of an embodiment after convolution post-processing;
fig. 6 is a schematic diagram of the final effect of the embodiment.
Detailed Description
The present invention will be further described with reference to the following examples and the accompanying drawings.
As shown in FIG. 1, the method for simulating ultrasonic imaging in real time based on the OptiX and Unity3D virtual reality platforms comprises the following steps:
A. firstly, building a virtual reality environment in Unity3D, and importing a 3D model drawn by modeling software such as 3Ds MAX; obtaining world coordinate information of the model;
B. creating a new light tracking scene; loading a 3D model within a scene; transmitting the world coordinate of the ultrasonic probe in the virtual reality to a light ray emission origin in a light ray tracing scene;
C. defining a light behavior function; creating a bounding box acceleration structure, creating light ray buffering according to corresponding probe parameters, and creating light ray collision buffering; and if the light ray collides with the model bounding box, calculating the intensity and the collision times of the light ray: if the light intensity is greater than the minimum intensity and the collision times do not reach the collision threshold, recording the information of the collision point, and then generating corresponding reflected and refracted light; if the light intensity is less than the minimum intensity or the light collision frequency is equal to the threshold value, recording related information and terminating the current light; if the ray does not interact with any object in the scene, recording relevant information and terminating the current ray;
D. after the simulation of all the light rays on the echo is finished, carrying out convolution operation to obtain speckle information;
E. and transmitting the post-processed image data to Unity3D to calculate a gray value corresponding to the pixel point, and displaying a corresponding ultrasonic simulation image obtained by scanning the cross section of the corresponding model at the current ultrasonic probe position.
Examples
The invention takes a 3D model of an OBJ as an example (as shown in figure 2) to carry out simulation imaging, and the specific steps are as follows:
step one, importing data of a human body three-dimensional model group, reading point data and surface data of each tissue model, and setting a mapping pointer of each tissue to a material;
dividing a memory area, a result storage area of a collision point, caching of an organization parameter set, point data and surface data caching of a human body three-dimensional model, caching of a corresponding organization mapping pointer, and putting the positions of probe detecting elements and an initial ray direction vector into the corresponding caches according to the arrangement sequence of the detecting elements;
step two, unifying the probe element coordinate and rotation calculation in the Unity3D with the OptiX, wherein the step of unifying the Unity3D with the OptiX engine coordinate and rotation mode is shown in FIG. 4;
then acquiring a probe position setting probe element from Unity 3D; when the OptiX acquires the coordinates and the direction of the probe, drawing an arc by using the position and the direction of the probe, averagely dividing the positions of the probe elements according to the number of the probe elements, initializing starting light rays, setting an initial position, setting the direction to be outward of the arc, setting the material of the starting position, wherein the schematic diagrams of the probe elements and the optical fibers are shown in FIG. 3;
thirdly, setting a collision box and a collision detection function by using the point data and the surface data imported in the first step, returning the tissue parameters of the collision point by using a tissue material mapping pointer during collision, calculating by using the following formula and the tissue parameters corresponding to the models when the light ray intersects with one of the models in the model group, and recording the collision result into a collision point set; selecting one of the reflected light and the refracted light generated by the collision by using a Monte Carlo method, detecting whether the light has sufficient energy and exceeds a collision upper limit, if so, continuing to perform the step three by using a new light, and if not, ending the current light thread;
Figure BDA0002663038820000061
Figure BDA0002663038820000062
Figure BDA0002663038820000063
Figure BDA0002663038820000064
in the formula: j. the design is a squarerRepresenting the intensity of the reflection; j. the design is a squareiRepresents the refractive strength; vrIndicating the direction of reflection; viRepresents the direction of refraction; θ represents an angle; n represents a normal vector;
step four, traversing the image matrix from left to right and from top to bottom according to the collision point set, moving a traversing pointer according to the ratio of the resolution to the ray stepping length, and describing by using the following formula when the collision point is not encountered;
I=Ioe-alf
in the formula: i represents the collision point amplitude; a represents the attenuation coefficient of the tissue; l represents a distance; f represents a frequency; i isoRepresents the initial energy;
when a collision point is encountered, the following formula is used for describing the collision point and converting the material value to continue describing downwards until the detection length is exceeded; finally, drawing an image outline graph;
Figure BDA0002663038820000071
in the formula: riRepresents a ray of light, PTRepresenting the collision point, I representing the collision point amplitude, ωiRepresenting the angle of incidence of the light, n representing how many rays hit this point; i istRepresents the incident energy of ω in the direction; t represents an integral variable ranging from 0 to T.
Fifthly, adding scatterer points which accord with the current tissue characteristics according to the parameters of the material, and finally performing convolution operation post-processing on the whole image (the schematic diagram after the convolution post-processing is shown in fig. 5) to obtain the pixel value information of the final ultrasonic simulation image;
step six, when the ray tracing algorithm in the step five is completely calculated, the OptiX program transmits the pixel value information of the simulated image to the Unity3D program through the array, the Unity3D creates a Texture2D with the same pixel value, the gray value of each pixel is changed according to the value stored in the array, and then the Texture2D is handed to the image component to render a complete ultrasonic simulated image (as shown in fig. 6).
Although the present invention has been described with reference to the above embodiments, it should be understood that the present invention is not limited to the above embodiments, and those skilled in the art can make various changes and modifications without departing from the scope of the present invention.

Claims (3)

1. The method for simulating ultrasonic imaging in real time based on the OptiX and Unity3D virtual reality platforms is characterized by comprising the following steps:
step one, building a virtual reality environment in Unity3D, importing 3D model data, and acquiring a space coordinate of a 3D model through an ultrasonic probe;
step two, creating a ray tracing scene, loading a 3D model in the scene, and uniformly converting the space coordinates of the 3D model in the Unity3D virtual reality environment into the space coordinates in the ray tracing scene;
thirdly, starting ray tracing according to the converted initial light intensity of the space coordinate to obtain a collision point set of the ultrasonic simulation image;
the third step comprises the following specific steps: transferring the converted space coordinates to a light ray emission origin in a light ray tracking scene; setting collision sum and collision detection functions by using the point data and the surface data; then, carrying out light ray collision according to a ray tracing algorithm to obtain information of collision points; finally, generating a collision point set according to the information of the collision points;
the specific process of the ray tracing algorithm for ray collision is as follows: creating a bounding box acceleration structure, creating light ray buffering according to corresponding probe parameters, and creating light ray collision buffering; and if the light ray collides with the model bounding box, calculating the intensity and the collision times of the light ray: if the light intensity is greater than the minimum intensity and the collision times do not reach the collision threshold, recording the information of the collision point, and then generating corresponding reflected and refracted light; if the light intensity is less than the minimum intensity or the light collision frequency is equal to the threshold value, recording related information and terminating the current light; if the ray does not interact with any object in the scene, recording relevant information and terminating the current ray;
generating an image profile of the ultrasonic simulation image according to the collision point set of the ultrasonic simulation image, and performing convolution operation on the image profile to acquire pixel value information of the ultrasonic simulation image;
the specific process of generating the image profile of the ultrasonic simulation image by the collision point set in the fourth step is as follows:
according to the collision point set, traversing the image matrix from left to right and from top to bottom, moving a traversing pointer according to the ratio of the resolution to the ray stepping length, and describing by using the following formula when the collision point is not met;
I=Ioe-alf
in the formula: i represents the collision point amplitude; a represents the attenuation coefficient of the tissue; l represents a distance; f represents a frequency; i isoRepresents the initial energy;
when a collision point is encountered, the following formula is used for describing the collision point and converting the material value to continue describing downwards until the detection length is exceeded; finally, drawing an image outline graph;
Figure FDA0002892002240000021
in the formula: riRepresents a ray of light, PTRepresenting the collision point, I representing the collision point amplitude, ωiRepresenting the angle of incidence of the light, n representing how many rays hit this point; i istRepresents the incident energy of ω in the direction; t represents an integral variable;
and fifthly, transmitting the pixel value information of the ultrasonic simulation image to a Unity3D to calculate a gray value corresponding to the pixel point, and finally displaying a corresponding ultrasonic simulation image obtained by scanning the cross section of the corresponding model at the current ultrasonic probe position.
2. The method for real-time simulated ultrasound imaging based on OptiX and Unity3D virtual reality platforms of claim 1, wherein the 3D model data includes point data, surface data.
3. The method for simulating ultrasonic imaging in real time based on the OptiX and Unity3D virtual reality platform according to claim 1, wherein the process of spatial coordinate uniform transformation in the second step comprises translation and rotation;
wherein the translation formula is as follows:
S=T.*P
Figure FDA0002892002240000022
in the formula: s represents the position of the probe after being changed; p represents the original position of the probe; t represents a translation matrix; t is tx、ty、tzRespectively representing the moving lengths on the x, y and z axes;
wherein the rotation formula is as follows:
S=e.*P
e=(X,Y,Z)
Q=w+ix+jy+kz
X=asin(2wx-2yz)
Y=atan2(2wx+2xz,1-2x2-2y2)
Z=atan2(2wz+2xy,1-2x2-2z2)
in the formula: s represents the position of the probe after being changed; p represents the original position of the probe; e represents the Euler angle; q represents a quaternion; x, Y, Z indicate the rotation angles around the X, Y and Z axes, respectively; w, x, y, and z represent real numbers, and i, j, and k represent imaginary numbers.
CN202010910398.8A 2020-09-02 2020-09-02 Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms Active CN112001998B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010910398.8A CN112001998B (en) 2020-09-02 2020-09-02 Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010910398.8A CN112001998B (en) 2020-09-02 2020-09-02 Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms

Publications (2)

Publication Number Publication Date
CN112001998A CN112001998A (en) 2020-11-27
CN112001998B true CN112001998B (en) 2021-02-19

Family

ID=73465731

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010910398.8A Active CN112001998B (en) 2020-09-02 2020-09-02 Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms

Country Status (1)

Country Link
CN (1) CN112001998B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104224230A (en) * 2014-09-15 2014-12-24 声泰特(成都)科技有限公司 Three-dimensional and four-dimensional ultrasonic imaging method and device based on GPU (Graphics Processing Unit) platform and system
CN105530871A (en) * 2013-09-11 2016-04-27 波士顿科学国际有限公司 Systems and methods for selection and displaying of images using an intravascular ultrasound imaging system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103505288B (en) * 2012-06-29 2017-11-17 通用电气公司 Ultrasonic imaging method and supersonic imaging apparatus
EP2976609B1 (en) * 2013-03-19 2022-01-05 Koninklijke Philips N.V. System for hyperspectral imaging in visible light, method for recording a hyperspectral image and displaying the hyperspectral image in visible light
US9747709B2 (en) * 2014-12-19 2017-08-29 General Electric Company Method and apparatus for animate visualization of static 3-D data
CN111403007A (en) * 2018-12-29 2020-07-10 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging optimization method, ultrasonic imaging system and computer-readable storage medium
CN110298915A (en) * 2019-06-19 2019-10-01 天津大学 A kind of Fast Volume Rendering Algorithm three-dimensional ultrasonic image reconstruction algorithm introducing scattering model

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105530871A (en) * 2013-09-11 2016-04-27 波士顿科学国际有限公司 Systems and methods for selection and displaying of images using an intravascular ultrasound imaging system
CN104224230A (en) * 2014-09-15 2014-12-24 声泰特(成都)科技有限公司 Three-dimensional and four-dimensional ultrasonic imaging method and device based on GPU (Graphics Processing Unit) platform and system

Also Published As

Publication number Publication date
CN112001998A (en) 2020-11-27

Similar Documents

Publication Publication Date Title
US10565900B2 (en) Ray-tracing methods for realistic interactive ultrasound simulation
Burger et al. Real-time GPU-based ultrasound simulation using deformable mesh models
KR101717695B1 (en) Simulation of medical imaging
Tom et al. Simulating patho-realistic ultrasound images using deep generative networks with adversarial learning
Kutter et al. Visualization and GPU-accelerated simulation of medical ultrasound from CT images
JP6147489B2 (en) Ultrasonic imaging system
EP1844438B1 (en) Method and system for the simulation or digital synthesis of echographic images
JP4204095B2 (en) 3D imaging system and method for subject volume
US8241041B2 (en) Process and system for simulation or digital synthesis of sonographic images
CN107533808A (en) Ultrasonic simulation system and method
US20170032702A1 (en) Method and Apparatus For Generating an Ultrasound Scatterer Representation
WO2007100263A1 (en) Method for simulation of ultrasound images
CN102496320B (en) A kind of real-time ultrasonic image analogy method based on CT volume data
Perreault et al. Speckle simulation based on B-mode echographic image acquisition model
CN112001998B (en) Real-time simulation ultrasonic imaging method based on OptiX and Unity3D virtual reality platforms
Starkov et al. Ultrasound simulation with deformable and patient-specific scatterer maps
Law et al. Ultrasound image simulation with gpu-based ray tracing
Szostek et al. Real-time simulation of ultrasound refraction phenomena using ray-trace based wavefront construction method
Barnouin et al. A real-time ultrasound rendering with model-based tissue deformation for needle insertion
Starkov et al. Ultrasound simulation with animated anatomical models and on-the-fly fusion with real images via path-tracing
WO2007101346A1 (en) Ultrasound simulator and method of simulating an ultrasound examination
Wang et al. A real-time ultrasound simulator using Monte-Carlo path tracing in conjunction with optix engine
Varray et al. Hybrid strategy to simulate 3-D nonlinear radio-frequency ultrasound using a variant spatial PSF
Bürger et al. Simulation of dynamic ultrasound based on CT models for medical education
CN112716519A (en) Medical image reverse time migration imaging method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant