KR101501172B1 - Method and apparatus for providing stereoscopic image - Google Patents

Method and apparatus for providing stereoscopic image Download PDF

Info

Publication number
KR101501172B1
KR101501172B1 KR20130115602A KR20130115602A KR101501172B1 KR 101501172 B1 KR101501172 B1 KR 101501172B1 KR 20130115602 A KR20130115602 A KR 20130115602A KR 20130115602 A KR20130115602 A KR 20130115602A KR 101501172 B1 KR101501172 B1 KR 101501172B1
Authority
KR
South Korea
Prior art keywords
sample point
observation
observation light
angle
stereoscopic image
Prior art date
Application number
KR20130115602A
Other languages
Korean (ko)
Inventor
최장원
유성복
최윤식
김준호
Original Assignee
알피니언메디칼시스템 주식회사
연세대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 알피니언메디칼시스템 주식회사, 연세대학교 산학협력단 filed Critical 알피니언메디칼시스템 주식회사
Priority to KR20130115602A priority Critical patent/KR101501172B1/en
Application granted granted Critical
Publication of KR101501172B1 publication Critical patent/KR101501172B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Acoustics & Sound (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Image Generation (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A stereoscopic image generation method and apparatus therefor are disclosed. According to another aspect of the present invention, there is provided a stereoscopic image generation method comprising: acquiring volume data formed in a virtual space; extracting observation light beams projected on volume data from a first viewpoint and a second viewpoint, respectively, Calculating the number of movement frequencies of the sample points with respect to the set viewing ray and selecting sample points based on the calculated number of movement frequency of the sample points, And generating a stereoscopic image.

Description

TECHNICAL FIELD The present invention relates to a stereoscopic image generating method and apparatus,

The present invention relates to an image processing technique, and more particularly, to a stereoscopic image generation technique.

BACKGROUND ART [0002] Ultrasonic diagnostic devices widely used in the medical field and the like are used to generate images of an internal shape of a target object (for example, internal organs of a patient). The ultrasonic diagnostic apparatus generally uses a conversion element to transmit / receive an ultrasonic signal to / from a target object. That is, the ultrasonic signal transmitted to the object is generated by electrically stimulating the acoustic transducer or the acoustic transducer array to generate an image of the internal tissue of the object. An ultrasonic signal is reflected from a discontinuous object tissue in a direction in which the ultrasonic signal propagates, and an ultrasonic echo signal is generated. The ultrasound echo signal is transmitted to a conversion element and converted into an electrical signal, and then generates ultrasound image data for an internal tissue image through amplification and signal processing.

The ultrasonic diagnostic apparatus is very important for the medical field because it can provide a real time high resolution image of the internal structure of the object. The 2D ultrasound image is formed using the acquired ultrasound data and the internal organization is judged by the interpretation of the doctors. However, in order to overcome the limitation of the 2D ultrasound image and to increase the application range of the ultrasound image, three-dimensional or four-dimensional ultrasound stereoscopic imaging technology is being developed.

Japanese Patent Application Laid-Open No. 10-2006-0085596, July 27, 2006. open Patent Publication No. 10-2012-0084644, July 30, 2012. open Patent Publication No. 10-2012-0056934, 2012.06.05. open

According to an embodiment, a stereoscopic image generation method and apparatus for providing a stereoscopic image in real time by improving the rendering speed of volume data and providing an individualized adaptive stereoscopic image according to the difference of viewers of each viewer are proposed.

According to an embodiment of the present invention, there is provided a stereoscopic image generation method comprising: acquiring volume data formed in a virtual space; setting an observation light beam projected on the volume data from each of a first viewpoint and a second viewpoint of an observation plane spaced from the virtual space Calculating a number of movement frequency of the sample point with respect to the set observation light beam and selecting a sample point based on the calculated number of movement frequency of the sample point, rendering the volume data using the selected sample point, .

The step of selecting a sample point according to an embodiment includes calculating the number of movement frequency of the sample point based on the tangent function according to the angle formed between the observation plane and each ray of observation, To select a sample point. The number of shifts x shift of the sample point is the angle at which the angle formed between the observation plane and the observation ray is?

Figure 112013088050256-pat00001
,
Figure 112013088050256-pat00002
, or
Figure 112013088050256-pat00003
Lt; / RTI >

The step of selecting a sample point according to another embodiment may include calculating a number of movement frequency of a sample point according to an angle between an observation plane and an observation ray using a preset reference table, Select the sample point.

The step of selecting a sample point includes a step of moving a second sample point shifted to the left or right by one space to a new sample point when the number of shifts of the sample point calculated starting from the first sample point in the sample points formed on the observation light ray reaches the number of shifts, .

The step of setting the observation ray may include a step of setting a viewing angle of the observer based on the left viewpoint observation light corresponding to the left viewpoint of the observer and the right viewpoint of the observer, The angle of the right-viewpoint ray can be set.

The step of setting an observation light beam according to an exemplary embodiment may include adjusting the angle of the left and right viewpoint light beams according to a visual difference of an observer to generate an adaptive stereoscopic image customized according to a visual difference of each observer, The selecting step may adjust the number of sample point movement frequencies by the angle of the adjusted left and right viewpoint light beams to adjust the sample point selection per viewpoint.

The stereoscopic image generating apparatus according to another embodiment includes an observation light setting unit for setting observation light beams projected on volume data of a virtual space from each of a first viewpoint and a second viewpoint of an observation plane, And a rendering unit for generating a stereoscopic image by rendering the volume data by using the selected sample point.

According to one embodiment, the rendering speed of the volume data is improved, and a stereoscopic image can be provided in real time. That is, by automatically calculating the number of movement frequency of the sample point with respect to the observation light beam projected on the volume data and selecting the sample point, the volume rendering can be performed using the selected sample point, thereby improving the stereoscopic image generation speed . In particular, a stereoscopic image can be generated and provided within a short time by a simple method of calculating the number of movement frequencies of sample points using a tangent function according to an angle formed between the observation plane and each of the observation light beams or a preset reference table.

Furthermore, it is possible to provide an adaptive stereoscopic image according to each viewer. In other words, by setting the angle of view ray which is suitable for each observer or each observer's preferred ray, the angle of the observation ray of each of the observers is adjusted to render the volume data, thereby generating an adaptive stereoscopic image for each observer can do.

Furthermore, by performing the volume rendering by setting the observation light at the two left and right viewpoints of the observer, realistic and realistic stereoscopic images can be generated as compared with the case of generating a stereoscopic image at a viewpoint of the observer. In this case, 3D ultrasound stereoscopic images of the inside of the human body such as a fetus and an organ can be provided more realistically and realistically.

1 is a configuration diagram of a stereoscopic image generation system according to an embodiment of the present invention;
FIG. 2 is a detailed configuration diagram of the image processing unit of FIG. 1 according to an embodiment of the present invention;
FIG. 3 is a reference view showing an example of an observation light beam projection according to an embodiment of the present invention;
FIG. 4 is a three-dimensional view showing an example of an observation light beam projection at left and right viewpoints of an observer according to an embodiment of the present invention;
FIG. 5 and FIG. 6 are cross-sectional views illustrating an example of an observation light beam projection at the left and right viewpoints of an observer according to various embodiments of the present invention,
Figures 7 and 8 are cross-sectional views of a sample point selection embodiment for left and right viewpoint observation beams in accordance with various embodiments of the present invention;
9 is a reference view showing a sample point selection embodiment for left and right viewpoint observation light beams according to another embodiment of the present invention,
FIG. 10 is a reference view showing a left and a right 3D ultrasound image obtained using a stereoscopic image generation technique according to an embodiment of the present invention,
11 is a flowchart illustrating a method of generating a stereoscopic image according to an exemplary embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear. In addition, the terms described below are defined in consideration of the functions of the present invention, which may vary depending on the intention of the user, the operator, or the custom. Therefore, the definition should be based on the contents throughout this specification.

1 is a configuration diagram of a stereoscopic image generation system 1 according to an embodiment of the present invention.

1, a stereoscopic image generation system 1 includes a transducer 10, a signal processing unit 11, a scan converter 12, an image processing unit 13, and a display unit 14.

In this specification, a description will be made focusing on a technique of generating an ultrasound stereoscopic image from ultrasound image data for the purpose of diagnosing the internal structure of a target such as a human body by using ultrasound. In this case, the stereoscopic image generation system 1 corresponds to an ultrasonic diagnostic apparatus. However, the technical field of the present invention is not limited to ultrasonic diagnostic technology. For example, it can be applied not only to ultrasound but also to medical imaging using CT or MRI. Furthermore, it is stated that it can be applied to all image processing fields that generate stereoscopic images even if it is not a medical image field. Stereoscopic images are three-dimensional (3D) or four-dimensional (4D) images.

Hereinafter, the configuration of the stereoscopic image generation system 1 will be described with reference to FIG. The transducer 10 transmits an ultrasonic signal to a target object, converts the ultrasonic signal reflected from the boundaries of different media back to an electric signal, and transfers the ultrasonic signal to the signal processing unit 11. The signal processing unit 11 performs signal processing such as TGC (Time Gain Compensation) amplification, echo processing, SDP (Spectral Doppler Processing), and CDP (Color Doppler Processing) on a signal transmitted from the transducer 10 And transmits ultrasound image data to a scan converter. At this time, the ultrasound image data may include conic coordinates, angle information of each scan line with respect to the vertical scan line, and the like. Further, the signal processing unit 11 forms the three-dimensional ultrasound image data based on the two-dimensional ultrasound image data of the object.

The scan converter 12 spatially and temporally converts the 3D ultrasound image data of the object so as to be compatible with the format of the display unit 14. The scan converter 12 can scan-convert the three-dimensional ultrasound image data of the object represented by the conic coordinates into the three-dimensional ultrasound image data of the rectangular coordinates.

The image processing unit 13 generates volume data from the 3D ultrasound image data and renders the generated volume data to generate a 3D ultrasound stereoscopic image. Meanwhile, the above-described three-dimensional ultrasound image data and volume data generation examples are merely examples for facilitating the understanding of the present invention, and various embodiments are possible. Particularly, the volume data generation process is based on the premise that the volume data is already obtained through various methods rather than the main method of the present invention. Hereinafter, the volume data generation process is mainly based on the process of generating the three- Will be described later.

The image processing unit 13 according to the embodiment sets observation light beams projected on the volume data from the first viewpoint and the second viewpoint of the observation plane spaced from the virtual space where the acquired volume data is located. Then, the number of movement frequency of the sample point is calculated with respect to the set observation ray to select the sample point, and a volume image is generated by volume rendering using the selected sample point. The detailed configuration of the image processing unit 13 will be described in detail later with reference to FIG. On the other hand, the display unit 14 outputs the stereoscopic image generated by the image processing unit 13 to the screen.

2 is a detailed configuration diagram of the image processing unit 13 of FIG. 1 according to an embodiment of the present invention.

Referring to FIG. 2, the image processing unit 13 includes an observation light setting unit 130, a sampling unit 132, and a rendering unit 134. Hereinafter, each configuration of the image processing unit 13 will be described with reference to FIG. 2. In order to facilitate understanding of the explanation, an example of the observation light beam projection example of FIG. 3 will be described together.

The observation light setting unit 130 sets the observation light beams 331 and 332 projected from the first viewpoint and the second viewpoint of the observer on the observation plane 300 to the volume data 320 of the virtual space 310, respectively. A predetermined angle is formed between the observation plane 300 and each of the observation beams 331 and 332 when the observation beams 331 and 332 are set.

The observation plane 300 corresponds to a screen on which the 3D ultrasound image is displayed and the virtual space 310 refers to a space in which the observation plane 300 is extended to a 3D space. The volume data 320 is located in the virtual space 310 and is composed of an object space to be expressed by an image and an empty space which is not represented by an image. For example, in the case of a fetus, a positive number corresponds to an empty space, and a fetal face surface corresponds to an object space. The volume data 320 can be obtained by the scan conversion calculation result in the scan converter (12 in Fig. 1).

The sampling unit 132 performs sampling by selecting a sample point with respect to the observation light beam set by the observation light beam setting unit 130. The two-dimensional (2D) image data can be easily obtained by using two cameras (or scanners) to match the viewpoints of the left and right eyes. In contrast, 3D (3D) stereoscopic images can not be obtained simply as stereoscopic images for 2D image data because they are obtained through post-processing such as volume rendering of 2D volume data. Accordingly, in the present invention, in order to generate a stereoscopic image of the 3D image data using the volume data already generated, the sampling unit 132 selects a sample point suitable for the observation light beam corresponding to the left and right viewpoints of the observer , And the rendering unit 134 renders the volume data using the sampling point selected by the sampling unit 132.

The sampling unit 132 calculates the number of shifts (x shift ) of the sample point on the observation light beam set by the observation light beam setting unit 130, in order to select the sample point. The sampling unit 132 according to one embodiment calculates the number of shifts (x shift ) of the sample points based on the tangent function tan? According to the angles of the observation plane 300 and the observation light rays 331 and 332 . Then, the sample point is selected based on the calculated shift frequency (x shift ) of the sample point. A specific embodiment of this will be described below with reference to FIGS. 7 and 8. FIG.

The sampling unit 132 according to another embodiment calculates the number of shifts (x shift ) of the sample points using a preset reference table. As shown in FIG. 9, which will be described later, the reference table is formed by a table of the number of shifts (x shift ) of sample points according to the angle formed between the observation plane and the observation light. The sampling unit 132 selects a sample point based on the number of shifts (x shift ) of the sample points calculated using the reference table. A concrete embodiment of this will be described later with reference to Fig.

The rendering unit 134 renders the volume data using the sampling points selected by the sampling unit 132 to generate a stereoscopic image. The generated stereoscopic image is displayed through the display unit (14 in Fig. 1).

FIG. 3 is a reference view showing an example of an observation light beam projection according to an embodiment of the present invention.

Referring to FIG. 3, a volume rendering process is required to generate a stereoscopic image using volume data. Volume ray casting is one of the volume rendering methods, which, according to one embodiment, produces virtual observation beams 331 and 332 at the first and second viewpoints of the observer formed in the pixels of the observation plane 300 ) To obtain hue and transparency values at a sample point of the light beam, and accumulate the hue and transparency values to determine the hue value in the pixel where the light is projected. At this time, the color of the intersection formed between the voxel, which is a three-dimensional pixel, and the observation light rays 331 and 332 is determined, and this color becomes one pixel value of the output image. By repeating the above-described process on all the pixels of the image, the output image is completed.

4 is a reference view showing a stereoscopic view of an example of an observation light beam projected at the left and right viewpoints of an observer according to an embodiment of the present invention.

2 and 4, the ray-of-sight setting unit 130 includes a left-point-of-view ray 331 projected on the volume data 320 from the left-eye point of the observer based on the observation plane 300 spaced from the virtual space, . Then, the right viewpoint observation light beam 332 projected on the volume data 320 is set from the right viewpoint of the observer.

The left and right viewpoint light beams 331 and 332 form a predetermined angle with respect to the observation plane 300, respectively. That is, by applying the principle of observation light in the volume ray projection method, the observation light setting unit 130 sets the angles of the left and right viewpoint observation light beams 331 and 332 as shown in FIG. Then, the rendering unit 134 generates a stereoscopic image by performing volume rendering using each of the set-up rays 331 and 332. If the volume rendering is performed by projecting the observation light at each of the left and right viewpoints of the observer, it is possible to generate the left and right stereoscopic images, thereby providing a realistic and realistic image to the observer.

FIGS. 5 and 6 are cross-sectional views illustrating examples of observed light projections at left and right viewpoints of an observer according to various embodiments of the present invention.

5 and 6, a predetermined angle is formed between the observation plane 300 and the right-view observed ray 332 between the observation plane 300 and the left-view observed ray 331 according to the light- do. 5, the angle formed between the observation plane 300 and the left viewpoint ray 331 is 90 degrees and the angle formed between the view plane 300 and the right viewpoint ray 332 is < RTI ID = 0.0 & 1 is 7 degrees. On the other hand, each θ 2 is 6, formed between the observation plane 300 and a left view-point observation beam and each θ 2 formed between 331, the observation plane 300 and the right view-point observation light 332 The same is 10 degrees. At this time, the tangent function tan &thgr; by the angle formed between the observation plane 300 and the left and right viewpoint observation light rays 331 and 332 can be expressed as a moving frequency of the sample point described later with reference to FIGS. 7 and 8 x shift ). < / RTI >

According to one embodiment, the angles of the left and right viewpoint observation light beams, which are different from each other, are adjusted at the time of setting the observation light rays. Since the distances between the left and right eyes differ depending on the observer, the viewing angles between the left and right eyes are different from each other even when viewed from the same object. Therefore, in consideration of the above-described characteristics, the present invention sets an observation ray angle suitable for each observer or each observer's preferred. In this case, the number of sample point movement frequencies (x shift ) is adjusted by the method of referring to FIGS. 7 to 9, which will be described later, by the angle of the left and right viewpoint light rays adjusted for each observer, Volume data can be rendered using the frequency (x shift ). Accordingly, an individualized adaptive stereoscopic image can be generated according to the visual difference of each viewer.

FIGS. 7 and 8 are reference diagrams illustrating sample point selection embodiments for left and right viewpoint observation beams in accordance with various embodiments of the present invention.

7 and 8, it is possible to determine the number of shifts (x shift ) of cumulative sample points through the following equation 1 after setting the angle between the observation plane and the observation light beam.

Figure 112013088050256-pat00004
(Equation 1)

In Equation (1), └┘ denotes a Gaussian symbol, and the value obtained in Equation (1) means the number of shifts of accumulated sample points at the time of volume rendering. That is, each time a selected sample point passes through the number of shifts (x shift ) of sample points, a new cumulative sample point is selected by shifting one column to the left or right. For example, when? 1 is 7 degrees as shown in FIG. 5,

Figure 112013088050256-pat00005
, A new cumulative sample point is selected by shifting one column to the left every eight accumulated sample points as shown in Fig.

The shift frequency (x shift ) of the sample point according to another embodiment is calculated through Equation (2).

Figure 112013088050256-pat00006
(Equation 2)

In Eq. (2), ┌ ┐ means the rounding function, and the value obtained in Eq. (2) means the number of shifts of cumulative sample points at the time of volume rendering. For example, when? 2 is 10 degrees as shown in Fig. 6,

Figure 112013088050256-pat00007
8, in the case of the left-view observed ray 331, a new cumulative sample point is shifted to the right by one space for every six cumulative sample points, and a new accumulated sample point is selected as the right-eye observed ray 332 , A new cumulative sample point is selected by shifting to the left by one space for every six cumulative sample points.

The shift frequency (x shift ) of the sample point according to another embodiment is calculated through Equation (3).

Figure 112013088050256-pat00008
(Equation 3)

In Equation 3, <> denotes a rounding function, and the value obtained in Equation 3 means the moving frequency of cumulative sample points at the time of volume rendering. For example, if θ 1 is 9 degrees,

Figure 112013088050256-pat00009
, A new cumulative sample point is selected by shifting left or right one space every six cumulative sample points.

FIG. 9 is a reference view showing a sample point selection example for left and right viewpoint observation light beams according to another embodiment of the present invention.

Referring to FIG. 9, a sample point movement frequency number reference table according to? Is set in advance, and a sample point movement frequency number (x shift ) is selected by referring to the table according to the set?. The formula for this is shown in Equation 4.

Figure 112013088050256-pat00010
(Equation 4)

For example, when the set θ 1 is 9 degrees, when θ is 9 in the reference table, since the number of sample point movement frequencies (x shift ) is 6, the sample points are shifted to the left by one space for every six accumulated sample points Select cumulative sample points.

FIG. 10 is a reference view showing a 3D ultrasound image at left and right views obtained using a stereoscopic image generation technique according to an embodiment of the present invention.

Referring to FIG. 10, a high-quality 3D stereoscopic image can be generated through the setting of an observation ray corresponding to the left and right viewpoints of the observer and the volume rendering using the set observation ray.

For example, when? 1 is 7 degrees as shown in FIG. 5,

Figure 112013088050256-pat00011
, A new cumulative sample point is selected by shifting to the left by one space for every eight cumulative sample points as shown in Fig. At this time, the volume data is rendered using the selected cumulative sample points to generate high-quality left and right viewpoint 3D ultrasound images as shown in FIG.

11 is a flowchart illustrating a method of generating a stereoscopic image according to an exemplary embodiment of the present invention.

Referring to FIG. 11, the stereoscopic image generating apparatus acquires volume data formed in a virtual space (1100).

Next, an observation light beam projected on the volume data is set from each of the first viewpoint and the second viewpoint of the observation plane spaced from the virtual space (1110). The observation light setting step 1110 according to an exemplary embodiment sets the angle of the left-view observed light corresponding to the left viewpoint of the observer and the right-view observed light corresponding to the right viewpoint of the observer.

Next, the number of movement frequency of the sample point is calculated with respect to the set observation ray, and the sample point is selected based on the calculated movement frequency number (x shift ) of the sample point (1120).

In a sample point selection step 1120 according to an exemplary embodiment, the number of movement (x shift ) of sample points is calculated based on the tangent function according to the angle formed between the observation plane and each of the observation rays, Select a sample point based on the number of shifts (x shift ). At this time, the number of movement frequencies (x shift ) of the sample points is such that when the angle formed between the observation plane and the observation light is?

Figure 112013088050256-pat00012
,
Figure 112013088050256-pat00013
, or
Figure 112013088050256-pat00014
Lt; / RTI &gt;

In the sample point selection step 1120 according to another embodiment, the number of movement frequency of the sample point according to the angle of the observation plane and the observation light ray is calculated using the preset reference table, and the number of movement frequency of the calculated sample point is calculated To select a sample point.

In order to generate an adaptive stereoscopic image customized according to a visual difference of each viewer, an angle of the left and right viewpoint observation light rays according to an observer is adjusted in the observation light ray setting step 1110 according to an exemplary embodiment, It is possible to adjust the sample point selection per viewpoint by adjusting the number of sample point movement frequencies by the angle of the left and right viewpoint light rays adjusted according to the observer.

Then, the volume data is rendered using the selected sample point to generate a stereoscopic image (1130).

According to the above description, the observation light is set at the two viewpoints of the observer (1110) and the number of movement of the sample point is calculated for the set observation light (1120). Then, a sample point is selected based on the number of movement frequency of the calculated sample point, and a stereoscopic image is generated through volume rendering (1130). At this time, since the computation amount is minimized through the simple calculation of the number of sample point movement frequencies described above with reference to FIGS. 7 to 9, the stereoscopic image can be generated in real time without greatly affecting the overall calculation amount. In particular, since recent volume rendering enables volume rendering of about 10 frames per second, it is possible to generate and provide stereoscopic images in real time.

The embodiments of the present invention have been described above. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed embodiments should be considered in an illustrative rather than a restrictive sense. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.

1: stereoscopic image generation system 10: transducer
11: Signal processing unit 12: Scan converter
13: Image processing unit 14:
130: Observation beam setting unit 132: Sampling unit
134: rendering unit 300: observation plane
310: virtual space 320: volume data
331, 332: Observation ray

Claims (13)

Acquiring volume data formed in a virtual space;
Setting an observation light beam projected on the volume data from each of a first viewpoint and a second viewpoint of an observation plane spaced from the virtual space;
Calculating a number of movement frequency of the sample point based on the angle of the observation plane and the angle of the observation light with respect to the set observation light beam and selecting a sample point based on the calculated number of movement frequency of the sample point; And
Generating a stereoscopic image by rendering volume data using the selected sample point;
And generating a stereoscopic image based on the stereoscopic image.
2. The method of claim 1, wherein selecting the sample point comprises:
Wherein a number of movement frequencies of the sample points is calculated based on a tangent function according to an angle formed between an observation plane and each of the observation light beams and a sample point is selected based on the calculated number of movement frequencies of the sample points. Generation method.
3. The method of claim 2,
When the angle formed between the observation plane and the observation light ray is ?, The number of shift frequencies x shift of the sample point is?
Figure 112013088050256-pat00015
,
Figure 112013088050256-pat00016
, or
Figure 112013088050256-pat00017
Dimensional image.
2. The method of claim 1, wherein selecting the sample point comprises:
Wherein the step of calculating the number of movement frequencies of the sample points according to the angles of the observation plane and the observation light using the preset reference table and selecting the sample points based on the number of movement frequency of the calculated sample points .
2. The method of claim 1, wherein selecting the sample point comprises:
And a second sample point shifted left or right by one space is selected as a new sample point when the number of shifts of the calculated sample point is reached, starting from a first sample point within sample points formed on the observation light beam. / RTI &gt;
The method according to claim 1,
The first point of time is the left point of view of the observer, the second point of view is the right point of the observer,
Wherein the step of setting the observing ray comprises:
And the angle of the right viewpoint observation light corresponding to the left viewpoint of the observer and the angle of the right viewpoint observation light corresponding to the right viewpoint of the observer are set.
7. The method of claim 6, wherein the step of setting the viewing ray comprises:
Wherein the angle of the left and right viewpoint observation light rays is adjusted according to a visual difference of an observer to generate an individualized adaptive stereoscopic image according to a visual difference of each viewer.
8. The method of claim 7, wherein selecting the sample point comprises:
And adjusting the number of sample point movement frequencies by the angle of the adjusted left and right viewpoint light rays to adjust sample point selection for each observer.
A viewing ray setting unit for setting a viewing ray to be projected on the volume data of the virtual space from each of the first viewpoint and the second viewpoint of the observation plane;
A sampling unit for calculating the number of movement frequency of the sample point based on the angle of the observation plane and the angle of the observation light with respect to the set observation light beam and selecting the sample point based on the calculated number of movement frequency of the sample point; And
A rendering unit for generating a stereoscopic image by rendering volume data using the selected sample point;
The stereoscopic image generating apparatus comprising:
10. The apparatus according to claim 9,
Wherein a number of movement frequencies of the sample points is calculated based on a tangent function according to an angle formed between an observation plane and each of the observation light beams and a sample point is selected based on the calculated number of movement frequencies of the sample points. Generating device.
10. The apparatus according to claim 9,
Wherein the stereoscopic image generating device calculates the number of movement frequency of the sample point according to the angle of the observation plane and the observation light ray by using a preset reference table and selects the sample point based on the calculated number of movement frequency of the sample point, .
10. The apparatus according to claim 9,
Wherein the angle of the left and right viewpoint observation light rays is adjusted according to a visual difference of an observer, in order to generate an adaptive stereoscopic image customized according to a visual difference of each observer.
13. The apparatus according to claim 12,
And adjusts the number of sample point movement frequencies by the angle of the left and right viewpoint light rays adjusted by the observation light ray setting unit to adjust sample point selection for each observer.
KR20130115602A 2013-09-27 2013-09-27 Method and apparatus for providing stereoscopic image KR101501172B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR20130115602A KR101501172B1 (en) 2013-09-27 2013-09-27 Method and apparatus for providing stereoscopic image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20130115602A KR101501172B1 (en) 2013-09-27 2013-09-27 Method and apparatus for providing stereoscopic image

Publications (1)

Publication Number Publication Date
KR101501172B1 true KR101501172B1 (en) 2015-03-11

Family

ID=53027110

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130115602A KR101501172B1 (en) 2013-09-27 2013-09-27 Method and apparatus for providing stereoscopic image

Country Status (1)

Country Link
KR (1) KR101501172B1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060085596A (en) * 2005-01-24 2006-07-27 지멘스 메디컬 솔루션즈 유에스에이, 인크. Stereoscopic three or four dimensional ultrasound imaging
JP2009238768A (en) * 2008-03-03 2009-10-15 National Institute Of Advanced Industrial & Technology Tunnel magnetoresistance element
KR101090660B1 (en) * 2011-09-14 2011-12-07 인하대학교 산학협력단 Method for real-time volume rendering using point-primitive
KR20140052176A (en) * 2012-10-22 2014-05-07 삼성전자주식회사 Method and apparatus for providing 3 dimensional image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060085596A (en) * 2005-01-24 2006-07-27 지멘스 메디컬 솔루션즈 유에스에이, 인크. Stereoscopic three or four dimensional ultrasound imaging
JP2009238768A (en) * 2008-03-03 2009-10-15 National Institute Of Advanced Industrial & Technology Tunnel magnetoresistance element
KR101090660B1 (en) * 2011-09-14 2011-12-07 인하대학교 산학협력단 Method for real-time volume rendering using point-primitive
KR20140052176A (en) * 2012-10-22 2014-05-07 삼성전자주식회사 Method and apparatus for providing 3 dimensional image

Similar Documents

Publication Publication Date Title
JP4649219B2 (en) Stereo image generator
JP6058283B2 (en) Ultrasonic diagnostic equipment
JP7112128B2 (en) METHOD, APPARATUS, IMAGE PROCESSING DEVICE, STORAGE MEDIUM AND THREE-DIMENSION IMAGING SYSTEM FOR CONVERTING TWO-DIMENSIONAL IMAGES TO THREE-DIMENSIONAL IMAGES
CN106254854B (en) Preparation method, the apparatus and system of 3-D image
KR101538658B1 (en) Medical image display method and apparatus
JP2009022788A (en) Ultrasonic diagnostic apparatus
KR20090107748A (en) Apparatus of multiview three-dimensional image synthesis for autostereoscopic 3d-tv displays and method thereof
KR101100464B1 (en) Ultrasound system and method for providing three-dimensional ultrasound image based on sub region of interest
JP2013119035A (en) Ultrasonic image formation system and method
CN102893306B (en) Medical diagnostic imaging apparatus and image processing apparatus
US9474509B2 (en) Apparatus and method for generating ultrasonic image
US9224240B2 (en) Depth-based information layering in medical diagnostic ultrasound
KR102218308B1 (en) ultrasonic image processing apparatus and method
EP3018628B1 (en) Imaging apparatus and imaging method
CN102985013B (en) Medical image diagnosis device, image processing device, and ultrasound diagnosis device
KR20140035747A (en) Ultrasound imaging apparatus and control method for the same
EP3130273B1 (en) Stereoscopic visualization system and method for endoscope using shape-from-shading algorithm
KR20160014933A (en) Ultrasonic apparatus and control method for the same
KR102005326B1 (en) Method for acquiring 3d depth information in image and system therefor
KR101501172B1 (en) Method and apparatus for providing stereoscopic image
CN116058868A (en) Portable augmented reality ultrasonic image visualization method, device and system
JP2014236340A (en) Image processing device, method, program, and stereoscopic image display device
JP2006197036A (en) Device and method for stereoscopic image display
JPH11155861A (en) Ultrasonoraph
JP5395611B2 (en) Ultrasonic diagnostic apparatus, image data generation apparatus, and control program for image data generation

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20180302

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20190304

Year of fee payment: 5