US20100321380A1 - Image Processing Method and Associated Apparatus for Rendering Three-dimensional Effect Using Two-dimensional Image - Google Patents
Image Processing Method and Associated Apparatus for Rendering Three-dimensional Effect Using Two-dimensional Image Download PDFInfo
- Publication number
- US20100321380A1 US20100321380A1 US12/794,943 US79494310A US2010321380A1 US 20100321380 A1 US20100321380 A1 US 20100321380A1 US 79494310 A US79494310 A US 79494310A US 2010321380 A1 US2010321380 A1 US 2010321380A1
- Authority
- US
- United States
- Prior art keywords
- image
- quadrilateral
- trapezoidal
- coordinate change
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/80—2D [Two Dimensional] animation, e.g. using sprites
Definitions
- the present invention relates to a mechanism for generating a three-dimensional (3D) effect, and more particularly, to an image processing method and an associated apparatus for rending a 3D effect using two-dimensional (2D) images.
- methods for rendering graphics or images include a 2D image rendering approach and a 3D image rendering approach.
- the 2D image rendering approach is easier and less expensive to implement, yet has a disadvantage of lacking depth information.
- the 3D image rendering approach although having an advantage of being capable of rendering better visual enjoyment to viewers using its depth information, is burdened with shortcomings including more complicated and more costly implementation.
- the present invention provides an image processing method for rendering a 3D effect by transfoiming a first quadrilateral image to a second quadrilateral image, with at least one of the first and second quadrilateral images being a trapezoidal image.
- the image processing method comprises: providing the first quadrilateral image; generating four coordinates of four vertices associated with the second quadrilateral image according to the first quadrilateral image and the 3D effect; determining a relative height and a relative width according to a height and a width of the first quadrilateral image and the coordinates of the four vertices; and generating a plurality of pixel values of the second quadrilateral image according to the coordinates of the four vertices, the relative height, and the relative width with reference to a plurality of pixel values of the first quadrilateral image.
- the present invention further provides an image processing apparatus for rendering a 3D effect by transforming a first quadrilateral image to a second quadrilateral image, with at least one of the first and second quadrilateral images being a trapezoidal image.
- the image processing apparatus comprises a target image determining unit, a pixel determining unit and a calculation unit.
- the target image determining unit generates coordinates of four vertices associated with the second quadrilateral image according to the first quadrilateral image and the 3D effect.
- the pixel determining unit determines a relative height and a relative width according to a height and a width of the first quadrilateral image and the coordinates of the vertices, and determines a corresponding relationship between a plurality of pixels of the first quadrilateral image and a plurality of pixels of the second quadrilateral image.
- the calculation unit generates a plurality of pixel values of the second quadrilateral image according to the corresponding relationship with reference to a plurality of pixel values of the first quadrilateral image.
- FIG. 1 is a schematic diagram of rendering a 3D effect using 2D images according to an embodiment of the invention
- FIG. 2 is a schematic diagram of an image processing apparatus according to an embodiment of the invention.
- FIG. 3 shows a schematic diagram of the quadrilateral images Q 1 and Q 2 according to one embodiment of the invention
- FIGS. 4A and 4B are schematic diagrams of a corresponding relationship between pixels of the trapezoidal image and pixels of the rectangular image shown in FIG. 3 ;
- FIG. 5 is a schematic diagram of a brightness change according to an embodiment of the invention.
- FIG. 6 is a schematic diagram of a vertical trapezoidal image being 2D rotated according to an embodiment of the invention.
- FIG. 7 is a schematic diagram of operations for rendering a 3D effect from a rotated image that is first 2D rotated by a pixel determining unit.
- FIG. 8 is a schematic diagram of the image processing apparatus in FIG. 2 rendering a 3D cube rotation effect using two trapezoidal images at different time points.
- a two-dimensional rendering approach is provided as a novel image processing approach for rendering operation icons or images of a user interface system, so as to render a 3D effect using 2D images without degrading an overall system performance while also bring better visual enjoyment to a user operating the user interface system.
- the image processing method and apparatus generate images that render at least one 3D effect, including image reshaping, rotating, twisting or expansion effects, and lighting effects.
- the image processing method and apparatus based on 2D images that require no Z-axis information (i.e., image depth information), is capable of rendering the 3D effect.
- resources that a processor or a calculation unit designates on calculations are significantly reduced to enhance an overall system calculation performance.
- cost of hardware previously applied for the conventional 2D image drawing method may only be slightly increased while cost of hardware previously needed to show 3D effects is reduced. Therefore, the method and apparatus of the invention offer cost advantages whether being realized by software or hardware.
- Icons of a user interface system are mostly quadrilateral icons, and more particularly, rectangular and square icons.
- a quadrilateral icon flips or rotates toward a predetermined angle
- a series of images during the transformation between an original image to a final image i.e., a generated image
- the method and apparatus according to an embodiment of the invention calculates a shape associated with each quadrilateral image to be generated in order to respectively generate a quadrilateral image.
- the shape of a quadrilateral icon at a latter of two successive time points is calculated according to a difference between rotation angles of the two successive time points; alternatively, the shape of a quadrilateral icon at a current time points is calculated according to a difference of rotation angles between the current-time-point image and an original image—other similar modifications are also within the scope of the invention.
- FIG. 1 shows a schematic diagram of rendering a 3D effect using 2D images according to an embodiment of the invention.
- icons of the user interface system render a flipping or rotation effect in a 3D space, such as flipping vertically (e.g., icons “0” and “1”), or flipping horizontally (e.g., icons “2” and “3”).
- flipping vertically e.g., icons “0” and “1”
- flipping horizontally e.g., icons “2” and “3”.
- a series of images transformed from an originally rectangular icon “0” are a plurality of different trapezoidal images when flipping vertically.
- a height of the series of trapezoidal images of the icons “0” gradually reduces (from time t 1 to t 5 ), so as to render a visual effect of the icon “0” flipping vertically in a 3D space.
- a series of images transformed from an icon “1” in an originally shallow trapezoidal image are a plurality of different trapezoidal and square images when flipping vertically.
- a height of the series of the trapezoidal images of the icon“1” gradually increases (from time t 1 to t 5 ), so that one may perceive the shape of the images “1” restore to a normal shape (i.e., the square image) from a shallow shape as the height of the trapezoidal images becomes larger to render a visual effect of the icon “1” flipping vertically in a 3D space.
- a series of images transformed from the icon “2” in an original square image are a plurality of different trapezoidal images when flipping horizontally.
- a height of the series of the trapezoidal images of the icon “2” gradually reduces from time t 1 to t 5 (the height of the series of trapezoidal images of the icon “2” is a width of the images), so that one may perceive the shape of the icon “2” gradually become shallow as the height of the trapezoidal images becomes smaller to render a visual effect of the icon “1” flipping horizontally in a 3D space.
- a height of a series of trapezoidal images of the icon “3” gradually becomes larger from time t 1 to t 5 , so as to render a visual effect of the icon “3” flipping horizontally in a 3D space.
- brightness of a series of trapezoidal images is appropriately adjusted. For example, suppose a light source is right in front of the images. Brightness of the series of images dims as the icon “0” flips from a front side downwards to face down or as the icon “2” flips from a front side rightwards to face a side. In contrast, brightness of the series of images becomes brighter as the icon “1” flips frontwards from facing a side to a front side or as the icon “3” flips rightwards from facing a side to a front side. In FIG.
- dots are used to indicate differences in brightness, i.e., images with a large number of dots are dimmer in brightness, images with a smaller number of dots are brighter in brightness, and the square image without any dots means it has maximum brightness.
- the icons “0” to “3” are used for illustrating effects provided by the method and apparatus of the invention, but are not to limit the invention thereto. Description of realizing an embodiment of the invention by hardware shall be given below.
- FIG. 2 shows a schematic diagram of an image processing apparatus 200 according to an embodiment of the invention.
- the image processing apparatus 200 comprises a target image determining unit 201 , a memory unit 205 , a pixel determining unit 210 , a calculation unit 215 , and a buffer 220 . How a predetermined quadrilateral image among the abovementioned series of quadrilateral images is generated is discussed below.
- pixel values of a first quadrilateral image Q 1 are used to generate pixel values of a second quadrilateral image Q 2
- the first and second quadrilateral images Q 1 and Q 2 may be temporally successive, e.g., the two quadrilateral images are the icon “0” at the time t 1 and t 2 in FIG. 1
- the memory unit 205 is for storing a plurality of pixel values of the quadrilateral image Q 1 corresponding to a 2D image.
- the target image determining unit 201 is for generating four coordinates of four vertices associated with the quadrilateral image Q 2 according to coordinates of four vertices of the quadrilateral image Q 1 and a desired 3D effect.
- At least one of the quadrilateral images Q 1 and Q 2 is a trapezoidal image.
- the pixel determining unit 210 Upon obtaining the coordinates of the four vertices of the quadrilateral image Q 1 , the pixel determining unit 210 correspondingly outputs the plurality of pixel values of the quadrilateral image Q 1 from the memory unit 205 to the buffer 200 according to the coordinates of the four vertices as well as a height and a width of the quadrilateral image Q 2 .
- the calculation unit 215 then accordingly generates the plurality of pixel values of the quadrilateral image Q 2 .
- the image processing apparatus 200 one after another generates the abovementioned series of quadrilateral images that render a 3D effect, which may be, for example, the quadrilateral image Q 1 flipping in a 3D space.
- the quadrilateral images are outputted and displayed on a display to allow a user to perceive the rendered 3D effect from using 2D images.
- the 2D image is an icon of a user interface system
- the quadrilateral images Q 1 and Q 2 are images of two successive time points
- the quadrilateral image Q 1 is an image at a former time point of the 2D image (i.e., the icon) during the flipping or rotation.
- the 2D image is the icon “0” in FIG. 1
- the quadrilateral image Q 1 is the square image of the image “0” at the time t 1
- the quadrilateral image Q 2 is the trapezoidal image of the icon “0” at the time t 2 .
- the quadrilateral image Q 1 may also be the square image of the icon “0” at the time t 3
- the quadrilateral image Q 2 may also be the trapezoidal image of the icon “0” at the time t 4 .
- the quadrilateral image Q 1 may be the trapezoidal image of the icon “1” at the time t 2
- the quadrilateral image Q 2 may be the trapezoidal image of the icon “1” at the time t 3 .
- the quadrilateral image Q 1 may also be the trapezoidal image of the icon “1” at the time t 4
- the quadrilateral image Q 2 may also be the square image of the icon “0” at the time t 5 .
- the image processing apparatus 200 is capable of generating flipped or rotated images after according to different desired flipping or rotation angles of the icon in a 3D space.
- the target image determining unit 201 is for generating the coordinates of the four vertices of the quadrilateral image Q 2 according to the coordinates of the four vertices of the quadrilateral image Q 1 as well as desired angle and direction of flipping or rotation.
- the quadrilateral image Q 1 is a source image, whose data is stored in the memory unit 205 .
- the target image determining unit 201 may be realized by hardware or software.
- FIG. 3 shows a schematic diagram of the quadrilateral images Q 1 and Q 2 according to one embodiment of the invention. As shown in FIG.
- the quadrilateral image Q 1 is a rectangular image
- coordinates P 21 ⁇ P 24 of the four vertices outputted by the target image determining unit 201 define the quadrilateral image Q 2 as a trapezoidal image.
- the memory unit 205 is further stored with display image data corresponding to a display image, in which the coordinates P 21 ⁇ P 24 generated by the target image determining unit 201 are located.
- the pixel determining unit 210 comprises a source coordinate generating unit 2101 and a target coordinate generating unit 2102 .
- the target coordinate generating unit 2102 generates a coordinate of each pixel of the trapezoidal image Q 2 according to the coordinates P 21 ⁇ P 24 of the four vertices of the trapezoidal image Q 2 .
- the target coordinate generating unit 2102 calculates change rates of coordinates at two sides of the trapezoidal image Q 2 (i.e., the change rates of the planar coordinates P 21 and P 23 of the two left vertices of the trapezoidal image, and the change rates of the planar coordinates P 22 and P 24 of the two right vertices of the trapezoidal image) to respectively obtain a first coordinate change rate and a second coordinate change rate.
- a start point and an end point at a predetermined row of a next scan line can be obtained as the trapezoidal image Q 2 increases or decreases by a row; that is, coordinates of two terminal points of each scan line of the trapezoidal image can be obtained.
- the first method is that, the two coordinate change rates of right and left sides are respectively used to obtain a distance between coordinate start points of each two adjacent rows and a distance between coordinate end points of each two adjacent rows, and the target coordinate generating unit 2102 respectively accumulates the two coordinate change rates to a current start point and a current end point with every increasing/decreasing row, so as to obtain coordinates of the start point and end point of a next row.
- the coordinate of the start point of each row is calculated by adding a product from multiplying a number for a current row with the coordinate change rate to values of the coordinate of a vertex (e.g., the start point P 21 ).
- the coordinate of the end point of each row can be calculated in the same manner. That is, the coordinate of the end point of each row is calculated by adding a product from multiplying a number of the current row with the coordinate change rate to values of the coordinate of a vertex (e.g., the end point P 24 ).
- the method describe above is for illustrative purposes, and proper modifications made by a person skilled in the art are within the scope of the invention.
- the source coordinate generating unit 2101 determines at least one corresponding pixel of the rectangular image Q 1 with respect to each pixel of the trapezoidal image Q 2 , and outputs a coordinate of the at least one pixel to the memory unit 205 , which further outputs pixel values corresponding to the at least one pixel to the buffer 220 .
- the calculation unit 215 then generates a pixel value for each pixel of the trapezoidal image Q 2 according to the pixel values corresponding to the at least one pixel in the buffer 220 .
- FIG. 4A and 4B show schematic diagrams illustrating operations of the source coordinate generating unit 2101 for determining a corresponding relationship between the pixels of the trapezoidal image Q 2 and the pixels of the trapezoid image Q 1 in FIG. 3 .
- a target pixel value of a target pixel among the plurality of pixels of the trapezoidal image Q 2 is generated shall be described below.
- the source coordinate generating unit 2101 calculates an interval ⁇ H by which the rectangular Q 1 correspondingly moves when corresponding image content in the trapezoidal image Q 2 moves by one row.
- the interval ⁇ H is regarded as an average interval, according to which the source coordinate generating unit 2101 calculates in the rectangular image Q 1 a row number of a predetermined scan line corresponding to the target pixel.
- the source coordinate generating unit 2101 calculates a pixel distance ⁇ W by which corresponding image content in the rectangular image Q 1 moves when a pixel in the trapezoidal image Q 2 moves, as shown in FIG. 4B .
- the pixel distance ⁇ W is regarded as an average pixel distance.
- the source coordinate generating unit 2101 determines at least one pixel from the scan line, and outputs a coordinate of the at least one pixel to the memory unit 205 .
- the memory unit 205 outputs at least one pixel value of the at least one pixel to a source data buffer 2201 .
- the calculation unit 215 generates the target pixel value of the trapezoidal image Q 2 according to the at least one pixel value in the source data buffer 2201 , and stores the target pixel value of the target pixel in a target data buffer 2202 .
- the source coordinate generating unit 2101 calculates an average pixel distance ⁇ W that is correspondingly moved by in the rectangular image Q 1 as a pixel in the trapezoidal image Q 2 is moved. For example, when the target pixel is located at a relatively narrow part (narrower than the width W 1 of the rectangular image Q 1 ) of the trapezoidal image Q 2 , since W 2 is narrower than W 1 , the average pixel distance ⁇ W correspondingly moved in the rectangular image Q 1 is greater than the actual distance by which the pixel moves in the trapezoidal image Q 2 .
- a coordinate of a pixel in the rectangular image Q 1 is selected according to the average pixel distance ⁇ W, and the source coordinate generating unit 2101 then outputs the selected coordinate to the memory unit 205 .
- the memory unit 205 Upon receiving the selected coordinate, the memory unit 205 accordingly outputs the pixel value of the pixel in the rectangular image Q 1 through a data bus BUS to the data buffer 2201 .
- the pixel value is temporarily stored in the source data buffer 2201 , and serves as the pixel value of the target pixel of the trapezoidal image Q 2 .
- the pixel value of the target pixel of the trapezoidal image Q 2 may also be an average pixel value of the pixel of the rectangular image Q 1 and its neighboring pixels.
- the source coordinate generating unit 2101 when determining the coordinate of the pixel of the rectangular image Q 1 , the source coordinate generating unit 2101 outputs coordinates of the pixel and neighboring pixels to the memory unit 205 , which outputs pixel values of the pixel and neighboring pixels to the source data buffer 2201 to temporarily store the pixels values therein.
- the calculation unit 215 calculates an average pixel value that serves as the pixel value of the target pixel of the trapezoidal image Q 2 .
- the target pixel is located at a relatively wide part (wider than the width W 1 of the rectangular image Q 1 ) of the trapezoidal image Q 2 , since W 2 is wider than W 1 , the average pixel distance ⁇ W correspondingly moved in the rectangular image Q 1 , as also calculated by the source coordinate generating unit 2101 , is smaller than the actual distance by which the pixel moves in the trapezoidal image Q 2 .
- a coordinate of a pixel in the rectangular image Q 1 is selected according to the average pixel distance ⁇ W, and the source coordinate generating unit 2101 then outputs the selected coordinate to the memory unit 205 .
- the memory unit 205 Upon receiving the selected coordinate, the memory unit 205 accordingly outputs the pixel value of the pixel in the rectangular image Q 1 through a data bus BUS to the source data buffer 2201 .
- the pixel value is temporarily stored in the source data buffer 2201 , and serves as the pixel value of the target pixel of the trapezoidal image Q 2 .
- FIGS. 4A and 4B for elaborating how a pixel value of a target pixel of the trapezoidal image Q 2 is obtained to render a 3D effect, apart from being a rectangular image and a trapezoidal image, respectively, as shown in FIGS.
- the quadrilateral image Q 1 and the quadrilateral image Q 2 may also be other combinations of respectively a trapezoidal image and a rectangular image, and two trapezoidal images. More specifically, although the operations in FIGS. 4A and 4B involve a rectangular image to generate a trapezoidal image to further render a 3D image effect, in other embodiments, a trapezoidal image may also be utilized to generate another trapezoidal image, or a trapezoidal image may be utilized to generate a rectangular image in order to render a 3D effect; proper modifications similar to the ones above are within the scope of the invention. Further, the quadrilateral images Q 1 and Q 2 may be images of non-successive time points. Again with reference to FIG. 1 , for example, the image processing image 200 may also generate a trapezoidal image of the icon “0” at the time t 3 or a trapezoidal image at the time t 5 from the rectangular image of the icon “0” at the time t 1 .
- the pixel values of the selected pixels are outputted by the memory unit 205 to the source data buffer 2201 .
- the calculation unit 215 obtains the stored pixel values temporarily stored in the source data buffer 2201 to generate a pixel value of each of pixels of the quadrilateral image Q 2 , and temporarily stores the generated pixel values in the target data buffer 2202 .
- the target data buffer 2202 then writes the pixel values of the quadrilateral image Q 2 back to the memory unit 205 according to the coordinates that the target coordinate generating unit 2102 for each of the pixels of the quadrilateral image Q 2 .
- the calculation unit 215 may correspondingly adjust pixel values of the image of the quadrilateral image Q 2 temporarily stored in the source data buffer 2201 , so that the quadrilateral image Q 2 displays corresponding brightness different from that of the quadrilateral image Q 1 to render a more realistic 3D effect.
- FIG. 5 shows a schematic diagram of a brightness change according to an embodiment of the invention. As shown in FIG. 5 , a light source is located right in front of images.
- an icon “3” displays a 3D effect
- a series images thereof gradually dims (the brightness decreases as the number of dots increases in this embodiment) so that an obvious brightness difference exists between the icon “3” and an icon “4” underneath to give a more realistic 3D effect.
- the image of the icon “3” is evenly dimmed; however, brightness of the image of the icon “3” may be varied in part; as such modification is also within the scope of the invention.
- the light source may be positioned at different angles, such as at an upper-left or upper-right angle.
- a vertical trapezoidal image i.e., a trapezoid having a pair of horizontal parallel sides
- the spirit of the invention is also suitable for generating a horizontal trapezoidal image (i.e., a trapezoid having a pair of vertically parallel sides).
- a first coordinate change rate and a second coordinate change rate are generated respectively according to coordinate change rates of two upper vertices and coordinate change rates of two lower vertices. Therefore, for 3D effects of flipping vertically or horizontally, or for flipping or rotation at different angles, the invention is capable of rendering the desired 3D effect using 2D images.
- FIG. 6 shows a schematic diagram of a vertical trapezoidal image being 2D rotated according to an embodiment of the invention.
- the 2D rotation is realized by the image determining unit 210 to provide a rotation for at least three different angles.
- a vertex R of an original trapezoidal image is rotated clockwise by 90 degrees, 180 degrees or 270 degrees.
- FIG. 7 shows a schematic diagram of operations for rendering a 3D effect from a rotated image that is first 2D rotated by the pixel determining unit 210 . As shown in FIG.
- the target coordinate generating unit 2102 first re-defines coordinates for the 2D rotation. For example, a reference point originally defined at an upper-left vertex of the icon “2” is re-defined to a lower-left vertex to render a 2D 90-degree rotation effect.
- the original icon “2” is a to-be-processed 2D image 800 .
- the target coordinate generating unit 2102 rotates the to-be-processed 2D image 800 by a predetermined angle (i.e., 90 degrees clockwise) to generate coordinates of vertices of a rotated 2D image 805 , generates a vertical trapezoidal image 810 according to the coordinates of the vertices of the rotated 2D image 805 , and horizontally rotates the vertical trapezoidal image 810 by 270 degrees (i.e., 90 degrees counterclockwise) to obtain coordinates of vertices of the desired horizontal trapezoidal image 815 by referencing the to-be-processed 2D image.
- a 2D rotation may also be integrated with operations for generating a horizontal trapezoidal image to generate a vertical trapezoidal image; as such modification is within the scope of the invention.
- the image processing method and associated method are capable of rendering a 3D effect by using 2D images without needing any depth information of icons of a user interface system. Therefore, compared with the current 3D image drawing techniques, the apparatus and method of the invention significantly lower software and hardware costs and resources.
- the foregoing embodiments describe only icons in a user interface system, with proper modifications, the concept of the invention is also applicable for displaying a 3D effect of an image in a user interface system, or even applicable to a display interface of a non-user interface system.
- the apparatus and method according another embodiment of the invention draw a trapezoidal image at each time point to exactly render a rotation effect of a 3D cube. Thus, a 3D effect is rendered without needing a large amount of calculation to better meet a real-time requirement of a user.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Computing Systems (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Controls And Circuits For Display Device (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
An image processing method for rendering a three-dimensional by transforming a first quadrilateral image to a second quadrilateral image is provided, with at least one of the first and second quadrilateral images being a trapezoidal image. The image processing method includes providing the first quadrilateral image, generating coordinates of four vertices associated with the second quadrilateral image according to the first quadrilateral image and the three-dimensional effect; determining a relative height and a relative width according to a height and a width of the first quadrilateral image and the coordinates of the four vertices; and generating a plurality of pixel values of the second quadrilateral image with reference to a plurality of pixel values of the first quadrilateral image according to the relative height and the relative width.
Description
- This patent application claims the benefit of U.S. provisional patent application No. 61/218,077 filed on Jun. 18, 2009.
- The present invention relates to a mechanism for generating a three-dimensional (3D) effect, and more particularly, to an image processing method and an associated apparatus for rending a 3D effect using two-dimensional (2D) images.
- In a current user interface system, e.g., a user interface of a portable device like a mobile phone, methods for rendering graphics or images include a 2D image rendering approach and a 3D image rendering approach. The 2D image rendering approach is easier and less expensive to implement, yet has a disadvantage of lacking depth information. The 3D image rendering approach, although having an advantage of being capable of rendering better visual enjoyment to viewers using its depth information, is burdened with shortcomings including more complicated and more costly implementation. More specifically, when the 3D image rendering approach is realized by hardware, corresponding hardware cost is much higher than that of the 2D image rendering approach; when the 3D image rendering approach is realized by software, a processor needs to designate more resources and time in rendering the 3D images such that processor performance may be significantly degraded due to the 3D image rendering approach.
- Therefore, it is an objective of the invention to provide an image processing method and an associated apparatus for rendering a 3D effect using 2D images, so as to address complications involved in the 3D image rendering approach to reduce software and hardware costs as well as enhancing an overall system performance.
- The present invention provides an image processing method for rendering a 3D effect by transfoiming a first quadrilateral image to a second quadrilateral image, with at least one of the first and second quadrilateral images being a trapezoidal image. The image processing method comprises: providing the first quadrilateral image; generating four coordinates of four vertices associated with the second quadrilateral image according to the first quadrilateral image and the 3D effect; determining a relative height and a relative width according to a height and a width of the first quadrilateral image and the coordinates of the four vertices; and generating a plurality of pixel values of the second quadrilateral image according to the coordinates of the four vertices, the relative height, and the relative width with reference to a plurality of pixel values of the first quadrilateral image.
- The present invention further provides an image processing apparatus for rendering a 3D effect by transforming a first quadrilateral image to a second quadrilateral image, with at least one of the first and second quadrilateral images being a trapezoidal image. The image processing apparatus comprises a target image determining unit, a pixel determining unit and a calculation unit. The target image determining unit generates coordinates of four vertices associated with the second quadrilateral image according to the first quadrilateral image and the 3D effect. The pixel determining unit determines a relative height and a relative width according to a height and a width of the first quadrilateral image and the coordinates of the vertices, and determines a corresponding relationship between a plurality of pixels of the first quadrilateral image and a plurality of pixels of the second quadrilateral image. The calculation unit generates a plurality of pixel values of the second quadrilateral image according to the corresponding relationship with reference to a plurality of pixel values of the first quadrilateral image.
- The present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, in which:
-
FIG. 1 is a schematic diagram of rendering a 3D effect using 2D images according to an embodiment of the invention; -
FIG. 2 is a schematic diagram of an image processing apparatus according to an embodiment of the invention; -
FIG. 3 shows a schematic diagram of the quadrilateral images Q1 and Q2 according to one embodiment of the invention; -
FIGS. 4A and 4B are schematic diagrams of a corresponding relationship between pixels of the trapezoidal image and pixels of the rectangular image shown inFIG. 3 ; -
FIG. 5 is a schematic diagram of a brightness change according to an embodiment of the invention; -
FIG. 6 is a schematic diagram of a vertical trapezoidal image being 2D rotated according to an embodiment of the invention; -
FIG. 7 is a schematic diagram of operations for rendering a 3D effect from a rotated image that is first 2D rotated by a pixel determining unit; and -
FIG. 8 is a schematic diagram of the image processing apparatus inFIG. 2 rendering a 3D cube rotation effect using two trapezoidal images at different time points. - For a three-dimensional (3D) rendering approach, hardware costs and system resources for rendering operation icons or images of a user interface system are quite high. Therefore, a two-dimensional rendering approach according to one embodiment of the invention is provided as a novel image processing approach for rendering operation icons or images of a user interface system, so as to render a 3D effect using 2D images without degrading an overall system performance while also bring better visual enjoyment to a user operating the user interface system. According to an embodiment of the invention, the image processing method and apparatus generate images that render at least one 3D effect, including image reshaping, rotating, twisting or expansion effects, and lighting effects. The image processing method and apparatus, based on 2D images that require no Z-axis information (i.e., image depth information), is capable of rendering the 3D effect. Thus, resources that a processor or a calculation unit designates on calculations are significantly reduced to enhance an overall system calculation performance. More specifically, when the image processing method and apparatus according to an embodiment of the invention is realized by hardware, cost of hardware previously applied for the conventional 2D image drawing method may only be slightly increased while cost of hardware previously needed to show 3D effects is reduced. Therefore, the method and apparatus of the invention offer cost advantages whether being realized by software or hardware.
- Icons of a user interface system are mostly quadrilateral icons, and more particularly, rectangular and square icons. According to a principle of the invention, when a quadrilateral icon flips or rotates toward a predetermined angle, a series of images during the transformation between an original image to a final image (i.e., a generated image) are simulated as a plurality of different successive images that are trapezoidal or a rectangular in shape. Therefore, the method and apparatus according to an embodiment of the invention, according to flipping or rotation angles of different 3D effects, calculates a shape associated with each quadrilateral image to be generated in order to respectively generate a quadrilateral image. For example, the shape of a quadrilateral icon at a latter of two successive time points is calculated according to a difference between rotation angles of the two successive time points; alternatively, the shape of a quadrilateral icon at a current time points is calculated according to a difference of rotation angles between the current-time-point image and an original image—other similar modifications are also within the scope of the invention.
-
FIG. 1 shows a schematic diagram of rendering a 3D effect using 2D images according to an embodiment of the invention. As shown inFIG. 1 , icons of the user interface system render a flipping or rotation effect in a 3D space, such as flipping vertically (e.g., icons “0” and “1”), or flipping horizontally (e.g., icons “2” and “3”). Taking the icon “0” for example, in order to render a 3D effect, a series of images transformed from an originally rectangular icon “0” are a plurality of different trapezoidal images when flipping vertically. As an angle of flipping gets larger, a height of the series of trapezoidal images of the icons “0” gradually reduces (from time t1 to t5), so as to render a visual effect of the icon “0” flipping vertically in a 3D space. Taking the icon “1” for example, a series of images transformed from an icon “1” in an originally shallow trapezoidal image are a plurality of different trapezoidal and square images when flipping vertically. As an angle of flipping gets larger, a height of the series of the trapezoidal images of the icon“1” gradually increases (from time t1 to t5), so that one may perceive the shape of the images “1” restore to a normal shape (i.e., the square image) from a shallow shape as the height of the trapezoidal images becomes larger to render a visual effect of the icon “1” flipping vertically in a 3D space. Again, taking the icon “2” for example, a series of images transformed from the icon “2” in an original square image are a plurality of different trapezoidal images when flipping horizontally. As an angle of flipping gets larger, a height of the series of the trapezoidal images of the icon “2” gradually reduces from time t1 to t5 (the height of the series of trapezoidal images of the icon “2” is a width of the images), so that one may perceive the shape of the icon “2” gradually become shallow as the height of the trapezoidal images becomes smaller to render a visual effect of the icon “1” flipping horizontally in a 3D space. Similarly, a height of a series of trapezoidal images of the icon “3” gradually becomes larger from time t1 to t5, so as to render a visual effect of the icon “3” flipping horizontally in a 3D space. To emphasize a 3D effect of the icon, brightness of a series of trapezoidal images is appropriately adjusted. For example, suppose a light source is right in front of the images. Brightness of the series of images dims as the icon “0” flips from a front side downwards to face down or as the icon “2” flips from a front side rightwards to face a side. In contrast, brightness of the series of images becomes brighter as the icon “1” flips frontwards from facing a side to a front side or as the icon “3” flips rightwards from facing a side to a front side. InFIG. 1 , dots are used to indicate differences in brightness, i.e., images with a large number of dots are dimmer in brightness, images with a smaller number of dots are brighter in brightness, and the square image without any dots means it has maximum brightness. The icons “0” to “3” are used for illustrating effects provided by the method and apparatus of the invention, but are not to limit the invention thereto. Description of realizing an embodiment of the invention by hardware shall be given below. - To render a 3D effect and reducing hardware cost at the same time, according to an embodiment of the invention, only the abovementioned series of images between the trapezoidal image and the rectangular image are involved to render the 3D effect without needing depth information of the images.
FIG. 2 shows a schematic diagram of animage processing apparatus 200 according to an embodiment of the invention. Theimage processing apparatus 200 comprises a targetimage determining unit 201, amemory unit 205, apixel determining unit 210, acalculation unit 215, and abuffer 220. How a predetermined quadrilateral image among the abovementioned series of quadrilateral images is generated is discussed below. In this embodiment, pixel values of a first quadrilateral image Q1 are used to generate pixel values of a second quadrilateral image Q2, and the first and second quadrilateral images Q1 and Q2 may be temporally successive, e.g., the two quadrilateral images are the icon “0” at the time t1 and t2 inFIG. 1 . Thememory unit 205 is for storing a plurality of pixel values of the quadrilateral image Q1 corresponding to a 2D image. The targetimage determining unit 201 is for generating four coordinates of four vertices associated with the quadrilateral image Q2 according to coordinates of four vertices of the quadrilateral image Q1 and a desired 3D effect. At least one of the quadrilateral images Q1 and Q2 is a trapezoidal image. Upon obtaining the coordinates of the four vertices of the quadrilateral image Q1, thepixel determining unit 210 correspondingly outputs the plurality of pixel values of the quadrilateral image Q1 from thememory unit 205 to thebuffer 200 according to the coordinates of the four vertices as well as a height and a width of the quadrilateral image Q2. Thecalculation unit 215 then accordingly generates the plurality of pixel values of the quadrilateral image Q2. Thus, theimage processing apparatus 200 one after another generates the abovementioned series of quadrilateral images that render a 3D effect, which may be, for example, the quadrilateral image Q1 flipping in a 3D space. The quadrilateral images are outputted and displayed on a display to allow a user to perceive the rendered 3D effect from using 2D images. - In this embodiment, for example, the 2D image is an icon of a user interface system, the quadrilateral images Q1 and Q2 are images of two successive time points, the quadrilateral image Q1 is an image at a former time point of the 2D image (i.e., the icon) during the flipping or rotation. For example, the 2D image is the icon “0” in
FIG. 1 , the quadrilateral image Q1 is the square image of the image “0” at the time t1, and the quadrilateral image Q2 is the trapezoidal image of the icon “0” at the time t2. The quadrilateral image Q1 may also be the square image of the icon “0” at the time t3, and the quadrilateral image Q2 may also be the trapezoidal image of the icon “0” at the time t4. Further, when the 2D image is the icon “1” inFIG. 1 , the quadrilateral image Q1 may be the trapezoidal image of the icon “1” at the time t2, and the quadrilateral image Q2 may be the trapezoidal image of the icon “1” at the time t3. The quadrilateral image Q1 may also be the trapezoidal image of the icon “1” at the time t4, and the quadrilateral image Q2 may also be the square image of the icon “0” at the time t5. In other words, theimage processing apparatus 200 is capable of generating flipped or rotated images after according to different desired flipping or rotation angles of the icon in a 3D space. - Again with reference to
FIG. 2 , the targetimage determining unit 201 is for generating the coordinates of the four vertices of the quadrilateral image Q2 according to the coordinates of the four vertices of the quadrilateral image Q1 as well as desired angle and direction of flipping or rotation. In this embodiment, the quadrilateral image Q1 is a source image, whose data is stored in thememory unit 205. In practice, the targetimage determining unit 201 may be realized by hardware or software.FIG. 3 shows a schematic diagram of the quadrilateral images Q1 and Q2 according to one embodiment of the invention. As shown inFIG. 3 , for example, the quadrilateral image Q1 is a rectangular image, and coordinates P21˜P24 of the four vertices outputted by the targetimage determining unit 201 define the quadrilateral image Q2 as a trapezoidal image. Thememory unit 205 is further stored with display image data corresponding to a display image, in which the coordinates P21˜P24 generated by the targetimage determining unit 201 are located. - The
pixel determining unit 210 comprises a source coordinate generatingunit 2101 and a target coordinate generatingunit 2102. The target coordinate generatingunit 2102 generates a coordinate of each pixel of the trapezoidal image Q2 according to the coordinates P21˜P24 of the four vertices of the trapezoidal image Q2. More specifically, the target coordinate generatingunit 2102 calculates change rates of coordinates at two sides of the trapezoidal image Q2 (i.e., the change rates of the planar coordinates P21 and P23 of the two left vertices of the trapezoidal image, and the change rates of the planar coordinates P22 and P24 of the two right vertices of the trapezoidal image) to respectively obtain a first coordinate change rate and a second coordinate change rate. With the first and second coordinate change rates, a start point and an end point at a predetermined row of a next scan line can be obtained as the trapezoidal image Q2 increases or decreases by a row; that is, coordinates of two terminal points of each scan line of the trapezoidal image can be obtained. In this embodiment, two methods are available for determining the start and end points of each scan line of the trapezoidal image Q2 shown inFIG. 3 . The first method is that, the two coordinate change rates of right and left sides are respectively used to obtain a distance between coordinate start points of each two adjacent rows and a distance between coordinate end points of each two adjacent rows, and the target coordinate generatingunit 2102 respectively accumulates the two coordinate change rates to a current start point and a current end point with every increasing/decreasing row, so as to obtain coordinates of the start point and end point of a next row. Further, to prevent limited calculation accuracy from deviating the generated coordinates of the start and end points, the coordinate of the start point of each row is calculated by adding a product from multiplying a number for a current row with the coordinate change rate to values of the coordinate of a vertex (e.g., the start point P21). Similarly, the coordinate of the end point of each row can be calculated in the same manner. That is, the coordinate of the end point of each row is calculated by adding a product from multiplying a number of the current row with the coordinate change rate to values of the coordinate of a vertex (e.g., the end point P24). However, the method describe above is for illustrative purposes, and proper modifications made by a person skilled in the art are within the scope of the invention. - When generating the trapezoidal image Q2, the source coordinate generating
unit 2101 determines at least one corresponding pixel of the rectangular image Q1 with respect to each pixel of the trapezoidal image Q2, and outputs a coordinate of the at least one pixel to thememory unit 205, which further outputs pixel values corresponding to the at least one pixel to thebuffer 220. Thecalculation unit 215 then generates a pixel value for each pixel of the trapezoidal image Q2 according to the pixel values corresponding to the at least one pixel in thebuffer 220.FIGS. 4A and 4B show schematic diagrams illustrating operations of the source coordinate generatingunit 2101 for determining a corresponding relationship between the pixels of the trapezoidal image Q2 and the pixels of the trapezoid image Q1 inFIG. 3 . How a target pixel value of a target pixel among the plurality of pixels of the trapezoidal image Q2 is generated shall be described below. With reference a height H2 of the trapezoidal image Q2 and a height H1 of the rectangular image Q1, the source coordinate generatingunit 2101 calculates an interval ΔH by which the rectangular Q1 correspondingly moves when corresponding image content in the trapezoidal image Q2 moves by one row. The interval ΔH is regarded as an average interval, according to which the source coordinate generatingunit 2101 calculates in the rectangular image Q1 a row number of a predetermined scan line corresponding to the target pixel. With reference to a width W2 of the scan line where the target pixel is located in the trapezoidal image Q2 and a width W1 of the rectangular image Q1, the source coordinate generatingunit 2101 calculates a pixel distance ΔW by which corresponding image content in the rectangular image Q1 moves when a pixel in the trapezoidal image Q2 moves, as shown inFIG. 4B . The pixel distance ΔW is regarded as an average pixel distance. According to the average pixel distance and a position of the target pixel in the scan line, the source coordinate generatingunit 2101 then determines at least one pixel from the scan line, and outputs a coordinate of the at least one pixel to thememory unit 205. Thememory unit 205 outputs at least one pixel value of the at least one pixel to asource data buffer 2201. Thecalculation unit 215 generates the target pixel value of the trapezoidal image Q2 according to the at least one pixel value in thesource data buffer 2201, and stores the target pixel value of the target pixel in atarget data buffer 2202. - The source coordinate generating
unit 2101 calculates an average pixel distance ΔW that is correspondingly moved by in the rectangular image Q1 as a pixel in the trapezoidal image Q2 is moved. For example, when the target pixel is located at a relatively narrow part (narrower than the width W1 of the rectangular image Q1) of the trapezoidal image Q2, since W2 is narrower than W1, the average pixel distance ΔW correspondingly moved in the rectangular image Q1 is greater than the actual distance by which the pixel moves in the trapezoidal image Q2. Thus, a coordinate of a pixel in the rectangular image Q1 is selected according to the average pixel distance ΔW, and the source coordinate generatingunit 2101 then outputs the selected coordinate to thememory unit 205. Upon receiving the selected coordinate, thememory unit 205 accordingly outputs the pixel value of the pixel in the rectangular image Q1 through a data bus BUS to thedata buffer 2201. The pixel value is temporarily stored in thesource data buffer 2201, and serves as the pixel value of the target pixel of the trapezoidal image Q2. In other embodiments, the pixel value of the target pixel of the trapezoidal image Q2 may also be an average pixel value of the pixel of the rectangular image Q1 and its neighboring pixels. In practice, when determining the coordinate of the pixel of the rectangular image Q1, the source coordinate generatingunit 2101 outputs coordinates of the pixel and neighboring pixels to thememory unit 205, which outputs pixel values of the pixel and neighboring pixels to thesource data buffer 2201 to temporarily store the pixels values therein. Next, according to the pixel values from thesource data buffer 2201, thecalculation unit 215 calculates an average pixel value that serves as the pixel value of the target pixel of the trapezoidal image Q2. - In contrast, when the target pixel is located at a relatively wide part (wider than the width W1 of the rectangular image Q1) of the trapezoidal image Q2, since W2 is wider than W1, the average pixel distance ΔW correspondingly moved in the rectangular image Q1, as also calculated by the source coordinate generating
unit 2101, is smaller than the actual distance by which the pixel moves in the trapezoidal image Q2. Thus, a coordinate of a pixel in the rectangular image Q1 is selected according to the average pixel distance ΔW, and the source coordinate generatingunit 2101 then outputs the selected coordinate to thememory unit 205. Upon receiving the selected coordinate, thememory unit 205 accordingly outputs the pixel value of the pixel in the rectangular image Q1 through a data bus BUS to thesource data buffer 2201. The pixel value is temporarily stored in thesource data buffer 2201, and serves as the pixel value of the target pixel of the trapezoidal image Q2. InFIGS. 4A and 4B for elaborating how a pixel value of a target pixel of the trapezoidal image Q2 is obtained to render a 3D effect, apart from being a rectangular image and a trapezoidal image, respectively, as shown inFIGS. 4A and 4B , the quadrilateral image Q1 and the quadrilateral image Q2 may also be other combinations of respectively a trapezoidal image and a rectangular image, and two trapezoidal images. More specifically, although the operations inFIGS. 4A and 4B involve a rectangular image to generate a trapezoidal image to further render a 3D image effect, in other embodiments, a trapezoidal image may also be utilized to generate another trapezoidal image, or a trapezoidal image may be utilized to generate a rectangular image in order to render a 3D effect; proper modifications similar to the ones above are within the scope of the invention. Further, the quadrilateral images Q1 and Q2 may be images of non-successive time points. Again with reference toFIG. 1 , for example, theimage processing image 200 may also generate a trapezoidal image of the icon “0” at the time t3 or a trapezoidal image at the time t5 from the rectangular image of the icon “0” at the time t1. - Through a plurality of coordinates of different pixels selected from the quadrilateral image Q1 for each of the pixels of the quadrilateral image Q2 by the source coordinate generating
unit 2101, the pixel values of the selected pixels are outputted by thememory unit 205 to thesource data buffer 2201. Thecalculation unit 215 obtains the stored pixel values temporarily stored in thesource data buffer 2201 to generate a pixel value of each of pixels of the quadrilateral image Q2, and temporarily stores the generated pixel values in thetarget data buffer 2202. Thetarget data buffer 2202 then writes the pixel values of the quadrilateral image Q2 back to thememory unit 205 according to the coordinates that the target coordinate generatingunit 2102 for each of the pixels of the quadrilateral image Q2. - Further, according to a flipping or rotation angle of the desired 3D effect, the
calculation unit 215 may correspondingly adjust pixel values of the image of the quadrilateral image Q2 temporarily stored in thesource data buffer 2201, so that the quadrilateral image Q2 displays corresponding brightness different from that of the quadrilateral image Q1 to render a more realistic 3D effect.FIG. 5 shows a schematic diagram of a brightness change according to an embodiment of the invention. As shown inFIG. 5 , a light source is located right in front of images. When an icon “3” displays a 3D effect, a series images thereof gradually dims (the brightness decreases as the number of dots increases in this embodiment) so that an obvious brightness difference exists between the icon “3” and an icon “4” underneath to give a more realistic 3D effect. In this embodiment, to simplify an overall design, when the icon “3” displays a 3D flipping effect, the image of the icon “3” is evenly dimmed; however, brightness of the image of the icon “3” may be varied in part; as such modification is also within the scope of the invention. In other embodiments, for example, the light source may be positioned at different angles, such as at an upper-left or upper-right angle. - In the foregoing embodiments, details for generating a vertical trapezoidal image (i.e., a trapezoid having a pair of horizontal parallel sides) are given. However, the spirit of the invention is also suitable for generating a horizontal trapezoidal image (i.e., a trapezoid having a pair of vertically parallel sides). To generate a horizontal trapezoidal image, a first coordinate change rate and a second coordinate change rate are generated respectively according to coordinate change rates of two upper vertices and coordinate change rates of two lower vertices. Therefore, for 3D effects of flipping vertically or horizontally, or for flipping or rotation at different angles, the invention is capable of rendering the desired 3D effect using 2D images. In addition, to make software and hardware designs simple, a horizontal trapezoidal image is generated by integrating a 2D rotation function with operations for generating a vertical trapezoidal image of the invention.
FIG. 6 shows a schematic diagram of a vertical trapezoidal image being 2D rotated according to an embodiment of the invention. For example, the 2D rotation is realized by theimage determining unit 210 to provide a rotation for at least three different angles. As shown inFIG. 6 , a vertex R of an original trapezoidal image is rotated clockwise by 90 degrees, 180 degrees or 270 degrees.FIG. 7 shows a schematic diagram of operations for rendering a 3D effect from a rotated image that is first 2D rotated by thepixel determining unit 210. As shown inFIG. 7 , to generate ahorizontal image 815, the target coordinate generatingunit 2102 first re-defines coordinates for the 2D rotation. For example, a reference point originally defined at an upper-left vertex of the icon “2” is re-defined to a lower-left vertex to render a 2D 90-degree rotation effect. In this embodiment, the original icon “2” is a to-be-processed 2D image 800. The target coordinate generatingunit 2102 rotates the to-be-processed 2D image 800 by a predetermined angle (i.e., 90 degrees clockwise) to generate coordinates of vertices of a rotated2D image 805, generates a verticaltrapezoidal image 810 according to the coordinates of the vertices of the rotated2D image 805, and horizontally rotates the verticaltrapezoidal image 810 by 270 degrees (i.e., 90 degrees counterclockwise) to obtain coordinates of vertices of the desired horizontaltrapezoidal image 815 by referencing the to-be-processed 2D image. A 2D rotation may also be integrated with operations for generating a horizontal trapezoidal image to generate a vertical trapezoidal image; as such modification is within the scope of the invention. - With the embodiments of the invention, the image processing method and associated method are capable of rendering a 3D effect by using 2D images without needing any depth information of icons of a user interface system. Therefore, compared with the current 3D image drawing techniques, the apparatus and method of the invention significantly lower software and hardware costs and resources. Further, although the foregoing embodiments describe only icons in a user interface system, with proper modifications, the concept of the invention is also applicable for displaying a 3D effect of an image in a user interface system, or even applicable to a display interface of a non-user interface system. In practice, referring to
FIG. 8 , the apparatus and method according another embodiment of the invention draw a trapezoidal image at each time point to exactly render a rotation effect of a 3D cube. Thus, a 3D effect is rendered without needing a large amount of calculation to better meet a real-time requirement of a user. - While the invention has been described in terms of what is presently considered to be the most practical and preferred embodiments, it is to be understood that the invention needs not to be limited to the above embodiments. On the contrary, it is intended to cover various modifications and similar arrangements included within the spirit and scope of the appended claims which are to be accorded with the broadest interpretation so as to encompass all such modifications and similar structures.
Claims (16)
1. An image processing method, for rendering a three-dimensional (3D) effect by transforming a first quadrilateral image to a second quadrilateral image, at least one of the first and second quadrilateral images being a trapezoidal image, the method comprising:
providing the first quadrilateral image;
generating coordinates of four vertices associated with the second quadrilateral image according to the first quadrilateral image and the 3D effect;
determining a relative height and a relative width according to a height and a width of the first quadrilateral image, and the coordinates of the four vertices; and
generating a plurality of pixel values of the second quadrilateral image with reference to a plurality of pixel values of the first quadrilateral image according to the relative height, and the relative width.
2. The method as claimed in claim 1 , wherein the 3D effect is a rotation effect of the first quadrilateral image in a 3D space.
3. The method as claimed in claim 2 , wherein the step of generating the coordinates of the four vertices comprises:
generating the coordinates of the four vertices according to a rotation angle associated with the 3D effect of the first quadrilateral image.
4. The method as claimed in claim 1 , further comprising:
calculating a plurality of coordinate change rates according to the coordinates of the four vertices;
wherein, the second quadrilateral image is the trapezoidal image, and the pixel values of the second quadrilateral image are generated according to the coordinate change rates, the relative height, and the relative width.
5. The method as claimed in claim 4 , wherein the trapezoidal image has a pair of horizontal parallel sides, and the step of calculating the coordinate change rates comprises:
calculating coordinate change rates of two left vertices of the trapezoidal image to generate a first coordinate change rate; and
calculating coordinate change rates of two right vertices of the trapezoidal image to generate a second coordinate change rate;
wherein, the first and second coordinate change rates are utilized for obtaining coordinates of two terminal points of each horizontal scan line of the second quadrilateral image.
6. The method as claimed in claim 4 , wherein the trapezoidal image has a pair of vertical parallel sides, and the step of calculating the coordinate change rates comprises:
calculating coordinate change rates of two upper vertices of the trapezoidal image to generate a first coordinate change rate; and
calculating coordinate change rates of two lower vertices of the trapezoidal image to generate a second coordinate change rate;
wherein, the first and second coordinate change rates are utilized for obtaining coordinates of two terminal points of each vertical scan line of the second quadrilateral image.
7. The method as claimed in claim 1 , wherein the second quadrilateral image comprise a target pixel, the step of generating the pixel values of the second quadrilateral image comprises:
obtaining a corresponding scan line of the first quadrilateral image according to the relative height and a position of the target pixel;
determining a pixel distance according to the relative width; and
determining at least one pixel of the corresponding scan line of the first quadrilateral image according to the pixel distance and the position of the target pixel, and generating a target pixel value of the target pixel according to a pixel value of the at least one pixel.
8. The method as claimed in claim 1 , wherein the first quadrilateral image has different brightness from that of the second quadrilateral image.
9. The method as claimed in claim 1 , further comprising:
displaying the first quadrilateral image and the second quadrilateral image in sequence.
10. An image processing apparatus, for rendering a 3D effect by transforming a first quadrilateral image to a second quadrilateral image, at least one of the first and second quadrilateral images being a trapezoidal image, the apparatus comprising:
a target image determining unit, for generating coordinates of four vertices associated with the second quadrilateral image according to the first quadrilateral image and the 3D effect;
a pixel determining unit, for determining a relative height and a relative width according to a height and width of the first quadrilateral image and the coordinates of the four vertices, and accordingly determining a corresponding relationship between a plurality of pixels of the first quadrilateral image and a plurality of pixels of the second quadrilateral image; and
a calculation unit, for generating a plurality of pixel values of the second quadrilateral image with reference to a plurality of pixel values of the first quadrilateral image according to the corresponding relationship.
11. The apparatus as claimed in claim 10 , wherein the 3D effect is a rotation effect of the first quadrilateral image in a 3D space.
12. The apparatus as claimed in claim 11 , wherein the target image determining unit generates the coordinates of four vertices of the second quadrilateral image according to a rotation angle associated with the 3D effect of the first quadrilateral image.
13. The apparatus as claimed in claim 10 , wherein the pixel determining unit calculates a plurality of coordinate change rates according to the coordinates of the four vertices, and when the second quadrilateral image is the trapezoidal image, the pixel determining unit determines the corresponding relationship according to the coordinate change rates, the relative height and the relative width.
14. The apparatus as claimed in claim 13 , wherein the trapezoidal image has a pair of horizontal parallel sides, the pixel determining unit calculates coordinate change rates of two left vertices of the trapezoidal image to generate a first coordinate change rate, and calculates coordinate change rates of two right vertices of the trapezoidal image to generate a second coordinate change rate, and the first and second coordinate change rates are utilized for obtaining coordinates of two terminal points of each horizontal scan line of the second quadrilateral image.
15. The apparatus as claimed in claim 13 , wherein the trapezoidal image has a pair of vertical parallel sides, the pixel determining unit calculates coordinate change rates of two upper vertices of the trapezoidal image to generate a first coordinate change rate, and calculates coordinate change rates of two lower vertices of the trapezoidal image to generate a second coordinate change rate, and the first and second coordinate change rates are utilized for obtaining coordinates of two terminal points of each vertical scan line of the second quadrilateral image.
16. The apparatus as claimed in claim 16 , wherein the first quadrilateral image has different brightness from that of the second quadrilateral image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/794,943 US20100321380A1 (en) | 2009-06-18 | 2010-06-07 | Image Processing Method and Associated Apparatus for Rendering Three-dimensional Effect Using Two-dimensional Image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US21807709P | 2009-06-18 | 2009-06-18 | |
US12/794,943 US20100321380A1 (en) | 2009-06-18 | 2010-06-07 | Image Processing Method and Associated Apparatus for Rendering Three-dimensional Effect Using Two-dimensional Image |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100321380A1 true US20100321380A1 (en) | 2010-12-23 |
Family
ID=43353912
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/794,943 Abandoned US20100321380A1 (en) | 2009-06-18 | 2010-06-07 | Image Processing Method and Associated Apparatus for Rendering Three-dimensional Effect Using Two-dimensional Image |
US12/814,617 Active 2031-06-07 US8442346B2 (en) | 2009-06-18 | 2010-06-14 | Image processing method and associated apparatus for adjusting an edge pixel |
US12/817,244 Active 2032-01-16 US8576220B2 (en) | 2009-06-18 | 2010-06-17 | Image processing method and associated apparatus for rendering three-dimensional effect using two-dimensional image |
US12/818,734 Active 2031-09-06 US8749712B2 (en) | 2009-06-18 | 2010-06-18 | Method for processing on-screen display and associated embedded system |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/814,617 Active 2031-06-07 US8442346B2 (en) | 2009-06-18 | 2010-06-14 | Image processing method and associated apparatus for adjusting an edge pixel |
US12/817,244 Active 2032-01-16 US8576220B2 (en) | 2009-06-18 | 2010-06-17 | Image processing method and associated apparatus for rendering three-dimensional effect using two-dimensional image |
US12/818,734 Active 2031-09-06 US8749712B2 (en) | 2009-06-18 | 2010-06-18 | Method for processing on-screen display and associated embedded system |
Country Status (3)
Country | Link |
---|---|
US (4) | US20100321380A1 (en) |
CN (4) | CN101930620B (en) |
TW (4) | TWI493500B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103544725A (en) * | 2013-11-19 | 2014-01-29 | 马宁 | Two-dimensional stereoscopic animation making method |
US20160063765A1 (en) * | 2014-08-27 | 2016-03-03 | Ricoh Company, Ltd. | Image processing apparatus and image processing method |
CN110456517A (en) * | 2019-08-20 | 2019-11-15 | 上海驾馥电子科技有限公司 | 3D display screen and its 3D display method |
US10996829B2 (en) * | 2019-05-14 | 2021-05-04 | Furuno Electric Company Limited | Apparatus, method and program for processing data |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110149020A1 (en) * | 2009-12-17 | 2011-06-23 | Ilya Klebanov | Method and system for video post-processing based on 3d data |
KR20120017649A (en) * | 2010-08-19 | 2012-02-29 | 삼성전자주식회사 | Display apparatus and control method |
JP5857567B2 (en) * | 2011-09-15 | 2016-02-10 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
US8935629B2 (en) * | 2011-10-28 | 2015-01-13 | Flipboard Inc. | Systems and methods for flipping through content |
TWI444794B (en) | 2011-12-23 | 2014-07-11 | Ind Tech Res Inst | Load control system and load control method |
US9110572B2 (en) * | 2013-02-04 | 2015-08-18 | Visible Spectrum, Inc. | Network based video creation |
US9424808B2 (en) * | 2013-08-22 | 2016-08-23 | Htc Corporation | Image cropping manipulation method and portable electronic device |
CN105814903A (en) * | 2013-09-10 | 2016-07-27 | 卡尔加里科技股份有限公司 | Architecture for distributed server-side and client-side image data rendering |
CN106063205B (en) | 2013-11-06 | 2018-06-29 | 卡尔加里科技股份有限公司 | The device and method that client traffic controls in remote access environment |
CN104796649B (en) * | 2014-01-21 | 2017-12-26 | 北京炬力北方微电子有限公司 | A kind of method and device of tripleplane |
KR101737089B1 (en) * | 2015-05-29 | 2017-05-17 | 삼성전자주식회사 | Method and device for displaying an image |
CN109144368B (en) * | 2018-08-23 | 2020-09-15 | 维沃移动通信有限公司 | Picture conversion method and terminal |
TWI789669B (en) * | 2020-12-31 | 2023-01-11 | 致茂電子股份有限公司 | Electronic device and image processing method |
US12008726B2 (en) * | 2022-09-13 | 2024-06-11 | Vizilu, Inc. | System and methods for providing a picture frame with an interactive experience |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010005204A1 (en) * | 1996-08-29 | 2001-06-28 | Sanyo Electric Co., Ltd. | Texture information assignment method, object extraction method, three-dimensional model generating method, and apparatus thereof |
US20020050988A1 (en) * | 2000-03-28 | 2002-05-02 | Michael Petrov | System and method of three-dimensional image capture and modeling |
US6577330B1 (en) * | 1997-08-12 | 2003-06-10 | Matsushita Electric Industrial Co., Ltd. | Window display device with a three-dimensional orientation of windows |
US6760488B1 (en) * | 1999-07-12 | 2004-07-06 | Carnegie Mellon University | System and method for generating a three-dimensional model from a two-dimensional image sequence |
US20040155877A1 (en) * | 2003-02-12 | 2004-08-12 | Canon Europa N.V. | Image processing apparatus |
US20060103650A1 (en) * | 2001-02-23 | 2006-05-18 | Fujitsu Limited | Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus |
US20070257904A1 (en) * | 2006-05-05 | 2007-11-08 | Microsoft Corporation | Editing text within a three-dimensional graphic |
US20090058883A1 (en) * | 2007-09-05 | 2009-03-05 | Osmosys S.A. | Method for rotating images |
US20090058851A1 (en) * | 2007-09-05 | 2009-03-05 | Osmosys S.A. | Method for drawing geometric shapes |
US20110034246A1 (en) * | 2008-04-09 | 2011-02-10 | Eyal Amitzur | System and method for a two dimensional to three dimensional game transformation |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69530824D1 (en) * | 1994-06-20 | 2003-06-26 | Sega Corp | METHOD AND DEVICE FOR IMAGE PROCESSING |
JPH08186844A (en) * | 1994-12-28 | 1996-07-16 | Sanyo Electric Co Ltd | Stereoscopic picture generator and stereoscopic picture generating method |
US5621906A (en) * | 1995-02-13 | 1997-04-15 | The Trustees Of Columbia University In The City Of New York | Perspective-based interface using an extended masthead |
JPH10164317A (en) * | 1996-12-05 | 1998-06-19 | Sharp Corp | Image reader |
EP0984397B1 (en) * | 1998-08-30 | 2005-03-02 | Gmd - Forschungszentrum Informationstechnik Gmbh | Method and device for elimination of unwanted steps in raster displays |
JP2000137830A (en) * | 1998-10-30 | 2000-05-16 | Nec Home Electronics Ltd | Graphic data processing method |
US6377273B1 (en) * | 1998-11-04 | 2002-04-23 | Industrial Technology Research Institute | Fast area-coverage computing method for anti-aliasing in graphics |
US6437793B1 (en) * | 1999-07-29 | 2002-08-20 | Bitstream Inc. | System for rapidly performing scan conversion with anti-aliasing upon outline fonts and other graphic elements |
US6674484B1 (en) * | 2000-01-10 | 2004-01-06 | Koninklijke Philips Electronics N.V. | Video sample rate conversion to achieve 3-D effects |
JP2002094764A (en) * | 2000-09-11 | 2002-03-29 | Matsushita Graphic Communication Systems Inc | Skew correction unit and image forming device |
US7171630B2 (en) * | 2001-11-06 | 2007-01-30 | Zinio Systems, Inc. | Electronic simulation of interaction with printed matter |
US6943805B2 (en) * | 2002-06-28 | 2005-09-13 | Microsoft Corporation | Systems and methods for providing image rendering using variable rate source sampling |
CN1251157C (en) * | 2002-12-27 | 2006-04-12 | 中国科学院自动化研究所 | Object three-dimensional model quick obtaining method based on active vision |
EP1635297B1 (en) * | 2003-05-30 | 2012-06-13 | Lattice Technology, Inc. | 3-dimensional graphics data display device |
CN1254949C (en) * | 2003-06-27 | 2006-05-03 | 光宝科技股份有限公司 | Automatic correction method for oblique image |
KR100510146B1 (en) * | 2003-08-20 | 2005-08-25 | 삼성전자주식회사 | Method and apparatus for graphic user interface |
WO2005055034A1 (en) | 2003-12-01 | 2005-06-16 | Research In Motion Limited | Previewing a new event on a small screen device |
KR100699265B1 (en) * | 2005-07-25 | 2007-03-27 | 삼성전자주식회사 | Display apparatus and control mathod thereof |
JP2007066012A (en) * | 2005-08-31 | 2007-03-15 | Toshiba Corp | Apparatus, method and program for drawing image |
US20070136681A1 (en) * | 2005-12-08 | 2007-06-14 | Syntax Brillian Corp. | On-screen display for configuring a display apparatus using graphic icons |
JP4463215B2 (en) * | 2006-01-30 | 2010-05-19 | 日本電気株式会社 | Three-dimensional processing apparatus and three-dimensional information terminal |
US20070250787A1 (en) * | 2006-04-21 | 2007-10-25 | Hideya Kawahara | Enhancing visual representation and other effects for application management on a device with a small screen |
KR101423915B1 (en) * | 2006-04-21 | 2014-07-29 | 삼성전자주식회사 | Method and apparatus for generating 3D On screen display |
US8203564B2 (en) * | 2007-02-16 | 2012-06-19 | Qualcomm Incorporated | Efficient 2-D and 3-D graphics processing |
US8111913B2 (en) * | 2008-09-17 | 2012-02-07 | Motorola Solutions, Inc. | Countermeasures against original background retrieval |
CN101452582B (en) * | 2008-12-18 | 2013-09-18 | 北京中星微电子有限公司 | Method and device for implementing three-dimensional video specific action |
-
2010
- 2010-05-18 TW TW099115793A patent/TWI493500B/en not_active IP Right Cessation
- 2010-05-20 CN CN2010101901172A patent/CN101930620B/en not_active Expired - Fee Related
- 2010-05-26 TW TW099116901A patent/TWI484824B/en not_active IP Right Cessation
- 2010-05-27 CN CN2010101948802A patent/CN101964859B/en not_active Expired - Fee Related
- 2010-06-04 TW TW099118127A patent/TWI425441B/en not_active IP Right Cessation
- 2010-06-07 US US12/794,943 patent/US20100321380A1/en not_active Abandoned
- 2010-06-07 TW TW099118436A patent/TWI517711B/en not_active IP Right Cessation
- 2010-06-07 CN CN2010102030770A patent/CN101930621B/en not_active Expired - Fee Related
- 2010-06-10 CN CN201010206830.1A patent/CN101930337B/en not_active Expired - Fee Related
- 2010-06-14 US US12/814,617 patent/US8442346B2/en active Active
- 2010-06-17 US US12/817,244 patent/US8576220B2/en active Active
- 2010-06-18 US US12/818,734 patent/US8749712B2/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010005204A1 (en) * | 1996-08-29 | 2001-06-28 | Sanyo Electric Co., Ltd. | Texture information assignment method, object extraction method, three-dimensional model generating method, and apparatus thereof |
US6577330B1 (en) * | 1997-08-12 | 2003-06-10 | Matsushita Electric Industrial Co., Ltd. | Window display device with a three-dimensional orientation of windows |
US6760488B1 (en) * | 1999-07-12 | 2004-07-06 | Carnegie Mellon University | System and method for generating a three-dimensional model from a two-dimensional image sequence |
US20020050988A1 (en) * | 2000-03-28 | 2002-05-02 | Michael Petrov | System and method of three-dimensional image capture and modeling |
US20060227133A1 (en) * | 2000-03-28 | 2006-10-12 | Michael Petrov | System and method of three-dimensional image capture and modeling |
US20060232583A1 (en) * | 2000-03-28 | 2006-10-19 | Michael Petrov | System and method of three-dimensional image capture and modeling |
US20060103650A1 (en) * | 2001-02-23 | 2006-05-18 | Fujitsu Limited | Display controlling apparatus, information terminal unit provided with display controlling apparatus, and viewpoint location controlling apparatus |
US20040155877A1 (en) * | 2003-02-12 | 2004-08-12 | Canon Europa N.V. | Image processing apparatus |
US20070257904A1 (en) * | 2006-05-05 | 2007-11-08 | Microsoft Corporation | Editing text within a three-dimensional graphic |
US20090058883A1 (en) * | 2007-09-05 | 2009-03-05 | Osmosys S.A. | Method for rotating images |
US20090058851A1 (en) * | 2007-09-05 | 2009-03-05 | Osmosys S.A. | Method for drawing geometric shapes |
US20110034246A1 (en) * | 2008-04-09 | 2011-02-10 | Eyal Amitzur | System and method for a two dimensional to three dimensional game transformation |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103544725A (en) * | 2013-11-19 | 2014-01-29 | 马宁 | Two-dimensional stereoscopic animation making method |
US20160063765A1 (en) * | 2014-08-27 | 2016-03-03 | Ricoh Company, Ltd. | Image processing apparatus and image processing method |
US10996829B2 (en) * | 2019-05-14 | 2021-05-04 | Furuno Electric Company Limited | Apparatus, method and program for processing data |
US11494059B2 (en) * | 2019-05-14 | 2022-11-08 | Furuno Electric Company Limited | Apparatus, method and program for processing data |
US11609687B2 (en) * | 2019-05-14 | 2023-03-21 | Furuno Electric Company Limited | Apparatus, method and program for processing data |
CN110456517A (en) * | 2019-08-20 | 2019-11-15 | 上海驾馥电子科技有限公司 | 3D display screen and its 3D display method |
Also Published As
Publication number | Publication date |
---|---|
TWI517711B (en) | 2016-01-11 |
CN101964859A (en) | 2011-02-02 |
CN101930620A (en) | 2010-12-29 |
CN101930621B (en) | 2012-02-01 |
CN101930621A (en) | 2010-12-29 |
TWI493500B (en) | 2015-07-21 |
TWI425441B (en) | 2014-02-01 |
TW201101228A (en) | 2011-01-01 |
US8442346B2 (en) | 2013-05-14 |
US20100321575A1 (en) | 2010-12-23 |
CN101930620B (en) | 2012-04-04 |
US8576220B2 (en) | 2013-11-05 |
CN101930337B (en) | 2015-04-22 |
US20100322531A1 (en) | 2010-12-23 |
CN101964859B (en) | 2012-09-19 |
CN101930337A (en) | 2010-12-29 |
TW201101824A (en) | 2011-01-01 |
TW201127040A (en) | 2011-08-01 |
TWI484824B (en) | 2015-05-11 |
US8749712B2 (en) | 2014-06-10 |
US20100321381A1 (en) | 2010-12-23 |
TW201101226A (en) | 2011-01-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100321380A1 (en) | Image Processing Method and Associated Apparatus for Rendering Three-dimensional Effect Using Two-dimensional Image | |
TWI578266B (en) | Varying effective resolution by screen location in graphics processing by approximating projection of vertices onto curved viewport | |
US9697647B2 (en) | Blending real and virtual construction jobsite objects in a dynamic augmented reality scene of a construction jobsite in real-time | |
US9538164B2 (en) | Stereoscopic conversion with viewing orientation for shader based graphics content | |
US20140064607A1 (en) | Systems, methods, and computer program products for low-latency warping of a depth map | |
CN107924556B (en) | Image generation device and image display control device | |
JP2022543729A (en) | System and method for foveated rendering | |
US20090244066A1 (en) | Multi parallax image generation apparatus and method | |
JP5061227B2 (en) | Video signal processing apparatus and virtual reality generation system | |
JP2007251914A (en) | Image signal processing apparatus, and virtual reality creating system | |
CN112740278B (en) | Method and apparatus for graphics processing | |
CN114513646A (en) | Method and device for generating panoramic video in three-dimensional virtual scene | |
JP2004356789A (en) | Stereoscopic video image display apparatus and program | |
US20220108420A1 (en) | Method and system of efficient image rendering for near-eye light field displays | |
TW202334803A (en) | Memory structures to support changing view direction | |
US20090058851A1 (en) | Method for drawing geometric shapes | |
JP2005165283A (en) | Map display device | |
US20240153200A1 (en) | View synthesis system and method using depth map | |
TWI852053B (en) | View synthesis system and method using depth map | |
JP3866267B2 (en) | Graphics equipment | |
JP2007312420A (en) | Video signal processing apparatus and virtual reality creating system | |
CN118827891A (en) | Image display method, device, electronic equipment and program product | |
KR20090058687A (en) | A low cost view-volume clipping method and an apparatus therefor | |
JP2006331062A (en) | Solid image generation method and solid image generation device | |
JP2005346194A (en) | Image generation device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MSTAR SEMICONDUCTOR, INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, RUEN-RONE;WANG, TSAI-SHENG;REEL/FRAME:024495/0267 Effective date: 20100601 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |