KR20150004483A - Real time animation apparatus and method for Java TV graphics service - Google Patents

Real time animation apparatus and method for Java TV graphics service Download PDF

Info

Publication number
KR20150004483A
KR20150004483A KR1020130077210A KR20130077210A KR20150004483A KR 20150004483 A KR20150004483 A KR 20150004483A KR 1020130077210 A KR1020130077210 A KR 1020130077210A KR 20130077210 A KR20130077210 A KR 20130077210A KR 20150004483 A KR20150004483 A KR 20150004483A
Authority
KR
South Korea
Prior art keywords
texture image
image
output area
texture
animation
Prior art date
Application number
KR1020130077210A
Other languages
Korean (ko)
Inventor
박경주
카지미에즈 크럼피에츠 프세므슬리브
Original Assignee
중앙대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 중앙대학교 산학협력단 filed Critical 중앙대학교 산학협력단
Priority to KR1020130077210A priority Critical patent/KR20150004483A/en
Publication of KR20150004483A publication Critical patent/KR20150004483A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A real time animation device for a Java TV graphics service is disclosed. The storage unit stores the source image and the texture image. The mapping unit converts and maps a texture image corresponding to an output area of the texture image set on the source image based on the coordinate information of the texture image in the coordinate plane set for each image constituting the animation. The rendering unit outputs each image constituting the animation on the basis of the texture image mapped corresponding to the source image and the output area of the texture image. The transformed coordinate calculator calculates the transform matrix based on the update information including the movement amount, the rotation angle, and the magnitude change ratio for changing the mapped texture image corresponding to the output area of the texture image, The coordinate information of the mapped texture image is updated to correspond to the output area of the texture image. According to the present invention, it is possible to quickly and simply perform real-time animation in a Java TV environment without upgrading firmware in a terminal having a memory capacity and a limited processing speed.

Description

Technical Field [0001] The present invention relates to a real-time animation apparatus and method for a Java TV graphics service,

The present invention relates to a real-time animation apparatus and method, and more particularly, to a real-time animation apparatus and method for providing a graphic service in a Java TV environment.

Graphics in the recently used Java TV environment can represent only a simple two-dimensional form in very early form without texture mapping or format conversion. The Java language itself, as of version 2.0, implements a two-dimensional graphics library, which can not be used for Java under version 1.1, which is used as the basis for a Java TV environment. Because these Java TV environments do not support affine transformation and image mapping, game and other media application developers are forced to apply special techniques used in the game industry to mimic similar behavior.

The most common solution to achieve this goal is called sprite animation. In sprite animation techniques, different states of shape transformation are stored separately from the different source images, and programmers can exchange and output them, if necessary, to mimic transformations such as scaling or rotation and other behaviors. 1 is a diagram showing a game developed for Java TV in which an object transformation is replaced by a sprite animation technique. Referring to FIG. 1, the image shown on the left is a sprite image set to a state before the jump as a character represented in the game, and the image shown on the right is generated by a player's interaction with the game, An image is an image that a character holding a line is transforming into a line over a line.

Sprite animation techniques are also used to represent texture image mapping in addition to transformation. This allows developers to create an image with a texture that is already mapped instead of mapping a texture image to a specific shape. 2 is a diagram showing an example in which a billiard texture is mapped and replaced when the billiard balls are moving. Referring to FIG. 2, it can be seen that the billiard sprite image is animated so as to appear to move as the billiard ball is hit in a certain direction from the position where it is located. Also, the rotation direction of the billiard ball is the upper side, the lower side, the right side, and the left side, and the position of the billiard ball changes as an image is drawn at a different position on the canvas.

The main disadvantage of this sprite image technique is that the amount of images to be stored on the server or client device is enormous because all possible states of the objects must be represented by different images. Therefore, the sprite image technique requires a certain amount of overhead to read and output the image again.

Korea Patent Publication No. 2011-0036947 (published on April 12, 2011, entitled " Partitioning-based performance analysis for grapheme imaging " Japanese Patent Laid-Open No. 2002-63595 (published on February 28, 2002, entitled " Graphics Device Having Skeleton Animation Sketch Hardware)

SUMMARY OF THE INVENTION It is an object of the present invention to provide a real-time real-time animation apparatus and method capable of realizing animations in real time while reducing the capacity of a storage device.

Another object of the present invention is to provide a computer-readable recording medium storing a program for causing a computer to execute a real-time real-time animation method capable of realizing animation in real time while reducing the capacity of a storage device .

According to an aspect of the present invention, there is provided a real-time animation apparatus including a source image used as a background in each image constituting an animation, and a texture image in which a shape is deformed in each of the images constituting the animation, Lt; / RTI > A mapping unit for transforming and mapping the texture image corresponding to an output area of a texture image set on the source image based on coordinate information of the texture image in a coordinate plane set for each image constituting the animation; A rendering unit for outputting each of the images constituting the animation on the basis of the texture image mapped corresponding to the source image and the output area of the texture image; And a transformation rate calculating unit that calculates a transformation matrix based on update information including at least one of a movement amount, a rotation angle, and a magnitude change ratio for changing a texture image mapped corresponding to an output area of the texture image, And a transformed coordinate calculation unit for applying the coordinate information of the mapped texture image corresponding to the output area of the texture image to update the coordinate information of the mapped texture image corresponding to the output area of the texture image.

Preferably, the coordinate information has a sequence corresponding to each vertex of the texture image as vertex coordinates of a region where the texture image of a polygonal shape is to be positioned in each image constituting the animation, and more preferably, And the shape of the area where the texture image is to be positioned is a rectangle.

Preferably, the rotation angle is an angle that rotates the texture image clockwise after locating the texture image at the origin of the coordinate plane.

Preferably, the mapping unit divides the texture image into a plurality of triangles, calculates area coordinates of the vertexes of the triangles, and transforms the texture images.

Preferably, the rendering unit selects a pixel position to determine a pixel value for an image constituting the animation, and when the selected pixel position belongs to an output area of the texture image, a pixel value at a corresponding position of the converted texture image Is determined as the pixel value of the selected pixel position and the pixel value at the corresponding position of the source image is determined as the pixel value of the selected pixel position if the pixel value does not belong to the output area of the texture image.

Preferably, the rendering unit selects a pixel position for determining a pixel value within a bounding box of a minimum size including an output region of the texture image set for an image constituting the animation and each side parallel to a coordinate axis of the coordinate plane If the selected pixel location belongs to an output area of the texture image, determines a pixel value of a corresponding location of the converted texture image as a pixel value of the selected pixel location, and if the selected pixel location does not belong to an output area of the texture image, The pixel value of the corresponding position of the image is determined as the pixel value of the selected pixel position.

Preferably, the mapping unit transforms and remaps a mapped texture image corresponding to an output area of the texture image corresponding to an output area of the texture image set on the source image, and the rendering unit converts the texture image, And outputs each image constituting the animation on the basis of the remapped texture image corresponding to the output region of the animation image.

According to another aspect of the present invention, there is provided a method for real-time animation, comprising the steps of: A first mapping step of transforming and mapping the texture image corresponding to an output area of a texture image set on a source image used as a background in each image constituting the animation based on the information; A first rendering step of outputting each image constituting the animation based on the texture image mapped corresponding to the source image and the output area of the texture image; A transformation matrix calculation step of computing a transformation matrix based on update information including at least one of a movement amount, a rotation angle, and a magnitude change ratio for changing a texture image mapped corresponding to an output area of the texture image; A transformed coordinate calculation step of applying coordinate information of a texture image mapped to the output area of the texture image to the transformation matrix to update coordinate information of the mapped texture image corresponding to the output area of the texture image; A texture image mapped corresponding to an output area of the texture image, corresponding to an output area of the texture image set on the source image, based on updated coordinate information of the texture image mapped corresponding to the output area of the texture image, A second mapping step of converting and remapping; And a second rendering step of outputting each of the images constituting the animation based on the source image and the remapped texture image corresponding to the output area of the texture image.

Preferably, the coordinate information has an order corresponding to each vertex of the texture image as vertex coordinates of an area where the texture image of a polygonal shape is to be positioned in each image constituting the animation, and more preferably, Is an angle at which the texture image is rotated clockwise after locating the texture image at the origin of the coordinate plane.

Preferably, the first rendering step may include: a pixel position selection step of selecting a pixel position to determine a pixel value for an image constituting the animation; And determining a pixel value at a corresponding position of the transformed texture image as a pixel value of the selected pixel position if the selected pixel position belongs to an output region of the texture image, And determining a pixel value at a corresponding position of the image as a pixel value of the selected pixel position.

Preferably, the first rendering step includes a step of determining a pixel value within a bounding box of a minimum size that includes an output area of the texture image set for an image constituting the animation and in which each side is parallel to a coordinate axis of the coordinate plane A pixel position selecting step of selecting a position; And determining a pixel value at a corresponding position of the transformed texture image as a pixel value of the selected pixel position if the selected pixel position belongs to an output region of the texture image, And determining a pixel value at a corresponding position of the image as a pixel value of the selected pixel position.

According to the apparatus and method for real time animation for Java TV graphics service according to the present invention, a terminal having a memory capacity and a limited processing speed can quickly and simply implement a two-dimensional graphic library without upgrading firmware, Real-time animation is possible, and 3D graphics can be implemented in Java TV environment in the future.

1 illustrates a game developed for Java TV in which an object transformation is replaced by a sprite animation technique,
FIG. 2 illustrates an example in which a billiard texture is mapped and replaced when the billiard balls are moving;
3 is a block diagram illustrating a configuration of a preferred embodiment of a real-time animation apparatus 300 according to the present invention.
4 shows an example of a source image (a), a texture image (b) and a result image (c)
5 is a flowchart illustrating a process of outputting a resultant image by the rendering unit 330,
FIG. 6 is a flowchart illustrating a process of performing a preferred embodiment of a real-time animation method according to the present invention.

Hereinafter, a preferred embodiment of a real-time animation apparatus and method for a Java TV graphics service according to the present invention will be described in detail with reference to the accompanying drawings.

3 is a block diagram illustrating a configuration of a preferred embodiment of a real-time animation apparatus 300 according to the present invention.

3, the real-time animation apparatus 300 according to the present invention includes a storage unit 310, a mapping unit 320, a rendering unit 330, and a transformed coordinate calculation unit 340.

The storage unit 310 stores a source image, a texture image, a transformation matrix, transformation coordinates, and the like. The storage unit 310 may include a nonvolatile memory and a volatile memory. In this case, the source image and the texture image are stored in the nonvolatile memory, and data generated during the operation of the real-time animation apparatus 300 according to the present invention, such as a transformation matrix, transformation coordinates, etc., is preferably stored in the volatile memory. Non-volatile memory may also be partitioned to allocate storage space independent of each component. Fig. 4 shows an example of the source image (a), the texture image (b), and the resulting image (c).

The mapping unit 320 maps the texture image to the texture image output area set on the source image based on the received coordinate information. The coordinate information indicates coordinates of a vertex as information indicating an area where a texture image is to be positioned on the resultant image. Therefore, the coordinate information is defined as the coordinate of the upper left pixel (i.e., the pixel located in the first column of the first row) of the resultant image as the origin, the direction in which the column number increases in the x-axis, It is the coordinate value on the coordinate system. At this time, the respective coordinate values inputted to the mapping unit 320 are inputted in the order of the coordinate values of the corresponding vertexes of the texture image. For example, when the texture image is a quadrangle, the coordinate values input to the mapping unit 320 are sequentially inputted to the upper left vertex, the upper right vertex, the lower right vertex, and the lower left vertex of the texture image. Taking the source image, the texture image and the result image shown in FIG. 4 as an example, when the size of the source image is 800 × 600 (pixels) and the size of the texture image is 150 × 150 (pixels) Coordinate information is input to the mapping unit 320 in the order of (458, 254), (368, 240), (382, 144) and (474, 158) In this case, the number of vertexes corresponds to the number of vertexes of the texture image, and the texture image may be any polygon. In the following description, the case where the texture image is a rectangle is described as an example.

In the present invention, the texture image used for texture mapping must be converted in real time to any shape. The texture mapping technique can use known techniques. In the present invention, the affine matrix transformation is applied to the vertexes to obtain the transformed shape, and then the texture image is mapped to the transformed shape using the gravity center coordinate algorithm . Affine matrix transformation is a technique for dividing a given rectangle into two triangles and then obtaining the transformed shape for each vertex of each triangle. That is, the affine texture mapping computes pixels of any triangle by using the center of gravity coordinates and maps to the source texture image. By this method, the coordinates based on the center of gravity of a given triangle can be calculated.

These coordinates are expressed as an angle with respect to each vertex of the triangle as shown in the following equation.

Figure pat00001

Figure pat00002

In the equations (1) and (2), r is a point within a triangle, a, b and c are vertexes of a triangle, and? 1 ,? 2 and? 3 are area coordinates obtained from interior angles?,?

From these properties, area coordinates for each of the three vertexes can be calculated by the following equation.

Figure pat00003

By obtaining the area coordinates, the same point on the source texture image can be easily detected, and it can be mapped to the target triangle. Therefore, area coordinates can be converted into x and y coordinates of a point on the texture image using the following equation.

Figure pat00004

In this way, pixels from the source texture image can be mapped to pixels located in the area of a given triangle. Java TV environment does not support texture mapping except simple image representation using pixel coordinate system, but other graphic libraries such as OpenGL, DirectX, etc. contain texture mapping algorithm and functions. In the present invention, texture mapping is performed using the simplest and robust method called affine texture mapping among the texture mapping algorithms and functions provided by the graphic libraries, thereby realizing simple and quick real time animation.

Table 1 shows the results of mapping source images of different sizes to target areas of different sizes.

Target area
(Size: Pixel)
Source texture image (size: pixels) Memory usage
50 x 50 100 x 100 300 x 300 100 x 100 6 to 8 ms 7 to 9 ms 13 ~ 15 ms
3780 K
150 x 150 12-14 ms 13 ~ 15 ms 19 to 21 ms 200 x 200 18 to 22 ms 25 to 28 ms 28 to 32 ms

The rendering unit 330 determines a pixel value of each pixel of the resultant image (i.e., a color value and a luminance value of each pixel) and outputs a resultant image. The outputting of the resultant image by the rendering unit 330 may be performed by storing the pixel value corresponding to each pixel constituting the resultant image in the storage unit 310 or by a method of outputting the resultant image to the display- Lt; / RTI > At this time, the rendering unit 330 outputs the resultant image in such a manner that pixel values are sequentially given from the pixel located on the upper left of the resultant image to the pixel positioned on the lower right.

5 is a flowchart illustrating a process of outputting a resultant image by the rendering unit 330. Referring to FIG.

Referring to FIG. 5, the rendering unit 330 selects a pixel position to determine a pixel value (S500). At this time, the rendering unit 330 sequentially selects the pixel position while sequentially increasing the value of the column from the first pixel of the resultant image (i.e., the pixel located in the first column of the first row), and when the last value of the column is reached, The pixel position is selected while increasing the value of the column. The selection of such pixel locations is repeated until the last column of the last row is reached. Next, the rendering unit 330 determines whether the selected pixel position belongs to the output area of the texture image (S510). If it is determined that the input image belongs to the output area of the texture image, the rendering unit 330 determines the pixel value of the corresponding position of the texture image as the pixel value of the corresponding pixel on the result image (S520). Otherwise, the rendering unit 330 determines the pixel value of the corresponding pixel of the source image as the pixel value of the corresponding pixel on the resultant image (S530). This process is repeatedly performed until a pixel value for all pixel positions of the resultant image is determined (S540).

On the other hand, the rendering method described with reference to FIG. 5 has a problem that it must be checked whether all the pixels of the result image belong to the output area of the texture image. This problem can be solved by applying the boundary box filtering technique. That is, after setting a minimum size bounding box (that is, a rectangle) including the output area of the texture image and parallel to each of the x and y axes, it is checked whether only the pixel positions in the bounding box belong to the output area of the texture image The amount of computation and the computation time can be reduced, thereby improving the rendering speed.

The transformed coordinate calculation unit 340 calculates a transform matrix for changing the texture image on the resultant image based on the transformed input information and applies the vertex coordinates of the texture image on the resultant image to the transformed transformed matrix, Is calculated. At this time, the change of the texture image is performed by movement, rotation, and size change. The transformation information input to the transformed coordinate calculation unit 340 for this purpose is a change amount, a rotation angle, and a magnitude change amount with respect to the texture image on the resultant image. Here, the movement amount refers to the movement distance of each vertex of the texture image in the x-axis direction and the y-axis direction in the existing result image, and the rotation angle refers to the position of the texture image in the original result image, , And the magnitude change ratio means the magnification or reduction ratio of the x and y axes after locating the texture image shown in the existing result image at the origin.

For example, the resultant image as shown in FIG. 4 is shifted by a pixel and b pixel on the x-axis and the y-axis, respectively, and the magnification is enlarged by c times and d times on the x- and y- The conversion matrix rotated by? Is calculated as follows.

First, the movement matrix is expressed by the following equation.

Figure pat00005

Next, the size conversion matrix is expressed by the following equation.

Figure pat00006

Next, the rotation matrix is expressed by the following equation.

Figure pat00007

In Equations (5) to (7), x and y mean the coordinates of each vertex of the texture image shown in the existing result image, and x 'and y' represent the angles of the texture image Coordinates of the vertex.

In this case, Equations (5) to (7) can be expressed by one transformation matrix as follows.

Figure pat00008

As described above, the transformed coordinate calculator 340 transforms the vertex coordinates of the texture image appearing in the resultant image into the transformed coordinates of the texture image, which is calculated based on the transformation information including the movement amount, the rotation angle, The vertex coordinates of the texture image after transformation are calculated by applying the vertex coordinates of the texture image appearing in the resultant image to the transformation matrix as shown in Equation (8). The vertex coordinates of the texture image after the transformation are stored in the storage unit 110. The mapping unit 320 receives vertex coordinates and texture image of the texture image after conversion stored in the storage unit 110, And the texture image is mapped.

The mapping unit 320 and the rendering unit 330 may be integrated into one component, while the mapping unit 320 and the rendering unit 330 are separate components. In addition, only the source image and the texture image are stored in the storage unit 310, and a separate processor controls the operations of the mapping unit 320, the rendering unit 330, and the transformed coordinate calculation unit 340, The data (i.e., the resultant image, the transformation matrix, the transformed coordinates, and the like) may be stored in a volatile memory provided with itself.

FIG. 6 is a flowchart illustrating a process of performing a preferred embodiment of a real-time animation method according to the present invention.

Referring to FIG. 6, the mapping unit 320 maps the texture image to the texture image output area set on the source image based on the received coordinate information (S600). Next, the rendering unit 330 determines the pixel values (i.e., the color values and the luminance values of the respective pixels) of the pixels constituting the resultant image as described with reference to FIG. 5, and outputs the resultant image (S610) . Next, the transformed coordinate calculation unit 340 calculates a transform matrix for transforming the texture image on the resultant image based on the input transformed information, and outputs to the transformed matrix the vertex coordinates of the existing texture image expressed on the resultant image To calculate the vertex coordinates of the changed texture image (S620). Next, the mapping unit 320 maps the texture image to the newly set texture image output area on the source image based on the coordinates of the vertex of the changed texture image calculated by the transformation coordinate calculation unit 340 (S630) . Next, based on the texture image and the source image mapped to the newly set texture image output area, the rendering unit 330 calculates a pixel value (i.e., a pixel value) of each pixel constituting the resultant image by a method as described with reference to FIG. , A color value and a luminance value of each pixel) and outputs a resultant image (S640). Steps S620 through S640 are repeatedly performed until the animation ends.

The real-time animation method according to the present invention as described with reference to FIG. 6 can be implemented as a computer program. At this time, it is preferable that the program implementing the real-time animation method according to the present invention is manufactured by API (Application Programming Interface) so that it can operate in the Java TV environment. In this case, A texture class for performing mapping of an image and outputting a resultant image, and a conversion class for calculating a conversion matrix and conversion coordinates). When the real-time animation method according to the present invention is implemented by a computer program, the real-time animation apparatus according to the present invention implements a real-time animation method according to the present invention and a memory in which a source image, a texture image, a transformation matrix, And a processor in which one program is installed and executed.

The present invention can also be embodied as computer-readable codes on a computer-readable recording medium. A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored. Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may be implemented in the form of a carrier wave (for example, transmission via the Internet) . The computer-readable recording medium may also be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation in the embodiment in which said invention is directed. It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the scope of the appended claims.

Claims (14)

A storage unit for storing a source image used as a background in each of the images constituting the animation and a texture image in which the shape of each of the images forming the animation is modified;
A mapping unit for transforming and mapping the texture image corresponding to an output area of a texture image set on the source image based on coordinate information of the texture image in a coordinate plane set for each image constituting the animation;
A rendering unit for outputting each of the images constituting the animation on the basis of the texture image mapped corresponding to the source image and the output area of the texture image; And
Calculating a transformation matrix based on update information including at least one of a movement amount, a rotation angle, and a magnitude change ratio for changing a texture image mapped corresponding to an output area of the texture image; And a transformed coordinate calculation unit for applying the coordinate information of the mapped texture image corresponding to the output area to update the coordinate information of the mapped texture image corresponding to the output area of the texture image.
The method according to claim 1,
Wherein the coordinate information is a vertex coordinate of an area where the texture image of a polygonal shape is to be positioned in each of the images constituting the animation, and has a sequence corresponding to each vertex of the texture image.
3. The method of claim 2,
Wherein the shape of the area where the texture image and the texture image are to be positioned is a rectangle.
4. The method according to any one of claims 1 to 3,
Wherein the rotation angle is an angle that rotates the texture image in a clockwise direction after locating the texture image at the origin of the coordinate plane.
The method according to claim 2 or 3,
Wherein the mapping unit converts the texture image by dividing the texture image into a plurality of triangles, calculating area coordinates of the vertexes of the triangles, and converting the texture images.
The method according to claim 1,
Wherein the rendering unit selects a pixel position to determine a pixel value for an image constituting the animation, and if the selected pixel position belongs to an output region of the texture image, a pixel value at a corresponding position of the converted texture image is selected And determines a pixel value at a corresponding position of the source image as a pixel value of the selected pixel position if the pixel value does not belong to the output region of the texture image.
The method according to claim 1,
The rendering unit selects a pixel position for determining a pixel value in a boundary box of a minimum size including an output region of the texture image set for the image constituting the animation and each side parallel to a coordinate axis of the coordinate plane, Determining a pixel value of a corresponding position of the transformed texture image as a pixel value of the selected pixel position if the pixel position is within the output region of the texture image, The pixel value of the selected pixel position is determined as the pixel value of the selected pixel position.
The method according to claim 1,
The mapping unit may map the output area of the texture image, corresponding to the output area of the texture image, based on the updated coordinate information of the texture image mapped to the output area of the texture image, Convert and remap texture images,
Wherein the rendering unit outputs each image constituting the animation on the basis of the remapped texture image corresponding to the source image and the output area of the texture image.
A source used as a background in each of the images constituting the animation based on coordinate information of a texture image whose shape is deformed in each of the images constituting the animation in a coordinate plane set for each image constituting the animation, A first mapping step of transforming and mapping the texture image corresponding to an output area of a texture image set on an image;
A first rendering step of outputting each image constituting the animation based on the texture image mapped corresponding to the source image and the output area of the texture image;
A transformation matrix calculation step of computing a transformation matrix based on update information including at least one of a movement amount, a rotation angle, and a magnitude change ratio for changing a texture image mapped corresponding to an output area of the texture image;
A transformed coordinate calculation step of applying coordinate information of a texture image mapped to the output area of the texture image to the transformation matrix to update coordinate information of the mapped texture image corresponding to the output area of the texture image;
A texture image mapped corresponding to an output area of the texture image, corresponding to an output area of the texture image set on the source image, based on updated coordinate information of the texture image mapped corresponding to the output area of the texture image, A second mapping step of converting and remapping; And
And a second rendering step of outputting each of the images constituting the animation based on the source image and the remapped texture image corresponding to the output area of the texture image.
10. The method of claim 9,
Wherein the coordinate information is a vertex coordinate of an area where the texture image of a polygonal shape is to be positioned in each of the images constituting the animation, and has a sequence corresponding to each vertex of the texture image.
10. The method of claim 9,
Wherein the rotation angle is an angle that rotates the texture image in a clockwise direction after locating the texture image at the origin of the coordinate plane.
10. The method of claim 9,
Wherein the first rendering step comprises:
A pixel position selecting step of selecting a pixel position to determine a pixel value for an image constituting the animation; And
If the selected pixel location belongs to the output area of the texture image, determines a pixel value of the corresponding location of the converted texture image as a pixel value of the selected pixel location, and if the selected pixel location does not belong to the output area of the texture image, And determining a pixel value at a corresponding position of the selected pixel position as a pixel value of the selected pixel position.
10. The method of claim 9,
Wherein the first rendering step comprises:
A pixel position selection step of selecting a pixel position to determine a pixel value within a bounding box of a minimum size including an output area of the texture image set for the image constituting the animation and each side parallel to a coordinate axis of the coordinate plane; And
If the selected pixel location belongs to the output area of the texture image, determines a pixel value of the corresponding location of the converted texture image as a pixel value of the selected pixel location, and if the selected pixel location does not belong to the output area of the texture image, And determining a pixel value at a corresponding position of the selected pixel position as a pixel value of the selected pixel position.
A computer-readable recording medium recording a program for causing a computer to execute the real-time animation method according to any one of claims 9 to 13.
KR1020130077210A 2013-07-02 2013-07-02 Real time animation apparatus and method for Java TV graphics service KR20150004483A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130077210A KR20150004483A (en) 2013-07-02 2013-07-02 Real time animation apparatus and method for Java TV graphics service

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130077210A KR20150004483A (en) 2013-07-02 2013-07-02 Real time animation apparatus and method for Java TV graphics service

Publications (1)

Publication Number Publication Date
KR20150004483A true KR20150004483A (en) 2015-01-13

Family

ID=52476646

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130077210A KR20150004483A (en) 2013-07-02 2013-07-02 Real time animation apparatus and method for Java TV graphics service

Country Status (1)

Country Link
KR (1) KR20150004483A (en)

Similar Documents

Publication Publication Date Title
JP6563048B2 (en) Tilt adjustment of texture mapping for multiple rendering targets with different resolutions depending on screen position
CN110383337B (en) Variable rate coloring
CN108351864B (en) Concave geometric dense paving
US8111264B2 (en) Method of and system for non-uniform image enhancement
KR101145260B1 (en) Apparatus and method for mapping textures to object model
JP5437475B2 (en) Shading generation method for images
US8059119B2 (en) Method for detecting border tiles or border pixels of a primitive for tile-based rendering
US20080231631A1 (en) Image processing apparatus and method of controlling operation of same
US20190318530A1 (en) Systems and Methods for Reducing Rendering Latency
US10121221B2 (en) Method and apparatus to accelerate rendering of graphics images
TWI434226B (en) Image processing techniques
US10553012B2 (en) Systems and methods for rendering foveated effects
US10699467B2 (en) Computer-graphics based on hierarchical ray casting
JP6863693B2 (en) Graphics processing system and method
US7843463B1 (en) System and method for bump mapping setup
JP2018106712A (en) Fast rendering of quadrics and marking of silhouettes thereof
AU2017279678A1 (en) Fast rendering of quadrics
JP2008165760A (en) Method and apparatus for processing graphics
JP6235123B2 (en) 2D curve tessellation using graphics pipeline
KR102443697B1 (en) Method and apparatus for performing a path stroke
KR20170040698A (en) Method and apparatus for performing graphics pipelines
JP4948273B2 (en) Information processing method and information processing apparatus
US11030791B2 (en) Centroid selection for variable rate shading
KR101227155B1 (en) Graphic image processing apparatus and method for realtime transforming low resolution image into high resolution image
US11120606B1 (en) Systems and methods for image texture uniformization for multiview object capture

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E601 Decision to refuse application