KR20150004483A - Real time animation apparatus and method for Java TV graphics service - Google Patents
Real time animation apparatus and method for Java TV graphics service Download PDFInfo
- Publication number
- KR20150004483A KR20150004483A KR1020130077210A KR20130077210A KR20150004483A KR 20150004483 A KR20150004483 A KR 20150004483A KR 1020130077210 A KR1020130077210 A KR 1020130077210A KR 20130077210 A KR20130077210 A KR 20130077210A KR 20150004483 A KR20150004483 A KR 20150004483A
- Authority
- KR
- South Korea
- Prior art keywords
- texture image
- image
- output area
- texture
- animation
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
A real time animation device for a Java TV graphics service is disclosed. The storage unit stores the source image and the texture image. The mapping unit converts and maps a texture image corresponding to an output area of the texture image set on the source image based on the coordinate information of the texture image in the coordinate plane set for each image constituting the animation. The rendering unit outputs each image constituting the animation on the basis of the texture image mapped corresponding to the source image and the output area of the texture image. The transformed coordinate calculator calculates the transform matrix based on the update information including the movement amount, the rotation angle, and the magnitude change ratio for changing the mapped texture image corresponding to the output area of the texture image, The coordinate information of the mapped texture image is updated to correspond to the output area of the texture image. According to the present invention, it is possible to quickly and simply perform real-time animation in a Java TV environment without upgrading firmware in a terminal having a memory capacity and a limited processing speed.
Description
The present invention relates to a real-time animation apparatus and method, and more particularly, to a real-time animation apparatus and method for providing a graphic service in a Java TV environment.
Graphics in the recently used Java TV environment can represent only a simple two-dimensional form in very early form without texture mapping or format conversion. The Java language itself, as of version 2.0, implements a two-dimensional graphics library, which can not be used for Java under version 1.1, which is used as the basis for a Java TV environment. Because these Java TV environments do not support affine transformation and image mapping, game and other media application developers are forced to apply special techniques used in the game industry to mimic similar behavior.
The most common solution to achieve this goal is called sprite animation. In sprite animation techniques, different states of shape transformation are stored separately from the different source images, and programmers can exchange and output them, if necessary, to mimic transformations such as scaling or rotation and other behaviors. 1 is a diagram showing a game developed for Java TV in which an object transformation is replaced by a sprite animation technique. Referring to FIG. 1, the image shown on the left is a sprite image set to a state before the jump as a character represented in the game, and the image shown on the right is generated by a player's interaction with the game, An image is an image that a character holding a line is transforming into a line over a line.
Sprite animation techniques are also used to represent texture image mapping in addition to transformation. This allows developers to create an image with a texture that is already mapped instead of mapping a texture image to a specific shape. 2 is a diagram showing an example in which a billiard texture is mapped and replaced when the billiard balls are moving. Referring to FIG. 2, it can be seen that the billiard sprite image is animated so as to appear to move as the billiard ball is hit in a certain direction from the position where it is located. Also, the rotation direction of the billiard ball is the upper side, the lower side, the right side, and the left side, and the position of the billiard ball changes as an image is drawn at a different position on the canvas.
The main disadvantage of this sprite image technique is that the amount of images to be stored on the server or client device is enormous because all possible states of the objects must be represented by different images. Therefore, the sprite image technique requires a certain amount of overhead to read and output the image again.
SUMMARY OF THE INVENTION It is an object of the present invention to provide a real-time real-time animation apparatus and method capable of realizing animations in real time while reducing the capacity of a storage device.
Another object of the present invention is to provide a computer-readable recording medium storing a program for causing a computer to execute a real-time real-time animation method capable of realizing animation in real time while reducing the capacity of a storage device .
According to an aspect of the present invention, there is provided a real-time animation apparatus including a source image used as a background in each image constituting an animation, and a texture image in which a shape is deformed in each of the images constituting the animation, Lt; / RTI > A mapping unit for transforming and mapping the texture image corresponding to an output area of a texture image set on the source image based on coordinate information of the texture image in a coordinate plane set for each image constituting the animation; A rendering unit for outputting each of the images constituting the animation on the basis of the texture image mapped corresponding to the source image and the output area of the texture image; And a transformation rate calculating unit that calculates a transformation matrix based on update information including at least one of a movement amount, a rotation angle, and a magnitude change ratio for changing a texture image mapped corresponding to an output area of the texture image, And a transformed coordinate calculation unit for applying the coordinate information of the mapped texture image corresponding to the output area of the texture image to update the coordinate information of the mapped texture image corresponding to the output area of the texture image.
Preferably, the coordinate information has a sequence corresponding to each vertex of the texture image as vertex coordinates of a region where the texture image of a polygonal shape is to be positioned in each image constituting the animation, and more preferably, And the shape of the area where the texture image is to be positioned is a rectangle.
Preferably, the rotation angle is an angle that rotates the texture image clockwise after locating the texture image at the origin of the coordinate plane.
Preferably, the mapping unit divides the texture image into a plurality of triangles, calculates area coordinates of the vertexes of the triangles, and transforms the texture images.
Preferably, the rendering unit selects a pixel position to determine a pixel value for an image constituting the animation, and when the selected pixel position belongs to an output area of the texture image, a pixel value at a corresponding position of the converted texture image Is determined as the pixel value of the selected pixel position and the pixel value at the corresponding position of the source image is determined as the pixel value of the selected pixel position if the pixel value does not belong to the output area of the texture image.
Preferably, the rendering unit selects a pixel position for determining a pixel value within a bounding box of a minimum size including an output region of the texture image set for an image constituting the animation and each side parallel to a coordinate axis of the coordinate plane If the selected pixel location belongs to an output area of the texture image, determines a pixel value of a corresponding location of the converted texture image as a pixel value of the selected pixel location, and if the selected pixel location does not belong to an output area of the texture image, The pixel value of the corresponding position of the image is determined as the pixel value of the selected pixel position.
Preferably, the mapping unit transforms and remaps a mapped texture image corresponding to an output area of the texture image corresponding to an output area of the texture image set on the source image, and the rendering unit converts the texture image, And outputs each image constituting the animation on the basis of the remapped texture image corresponding to the output region of the animation image.
According to another aspect of the present invention, there is provided a method for real-time animation, comprising the steps of: A first mapping step of transforming and mapping the texture image corresponding to an output area of a texture image set on a source image used as a background in each image constituting the animation based on the information; A first rendering step of outputting each image constituting the animation based on the texture image mapped corresponding to the source image and the output area of the texture image; A transformation matrix calculation step of computing a transformation matrix based on update information including at least one of a movement amount, a rotation angle, and a magnitude change ratio for changing a texture image mapped corresponding to an output area of the texture image; A transformed coordinate calculation step of applying coordinate information of a texture image mapped to the output area of the texture image to the transformation matrix to update coordinate information of the mapped texture image corresponding to the output area of the texture image; A texture image mapped corresponding to an output area of the texture image, corresponding to an output area of the texture image set on the source image, based on updated coordinate information of the texture image mapped corresponding to the output area of the texture image, A second mapping step of converting and remapping; And a second rendering step of outputting each of the images constituting the animation based on the source image and the remapped texture image corresponding to the output area of the texture image.
Preferably, the coordinate information has an order corresponding to each vertex of the texture image as vertex coordinates of an area where the texture image of a polygonal shape is to be positioned in each image constituting the animation, and more preferably, Is an angle at which the texture image is rotated clockwise after locating the texture image at the origin of the coordinate plane.
Preferably, the first rendering step may include: a pixel position selection step of selecting a pixel position to determine a pixel value for an image constituting the animation; And determining a pixel value at a corresponding position of the transformed texture image as a pixel value of the selected pixel position if the selected pixel position belongs to an output region of the texture image, And determining a pixel value at a corresponding position of the image as a pixel value of the selected pixel position.
Preferably, the first rendering step includes a step of determining a pixel value within a bounding box of a minimum size that includes an output area of the texture image set for an image constituting the animation and in which each side is parallel to a coordinate axis of the coordinate plane A pixel position selecting step of selecting a position; And determining a pixel value at a corresponding position of the transformed texture image as a pixel value of the selected pixel position if the selected pixel position belongs to an output region of the texture image, And determining a pixel value at a corresponding position of the image as a pixel value of the selected pixel position.
According to the apparatus and method for real time animation for Java TV graphics service according to the present invention, a terminal having a memory capacity and a limited processing speed can quickly and simply implement a two-dimensional graphic library without upgrading firmware, Real-time animation is possible, and 3D graphics can be implemented in Java TV environment in the future.
1 illustrates a game developed for Java TV in which an object transformation is replaced by a sprite animation technique,
FIG. 2 illustrates an example in which a billiard texture is mapped and replaced when the billiard balls are moving;
3 is a block diagram illustrating a configuration of a preferred embodiment of a real-
4 shows an example of a source image (a), a texture image (b) and a result image (c)
5 is a flowchart illustrating a process of outputting a resultant image by the
FIG. 6 is a flowchart illustrating a process of performing a preferred embodiment of a real-time animation method according to the present invention.
Hereinafter, a preferred embodiment of a real-time animation apparatus and method for a Java TV graphics service according to the present invention will be described in detail with reference to the accompanying drawings.
3 is a block diagram illustrating a configuration of a preferred embodiment of a real-
3, the real-
The
The
In the present invention, the texture image used for texture mapping must be converted in real time to any shape. The texture mapping technique can use known techniques. In the present invention, the affine matrix transformation is applied to the vertexes to obtain the transformed shape, and then the texture image is mapped to the transformed shape using the gravity center coordinate algorithm . Affine matrix transformation is a technique for dividing a given rectangle into two triangles and then obtaining the transformed shape for each vertex of each triangle. That is, the affine texture mapping computes pixels of any triangle by using the center of gravity coordinates and maps to the source texture image. By this method, the coordinates based on the center of gravity of a given triangle can be calculated.
These coordinates are expressed as an angle with respect to each vertex of the triangle as shown in the following equation.
In the equations (1) and (2), r is a point within a triangle, a, b and c are vertexes of a triangle, and? 1 ,? 2 and? 3 are area coordinates obtained from interior angles?,?
From these properties, area coordinates for each of the three vertexes can be calculated by the following equation.
By obtaining the area coordinates, the same point on the source texture image can be easily detected, and it can be mapped to the target triangle. Therefore, area coordinates can be converted into x and y coordinates of a point on the texture image using the following equation.
In this way, pixels from the source texture image can be mapped to pixels located in the area of a given triangle. Java TV environment does not support texture mapping except simple image representation using pixel coordinate system, but other graphic libraries such as OpenGL, DirectX, etc. contain texture mapping algorithm and functions. In the present invention, texture mapping is performed using the simplest and robust method called affine texture mapping among the texture mapping algorithms and functions provided by the graphic libraries, thereby realizing simple and quick real time animation.
Table 1 shows the results of mapping source images of different sizes to target areas of different sizes.
(Size: Pixel)
3780 K
The
5 is a flowchart illustrating a process of outputting a resultant image by the
Referring to FIG. 5, the
On the other hand, the rendering method described with reference to FIG. 5 has a problem that it must be checked whether all the pixels of the result image belong to the output area of the texture image. This problem can be solved by applying the boundary box filtering technique. That is, after setting a minimum size bounding box (that is, a rectangle) including the output area of the texture image and parallel to each of the x and y axes, it is checked whether only the pixel positions in the bounding box belong to the output area of the texture image The amount of computation and the computation time can be reduced, thereby improving the rendering speed.
The transformed coordinate
For example, the resultant image as shown in FIG. 4 is shifted by a pixel and b pixel on the x-axis and the y-axis, respectively, and the magnification is enlarged by c times and d times on the x- and y- The conversion matrix rotated by? Is calculated as follows.
First, the movement matrix is expressed by the following equation.
Next, the size conversion matrix is expressed by the following equation.
Next, the rotation matrix is expressed by the following equation.
In Equations (5) to (7), x and y mean the coordinates of each vertex of the texture image shown in the existing result image, and x 'and y' represent the angles of the texture image Coordinates of the vertex.
In this case, Equations (5) to (7) can be expressed by one transformation matrix as follows.
As described above, the transformed coordinate
The
FIG. 6 is a flowchart illustrating a process of performing a preferred embodiment of a real-time animation method according to the present invention.
Referring to FIG. 6, the
The real-time animation method according to the present invention as described with reference to FIG. 6 can be implemented as a computer program. At this time, it is preferable that the program implementing the real-time animation method according to the present invention is manufactured by API (Application Programming Interface) so that it can operate in the Java TV environment. In this case, A texture class for performing mapping of an image and outputting a resultant image, and a conversion class for calculating a conversion matrix and conversion coordinates). When the real-time animation method according to the present invention is implemented by a computer program, the real-time animation apparatus according to the present invention implements a real-time animation method according to the present invention and a memory in which a source image, a texture image, a transformation matrix, And a processor in which one program is installed and executed.
The present invention can also be embodied as computer-readable codes on a computer-readable recording medium. A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored. Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may be implemented in the form of a carrier wave (for example, transmission via the Internet) . The computer-readable recording medium may also be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation in the embodiment in which said invention is directed. It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the scope of the appended claims.
Claims (14)
A mapping unit for transforming and mapping the texture image corresponding to an output area of a texture image set on the source image based on coordinate information of the texture image in a coordinate plane set for each image constituting the animation;
A rendering unit for outputting each of the images constituting the animation on the basis of the texture image mapped corresponding to the source image and the output area of the texture image; And
Calculating a transformation matrix based on update information including at least one of a movement amount, a rotation angle, and a magnitude change ratio for changing a texture image mapped corresponding to an output area of the texture image; And a transformed coordinate calculation unit for applying the coordinate information of the mapped texture image corresponding to the output area to update the coordinate information of the mapped texture image corresponding to the output area of the texture image.
Wherein the coordinate information is a vertex coordinate of an area where the texture image of a polygonal shape is to be positioned in each of the images constituting the animation, and has a sequence corresponding to each vertex of the texture image.
Wherein the shape of the area where the texture image and the texture image are to be positioned is a rectangle.
Wherein the rotation angle is an angle that rotates the texture image in a clockwise direction after locating the texture image at the origin of the coordinate plane.
Wherein the mapping unit converts the texture image by dividing the texture image into a plurality of triangles, calculating area coordinates of the vertexes of the triangles, and converting the texture images.
Wherein the rendering unit selects a pixel position to determine a pixel value for an image constituting the animation, and if the selected pixel position belongs to an output region of the texture image, a pixel value at a corresponding position of the converted texture image is selected And determines a pixel value at a corresponding position of the source image as a pixel value of the selected pixel position if the pixel value does not belong to the output region of the texture image.
The rendering unit selects a pixel position for determining a pixel value in a boundary box of a minimum size including an output region of the texture image set for the image constituting the animation and each side parallel to a coordinate axis of the coordinate plane, Determining a pixel value of a corresponding position of the transformed texture image as a pixel value of the selected pixel position if the pixel position is within the output region of the texture image, The pixel value of the selected pixel position is determined as the pixel value of the selected pixel position.
The mapping unit may map the output area of the texture image, corresponding to the output area of the texture image, based on the updated coordinate information of the texture image mapped to the output area of the texture image, Convert and remap texture images,
Wherein the rendering unit outputs each image constituting the animation on the basis of the remapped texture image corresponding to the source image and the output area of the texture image.
A first rendering step of outputting each image constituting the animation based on the texture image mapped corresponding to the source image and the output area of the texture image;
A transformation matrix calculation step of computing a transformation matrix based on update information including at least one of a movement amount, a rotation angle, and a magnitude change ratio for changing a texture image mapped corresponding to an output area of the texture image;
A transformed coordinate calculation step of applying coordinate information of a texture image mapped to the output area of the texture image to the transformation matrix to update coordinate information of the mapped texture image corresponding to the output area of the texture image;
A texture image mapped corresponding to an output area of the texture image, corresponding to an output area of the texture image set on the source image, based on updated coordinate information of the texture image mapped corresponding to the output area of the texture image, A second mapping step of converting and remapping; And
And a second rendering step of outputting each of the images constituting the animation based on the source image and the remapped texture image corresponding to the output area of the texture image.
Wherein the coordinate information is a vertex coordinate of an area where the texture image of a polygonal shape is to be positioned in each of the images constituting the animation, and has a sequence corresponding to each vertex of the texture image.
Wherein the rotation angle is an angle that rotates the texture image in a clockwise direction after locating the texture image at the origin of the coordinate plane.
Wherein the first rendering step comprises:
A pixel position selecting step of selecting a pixel position to determine a pixel value for an image constituting the animation; And
If the selected pixel location belongs to the output area of the texture image, determines a pixel value of the corresponding location of the converted texture image as a pixel value of the selected pixel location, and if the selected pixel location does not belong to the output area of the texture image, And determining a pixel value at a corresponding position of the selected pixel position as a pixel value of the selected pixel position.
Wherein the first rendering step comprises:
A pixel position selection step of selecting a pixel position to determine a pixel value within a bounding box of a minimum size including an output area of the texture image set for the image constituting the animation and each side parallel to a coordinate axis of the coordinate plane; And
If the selected pixel location belongs to the output area of the texture image, determines a pixel value of the corresponding location of the converted texture image as a pixel value of the selected pixel location, and if the selected pixel location does not belong to the output area of the texture image, And determining a pixel value at a corresponding position of the selected pixel position as a pixel value of the selected pixel position.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130077210A KR20150004483A (en) | 2013-07-02 | 2013-07-02 | Real time animation apparatus and method for Java TV graphics service |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130077210A KR20150004483A (en) | 2013-07-02 | 2013-07-02 | Real time animation apparatus and method for Java TV graphics service |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20150004483A true KR20150004483A (en) | 2015-01-13 |
Family
ID=52476646
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130077210A KR20150004483A (en) | 2013-07-02 | 2013-07-02 | Real time animation apparatus and method for Java TV graphics service |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20150004483A (en) |
-
2013
- 2013-07-02 KR KR1020130077210A patent/KR20150004483A/en not_active Application Discontinuation
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6563048B2 (en) | Tilt adjustment of texture mapping for multiple rendering targets with different resolutions depending on screen position | |
CN110383337B (en) | Variable rate coloring | |
CN108351864B (en) | Concave geometric dense paving | |
US8111264B2 (en) | Method of and system for non-uniform image enhancement | |
KR101145260B1 (en) | Apparatus and method for mapping textures to object model | |
JP5437475B2 (en) | Shading generation method for images | |
US8059119B2 (en) | Method for detecting border tiles or border pixels of a primitive for tile-based rendering | |
US20080231631A1 (en) | Image processing apparatus and method of controlling operation of same | |
US20190318530A1 (en) | Systems and Methods for Reducing Rendering Latency | |
US10121221B2 (en) | Method and apparatus to accelerate rendering of graphics images | |
TWI434226B (en) | Image processing techniques | |
US10553012B2 (en) | Systems and methods for rendering foveated effects | |
US10699467B2 (en) | Computer-graphics based on hierarchical ray casting | |
JP6863693B2 (en) | Graphics processing system and method | |
US7843463B1 (en) | System and method for bump mapping setup | |
JP2018106712A (en) | Fast rendering of quadrics and marking of silhouettes thereof | |
AU2017279678A1 (en) | Fast rendering of quadrics | |
JP2008165760A (en) | Method and apparatus for processing graphics | |
JP6235123B2 (en) | 2D curve tessellation using graphics pipeline | |
KR102443697B1 (en) | Method and apparatus for performing a path stroke | |
KR20170040698A (en) | Method and apparatus for performing graphics pipelines | |
JP4948273B2 (en) | Information processing method and information processing apparatus | |
US11030791B2 (en) | Centroid selection for variable rate shading | |
KR101227155B1 (en) | Graphic image processing apparatus and method for realtime transforming low resolution image into high resolution image | |
US11120606B1 (en) | Systems and methods for image texture uniformization for multiview object capture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |